WorldWideScience

Sample records for sub-pixel scale cloud

  1. A Framework Based on 2-D Taylor Expansion for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method

    Science.gov (United States)

    Zhang, Z.; Werner, F.; Cho, H. -M.; Wind, G.; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2016-01-01

    The bi-spectral method retrieves cloud optical thickness and cloud droplet effective radius simultaneously from a pair of cloud reflectance observations, one in a visible or near-infrared (VISNIR) band and the other in a shortwave infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved and re. In the literature, the retrievals of and re are often assumed to be independent and considered separately when investigating the impact of sub-pixel cloud reflectance variations on the bi-spectral method. As a result, the impact on is contributed only by the sub-pixel variation of VISNIR band reflectance and the impact on re only by the sub-pixel variation of SWIR band reflectance. In our new framework, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VISNIR and SWIR cloud reflectances and their covariance on the and re retrievals. This framework takes into account the fact that the retrievals are determined by both VISNIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VISNIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval. We test our framework using synthetic cloud fields from a large-eddy simulation and real observations from Moderate Resolution Imaging Spectroradiometer. The predicted results based on our framework agree very well with the numerical simulations. Our framework can be used

  2. A Framework for Quantifying the Impacts of Sub-Pixel Reflectance Variance and Covariance on Cloud Optical Thickness and Effective Radius Retrievals Based on the Bi-Spectral Method.

    Science.gov (United States)

    Zhang, Z; Werner, F.; Cho, H. -M.; Wind, Galina; Platnick, S.; Ackerman, A. S.; Di Girolamo, L.; Marshak, A.; Meyer, Kerry

    2017-01-01

    The so-called bi-spectral method retrieves cloud optical thickness (t) and cloud droplet effective radius (re) simultaneously from a pair of cloud reflectance observations, one in a visible or near infrared (VIS/NIR) band and the other in a shortwave-infrared (SWIR) band. A cloudy pixel is usually assumed to be horizontally homogeneous in the retrieval. Ignoring sub-pixel variations of cloud reflectances can lead to a significant bias in the retrieved t and re. In this study, we use the Taylor expansion of a two-variable function to understand and quantify the impacts of sub-pixel variances of VIS/NIR and SWIR cloud reflectances and their covariance on the t and re retrievals. This framework takes into account the fact that the retrievals are determined by both VIS/NIR and SWIR band observations in a mutually dependent way. In comparison with previous studies, it provides a more comprehensive understanding of how sub-pixel cloud reflectance variations impact the t and re retrievals based on the bi-spectral method. In particular, our framework provides a mathematical explanation of how the sub-pixel variation in VIS/NIR band influences the re retrieval and why it can sometimes outweigh the influence of variations in the SWIR band and dominate the error in re retrievals, leading to a potential contribution of positive bias to the re retrieval.

  3. Chandra ACIS Sub-pixel Resolution

    Science.gov (United States)

    Kim, Dong-Woo; Anderson, C. S.; Mossman, A. E.; Allen, G. E.; Fabbiano, G.; Glotfelty, K. J.; Karovska, M.; Kashyap, V. L.; McDowell, J. C.

    2011-05-01

    We investigate how to achieve the best possible ACIS spatial resolution by binning in ACIS sub-pixel and applying an event repositioning algorithm after removing pixel-randomization from the pipeline data. We quantitatively assess the improvement in spatial resolution by (1) measuring point source sizes and (2) detecting faint point sources. The size of a bright (but no pile-up), on-axis point source can be reduced by about 20-30%. With the improve resolution, we detect 20% more faint sources when embedded on the extended, diffuse emission in a crowded field. We further discuss the false source rate of about 10% among the newly detected sources, using a few ultra-deep observations. We also find that the new algorithm does not introduce a grid structure by an aliasing effect for dithered observations and does not worsen the positional accuracy

  4. 2D Sub-Pixel Disparity Measurement Using QPEC / Medicis

    Directory of Open Access Journals (Sweden)

    M. Cournet

    2016-06-01

    Full Text Available In the frame of its earth observation missions, CNES created a library called QPEC, and one of its launcher called Medicis. QPEC / Medicis is a sub-pixel two-dimensional stereo matching algorithm that works on an image pair. This tool is a block matching algorithm, which means that it is based on a local method. Moreover it does not regularize the results found. It proposes several matching costs, such as the Zero mean Normalised Cross-Correlation or statistical measures (the Mutual Information being one of them, and different match validation flags. QPEC / Medicis is able to compute a two-dimensional dense disparity map with a subpixel precision. Hence, it is more versatile than disparity estimation methods found in computer vision literature, which often assume an epipolar geometry. CNES uses Medicis, among other applications, during the in-orbit image quality commissioning of earth observation satellites. For instance the Pléiades-HR 1A & 1B and the Sentinel-2 geometric calibrations are based on this block matching algorithm. Over the years, it has become a common tool in ground segments for in-flight monitoring purposes. For these two kinds of applications, the two-dimensional search and the local sub-pixel measure without regularization can be essential. This tool is also used to generate automatic digital elevation models, for which it was not initially dedicated. This paper deals with the QPEC / Medicis algorithm. It also presents some of its CNES applications (in-orbit commissioning, in flight monitoring or digital elevation model generation. Medicis software is distributed outside the CNES as well. This paper finally describes some of these external applications using Medicis, such as ground displacement measurement, or intra-oral scanner in the dental domain.

  5. Scaling the CERN OpenStack cloud

    Science.gov (United States)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  6. Scale analysis of convective clouds

    Directory of Open Access Journals (Sweden)

    Micha Gryschka

    2008-12-01

    Full Text Available The size distribution of cumulus clouds due to shallow and deep convection is analyzed using satellite pictures, LES model results and data from the German rain radar network. The size distributions found can be described by simple power laws as has also been proposed for other cloud data in the literature. As the observed precipitation at ground stations is finally determined by cloud numbers in an area and individual sizes and rain rates of single clouds, the cloud size distributions might be used for developing empirical precipitation forecasts or for validating results from cloud resolving models being introduced to routine weather forecasts.

  7. Mesoscale to Synoptic Scale Cloud Variability

    Science.gov (United States)

    Rossow, William B.

    1998-01-01

    The atmospheric circulation and its interaction with the oceanic circulation involve non-linear and non-local exchanges of energy and water over a very large range of space and time scales. These exchanges are revealed, in part, by the related variations of clouds, which occur on a similar range of scales as the atmospheric motions that produce them. Collection of comprehensive measurements of the properties of the atmosphere, clouds and surface allows for diagnosis of some of these exchanges. The use of a multi-satellite-network approach by the International Satellite Cloud Climatology Project (ISCCP) comes closest to providing complete coverage of the relevant range space and time scales over which the clouds, atmosphere and ocean vary. A nearly 15-yr dataset is now available that covers the range from 3 hr and 30 km to decade and planetary. This paper considers three topics: (1) cloud variations at the smallest scales and how they may influence radiation-cloud interactions, and (2) cloud variations at "moderate" scales and how they may cause natural climate variability, and (3) cloud variations at the largest scales and how they affect the climate. The emphasis in this discussion is on the more mature subject of cloud-radiation interactions. There is now a need to begin similar detailed diagnostic studies of water exchange processes.

  8. Multi-scale Modeling of Arctic Clouds

    Science.gov (United States)

    Hillman, B. R.; Roesler, E. L.; Dexheimer, D.

    2017-12-01

    The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.

  9. Landform classification using a sub-pixel spatial attraction model to increase spatial resolution of digital elevation model (DEM

    Directory of Open Access Journals (Sweden)

    Marzieh Mokarrama

    2018-04-01

    Full Text Available The purpose of the present study is preparing a landform classification by using digital elevation model (DEM which has a high spatial resolution. To reach the mentioned aim, a sub-pixel spatial attraction model was used as a novel method for preparing DEM with a high spatial resolution in the north of Darab, Fars province, Iran. The sub-pixel attraction models convert the pixel into sub-pixels based on the neighboring pixels fraction values, which can only be attracted by a central pixel. Based on this approach, a mere maximum of eight neighboring pixels can be selected for calculating of the attraction value. In the mentioned model, other pixels are supposed to be far from the central pixel to receive any attraction. In the present study by using a sub-pixel attraction model, the spatial resolution of a DEM was increased. The design of the algorithm is accomplished by using a DEM with a spatial resolution of 30 m (the Advanced Space borne Thermal Emission and Reflection Radiometer; (ASTER and a 90 m (the Shuttle Radar Topography Mission; (SRTM. In the attraction model, scale factors of (S = 2, S = 3, and S = 4 with two neighboring methods of touching (T = 1 and quadrant (T = 2 are applied to the DEMs by using MATLAB software. The algorithm is evaluated by taking the best advantages of 487 sample points, which are measured by surveyors. The spatial attraction model with scale factor of (S = 2 gives better results compared to those scale factors which are greater than 2. Besides, the touching neighborhood method is turned to be more accurate than the quadrant method. In fact, dividing each pixel into more than two sub-pixels decreases the accuracy of the resulted DEM. On the other hand, in these cases DEM, is itself in charge of increasing the value of root-mean-square error (RMSE and shows that attraction models could not be used for S which is greater than 2. Thus considering results, the proposed model is highly capable of

  10. Radial lens distortion correction with sub-pixel accuracy for X-ray micro-tomography.

    Science.gov (United States)

    Vo, Nghia T; Atwood, Robert C; Drakopoulos, Michael

    2015-12-14

    Distortion correction or camera calibration for an imaging system which is highly configurable and requires frequent disassembly for maintenance or replacement of parts needs a speedy method for recalibration. Here we present direct techniques for calculating distortion parameters of a non-linear model based on the correct determination of the center of distortion. These techniques are fast, very easy to implement, and accurate at sub-pixel level. The implementation at the X-ray tomography system of the I12 beamline, Diamond Light Source, which strictly requires sub-pixel accuracy, shows excellent performance in the calibration image and in the reconstructed images.

  11. Sub-pixel analysis to support graphic security after scanning at low resolution

    Science.gov (United States)

    Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve

    2006-02-01

    Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced

  12. Improving sub-pixel imperviousness change prediction by ensembling heterogeneous non-linear regression models

    Science.gov (United States)

    Drzewiecki, Wojciech

    2016-12-01

    In this work nine non-linear regression models were compared for sub-pixel impervious surface area mapping from Landsat images. The comparison was done in three study areas both for accuracy of imperviousness coverage evaluation in individual points in time and accuracy of imperviousness change assessment. The performance of individual machine learning algorithms (Cubist, Random Forest, stochastic gradient boosting of regression trees, k-nearest neighbors regression, random k-nearest neighbors regression, Multivariate Adaptive Regression Splines, averaged neural networks, and support vector machines with polynomial and radial kernels) was also compared with the performance of heterogeneous model ensembles constructed from the best models trained using particular techniques. The results proved that in case of sub-pixel evaluation the most accurate prediction of change may not necessarily be based on the most accurate individual assessments. When single methods are considered, based on obtained results Cubist algorithm may be advised for Landsat based mapping of imperviousness for single dates. However, Random Forest may be endorsed when the most reliable evaluation of imperviousness change is the primary goal. It gave lower accuracies for individual assessments, but better prediction of change due to more correlated errors of individual predictions. Heterogeneous model ensembles performed for individual time points assessments at least as well as the best individual models. In case of imperviousness change assessment the ensembles always outperformed single model approaches. It means that it is possible to improve the accuracy of sub-pixel imperviousness change assessment using ensembles of heterogeneous non-linear regression models.

  13. Finite mixture models for sub-pixel coastal land cover classification

    CSIR Research Space (South Africa)

    Ritchie, Michaela C

    2017-05-01

    Full Text Available Models for Sub- pixel Coastal Land Cover Classification M. Ritchie Dr. M. Lück-Vogel Dr. P. Debba Dr. V. Goodall ISRSE - 37 Tshwane, South Africa 10 May 2017 2Study Area Africa South Africa FALSE BAY 3Strand Gordon’s Bay Study Area WorldView-2 Image.../Urban 1 10 10 Herbaceous Vegetation 1 5 5 Shadow 1 8 8 Sparse Vegetation 1 3 3 Water 1 10 10 Woody Vegetation 1 5 5 11 Maximum Likelihood Classification (MLC) 12 Gaussian Mixture Discriminant Analysis (GMDA) 13 A B C t-distribution Mixture Discriminant...

  14. Segmentation of arterial vessel wall motion to sub-pixel resolution using M-mode ultrasound.

    Science.gov (United States)

    Fancourt, Craig; Azer, Karim; Ramcharan, Sharmilee L; Bunzel, Michelle; Cambell, Barry R; Sachs, Jeffrey R; Walker, Matthew

    2008-01-01

    We describe a method for segmenting arterial vessel wall motion to sub-pixel resolution, using the returns from M-mode ultrasound. The technique involves measuring the spatial offset between all pairs of scans from their cross-correlation, converting the spatial offsets to relative wall motion through a global optimization, and finally translating from relative to absolute wall motion by interpolation over the M-mode image. The resulting detailed wall distension waveform has the potential to enhance existing vascular biomarkers, such as strain and compliance, as well as enable new ones.

  15. Improving sub-pixel imperviousness change prediction by ensembling heterogeneous non-linear regression models

    Directory of Open Access Journals (Sweden)

    Drzewiecki Wojciech

    2016-12-01

    Full Text Available In this work nine non-linear regression models were compared for sub-pixel impervious surface area mapping from Landsat images. The comparison was done in three study areas both for accuracy of imperviousness coverage evaluation in individual points in time and accuracy of imperviousness change assessment. The performance of individual machine learning algorithms (Cubist, Random Forest, stochastic gradient boosting of regression trees, k-nearest neighbors regression, random k-nearest neighbors regression, Multivariate Adaptive Regression Splines, averaged neural networks, and support vector machines with polynomial and radial kernels was also compared with the performance of heterogeneous model ensembles constructed from the best models trained using particular techniques.

  16. Installing and Scaling out Ubuntu Enterprise Cloud in Virtual Environment

    DEFF Research Database (Denmark)

    Pantić, Zoran; Ali Babar, Muhammad

    This document contains the supplemental material to the book “Guidelines for Building a Private Cloud Infrastructure”. This document provides guidance on how to install Ubuntu Enterprise Cloud in virtual environment, and afterwards how to scale out when needed. The purpose of this supplemental book...... is to provide a practical, step-by-step, detailed guide on how to dimension and install the machines and network. Some initial steps of configuring the cloud are also covered. The installation is performed in a virtual environment based on Windows 7 and VMware Workstation 7. The cloud installation is performed...... cloud, both using the command line tools, and GUI based tool HybridFox....

  17. Cloud Detection by Fusing Multi-Scale Convolutional Features

    Science.gov (United States)

    Li, Zhiwei; Shen, Huanfeng; Wei, Yancong; Cheng, Qing; Yuan, Qiangqiang

    2018-04-01

    Clouds detection is an important pre-processing step for accurate application of optical satellite imagery. Recent studies indicate that deep learning achieves best performance in image segmentation tasks. Aiming at boosting the accuracy of cloud detection for multispectral imagery, especially for those that contain only visible and near infrared bands, in this paper, we proposed a deep learning based cloud detection method termed MSCN (multi-scale cloud net), which segments cloud by fusing multi-scale convolutional features. MSCN was trained on a global cloud cover validation collection, and was tested in more than ten types of optical images with different resolution. Experiment results show that MSCN has obvious advantages over the traditional multi-feature combined cloud detection method in accuracy, especially when in snow and other areas covered by bright non-cloud objects. Besides, MSCN produced more detailed cloud masks than the compared deep cloud detection convolution network. The effectiveness of MSCN make it promising for practical application in multiple kinds of optical imagery.

  18. Quantifying the Climate-Scale Accuracy of Satellite Cloud Retrievals

    Science.gov (United States)

    Roberts, Y.; Wielicki, B. A.; Sun-Mack, S.; Minnis, P.; Liang, L.; Di Girolamo, L.

    2014-12-01

    Instrument calibration and cloud retrieval algorithms have been developed to minimize retrieval errors on small scales. However, measurement uncertainties and assumptions within retrieval algorithms at the pixel level may alias into decadal-scale trends of cloud properties. We first, therefore, quantify how instrument calibration changes could alias into cloud property trends. For a perfect observing system the climate trend accuracy is limited only by the natural variability of the climate variable. Alternatively, for an actual observing system, the climate trend accuracy is additionally limited by the measurement uncertainty. Drifts in calibration over time may therefore be disguised as a true climate trend. We impose absolute calibration changes to MODIS spectral reflectance used as input to the CERES Cloud Property Retrieval System (CPRS) and run the modified MODIS reflectance through the CPRS to determine the sensitivity of cloud properties to calibration changes. We then use these changes to determine the impact of instrument calibration changes on trend uncertainty in reflected solar cloud properties. Secondly, we quantify how much cloud retrieval algorithm assumptions alias into cloud optical retrieval trends by starting with the largest of these biases: the plane-parallel assumption in cloud optical thickness (τC) retrievals. First, we collect liquid water cloud fields obtained from Multi-angle Imaging Spectroradiometer (MISR) measurements to construct realistic probability distribution functions (PDFs) of 3D cloud anisotropy (a measure of the degree to which clouds depart from plane-parallel) for different ISCCP cloud types. Next, we will conduct a theoretical study with dynamically simulated cloud fields and a 3D radiative transfer model to determine the relationship between 3D cloud anisotropy and 3D τC bias for each cloud type. Combining these results provides distributions of 3D τC bias by cloud type. Finally, we will estimate the change in

  19. Sub-pixel estimation of tree cover and bare surface densities using regression tree analysis

    Directory of Open Access Journals (Sweden)

    Carlos Augusto Zangrando Toneli

    2011-09-01

    Full Text Available Sub-pixel analysis is capable of generating continuous fields, which represent the spatial variability of certain thematic classes. The aim of this work was to develop numerical models to represent the variability of tree cover and bare surfaces within the study area. This research was conducted in the riparian buffer within a watershed of the São Francisco River in the North of Minas Gerais, Brazil. IKONOS and Landsat TM imagery were used with the GUIDE algorithm to construct the models. The results were two index images derived with regression trees for the entire study area, one representing tree cover and the other representing bare surface. The use of non-parametric and non-linear regression tree models presented satisfactory results to characterize wetland, deciduous and savanna patterns of forest formation.

  20. Transforming landscape ecological evaluations using sub-pixel remote sensing classifications: A study of invasive saltcedar (Tamarix spp.)

    Science.gov (United States)

    Frazier, Amy E.

    Invasive species disrupt landscape patterns and compromise the functionality of ecosystem processes. Non-native saltcedar (Tamarix spp.) poses significant threats to native vegetation and groundwater resources in the southwestern U.S. and Mexico, and quantifying spatial and temporal distribution patterns is essential for monitoring its spread. Advanced remote sensing classification techniques such as sub-pixel classifications are able to detect and discriminate saltcedar from native vegetation with high accuracy, but these types of classifications are not compatible with landscape metrics, which are the primary tool available for statistically assessing distribution patterns, because they do not have discrete class boundaries. The objective of this research is to develop new methods that allow sub-pixel classifications to be analyzed using landscape metrics. The research will be carried out through three specific aims: (1) develop and test a method to transform continuous sub-pixel classifications into categorical representations that are compatible with widely used landscape metric tools, (2) establish a gradient-based concept of landscape using sub-pixel classifications and the technique developed in the first objective to explore the relationships between pattern and process, and (3) generate a new super-resolution mapping technique method to predict the spatial locations of fractional land covers within a pixel. Results show that the threshold gradient method is appropriate for discretizing sub-pixel data, and can be used to generate increased information about the landscape compared to traditional single-value metrics. Additionally, the super-resolution classification technique was also able to provide detailed sub-pixel mapping information, but additional work will be needed to develop rigorous validation and accuracy assessment techniques.

  1. Microsecond-scale electric field pulses in cloud lightning discharges

    Science.gov (United States)

    Villanueva, Y.; Rakov, V. A.; Uman, M. A.; Brook, M.

    1994-01-01

    From wideband electric field records acquired using a 12-bit digitizing system with a 500-ns sampling interval, microsecond-scale pulses in different stages of cloud flashes in Florida and New Mexico are analyzed. Pulse occurrence statistics and waveshape characteristics are presented. The larger pulses tend to occur early in the flash, confirming the results of Bils et al. (1988) and in contrast with the three-stage representation of cloud-discharge electric fields suggested by Kitagawa and Brook (1960). Possible explanations for the discrepancy are discussed. The tendency for the larger pulses to occur early in the cloud flash suggests that they are related to the initial in-cloud channel formation processes and contradicts the common view found in the atmospheric radio-noise literature that the main sources of VLF/LF electromagnetic radiation in cloud flashes are the K processes which occur in the final, or J type, part of the cloud discharge.

  2. Synergetic cloud fraction determination for SCIAMACHY using MERIS

    Directory of Open Access Journals (Sweden)

    C. Schlundt

    2011-02-01

    Full Text Available Since clouds play an essential role in the Earth's climate system, it is important to understand the cloud characteristics as well as their distribution on a global scale using satellite observations. The main scientific objective of SCIAMACHY (SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY onboard the ENVISAT satellite is the retrieval of vertical columns of trace gases.

    On the one hand, SCIAMACHY has to be sensitive to low variations in trace gas concentrations which means the ground pixel size has to be large enough. On the other hand, such a large pixel size leads to the problem that SCIAMACHY spectra are often contaminated by clouds. SCIAMACHY spectral measurements are not well suitable to derive a reliable sub-pixel cloud fraction that can be used as input parameter for subsequent retrievals of cloud properties or vertical trace gas columns. Therefore, we use MERIS/ENVISAT spectral measurements with its high spatial resolution as sub-pixel information for the determination of MerIs Cloud fRation fOr Sciamachy (MICROS. Since MERIS covers an even broader swath width than SCIAMACHY, no problems in spatial and temporal collocation of measurements occur. This enables the derivation of a SCIAMACHY cloud fraction with an accuracy much higher as compared with other current cloud fractions that are based on SCIAMACHY's PMD (Polarization Measurement Device data.

    We present our new developed MICROS algorithm, based on the threshold approach, as well as a qualitative validation of our results with MERIS satellite images for different locations, especially with respect to bright surfaces such as snow/ice and sands. In addition, the SCIAMACHY cloud fractions derived from MICROS are intercompared with other current SCIAMACHY cloud fractions based on different approaches demonstrating a considerable improvement regarding geometric cloud fraction determination using the MICROS algorithm.

  3. A characteristic scale in radiation fields of fractal clouds

    Energy Technology Data Exchange (ETDEWEB)

    Wiscombe, W.; Cahalan, R.; Davis, A.; Marshak, A. [Goddard Space Flight Center, Greenbelt, MD (United States)

    1996-04-01

    The wavenumber spectrum of Landsat imagery for marine stratocumulus cloud shows a scale break when plotted on a double log plot. We offer an explanation of this scale break in terms of smoothing by horizontal radiative fluxes, which is parameterized and incorporated into an improved pixel approximation. We compute the radiation fields emerging from cloud models with horizontally variable optical depth fractal models. We use comparative spectral and multifractal analysis to qualify the validity of the independent pixel approximation at the largest scales and demonstrate it`s shortcomings on the smallest scales.

  4. Characterizing sub-pixel landsat ETM plus fire severity on experimental fires in the Kruger National Park, South Africa

    CSIR Research Space (South Africa)

    Landmann, T

    2003-07-01

    Full Text Available Burn severity was quantitatively mapped using a unique linear spectral mixture model to determine sub-pixel abundances of different ashes and combustion completeness measured on the corresponding fire-affected pixels in Landsat data. A new burn...

  5. The role of cloud-scale resolution on radiative properties of oceanic cumulus clouds

    International Nuclear Information System (INIS)

    Kassianov, Evgueni; Ackerman, Thomas; Kollias, Pavlos

    2005-01-01

    Both individual and combined effects of the horizontal and vertical variability of cumulus clouds on solar radiative transfer are investigated using a two-dimensional (x- and z-directions) cloud radar dataset. This high-resolution dataset of typical fair-weather marine cumulus is derived from ground-based 94GHz cloud radar observations. The domain-averaged (along x-direction) radiative properties are computed by a Monte Carlo method. It is shown that (i) different cloud-scale resolutions can be used for accurate calculations of the mean absorption, upward and downward fluxes; (ii) the resolution effects can depend strongly on the solar zenith angle; and (iii) a few cloud statistics can be successfully applied for calculating the averaged radiative properties

  6. Giant molecular cloud scaling relations: the role of the cloud definition

    Science.gov (United States)

    Khoperskov, S. A.; Vasiliev, E. O.; Ladeyschikov, D. A.; Sobolev, A. M.; Khoperskov, A. V.

    2016-01-01

    We investigate the physical properties of molecular clouds in disc galaxies with different morphologies: a galaxy without prominent structure, a spiral barred galaxy and a galaxy with flocculent structure. Our N-body/hydrodynamical simulations take into account non-equilibrium H2 and CO chemical kinetics, self-gravity, star formation and feedback processes. For the simulated galaxies, the scaling relations of giant molecular clouds, or so-called Larson's relations, are studied for two types of cloud definition (or extraction method): the first is based on total column density position-position (PP) data sets and the second is indicated by the CO (1-0) line emission used in position-position-velocity (PPV) data. We find that the cloud populations obtained using both cloud extraction methods generally have similar physical parameters, except that for the CO data the mass spectrum of clouds has a tail with low-mass objects M ˜ 103-104 M⊙. Owing toa varying column density threshold, the power-law indices in the scaling relations are significantly changed. In contrast, the relations are invariant to the CO brightness temperature threshold. Finally, we find that the mass spectra of clouds for PPV data are almost insensitive to the galactic morphology, whereas the spectra for PP data demonstrate significant variation.

  7. Gravity, turbulence and the scaling ``laws'' in molecular clouds

    Science.gov (United States)

    Ballesteros-Paredes, Javier

    The so-called Larson (1981) scaling laws found empirically in molecular clouds have been generally interpreted as evidence that the clouds are turbulent and fractal. In the present contribution we discussed how recent observations and models of cloud formation suggest that: (a) these relations are the result of strong observational biases due to the cloud definition itself: since the filling factor of the dense structures is small, by thresholding the column density the computed mean density between clouds is nearly constant, and nearly the same as the threshold (Ballesteros-Paredes et al. 2012). (b) When accounting for column density variations, the velocity dispersion-size relation does not appears anymore. Instead, dense cores populate the upper-left corner of the δ v-R diagram (Ballesteros-Paredes et al. 2011a). (c) Instead of a δ v-R relation, a more appropriate relation seems to be δ v 2 / R = 2 GMΣ, which suggest that clouds are in collapse, rather than supported by turbulence (Ballesteros-Paredes et al. 2011a). (d) These results, along with the shapes of the star formation histories (Hartmann, Ballesteros-Paredes & Heitsch 2012), line profiles of collapsing clouds in numerical simulations (Heitsch, Ballesteros-Paredes & Hartmann 2009), core-to-core velocity dispersions (Heitsch, Ballesteros-Paredes & Hartmann 2009), time-evolution of the column density PDFs (Ballesteros-Paredes et al. 2011b), etc., strongly suggest that the actual source of the non-thermal motions is gravitational collapse of the clouds, so that the turbulent, chaotic component of the motions is only a by-product of the collapse, with no significant ``support" role for the clouds. This result calls into question if the scale-free nature of the motions has a turbulent, origin (Ballesteros-Paredes et al. 2011a; Ballesteros-Paredes et al. 2011b, Ballesteros-Paredes et al. 2012).

  8. METRICS FOR DYNAMIC SCALING OF DATABASE IN CLOUDS

    Directory of Open Access Journals (Sweden)

    Alexander V. Boichenko

    2013-01-01

    Full Text Available This article analyzes the main methods of scaling databases (replication, sharding and their support at the popular relational databases and NoSQL solutions with different data models: a document-oriented, key-value, column-oriented, graph. The article provides an assessment of the capabilities of modern cloud-based solution and gives a model for the organization of dynamic scaling in the cloud infrastructure. In the article are analyzed different types of metrics and are included the basic metrics that characterize the functioning parameters and database technology, as well as sets the goals of the integral metrics, necessary for the implementation of adaptive algorithms for dynamic scaling databases in the cloud infrastructure. This article was prepared with the support of RFBR grant № 13-07-00749.

  9. A Data Generator for Cloud-Scale Benchmarking

    Science.gov (United States)

    Rabl, Tilmann; Frank, Michael; Sergieh, Hatem Mousselly; Kosch, Harald

    In many fields of research and business data sizes are breaking the petabyte barrier. This imposes new problems and research possibilities for the database community. Usually, data of this size is stored in large clusters or clouds. Although clouds have become very popular in recent years, there is only little work on benchmarking cloud applications. In this paper we present a data generator for cloud sized applications. Its architecture makes the data generator easy to extend and to configure. A key feature is the high degree of parallelism that allows linear scaling for arbitrary numbers of nodes. We show how distributions, relationships and dependencies in data can be computed in parallel with linear speed up.

  10. A robust sub-pixel edge detection method of infrared image based on tremor-based retinal receptive field model

    Science.gov (United States)

    Gao, Kun; Yang, Hu; Chen, Xiaomei; Ni, Guoqiang

    2008-03-01

    Because of complex thermal objects in an infrared image, the prevalent image edge detection operators are often suitable for a certain scene and extract too wide edges sometimes. From a biological point of view, the image edge detection operators work reliably when assuming a convolution-based receptive field architecture. A DoG (Difference-of- Gaussians) model filter based on ON-center retinal ganglion cell receptive field architecture with artificial eye tremors introduced is proposed for the image contour detection. Aiming at the blurred edges of an infrared image, the subsequent orthogonal polynomial interpolation and sub-pixel level edge detection in rough edge pixel neighborhood is adopted to locate the foregoing rough edges in sub-pixel level. Numerical simulations show that this method can locate the target edge accurately and robustly.

  11. Automatically Determining Scale Within Unstructured Point Clouds

    Science.gov (United States)

    Kadamen, Jayren; Sithole, George

    2016-06-01

    Three dimensional models obtained from imagery have an arbitrary scale and therefore have to be scaled. Automatically scaling these models requires the detection of objects in these models which can be computationally intensive. Real-time object detection may pose problems for applications such as indoor navigation. This investigation poses the idea that relational cues, specifically height ratios, within indoor environments may offer an easier means to obtain scales for models created using imagery. The investigation aimed to show two things, (a) that the size of objects, especially the height off ground is consistent within an environment, and (b) that based on this consistency, objects can be identified and their general size used to scale a model. To test the idea a hypothesis is first tested on a terrestrial lidar scan of an indoor environment. Later as a proof of concept the same test is applied to a model created using imagery. The most notable finding was that the detection of objects can be more readily done by studying the ratio between the dimensions of objects that have their dimensions defined by human physiology. For example the dimensions of desks and chairs are related to the height of an average person. In the test, the difference between generalised and actual dimensions of objects were assessed. A maximum difference of 3.96% (2.93cm) was observed from automated scaling. By analysing the ratio between the heights (distance from the floor) of the tops of objects in a room, identification was also achieved.

  12. Aerosol-cloud interactions from urban, regional to global scales

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yuan [California Institute of Technology, Pasadena, CA (United States). Seismological Lab.

    2015-10-01

    The studies in this dissertation aim at advancing our scientific understandings about physical processes involved in the aerosol-cloud-precipitation interaction and quantitatively assessing the impacts of aerosols on the cloud systems with diverse scales over the globe on the basis of the observational data analysis and various modeling studies. As recognized in the Fifth Assessment Report by the Inter-government Panel on Climate Change, the magnitude of radiative forcing by atmospheric aerosols is highly uncertain, representing the largest uncertainty in projections of future climate by anthropogenic activities. By using a newly implemented cloud microphysical scheme in the cloud-resolving model, the thesis assesses aerosol-cloud interaction for distinct weather systems, ranging from individual cumulus to mesoscale convective systems. This thesis also introduces a novel hierarchical modeling approach that solves a long outstanding mismatch between simulations by regional weather models and global climate models in the climate modeling community. More importantly, the thesis provides key scientific solutions to several challenging questions in climate science, including the global impacts of the Asian pollution. As scientists wrestle with the complexities of climate change in response to varied anthropogenic forcing, perhaps no problem is more challenging than the understanding of the impacts of atmospheric aerosols from air pollution on clouds and the global circulation.

  13. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  14. Aerosol-cloud interactions from urban, regional to global scales

    International Nuclear Information System (INIS)

    Wang, Yuan

    2015-01-01

    The studies in this dissertation aim at advancing our scientific understandings about physical processes involved in the aerosol-cloud-precipitation interaction and quantitatively assessing the impacts of aerosols on the cloud systems with diverse scales over the globe on the basis of the observational data analysis and various modeling studies. As recognized in the Fifth Assessment Report by the Inter-government Panel on Climate Change, the magnitude of radiative forcing by atmospheric aerosols is highly uncertain, representing the largest uncertainty in projections of future climate by anthropogenic activities. By using a newly implemented cloud microphysical scheme in the cloud-resolving model, the thesis assesses aerosol-cloud interaction for distinct weather systems, ranging from individual cumulus to mesoscale convective systems. This thesis also introduces a novel hierarchical modeling approach that solves a long outstanding mismatch between simulations by regional weather models and global climate models in the climate modeling community. More importantly, the thesis provides key scientific solutions to several challenging questions in climate science, including the global impacts of the Asian pollution. As scientists wrestle with the complexities of climate change in response to varied anthropogenic forcing, perhaps no problem is more challenging than the understanding of the impacts of atmospheric aerosols from air pollution on clouds and the global circulation.

  15. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  16. Web-scale data management for the cloud

    CERN Document Server

    Lehner, Wolfgang

    2013-01-01

    The efficient management of a consistent and integrated database is a central task in modern IT and highly relevant for science and industry. Hardly any critical enterprise solution comes without any functionality for managing data in its different forms. Web-Scale Data Management for the Cloud addresses fundamental challenges posed by the need and desire to provide database functionality in the context of the Database as a Service (DBaaS) paradigm for database outsourcing. This book also discusses the motivation of the new paradigm of cloud computing, and its impact to data outsourcing and se

  17. Cloud-Scale Numerical Modeling of the Arctic Boundary Layer

    Science.gov (United States)

    Krueger, Steven K.

    1998-01-01

    The interactions between sea ice, open ocean, atmospheric radiation, and clouds over the Arctic Ocean exert a strong influence on global climate. Uncertainties in the formulation of interactive air-sea-ice processes in global climate models (GCMs) result in large differences between the Arctic, and global, climates simulated by different models. Arctic stratus clouds are not well-simulated by GCMs, yet exert a strong influence on the surface energy budget of the Arctic. Leads (channels of open water in sea ice) have significant impacts on the large-scale budgets during the Arctic winter, when they contribute about 50 percent of the surface fluxes over the Arctic Ocean, but cover only 1 to 2 percent of its area. Convective plumes generated by wide leads may penetrate the surface inversion and produce condensate that spreads up to 250 km downwind of the lead, and may significantly affect the longwave radiative fluxes at the surface and thereby the sea ice thickness. The effects of leads and boundary layer clouds must be accurately represented in climate models to allow possible feedbacks between them and the sea ice thickness. The FIRE III Arctic boundary layer clouds field program, in conjunction with the SHEBA ice camp and the ARM North Slope of Alaska and Adjacent Arctic Ocean site, will offer an unprecedented opportunity to greatly improve our ability to parameterize the important effects of leads and boundary layer clouds in GCMs.

  18. Thorough statistical comparison of machine learning regression models and their ensembles for sub-pixel imperviousness and imperviousness change mapping

    Directory of Open Access Journals (Sweden)

    Drzewiecki Wojciech

    2017-12-01

    Full Text Available We evaluated the performance of nine machine learning regression algorithms and their ensembles for sub-pixel estimation of impervious areas coverages from Landsat imagery. The accuracy of imperviousness mapping in individual time points was assessed based on RMSE, MAE and R2. These measures were also used for the assessment of imperviousness change intensity estimations. The applicability for detection of relevant changes in impervious areas coverages at sub-pixel level was evaluated using overall accuracy, F-measure and ROC Area Under Curve. The results proved that Cubist algorithm may be advised for Landsat-based mapping of imperviousness for single dates. Stochastic gradient boosting of regression trees (GBM may be also considered for this purpose. However, Random Forest algorithm is endorsed for both imperviousness change detection and mapping of its intensity. In all applications the heterogeneous model ensembles performed at least as well as the best individual models or better. They may be recommended for improving the quality of sub-pixel imperviousness and imperviousness change mapping. The study revealed also limitations of the investigated methodology for detection of subtle changes of imperviousness inside the pixel. None of the tested approaches was able to reliably classify changed and non-changed pixels if the relevant change threshold was set as one or three percent. Also for fi ve percent change threshold most of algorithms did not ensure that the accuracy of change map is higher than the accuracy of random classifi er. For the threshold of relevant change set as ten percent all approaches performed satisfactory.

  19. Thorough statistical comparison of machine learning regression models and their ensembles for sub-pixel imperviousness and imperviousness change mapping

    Science.gov (United States)

    Drzewiecki, Wojciech

    2017-12-01

    We evaluated the performance of nine machine learning regression algorithms and their ensembles for sub-pixel estimation of impervious areas coverages from Landsat imagery. The accuracy of imperviousness mapping in individual time points was assessed based on RMSE, MAE and R2. These measures were also used for the assessment of imperviousness change intensity estimations. The applicability for detection of relevant changes in impervious areas coverages at sub-pixel level was evaluated using overall accuracy, F-measure and ROC Area Under Curve. The results proved that Cubist algorithm may be advised for Landsat-based mapping of imperviousness for single dates. Stochastic gradient boosting of regression trees (GBM) may be also considered for this purpose. However, Random Forest algorithm is endorsed for both imperviousness change detection and mapping of its intensity. In all applications the heterogeneous model ensembles performed at least as well as the best individual models or better. They may be recommended for improving the quality of sub-pixel imperviousness and imperviousness change mapping. The study revealed also limitations of the investigated methodology for detection of subtle changes of imperviousness inside the pixel. None of the tested approaches was able to reliably classify changed and non-changed pixels if the relevant change threshold was set as one or three percent. Also for fi ve percent change threshold most of algorithms did not ensure that the accuracy of change map is higher than the accuracy of random classifi er. For the threshold of relevant change set as ten percent all approaches performed satisfactory.

  20. Large-scale urban point cloud labeling and reconstruction

    Science.gov (United States)

    Zhang, Liqiang; Li, Zhuqiang; Li, Anjian; Liu, Fangyu

    2018-04-01

    The large number of object categories and many overlapping or closely neighboring objects in large-scale urban scenes pose great challenges in point cloud classification. In this paper, a novel framework is proposed for classification and reconstruction of airborne laser scanning point cloud data. To label point clouds, we present a rectified linear units neural network named ReLu-NN where the rectified linear units (ReLu) instead of the traditional sigmoid are taken as the activation function in order to speed up the convergence. Since the features of the point cloud are sparse, we reduce the number of neurons by the dropout to avoid over-fitting of the training process. The set of feature descriptors for each 3D point is encoded through self-taught learning, and forms a discriminative feature representation which is taken as the input of the ReLu-NN. The segmented building points are consolidated through an edge-aware point set resampling algorithm, and then they are reconstructed into 3D lightweight models using the 2.5D contouring method (Zhou and Neumann, 2010). Compared with deep learning approaches, the ReLu-NN introduced can easily classify unorganized point clouds without rasterizing the data, and it does not need a large number of training samples. Most of the parameters in the network are learned, and thus the intensive parameter tuning cost is significantly reduced. Experimental results on various datasets demonstrate that the proposed framework achieves better performance than other related algorithms in terms of classification accuracy and reconstruction quality.

  1. Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud

    Directory of Open Access Journals (Sweden)

    A. Paulin Florence

    2016-01-01

    Full Text Available Cloud computing is a new technology which supports resource sharing on a “Pay as you go” basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption.

  2. Large-scale parallel genome assembler over cloud computing environment.

    Science.gov (United States)

    Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong

    2017-06-01

    The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.

  3. Evaluation of the Use of Sub-Pixel Offset Tracking Techniques to Monitor Landslides in Densely Vegetated Steeply Sloped Areas

    Directory of Open Access Journals (Sweden)

    Luyi Sun

    2016-08-01

    Full Text Available Sub-Pixel Offset Tracking (sPOT is applied to derive high-resolution centimetre-level landslide rates in the Three Gorges Region of China using TerraSAR-X Hi-resolution Spotlight (TSX HS space-borne SAR images. These results contrast sharply with previous use of conventional differential Interferometric Synthetic Aperture Radar (DInSAR techniques in areas with steep slopes, dense vegetation and large variability in water vapour which indicated around 12% phase coherent coverage. By contrast, sPOT is capable of measuring two dimensional deformation of large gradient over steeply sloped areas covered in dense vegetation. Previous applications of sPOT in this region relies on corner reflectors (CRs, (high coherence features to obtain reliable measurements. However, CRs are expensive and difficult to install, especially in remote areas; and other potential high coherence features comparable with CRs are very few and outside the landslide boundary. The resultant sub-pixel level deformation field can be statistically analysed to yield multi-modal maps of deformation regions. This approach is shown to have a significant impact when compared with previous offset tracking measurements of landslide deformation, as it is demonstrated that sPOT can be applied even in densely vegetated terrain without relying on high-contrast surface features or requiring any de-noising process.

  4. Cost Optimal Elastic Auto-Scaling in Cloud Infrastructure

    Science.gov (United States)

    Mukhopadhyay, S.; Sidhanta, S.; Ganguly, S.; Nemani, R. R.

    2014-12-01

    Today, elastic scaling is critical part of leveraging cloud. Elastic scaling refers to adding resources only when it is needed and deleting resources when not in use. Elastic scaling ensures compute/server resources are not over provisioned. Today, Amazon and Windows Azure are the only two platform provider that allow auto-scaling of cloud resources where servers are automatically added and deleted. However, these solution falls short of following key features: A) Requires explicit policy definition such server load and therefore lacks any predictive intelligence to make optimal decision; B) Does not decide on the right size of resource and thereby does not result in cost optimal resource pool. In a typical cloud deployment model, we consider two types of application scenario: A. Batch processing jobs → Hadoop/Big Data case B. Transactional applications → Any application that process continuous transactions (Requests/response) In reference of classical queuing model, we are trying to model a scenario where servers have a price and capacity (size) and system can add delete servers to maintain a certain queue length. Classical queueing models applies to scenario where number of servers are constant. So we cannot apply stationary system analysis in this case. We investigate the following questions 1. Can we define Job queue and use the metric to define such a queue to predict the resource requirement in a quasi-stationary way? Can we map that into an optimal sizing problem? 2. Do we need to get into a level of load (CPU/Data) on server level to characterize the size requirement? How do we learn that based on Job type?

  5. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    Science.gov (United States)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  6. Molecular cloud-scale star formation in NGC 300

    Energy Technology Data Exchange (ETDEWEB)

    Faesi, Christopher M.; Lada, Charles J.; Forbrich, Jan [Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Menten, Karl M. [Max Planck Institut für Radioastronomie, Auf dem Hügel 69, D-53121 Bonn (Germany); Bouy, Hervé [Centro de Astrobiología, (INTA-CSIC), Departamento de Astrofísica, POB 78, ESAC Campus, 28691 Villanueva dela Cañada (Spain)

    2014-07-01

    We present the results of a galaxy-wide study of molecular gas and star formation in a sample of 76 H II regions in the nearby spiral galaxy NGC 300. We have measured the molecular gas at 250 pc scales using pointed CO(J = 2-1) observations with the Atacama Pathfinder Experiment telescope. We detect CO in 42 of our targets, deriving molecular gas masses ranging from our sensitivity limit of ∼10{sup 5} M {sub ☉} to 7 × 10{sup 5} M {sub ☉}. We find a clear decline in the CO detection rate with galactocentric distance, which we attribute primarily to the decreasing radial metallicity gradient in NGC 300. We combine Galaxy Evolution Explorer far-ultraviolet, Spitzer 24 μm, and Hα narrowband imaging to measure the star formation activity in our sample. We have developed a new direct modeling approach for computing star formation rates (SFRs) that utilizes these data and population synthesis models to derive the masses and ages of the young stellar clusters associated with each of our H II region targets. We find a characteristic gas depletion time of 230 Myr at 250 pc scales in NGC 300, more similar to the results obtained for Milky Way giant molecular clouds than the longer (>2 Gyr) global depletion times derived for entire galaxies and kiloparsec-sized regions within them. This difference is partially due to the fact that our study accounts for only the gas and stars within the youngest star-forming regions. We also note a large scatter in the NGC 300 SFR-molecular gas mass scaling relation that is furthermore consistent with the Milky Way cloud results. This scatter likely represents real differences in giant molecular cloud physical properties such as the dense gas fraction.

  7. Correction of sub-pixel topographical effects on land surface albedo retrieved from geostationary satellite (FengYun-2D) observations

    International Nuclear Information System (INIS)

    Roupioz, L; Nerry, F; Jia, L; Menenti, M

    2014-01-01

    The Qinghai-Tibetan Plateau is characterised by a very strong relief which affects albedo retrieval from satellite data. The objective of this study is to highlight the effects of sub-pixel topography and to account for those effects when retrieving land surface albedo from geostationary satellite FengYun-2D (FY-2D) data with 1.25km spatial resolution using the high spatial resolution (30 m) data of the Digital Elevation Model (DEM) from ASTER. The methodology integrates the effects of sub-pixel topography on the estimation of the total irradiance received at the surface, allowing the computation of the topographically corrected surface reflectance. Furthermore, surface albedo is estimated by applying the parametric BRDF (Bidirectional Reflectance Distribution Function) model called RPV (Rahman-Pinty-Verstraete) to the terrain corrected surface reflectance. The results, evaluated against ground measurements collected over several experimental sites on the Qinghai-Tibetan Plateau, document the advantage of integrating the sub-pixel topography effects in the land surface reflectance at 1km resolution to estimate the land surface albedo. The results obtained after using sub-pixel topographic correction are compared with the ones obtained after using pixel level topographic correction. The preliminary results imply that, in highly rugged terrain, the sub-pixel topography correction method gives more accurate results. The pixel level correction tends to overestimate surface albedo

  8. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Van Beusekom, Ashley E.; González, Grizelle; Scholl, Martha A.

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline for quantifying future changes in cloud base, we installed a ceilometer at 100 m altitude in the forest upwind of the TMCF that occupies an altitude range from ∼ 600 m to the peaks at 1100 m in the Luquillo Mountains of eastern Puerto Rico. Airport Automated Surface Observing System (ASOS) ceilometer data, radiosonde data, and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite data were obtained to investigate seasonal cloud base dynamics, altitude of the trade-wind inversion (TWI), and typical cloud thickness for the surrounding Caribbean region. Cloud base is rarely quantified near mountains, so these results represent a first look at seasonal and diurnal cloud base dynamics for the TMCF. From May 2013 to August 2016, cloud base was lowest during the midsummer dry season, and cloud bases were lower than the mountaintops as often in the winter dry season as in the wet seasons. The lowest cloud bases most frequently occurred at higher elevation than 600 m, from 740 to 964 m. The Luquillo forest low cloud base altitudes were higher than six other sites in the Caribbean by ∼ 200–600 m, highlighting the importance of site selection to measure topographic influence on cloud height. Proximity to the oceanic cloud system where shallow cumulus clouds are seasonally invariant in altitude and cover, along with local trade-wind orographic lifting and cloud formation, may explain the dry season low clouds. The results indicate that climate change threats to low-elevation TMCFs are not limited to the dry season; changes in synoptic-scale weather patterns

  9. Sub-Pixel Accuracy Crack Width Determination on Concrete Beams in Load Tests by Triangle Mesh Geometry Analysis

    Science.gov (United States)

    Liebold, F.; Maas, H.-G.

    2018-05-01

    This paper deals with the determination of crack widths of concrete beams during load tests from monocular image sequences. The procedure starts in a reference image of the probe with suitable surface texture under zero load, where a large number of points is defined by an interest operator. Then a triangulated irregular network is established to connect the points. Image sequences are recorded during load tests with the load increasing continuously or stepwise, or at intermittently changing load. The vertices of the triangles are tracked through the consecutive images of the sequence with sub-pixel accuracy by least squares matching. All triangles are then analyzed for changes by principal strain calculation. For each triangle showing significant strain, a crack width is computed by a thorough geometric analysis of the relative movement of the vertices.

  10. A Novel Sub-pixel Measurement Algorithm Based on Mixed the Fractal and Digital Speckle Correlation in Frequency Domain

    Directory of Open Access Journals (Sweden)

    Zhangfang Hu

    2014-10-01

    Full Text Available The digital speckle correlation is a non-contact in-plane displacement measurement method based on machine vision. Motivated by the facts that the low accuracy and large amount of calculation produced by the traditional digital speckle correlation method in spatial domain, we introduce a sub-pixel displacement measurement algorithm which employs a fast interpolation method based on fractal theory and digital speckle correlation in frequency domain. This algorithm can overcome either the blocking effect or the blurring caused by the traditional interpolation methods, and the frequency domain processing also avoids the repeated searching in the correlation recognition of the spatial domain, thus the operation quantity is largely reduced and the information extracting speed is improved. The comparative experiment is given to verify that the proposed algorithm in this paper is effective.

  11. Estimation of sub-pixel water area on Tibet plateau using multiple endmembers spectral mixture spectral analysis from MODIS data

    Science.gov (United States)

    Cui, Qian; Shi, Jiancheng; Xu, Yuanliu

    2011-12-01

    Water is the basic needs for human society, and the determining factor of stability of ecosystem as well. There are lots of lakes on Tibet Plateau, which will lead to flood and mudslide when the water expands sharply. At present, water area is extracted from TM or SPOT data for their high spatial resolution; however, their temporal resolution is insufficient. MODIS data have high temporal resolution and broad coverage. So it is valuable resource for detecting the change of water area. Because of its low spatial resolution, mixed-pixels are common. In this paper, four spectral libraries are built using MOD09A1 product, based on that, water body is extracted in sub-pixels utilizing Multiple Endmembers Spectral Mixture Analysis (MESMA) using MODIS daily reflectance data MOD09GA. The unmixed result is comparing with contemporaneous TM data and it is proved that this method has high accuracy.

  12. Color capable sub-pixel resolving optofluidic microscope and its application to blood cell imaging for malaria diagnosis.

    Directory of Open Access Journals (Sweden)

    Seung Ah Lee

    Full Text Available Miniaturization of imaging systems can significantly benefit clinical diagnosis in challenging environments, where access to physicians and good equipment can be limited. Sub-pixel resolving optofluidic microscope (SROFM offers high-resolution imaging in the form of an on-chip device, with the combination of microfluidics and inexpensive CMOS image sensors. In this work, we report on the implementation of color SROFM prototypes with a demonstrated optical resolution of 0.66 µm at their highest acuity. We applied the prototypes to perform color imaging of red blood cells (RBCs infected with Plasmodium falciparum, a particularly harmful type of malaria parasites and one of the major causes of death in the developing world.

  13. Progress in Understanding the Impacts of 3-D Cloud Structure on MODIS Cloud Property Retrievals for Marine Boundary Layer Clouds

    Science.gov (United States)

    Zhang, Zhibo; Werner, Frank; Miller, Daniel; Platnick, Steven; Ackerman, Andrew; DiGirolamo, Larry; Meyer, Kerry; Marshak, Alexander; Wind, Galina; Zhao, Guangyu

    2016-01-01

    Theory: A novel framework based on 2-D Tayler expansion for quantifying the uncertainty in MODIS retrievals caused by sub-pixel reflectance inhomogeneity. (Zhang et al. 2016). How cloud vertical structure influences MODIS LWP retrievals. (Miller et al. 2016). Observation: Analysis of failed MODIS cloud property retrievals. (Cho et al. 2015). Cloud property retrievals from 15m resolution ASTER observations. (Werner et al. 2016). Modeling: LES-Satellite observation simulator (Zhang et al. 2012, Miller et al. 2016).

  14. Quantifying sub-pixel urban impervious surface through fusion of optical and inSAR imagery

    Science.gov (United States)

    Yang, L.; Jiang, L.; Lin, H.; Liao, M.

    2009-01-01

    In this study, we explored the potential to improve urban impervious surface modeling and mapping with the synergistic use of optical and Interferometric Synthetic Aperture Radar (InSAR) imagery. We used a Classification and Regression Tree (CART)-based approach to test the feasibility and accuracy of quantifying Impervious Surface Percentage (ISP) using four spectral bands of SPOT 5 high-resolution geometric (HRG) imagery and three parameters derived from the European Remote Sensing (ERS)-2 Single Look Complex (SLC) SAR image pair. Validated by an independent ISP reference dataset derived from the 33 cm-resolution digital aerial photographs, results show that the addition of InSAR data reduced the ISP modeling error rate from 15.5% to 12.9% and increased the correlation coefficient from 0.71 to 0.77. Spatially, the improvement is especially noted in areas of vacant land and bare ground, which were incorrectly mapped as urban impervious surfaces when using the optical remote sensing data. In addition, the accuracy of ISP prediction using InSAR images alone is only marginally less than that obtained by using SPOT imagery. The finding indicates the potential of using InSAR data for frequent monitoring of urban settings located in cloud-prone areas.

  15. Do clouds save the great barrier reef? satellite imagery elucidates the cloud-SST relationship at the local scale.

    Directory of Open Access Journals (Sweden)

    Susannah M Leahy

    Full Text Available Evidence of global climate change and rising sea surface temperatures (SSTs is now well documented in the scientific literature. With corals already living close to their thermal maxima, increases in SSTs are of great concern for the survival of coral reefs. Cloud feedback processes may have the potential to constrain SSTs, serving to enforce an "ocean thermostat" and promoting the survival of coral reefs. In this study, it was hypothesized that cloud cover can affect summer SSTs in the tropics. Detailed direct and lagged relationships between cloud cover and SST across the central Great Barrier Reef (GBR shelf were investigated using data from satellite imagery and in situ temperature and light loggers during two relatively hot summers (2005 and 2006 and two relatively cool summers (2007 and 2008. Across all study summers and shelf positions, SSTs exhibited distinct drops during periods of high cloud cover, and conversely, SST increases during periods of low cloud cover, with a three-day temporal lag between a change in cloud cover and a subsequent change in SST. Cloud cover alone was responsible for up to 32.1% of the variation in SSTs three days later. The relationship was strongest in both El Niño (2005 and La Niña (2008 study summers and at the inner-shelf position in those summers. SST effects on subsequent cloud cover were weaker and more variable among study summers, with rising SSTs explaining up to 21.6% of the increase in cloud cover three days later. This work quantifies the often observed cloud cooling effect on coral reefs. It highlights the importance of incorporating local-scale processes into bleaching forecasting models, and encourages the use of remote sensing imagery to value-add to coral bleaching field studies and to more accurately predict risks to coral reefs.

  16. Automatic optimal filament segmentation with sub-pixel accuracy using generalized linear models and B-spline level-sets.

    Science.gov (United States)

    Xiao, Xun; Geyer, Veikko F; Bowne-Anderson, Hugo; Howard, Jonathon; Sbalzarini, Ivo F

    2016-08-01

    Biological filaments, such as actin filaments, microtubules, and cilia, are often imaged using different light-microscopy techniques. Reconstructing the filament curve from the acquired images constitutes the filament segmentation problem. Since filaments have lower dimensionality than the image itself, there is an inherent trade-off between tracing the filament with sub-pixel accuracy and avoiding noise artifacts. Here, we present a globally optimal filament segmentation method based on B-spline vector level-sets and a generalized linear model for the pixel intensity statistics. We show that the resulting optimization problem is convex and can hence be solved with global optimality. We introduce a simple and efficient algorithm to compute such optimal filament segmentations, and provide an open-source implementation as an ImageJ/Fiji plugin. We further derive an information-theoretic lower bound on the filament segmentation error, quantifying how well an algorithm could possibly do given the information in the image. We show that our algorithm asymptotically reaches this bound in the spline coefficients. We validate our method in comprehensive benchmarks, compare with other methods, and show applications from fluorescence, phase-contrast, and dark-field microscopy. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Large-Scale Ocean Circulation-Cloud Interactions Reduce the Pace of Transient Climate Change

    Science.gov (United States)

    Trossman, D. S.; Palter, J. B.; Merlis, T. M.; Huang, Y.; Xia, Y.

    2016-01-01

    Changes to the large scale oceanic circulation are thought to slow the pace of transient climate change due, in part, to their influence on radiative feedbacks. Here we evaluate the interactions between CO2-forced perturbations to the large-scale ocean circulation and the radiative cloud feedback in a climate model. Both the change of the ocean circulation and the radiative cloud feedback strongly influence the magnitude and spatial pattern of surface and ocean warming. Changes in the ocean circulation reduce the amount of transient global warming caused by the radiative cloud feedback by helping to maintain low cloud coverage in the face of global warming. The radiative cloud feedback is key in affecting atmospheric meridional heat transport changes and is the dominant radiative feedback mechanism that responds to ocean circulation change. Uncertainty in the simulated ocean circulation changes due to CO2 forcing may contribute a large share of the spread in the radiative cloud feedback among climate models.

  18. High-Resolution Global Modeling of the Effects of Subgrid-Scale Clouds and Turbulence on Precipitating Cloud Systems

    Energy Technology Data Exchange (ETDEWEB)

    Bogenschutz, Peter [National Center for Atmospheric Research, Boulder, CO (United States); Moeng, Chin-Hoh [National Center for Atmospheric Research, Boulder, CO (United States)

    2015-10-13

    The PI’s at the National Center for Atmospheric Research (NCAR), Chin-Hoh Moeng and Peter Bogenschutz, have primarily focused their time on the implementation of the Simplified-Higher Order Turbulence Closure (SHOC; Bogenschutz and Krueger 2013) to the Multi-scale Modeling Framework (MMF) global model and testing of SHOC on deep convective cloud regimes.

  19. Self managing monitoring for highly elastic large scale Cloud deployments

    OpenAIRE

    Ward, Jonathan Stuart; Barker, Adam David

    2014-01-01

    Infrastructure as a Service computing exhibits a number of properties, which are not found in conventional server deployments. Elasticity is among the most significant of these properties which has wide reaching implications for applications deployed in cloud hosted VMs. Among the applications affected by elasticity is monitoring. In this paper we investigate the challenges of monitoring large cloud deployments and how these challenges differ from previous monitoring problems. In order to mee...

  20. Estimation bias from using nonlinear Fourier plane correlators for sub-pixel image shift measurement and implications for the binary joint transform correlator

    Science.gov (United States)

    Grycewicz, Thomas J.; Florio, Christopher J.; Franz, Geoffrey A.; Robinson, Ross E.

    2007-09-01

    When using Fourier plane digital algorithms or an optical correlator to measure the correlation between digital images, interpolation by center-of-mass or quadratic estimation techniques can be used to estimate image displacement to the sub-pixel level. However, this can lead to a bias in the correlation measurement. This bias shifts the sub-pixel output measurement to be closer to the nearest pixel center than the actual location. The paper investigates the bias in the outputs of both digital and optical correlators, and proposes methods to minimize this effect. We use digital studies and optical implementations of the joint transform correlator to demonstrate optical registration with accuracies better than 0.1 pixels. We use both simulations of image shift and movies of a moving target as inputs. We demonstrate bias error for both center-of-mass and quadratic interpolation, and discuss the reasons that this bias is present. Finally, we suggest measures to reduce or eliminate the bias effects. We show that when sub-pixel bias is present, it can be eliminated by modifying the interpolation method. By removing the bias error, we improve registration accuracy by thirty percent.

  1. EVIDENCE FOR CLOUD-CLOUD COLLISION AND PARSEC-SCALE STELLAR FEEDBACK WITHIN THE L1641-N REGION

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, Fumitaka [National Astronomical Observatory, Mitaka, Tokyo 181-8588 (Japan); Miura, Tomoya; Nishi, Ryoichi [Department of Physics, Niigata University, 8050 Ikarashi-2, Niigata 950-2181 (Japan); Kitamura, Yoshimi; Akashi, Toshiya; Ikeda, Norio [Institute of Space and Astronautical Science, Japan Aerospace Exploration Agency, 3-1-1 Yoshinodai, Sagamihara, Kanagawa 229-8510 (Japan); Shimajiri, Yoshito; Kawabe, Ryohei [Nobeyama Radio Observatory, Nobeyama, Minamimaki, Minamisaku, Nagano 384-1305 (Japan); Tsukagoshi, Takashi [Department of Astronomy, School of Science, University of Tokyo, Bunkyo, Tokyo 113-0033 (Japan); Momose, Munetake [Institute of Astrophysics and Planetary Sciences, Ibaraki University, Bunkyo 2-1-1, Mito 310-8512 (Japan); Li Zhiyun, E-mail: fumitaka.nakamura@nao.ac.jp [Department of Astronomy, University of Virginia, P.O. Box 400325, Charlottesville, VA 22904 (United States)

    2012-02-10

    We present high spatial resolution {sup 12}CO (J = 1-0) images taken by the Nobeyama 45 m telescope toward a 48' Multiplication-Sign 48' area, including the L1641-N cluster. The effective spatial resolution of the maps is 21'', corresponding to 0.04 pc at a distance of 400 pc. A recent 1.1 mm dust continuum map reveals that the dense gas is concentrated in several thin filaments. We find that a few dust filaments are located at the parts where {sup 12}CO (J = 1-0) emission drops sharply. Furthermore, the filaments have two components with different velocities. The velocity difference between the two components is about 3 km s{sup -1}, corresponding to a Mach number of 10, significantly larger than the local turbulent velocity in the cloud. These facts imply that the collision of the two components (hereafter, the cloud-cloud collision) possibly contributed to the formation of these filaments. Since the two components appear to overlap toward the filaments on the plane of the sky, the collision may have occurred almost along the line of sight. Star formation in the L1641-N cluster was probably triggered by such a collision. We also find several parsec-scale CO shells whose centers are close to either the L1641-N cluster or the V 380 Ori cluster. We propose that these shells were created by multiple winds and/or outflows from cluster young stellar objects, i.e., 'protocluster winds'. One exceptional dust filament located at the western cloud edge lies along a shell; it is presumably part of the expanding shell. Both the cloud-cloud collision and protocluster winds are likely to influence the cloud structure and kinematics in this region.

  2. Introducing Subrid-scale Cloud Feedbacks to Radiation for Regional Meteorological and Cllimate Modeling

    Science.gov (United States)

    Convection systems and associated cloudiness directly influence regional and local radiation budgets, and dynamics and thermodynamics through feedbacks. However, most subgrid-scale convective parameterizations in regional weather and climate models do not consider cumulus cloud ...

  3. CERNBox: Petabyte-Scale Cloud Synchronisation and Sharing Platform

    OpenAIRE

    Hugo González Labrador

    2016-01-01

    CERNBox is a cloud synchronisation service for end-users: it allows syncing and sharing files on all major mobile and desktop platforms (Linux, Windows, MacOSX, Android, iOS) aiming to provide offline availability to any data stored in the CERN EOS infrastructure. There is a high demand in the community for an easily accessible cloud storage solution such as CERNBox. Integration of the CERNBox service with the EOS storage back-end is the next step towards providing ’synchronisation and sharin...

  4. Sensitivities of simulated satellite views of clouds to subgrid-scale overlap and condensate heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Hillman, Benjamin R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marchand, Roger T. [Univ. of Washington, Seattle, WA (United States); Ackerman, Thomas P. [Univ. of Washington, Seattle, WA (United States)

    2017-08-01

    Satellite simulators are often used to account for limitations in satellite retrievals of cloud properties in comparisons between models and satellite observations. The purpose of the simulator framework is to enable more robust evaluation of model cloud properties, so that di erences between models and observations can more con dently be attributed to model errors. However, these simulators are subject to uncertainties themselves. A fundamental uncertainty exists in connecting the spatial scales at which cloud properties are retrieved with those at which clouds are simulated in global models. In this study, we create a series of sensitivity tests using 4 km global model output from the Multiscale Modeling Framework to evaluate the sensitivity of simulated satellite retrievals when applied to climate models whose grid spacing is many tens to hundreds of kilometers. In particular, we examine the impact of cloud and precipitation overlap and of condensate spatial variability. We find the simulated retrievals are sensitive to these assumptions. Specifically, using maximum-random overlap with homogeneous cloud and precipitation condensate, which is often used in global climate models, leads to large errors in MISR and ISCCP-simulated cloud cover and in CloudSat-simulated radar reflectivity. To correct for these errors, an improved treatment of unresolved clouds and precipitation is implemented for use with the simulator framework and is shown to substantially reduce the identified errors.

  5. Sharing Planetary-Scale Data in the Cloud

    Science.gov (United States)

    Sundwall, J.; Flasher, J.

    2016-12-01

    On 19 March 2015, Amazon Web Services (AWS) announced Landsat on AWS, an initiative to make data from the U.S. Geological Survey's Landsat satellite program freely available in the cloud. Because of Landsat's global coverage and long history, it has become a reference point for all Earth observation work and is considered the gold standard of natural resource satellite imagery. Within the first year of Landsat on AWS, the service served over a billion requests for Landsat imagery and metadata, globally. Availability of the data in the cloud has led to new product development by companies and startups including Mapbox, Esri, CartoDB, MathWorks, Development Seed, Trimble, Astro Digital, Blue Raster and Timbr.io. The model of staging data for analysis in the cloud established by Landsat on AWS has since been applied to high resolution radar data, European Space Agency satellite imagery, global elevation data and EPA air quality models. This session will provide an overview of lessons learned throughout these projects. It will demonstrate how cloud-based object storage is democratizing access to massive publicly-funded data sets that have previously only been available to people with access to large amounts of storage, bandwidth, and computing power. Technical discussion points will include: The differences between staging data for analysis using object storage versus file storage Using object stores to design simple RESTful APIs through thoughtful file naming conventions, header fields, and HTTP Range Requests Managing costs through data architecture and Amazon S3's "requester pays" feature Building tools that allow users to take their algorithm to the data in the cloud Using serverless technologies to display dynamic frontends for massive data sets

  6. Aerosol-cloud interactions in a multi-scale modeling framework

    Science.gov (United States)

    Lin, G.; Ghan, S. J.

    2017-12-01

    Atmospheric aerosols play an important role in changing the Earth's climate through scattering/absorbing solar and terrestrial radiation and interacting with clouds. However, quantification of the aerosol effects remains one of the most uncertain aspects of current and future climate projection. Much of the uncertainty results from the multi-scale nature of aerosol-cloud interactions, which is very challenging to represent in traditional global climate models (GCMs). In contrast, the multi-scale modeling framework (MMF) provides a viable solution, which explicitly resolves the cloud/precipitation in the cloud resolved model (CRM) embedded in the GCM grid column. In the MMF version of community atmospheric model version 5 (CAM5), aerosol processes are treated with a parameterization, called the Explicit Clouds Parameterized Pollutants (ECPP). It uses the cloud/precipitation statistics derived from the CRM to treat the cloud processing of aerosols on the GCM grid. However, this treatment treats clouds on the CRM grid but aerosols on the GCM grid, which is inconsistent with the reality that cloud-aerosol interactions occur on the cloud scale. To overcome the limitation, here, we propose a new aerosol treatment in the MMF: Explicit Clouds Explicit Aerosols (ECEP), in which we resolve both clouds and aerosols explicitly on the CRM grid. We first applied the MMF with ECPP to the Accelerated Climate Modeling for Energy (ACME) model to have an MMF version of ACME. Further, we also developed an alternative version of ACME-MMF with ECEP. Based on these two models, we have conducted two simulations: one with the ECPP and the other with ECEP. Preliminary results showed that the ECEP simulations tend to predict higher aerosol concentrations than ECPP simulations, because of the more efficient vertical transport from the surface to the higher atmosphere but the less efficient wet removal. We also found that the cloud droplet number concentrations are also different between the

  7. Scaling analysis of cloud and rain water in marine stratocumulus and implications for scale-aware microphysical parameterizations

    Science.gov (United States)

    Witte, M.; Morrison, H.; Jensen, J. B.; Bansemer, A.; Gettelman, A.

    2017-12-01

    The spatial covariance of cloud and rain water (or in simpler terms, small and large drops, respectively) is an important quantity for accurate prediction of the accretion rate in bulk microphysical parameterizations that account for subgrid variability using assumed probability density functions (pdfs). Past diagnoses of this covariance from remote sensing, in situ measurements and large eddy simulation output have implicitly assumed that the magnitude of the covariance is insensitive to grain size (i.e. horizontal resolution) and averaging length, but this is not the case because both cloud and rain water exhibit scale invariance across a wide range of scales - from tens of centimeters to tens of kilometers in the case of cloud water, a range that we will show is primarily limited by instrumentation and sampling issues. Since the individual variances systematically vary as a function of spatial scale, it should be expected that the covariance follows a similar relationship. In this study, we quantify the scaling properties of cloud and rain water content and their covariability from high frequency in situ aircraft measurements of marine stratocumulus taken over the southeastern Pacific Ocean aboard the NSF/NCAR C-130 during the VOCALS-REx field experiment of October-November 2008. First we confirm that cloud and rain water scale in distinct manners, indicating that there is a statistically and potentially physically significant difference in the spatial structure of the two fields. Next, we demonstrate that the covariance is a strong function of spatial scale, which implies important caveats regarding the ability of limited-area models with domains smaller than a few tens of kilometers across to accurately reproduce the spatial organization of precipitation. Finally, we present preliminary work on the development of a scale-aware parameterization of cloud-rain water subgrid covariability based in multifractal analysis intended for application in large-scale model

  8. Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) Science Plan

    Energy Technology Data Exchange (ETDEWEB)

    Fast, JD [Pacific Northwest National Laboratory; Berg, LK [Pacific Northwest National Laboratory

    2015-12-01

    Cumulus convection is an important component in the atmospheric radiation budget and hydrologic cycle over the Southern Great Plains and over many regions of the world, particularly during the summertime growing season when intense turbulence induced by surface radiation couples the land surface to clouds. Current convective cloud parameterizations contain uncertainties resulting in part from insufficient coincident data that couples cloud macrophysical and microphysical properties to inhomogeneities in boundary layer and aerosol properties. The Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) campaign is designed to provide a detailed set of measurements that are needed to obtain a more complete understanding of the life cycle of shallow clouds by coupling cloud macrophysical and microphysical properties to land surface properties, ecosystems, and aerosols. HI-SCALE consists of 2, 4-week intensive observational periods, one in the spring and the other in the late summer, to take advantage of different stages and distribution of “greenness” for various types of vegetation in the vicinity of the Atmospheric Radiation and Measurement (ARM) Climate Research Facility’s Southern Great Plains (SGP) site as well as aerosol properties that vary during the growing season. Most of the proposed instrumentation will be deployed on the ARM Aerial Facility (AAF) Gulfstream 1 (G-1) aircraft, including those that measure atmospheric turbulence, cloud water content and drop size distributions, aerosol precursor gases, aerosol chemical composition and size distributions, and cloud condensation nuclei concentrations. Routine ARM aerosol measurements made at the surface will be supplemented with aerosol microphysical properties measurements. The G-1 aircraft will complete transects over the SGP Central Facility at multiple altitudes within the boundary layer, within clouds, and above clouds.

  9. Using cloud ice flux to parametrise large-scale lightning

    Directory of Open Access Journals (Sweden)

    D. L. Finney

    2014-12-01

    Full Text Available Lightning is an important natural source of nitrogen oxide especially in the middle and upper troposphere. Hence, it is essential to represent lightning in chemistry transport and coupled chemistry–climate models. Using ERA-Interim meteorological reanalysis data we compare the lightning flash density distributions produced using several existing lightning parametrisations, as well as a new parametrisation developed on the basis of upward cloud ice flux at 440 hPa. The use of ice flux forms a link to the non-inductive charging mechanism of thunderstorms. Spatial and temporal distributions of lightning flash density are compared to tropical and subtropical observations for 2007–2011 from the Lightning Imaging Sensor (LIS on the Tropical Rainfall Measuring Mission (TRMM satellite. The well-used lightning flash parametrisation based on cloud-top height has large biases but the derived annual total flash density has a better spatial correlation with the LIS observations than other existing parametrisations. A comparison of flash density simulated by the different schemes shows that the cloud-top height parametrisation has many more instances of moderate flash densities and fewer low and high extremes compared to the other parametrisations. Other studies in the literature have shown that this feature of the cloud-top height parametrisation is in contrast to lightning observations over certain regions. Our new ice flux parametrisation shows a clear improvement over all the existing parametrisations with lower root mean square errors (RMSEs and better spatial correlations with the observations for distributions of annual total, and seasonal and interannual variations. The greatest improvement with the new parametrisation is a more realistic representation of the zonal distribution with a better balance between tropical and subtropical lightning flash estimates. The new parametrisation is appropriate for testing in chemistry transport and chemistry

  10. Evaluating Commercial and Private Cloud Services for Facility-Scale Geodetic Data Access, Analysis, and Services

    Science.gov (United States)

    Meertens, C. M.; Boler, F. M.; Ertz, D. J.; Mencin, D.; Phillips, D.; Baker, S.

    2017-12-01

    UNAVCO, in its role as a NSF facility for geodetic infrastructure and data, has succeeded for over two decades using on-premises infrastructure, and while the promise of cloud-based infrastructure is well-established, significant questions about suitability of such infrastructure for facility-scale services remain. Primarily through the GeoSciCloud award from NSF EarthCube, UNAVCO is investigating the costs, advantages, and disadvantages of providing its geodetic data and services in the cloud versus using UNAVCO's on-premises infrastructure. (IRIS is a collaborator on the project and is performing its own suite of investigations). In contrast to the 2-3 year time scale for the research cycle, the time scale of operation and planning for NSF facilities is for a minimum of five years and for some services extends to a decade or more. Planning for on-premises infrastructure is deliberate, and migrations typically take months to years to fully implement. Migrations to a cloud environment can only go forward with similar deliberate planning and understanding of all costs and benefits. The EarthCube GeoSciCloud project is intended to address the uncertainties of facility-level operations in the cloud. Investigations are being performed in a commercial cloud environment (Amazon AWS) during the first year of the project and in a private cloud environment (NSF XSEDE resource at the Texas Advanced Computing Center) during the second year. These investigations are expected to illuminate the potential as well as the limitations of running facility scale production services in the cloud. The work includes running parallel equivalent cloud-based services to on premises services and includes: data serving via ftp from a large data store, operation of a metadata database, production scale processing of multiple months of geodetic data, web services delivery of quality checked data and products, large-scale compute services for event post-processing, and serving real time data

  11. Towards an integrated multiscale simulation of turbulent clouds on PetaScale computers

    International Nuclear Information System (INIS)

    Wang Lianping; Ayala, Orlando; Parishani, Hossein; Gao, Guang R; Kambhamettu, Chandra; Li Xiaoming; Rossi, Louis; Orozco, Daniel; Torres, Claudio; Grabowski, Wojciech W; Wyszogrodzki, Andrzej A; Piotrowski, Zbigniew

    2011-01-01

    The development of precipitating warm clouds is affected by several effects of small-scale air turbulence including enhancement of droplet-droplet collision rate by turbulence, entrainment and mixing at the cloud edges, and coupling of mechanical and thermal energies at various scales. Large-scale computation is a viable research tool for quantifying these multiscale processes. Specifically, top-down large-eddy simulations (LES) of shallow convective clouds typically resolve scales of turbulent energy-containing eddies while the effects of turbulent cascade toward viscous dissipation are parameterized. Bottom-up hybrid direct numerical simulations (HDNS) of cloud microphysical processes resolve fully the dissipation-range flow scales but only partially the inertial subrange scales. it is desirable to systematically decrease the grid length in LES and increase the domain size in HDNS so that they can be better integrated to address the full range of scales and their coupling. In this paper, we discuss computational issues and physical modeling questions in expanding the ranges of scales realizable in LES and HDNS, and in bridging LES and HDNS. We review our on-going efforts in transforming our simulation codes towards PetaScale computing, in improving physical representations in LES and HDNS, and in developing better methods to analyze and interpret the simulation results.

  12. Evaluating and Improving Cloud Processes in the Multi-Scale Modeling Framework

    Energy Technology Data Exchange (ETDEWEB)

    Ackerman, Thomas P. [Univ. of Washington, Seattle, WA (United States)

    2015-03-01

    The research performed under this grant was intended to improve the embedded cloud model in the Multi-scale Modeling Framework (MMF) for convective clouds by using a 2-moment microphysics scheme rather than the single moment scheme used in all the MMF runs to date. The technical report and associated documents describe the results of testing the cloud resolving model with fixed boundary conditions and evaluation of model results with data. The overarching conclusion is that such model evaluations are problematic because errors in the forcing fields control the results so strongly that variations in parameterization values cannot be usefully constrained

  13. Discrimination of Biomass Burning Smoke and Clouds in MAIAC Algorithm

    Science.gov (United States)

    Lyapustin, A.; Korkin, S.; Wang, Y.; Quayle, B.; Laszlo, I.

    2012-01-01

    The multi-angle implementation of atmospheric correction (MAIAC) algorithm makes aerosol retrievals from MODIS data at 1 km resolution providing information about the fine scale aerosol variability. This information is required in different applications such as urban air quality analysis, aerosol source identification etc. The quality of high resolution aerosol data is directly linked to the quality of cloud mask, in particular detection of small (sub-pixel) and low clouds. This work continues research in this direction, describing a technique to detect small clouds and introducing the smoke test to discriminate the biomass burning smoke from the clouds. The smoke test relies on a relative increase of aerosol absorption at MODIS wavelength 0.412 micrometers as compared to 0.47-0.67 micrometers due to multiple scattering and enhanced absorption by organic carbon released during combustion. This general principle has been successfully used in the OMI detection of absorbing aerosols based on UV measurements. This paper provides the algorithm detail and illustrates its performance on two examples of wildfires in US Pacific North-West and in Georgia/Florida of 2007.

  14. Modelling cloud effects on ozone on a regional scale : A case study

    NARCIS (Netherlands)

    Matthijsen, J.; Builtjes, P.J.H.; Meijer, E.W.; Boersen, G.

    1997-01-01

    We have investigated the influence of clouds on ozone on a regional scale (Europe) with a regional scale photochemical dispersion model (LOTOS). The LOTOS-model calculates ozone and other photo-oxidant concentrations in the lowest three km of the troposphere, using actual meteorologic data and

  15. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  16. Molecular clouds in the NGC 6334 and NGC 6357 region: Evidence for a 100 pc-scale cloud-cloud collision triggering the Galactic mini-starbursts

    Science.gov (United States)

    Fukui, Yasuo; Kohno, Mikito; Yokoyama, Keiko; Torii, Kazufumi; Hattori, Yusuke; Sano, Hidetoshi; Nishimura, Atsushi; Ohama, Akio; Yamamoto, Hiroaki; Tachihara, Kengo

    2018-05-01

    We carried out new CO (J = 1-0, 2-1, and 3-2) observations with NANTEN2 and ASTE in the region of the twin Galactic mini-starbursts NGC 6334 and NGC 6357. We detected two velocity molecular components of 12 km s-1 velocity separation, which is continuous over 3° along the plane. In NGC 6334 the two components show similar two-peaked intensity distributions toward the young H II regions and are linked by a bridge feature. In NGC 6357 we found spatially complementary distribution between the two velocity components as well as a bridge feature in velocity. Based on these results we hypothesize that the two clouds in the two regions collided with each other in the past few Myr and triggered the formation of the starbursts over ˜ 100 pc. We suggest that the formation of the starbursts happened toward the collisional region of extent ˜ 10 pc with initial high molecular column densities. For NGC 6334 we present a scenario which includes spatial variation of the colliding epoch due to non-uniform cloud separation. The scenario possibly explains the apparent age differences among the young O stars in NGC 6334, which range from 104 yr to 106 yr; the latest collision happened within 105 yr toward the youngest stars in NGC 6334 I(N) and I which exhibit molecular outflows without H II regions. For NGC 6357 the O stars were formed a few Myr ago, and the cloud dispersal by the O stars is significant. We conclude that cloud-cloud collision offers a possible explanation of the mini-starburst over a 100 pc scale.

  17. Large-Scale, Multi-Sensor Atmospheric Data Fusion Using Hybrid Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2015-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, MODIS, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over 10 years of data. HySDS is a Hybrid-Cloud Science Data System that has been developed and applied under NASA AIST, MEaSUREs, and ACCESS grants. HySDS uses the SciFlow workflow engine to partition analysis workflows into parallel tasks (e.g. segmenting by time or space) that are pushed into a durable job queue. The tasks are "pulled" from the queue by worker Virtual Machines (VM's) and executed in an on-premise Cloud (Eucalyptus or OpenStack) or at Amazon in the public Cloud or govCloud. In this way, years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the transferred data. We are using HySDS to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a MEASURES grant. We will present the architecture of HySDS, describe the achieved "clock time" speedups in fusing datasets on our own nodes and in the Amazon Cloud, and discuss the Cloud cost tradeoffs for storage, compute, and data transfer. Our system demonstrates how one can pull A-Train variables (Levels 2 & 3) on-demand into the Amazon Cloud, and cache only those variables that are heavily used, so that any number of compute jobs can be

  18. Measurement of 3-D Vibrational Motion by Dynamic Photogrammetry Using Least-Square Image Matching for Sub-Pixel Targeting to Improve Accuracy

    Science.gov (United States)

    Lee, Hyoseong; Rhee, Huinam; Oh, Jae Hong; Park, Jin Ho

    2016-01-01

    This paper deals with an improved methodology to measure three-dimensional dynamic displacements of a structure by digital close-range photogrammetry. A series of stereo images of a vibrating structure installed with targets are taken at specified intervals by using two daily-use cameras. A new methodology is proposed to accurately trace the spatial displacement of each target in three-dimensional space. This method combines the correlation and the least-square image matching so that the sub-pixel targeting can be obtained to increase the measurement accuracy. Collinearity and space resection theory are used to determine the interior and exterior orientation parameters. To verify the proposed method, experiments have been performed to measure displacements of a cantilevered beam excited by an electrodynamic shaker, which is vibrating in a complex configuration with mixed bending and torsional motions simultaneously with multiple frequencies. The results by the present method showed good agreement with the measurement by two laser displacement sensors. The proposed methodology only requires inexpensive daily-use cameras, and can remotely detect the dynamic displacement of a structure vibrating in a complex three-dimensional defection shape up to sub-pixel accuracy. It has abundant potential applications to various fields, e.g., remote vibration monitoring of an inaccessible or dangerous facility. PMID:26978366

  19. DESIGN OF DYADIC-INTEGER-COEFFICIENTS BASED BI-ORTHOGONAL WAVELET FILTERS FOR IMAGE SUPER-RESOLUTION USING SUB-PIXEL IMAGE REGISTRATION

    Directory of Open Access Journals (Sweden)

    P.B. Chopade

    2014-05-01

    Full Text Available This paper presents image super-resolution scheme based on sub-pixel image registration by the design of a specific class of dyadic-integer-coefficient based wavelet filters derived from the construction of a half-band polynomial. First, the integer-coefficient based half-band polynomial is designed by the splitting approach. Next, this designed half-band polynomial is factorized and assigned specific number of vanishing moments and roots to obtain the dyadic-integer coefficients low-pass analysis and synthesis filters. The possibility of these dyadic-integer coefficients based wavelet filters is explored in the field of image super-resolution using sub-pixel image registration. The two-resolution frames are registered at a specific shift from one another to restore the resolution lost by CCD array of camera. The discrete wavelet transform (DWT obtained from the designed coefficients is applied on these two low-resolution images to obtain the high resolution image. The developed approach is validated by comparing the quality metrics with existing filter banks.

  20. Clausius-Clapeyron Scaling of Convective Available Potential Energy (CAPE) in Cloud-Resolving Simulations

    Science.gov (United States)

    Seeley, J.; Romps, D. M.

    2015-12-01

    Recent work by Singh and O'Gorman has produced a theory for convective available potential energy (CAPE) in radiative-convective equilibrium. In this model, the atmosphere deviates from a moist adiabat—and, therefore, has positive CAPE—because entrainment causes evaporative cooling in cloud updrafts, thereby steepening their lapse rate. This has led to the proposal that CAPE increases with global warming because the strength of evaporative cooling scales according to the Clausius-Clapeyron (CC) relation. However, CAPE could also change due to changes in cloud buoyancy and changes in the entrainment rate, both of which could vary with global warming. To test the relative importance of changes in CAPE due to CC scaling of evaporative cooling, changes in cloud buoyancy, and changes in the entrainment rate, we subject a cloud-resolving model to a suite of natural (and unnatural) forcings. We find that CAPE changes are primarily driven by changes in the strength of evaporative cooling; the effect of changes in the entrainment rate and cloud buoyancy are comparatively small. This builds support for CC scaling of CAPE.

  1. Interactions Between Atmospheric Aerosols and Marine Boundary Layer Clouds on Regional and Global Scales

    Science.gov (United States)

    Wang, Zhen

    Airborne aerosols are crucial atmospheric constituents that are involved in global climate change and human life qualities. Understanding the nature and magnitude of aerosol-cloud-precipitation interactions is critical in model predictions for atmospheric radiation budget and the water cycle. The interactions depend on a variety of factors including aerosol physicochemical complexity, cloud types, meteorological and thermodynamic regimes and data processing techniques. This PhD work is an effort to quantify the relationships among aerosol, clouds, and precipitation on both global and regional scales by using satellite retrievals and aircraft measurements. The first study examines spatial distributions of conversion rate of cloud water to rainwater in warm maritime clouds over the globe by using NASA A-Train satellite data. This study compares the time scale of the onset of precipitation with different aerosol categories defined by values of aerosol optical depth, fine mode fraction, and Angstrom Exponent. The results indicate that conversion time scales are actually quite sensitive to lower tropospheric static stability (LTSS) and cloud liquid water path (LWP), in addition to aerosol type. Analysis shows that tropical Pacific Ocean is dominated by the highest average conversion rate while subtropical warm cloud regions (far northeastern Pacific Ocean, far southeastern Pacific Ocean, Western Africa coastal area) exhibit the opposite result. Conversion times are mostly shorter for lower LTSS regimes. When LTSS condition is fixed, higher conversion rates coincide with higher LWP and lower aerosol index categories. After a general global view of physical property quantifications, the rest of the presented PhD studies is focused on regional airborne observations, especially bulk cloud water chemistry and aerosol aqueous-phase reactions during the summertime off the California coast. Local air mass origins are categorized into three distinct types (ocean, ships, and land

  2. A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes

    Science.gov (United States)

    Tao, Wei-Kuo

    2012-01-01

    During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].

  3. RACORO continental boundary layer cloud investigations: 1. Case study development and ensemble large-scale forcings

    Science.gov (United States)

    Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; Li, Zhijin; Xie, Shaocheng; Ackerman, Andrew S.; Zhang, Minghua; Khairoutdinov, Marat

    2015-06-01

    Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, κ, are derived from observations to be 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary clouds.

  4. RACORO Continental Boundary Layer Cloud Investigations: 1. Case Study Development and Ensemble Large-Scale Forcings

    Science.gov (United States)

    Vogelmann, Andrew M.; Fridlind, Ann M.; Toto, Tami; Endo, Satoshi; Lin, Wuyin; Wang, Jian; Feng, Sha; Zhang, Yunyan; Turner, David D.; Liu, Yangang; hide

    2015-01-01

    Observation-based modeling case studies of continental boundary layer clouds have been developed to study cloudy boundary layers, aerosol influences upon them, and their representation in cloud- and global-scale models. Three 60 h case study periods span the temporal evolution of cumulus, stratiform, and drizzling boundary layer cloud systems, representing mixed and transitional states rather than idealized or canonical cases. Based on in situ measurements from the Routine AAF (Atmospheric Radiation Measurement (ARM) Aerial Facility) CLOWD (Clouds with Low Optical Water Depth) Optical Radiative Observations (RACORO) field campaign and remote sensing observations, the cases are designed with a modular configuration to simplify use in large-eddy simulations (LES) and single-column models. Aircraft measurements of aerosol number size distribution are fit to lognormal functions for concise representation in models. Values of the aerosol hygroscopicity parameter, kappa, are derived from observations to be approximately 0.10, which are lower than the 0.3 typical over continents and suggestive of a large aerosol organic fraction. Ensemble large-scale forcing data sets are derived from the ARM variational analysis, European Centre for Medium-Range Weather Forecasts, and a multiscale data assimilation system. The forcings are assessed through comparison of measured bulk atmospheric and cloud properties to those computed in "trial" large-eddy simulations, where more efficient run times are enabled through modest reductions in grid resolution and domain size compared to the full-sized LES grid. Simulations capture many of the general features observed, but the state-of-the-art forcings were limited at representing details of cloud onset, and tight gradients and high-resolution transients of importance. Methods for improving the initial conditions and forcings are discussed. The cases developed are available to the general modeling community for studying continental boundary

  5. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP

    Science.gov (United States)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-01

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version of the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. Other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.

  6. Cloud-based computation for accelerating vegetation mapping and change detection at regional to national scales

    Science.gov (United States)

    Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts

    2015-01-01

    Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...

  7. A BAND SELECTION METHOD FOR SUB-PIXEL TARGET DETECTION IN HYPERSPECTRAL IMAGES BASED ON LABORATORY AND FIELD REFLECTANCE SPECTRAL COMPARISON

    Directory of Open Access Journals (Sweden)

    S. Sharifi hashjin

    2016-06-01

    Full Text Available In recent years, developing target detection algorithms has received growing interest in hyperspectral images. In comparison to the classification field, few studies have been done on dimension reduction or band selection for target detection in hyperspectral images. This study presents a simple method to remove bad bands from the images in a supervised manner for sub-pixel target detection. The proposed method is based on comparing field and laboratory spectra of the target of interest for detecting bad bands. For evaluation, the target detection blind test dataset is used in this study. Experimental results show that the proposed method can improve efficiency of the two well-known target detection methods, ACE and CEM.

  8. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  9. The parsec-scale relationship between ICO and AV in local molecular clouds

    Science.gov (United States)

    Lee, Cheoljong; Leroy, Adam K.; Bolatto, Alberto D.; Glover, Simon C. O.; Indebetouw, Remy; Sandstrom, Karin; Schruba, Andreas

    2018-03-01

    We measure the parsec-scale relationship between integrated CO intensity (ICO) and visual extinction (AV) in 24 local molecular clouds using maps of CO emission and dust optical depth from Planck. This relationship informs our understanding of CO emission across environments, but clean Milky Way measurements remain scarce. We find uniform ICO for a given AV, with the results bracketed by previous studies of the Pipe and Perseus clouds. Our measured ICO-AV relation broadly agrees with the standard Galactic CO-to-H2 conversion factor, the relation found for the Magellanic clouds at coarser resolution, and numerical simulations by Glover & Clark (2016). This supports the idea that CO emission primarily depends on shielding, which protects molecules from dissociating radiation. Evidence for CO saturation at high AV and a threshold for CO emission at low AV varies remains uncertain due to insufficient resolution and ambiguities in background subtraction. Resolution of order 0.1 pc may be required to measure these features. We use this ICO-AV relation to predict how the CO-to-H2 conversion factor (XCO) would change if the Solar Neighbourhood clouds had different dust-to-gas ratio (metallicity). The calculations highlight the need for improved observations of the CO emission threshold and H I shielding layer depth. They are also sensitive to the shape of the column density distribution. Because local clouds collectively show a self-similar distribution, we predict a shallow metallicity dependence for XCO down to a few tenths of solar metallicity. However, our calculations also imply dramatic variations in cloud-to-cloud XCO at subsolar metallicity.

  10. An LTE effective temperature scale for red supergiants in the Magellanic clouds

    Science.gov (United States)

    Tabernero, H. M.; Dorda, R.; Negueruela, I.; González-Fernández, C.

    2018-05-01

    We present a self-consistent study of cool supergiants (CSGs) belonging to the Magellanic clouds. We calculated stellar atmospheric parameters using LTE KURUCZ and MARCS atmospheric models for more than 400 individual targets by fitting a careful selection of weak metallic lines. We explore the existence of a Teff scale and its implications in two different metallicity environments (each Magellanic cloud). Critical and in-depth tests have been performed to assess the reliability of our stellar parameters (i.e. internal error budget, NLTE systematics). In addition, several Monte Carlo tests have been carried out to infer the significance of the Teff scale found. Our findings point towards a unique Teff scale that seems to be independent of the environment.

  11. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  12. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Ashley E. Van Beusekom; Grizelle Gonzalez; Martha A. Scholl

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline...

  13. Contributions of Heterogeneous Ice Nucleation, Large-Scale Circulation, and Shallow Cumulus Detrainment to Cloud Phase Transition in Mixed-Phase Clouds with NCAR CAM5

    Science.gov (United States)

    Liu, X.; Wang, Y.; Zhang, D.; Wang, Z.

    2016-12-01

    representations of large-scale moisture transport, cloud microphysics, ice nucleation, and cumulus detrainment in order to improve the mixed-phase transition in GCMs.

  14. Instantaneous Linkages between Clouds and Large-Scale Meteorology over the Southern Ocean in Observations and a Climate Model

    Energy Technology Data Exchange (ETDEWEB)

    Wall, Casey J. [Department of Atmospheric Sciences, University of Washington, Seattle, Washington; Hartmann, Dennis L. [Department of Atmospheric Sciences, University of Washington, Seattle, Washington; Ma, Po-Lun [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland, Washington

    2017-12-01

    Instantaneous, coincident, footprint-level satellite observations of cloud properties and radiation taken during austral summer over the Southern Ocean are used to study relationships between clouds and large-scale meteorology. Cloud properties are very sensitive to the strength of vertical motion in the middle-troposphere, and low-cloud properties are sensitive to estimated inversion strength, low-level temperature advection, and sea surface temperature. These relationships are quantified. An index for the meteorological anomalies associated with midlatitude cyclones is presented, and it is used to reveal the sensitivity of clouds to the meteorology within the warm- and cold-sector of cyclones. The observed relationships between clouds and meteorology are compared to those in the Community Atmosphere Model version 5 (CAM5) using satellite simulators. Low-clouds simulated by CAM5 are too few, too bright, and contain too much ice, and low-clouds located in the cold-sector of cyclones are too sensitive to variations in the meteorology. The latter two biases are dramatically reduced when CAM5 is coupled with an updated boundary layer parameterization know as Cloud Layers Unified by Binormals (CLUBB). More generally, this study demonstrates that examining the instantaneous timescale is a powerful approach to understanding the physical processes that control clouds and how they are represented in climate models. Such an evaluation goes beyond the cloud climatology and exposes model bias under various meteorological conditions.

  15. Evaluating the Performance of the Goddard Multi-Scale Modeling Framework against GPM, TRMM and CloudSat/CALIPSO Products

    Science.gov (United States)

    Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.

    2014-12-01

    Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.

  16. Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) Field Campaign Report

    Energy Technology Data Exchange (ETDEWEB)

    Fast, J. D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Berg, L. K. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burleyson, C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Fan, J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Feng, Z. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hagos, S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Huang, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Guenther, A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Laskin, A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ovchinnikov, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shilling, J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Shrivastava, M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Xiao, H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zaveri, R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zelenyuk-Imre, A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kuang, C. [Brookhaven National Lab. (BNL), Upton, NY (United States); Wang, J. [Brookhaven National Lab. (BNL), Upton, NY (United States); Smith, J. [University of California-Irvine; Turner, D. [National Severe Storms Laboratory; Gentine, P. [Columbia Univ., New York, NY (United States)

    2017-05-01

    Cumulus convection is an important component in the atmospheric radiation budget and hydrologic cycle over the southern Great Plains and over many regions of the world, particularly during the summertime growing season when intense turbulence induced by surface radiation couples the land surface to clouds. Current convective cloud parameterizations contain uncertainties resulting in part from insufficient coincident data that couples cloud macrophysical and microphysical properties to inhomogeneities in land surface, boundary layer, and aerosol properties. The Holistic Interactions of Shallow Clouds, Aerosols, and Land-Ecosystems (HI-SCALE) campaign was designed to provide a detailed set of measurements that are needed to obtain a more complete understanding of the lifecycle of shallow clouds by coupling cloud macrophysical and microphysical properties to land surface properties, ecosystems, and aerosols. Some of the land-atmosphere-cloud interactions that can be studied using HI-SCALE data are shown in Figure 1. HI-SCALE consisted of two 4-week intensive operation periods (IOPs), one in the spring (April 24-May 21) and the other in the late summer (August 28-September 24) of 2016, to take advantage of different stages of the plant lifecycle, the distribution of “greenness” for various types of vegetation in the vicinity of the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility Southern Great Plains (SGP) site, and aerosol properties that vary during the growing season. As expected, satellite measurements indicated that the Normalized Difference Vegetation Index (NDVI) was much “greener” in the vicinity of the SGP site during the spring IOP than the late summer IOP as a result of winter wheat maturing in the spring and being harvested in the early summer. As shown in Figure 2, temperatures were cooler than average and soil moisture was high during the spring IOP, while temperatures were warmer than average and

  17. A New WRF-Chem Treatment for Studying Regional Scale Impacts of Cloud-Aerosol Interactions in Parameterized Cumuli

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Larry K.; Shrivastava, ManishKumar B.; Easter, Richard C.; Fast, Jerome D.; Chapman, Elaine G.; Liu, Ying

    2015-01-01

    A new treatment of cloud-aerosol interactions within parameterized shallow and deep convection has been implemented in WRF-Chem that can be used to better understand the aerosol lifecycle over regional to synoptic scales. The modifications to the model to represent cloud-aerosol interactions include treatment of the cloud dropletnumber mixing ratio; key cloud microphysical and macrophysical parameters (including the updraft fractional area, updraft and downdraft mass fluxes, and entrainment) averaged over the population of shallow clouds, or a single deep convective cloud; and vertical transport, activation/resuspension, aqueous chemistry, and wet removal of aerosol and trace gases in warm clouds. Thesechanges have been implemented in both the WRF-Chem chemistry packages as well as the Kain-Fritsch cumulus parameterization that has been modified to better represent shallow convective clouds. Preliminary testing of the modified WRF-Chem has been completed using observations from the Cumulus Humilis Aerosol Processing Study (CHAPS) as well as a high-resolution simulation that does not include parameterized convection. The simulation results are used to investigate the impact of cloud-aerosol interactions on the regional scale transport of black carbon (BC), organic aerosol (OA), and sulfate aerosol. Based on the simulations presented here, changes in the column integrated BC can be as large as -50% when cloud-aerosol interactions are considered (due largely to wet removal), or as large as +35% for sulfate in non-precipitating conditions due to the sulfate production in the parameterized clouds. The modifications to WRF-Chem version 3.2.1 are found to account for changes in the cloud drop number concentration (CDNC) and changes in the chemical composition of cloud-drop residuals in a way that is consistent with observations collected during CHAPS. Efforts are currently underway to port the changes described here to WRF-Chem version 3.5, and it is anticipated that they

  18. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  19. Large-Scale, Parallel, Multi-Sensor Atmospheric Data Fusion Using Cloud Computing

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.; Fetzer, E. J.

    2013-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the 'A-Train' platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over decades. Moving to multi-sensor, long-duration analyses of important climate variables presents serious challenges for large-scale data mining and fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another (MODIS), and to a model (MERRA), stratify the comparisons using a classification of the 'cloud scenes' from CloudSat, and repeat the entire analysis over 10 years of data. To efficiently assemble such datasets, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. However, these problems are Data Intensive computing so the data transfer times and storage costs (for caching) are key issues. SciReduce is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Figure 1 shows the architecture of the full computational system, with SciReduce at the core. Multi-year datasets are automatically 'sharded' by time and space across a cluster of nodes so that years of data (millions of files) can be processed in a massively parallel way. Input variables (arrays) are pulled on-demand into the Cloud using OPeNDAP URLs or other subsetting services, thereby minimizing the size of the cached input and intermediate datasets. We are using SciReduce to automate the production of multiple versions of a ten-year A-Train water vapor climatology under a NASA MEASURES grant. We will

  20. Large-Scale, Parallel, Multi-Sensor Data Fusion in the Cloud

    Science.gov (United States)

    Wilson, B. D.; Manipon, G.; Hua, H.

    2012-12-01

    NASA's Earth Observing System (EOS) is an ambitious facility for studying global climate change. The mandate now is to combine measurements from the instruments on the "A-Train" platforms (AIRS, AMSR-E, MODIS, MISR, MLS, and CloudSat) and other Earth probes to enable large-scale studies of climate change over periods of years to decades. However, moving from predominantly single-instrument studies to a multi-sensor, measurement-based model for long-duration analysis of important climate variables presents serious challenges for large-scale data mining and data fusion. For example, one might want to compare temperature and water vapor retrievals from one instrument (AIRS) to another instrument (MODIS), and to a model (ECMWF), stratify the comparisons using a classification of the "cloud scenes" from CloudSat, and repeat the entire analysis over years of AIRS data. To perform such an analysis, one must discover & access multiple datasets from remote sites, find the space/time "matchups" between instruments swaths and model grids, understand the quality flags and uncertainties for retrieved physical variables, assemble merged datasets, and compute fused products for further scientific and statistical analysis. To efficiently assemble such decade-scale datasets in a timely manner, we are utilizing Elastic Computing in the Cloud and parallel map/reduce-based algorithms. "SciReduce" is a Hadoop-like parallel analysis system, programmed in parallel python, that is designed from the ground up for Earth science. SciReduce executes inside VMWare images and scales to any number of nodes in the Cloud. Unlike Hadoop, in which simple tuples (keys & values) are passed between the map and reduce functions, SciReduce operates on bundles of named numeric arrays, which can be passed in memory or serialized to disk in netCDF4 or HDF5. Thus, SciReduce uses the native datatypes (geolocated grids, swaths, and points) that geo-scientists are familiar with. We are deploying within Sci

  1. Surveying the Dense Gas in Barnard 1 and NGC 1333 from Cloud to Core Scales

    Science.gov (United States)

    Storm, Shaye; Mundy, Lee; Teuben, Peter; Lee, Katherine; Fernandez-Lopez, Manuel; Looney, Leslie; Rosolowsky, Erik; Classy Collaboration

    2013-07-01

    The CARMA Large Area Star formation Survey (CLASSy) is mapping molecular emission across large areas of the nearby Perseus and Serpens Molecular Clouds. With an angular resolution of 7 arcsec, CLASSy probes dense gas on scales from a few thousand AU to parsecs with CARMA-23 and single-dish observations. The resulting maps of N2H+, HCN, and HCO+ J=1-0 trace the kinematics and structure of the high-density gas in regions covering a wide range of intrinsic star formation activity. This poster presents an overview of three completed CLASSy fields, NGC 1333, Barnard 1, and Serpens Main, and then focuses on the dendrogram analysis that CLASSy is using to characterize the emission structure. We have chosen a dendrogram analysis over traditional clump finding because dendrograms better encode the hierarchical nature of cloud structure and better facilitate analysis of cloud properties across the range of size scales probed by CLASSy. We present a new dendrogram methodology that allows for non-binary mergers of kernels, which results in a gas hierarchy that is more true to limitations of the S/N in the data. The resulting trees from Barnard 1 and NGC 1333 are used to derive physical parameters of the identified gas structures, and to probe the kinematic relationship between gas structures at different spatial scales and evolutionary stages. We derive a flat relation between mean internal turbulence and structure size for the dense gas in both regions, but find a difference between the magnitude of the internal turbulence in regions with and without protostars; the dense gas in the B1 main core and NGC 1333 are characterized by mostly transonic to supersonic turbulence, while the B1 filaments and clumps southwest of the main core have mostly subsonic turbulence. These initial results, along with upcoming work analyzing the completed CLASSy observations, will be used to test current theories for star formation in turbulent molecular clouds.

  2. Fault Tolerance and Scaling in e-Science Cloud Applications: Observations from the Continuing Development of MODISAzure

    Energy Technology Data Exchange (ETDEWEB)

    Li, Jie [Univ. of Virginia, Charlottesville, VA (United States). Dept. of Computer Science; Humphrey, Marty [Univ. of Virginia, Charlottesville, VA (United States). Dept. of Computer Science; Cheah, You-Wei [Indiana Univ., Bloomington, IN (United States); Ryu, Youngryel [Univ. of California, Berkeley, CA (United States). Dept. of Environmental Science, Policy, and Management; Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Jackson, Keith [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); van Ingen, Catharine [Microsoft Research. San Francisco, CA (United States)

    2010-04-01

    It can be natural to believe that many of the traditional issues of scale have been eliminated or at least greatly reduced via cloud computing. That is, if one can create a seemingly wellfunctioning cloud application that operates correctly on small or moderate-sized problems, then the very nature of cloud programming abstractions means that the same application will run as well on potentially significantly larger problems. In this paper, we present our experiences taking MODISAzure, our satellite data processing system built on the Windows Azure cloud computing platform, from the proof-of-concept stage to a point of being able to run on significantly larger problem sizes (e.g., from national-scale data sizes to global-scale data sizes). To our knowledge, this is the longest-running eScience application on the nascent Windows Azure platform. We found that while many infrastructure-level issues were thankfully masked from us by the cloud infrastructure, it was valuable to design additional redundancy and fault-tolerance capabilities such as transparent idempotent task retry and logging to support debugging of user code encountering unanticipated data issues. Further, we found that using a commercial cloud means anticipating inconsistent performance and black-box behavior of virtualized compute instances, as well as leveraging changing platform capabilities over time. We believe that the experiences presented in this paper can help future eScience cloud application developers on Windows Azure and other commercial cloud providers.

  3. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Sanggoo Kang

    2016-08-01

    Full Text Available Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-based image processing algorithms by comparing the performance of a single virtual server and multiple auto-scaled virtual servers under identical experimental conditions. In this study, the cloud computing environment is built with OpenStack, and four algorithms from the Orfeo toolbox are used for practical geo-based image processing experiments. The auto-scaling results from all experimental performance tests demonstrate applicable significance with respect to cloud utilization concerning response time. Auto-scaling contributes to the development of web-based satellite image application services using cloud-based technologies.

  4. Production of lightning NOx and its vertical distribution calculated from three-dimensional cloud-scale chemical transport model simulations

    KAUST Repository

    Ott, Lesley E.; Pickering, Kenneth E.; Stenchikov, Georgiy L.; Allen, Dale J.; DeCaria, Alex J.; Ridley, Brian; Lin, Ruei-Fong; Lang, Stephen; Tao, Wei-Kuo

    2010-01-01

    A three-dimensional (3-D) cloud-scale chemical transport model that includes a parameterized source of lightning NOx on the basis of observed flash rates has been used to simulate six midlatitude and subtropical thunderstorms observed during four

  5. Computational biology in the cloud: methods and new insights from computing at scale.

    Science.gov (United States)

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  6. A Coupled fcGCM-GCE Modeling System: A 3D Cloud Resolving Model and a Regional Scale Model

    Science.gov (United States)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and ore sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (21CE, several 31CE), Goddard radiation (including explicity calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generation regional scale model, WRF. In this talk, I will present: (1) A Brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), (3) A discussion on the Goddard WRF version (its developments and applications), and (4) The characteristics of the four-dimensional cloud data

  7. Moving image analysis to the cloud: A case study with a genome-scale tomographic study

    Energy Technology Data Exchange (ETDEWEB)

    Mader, Kevin [4Quant Ltd., Switzerland & Institute for Biomedical Engineering at University and ETH Zurich (Switzerland); Stampanoni, Marco [Institute for Biomedical Engineering at University and ETH Zurich, Switzerland & Swiss Light Source at Paul Scherrer Institut, Villigen (Switzerland)

    2016-01-28

    Over the last decade, the time required to measure a terabyte of microscopic imaging data has gone from years to minutes. This shift has moved many of the challenges away from experimental design and measurement to scalable storage, organization, and analysis. As many scientists and scientific institutions lack training and competencies in these areas, major bottlenecks have arisen and led to substantial delays and gaps between measurement, understanding, and dissemination. We present in this paper a framework for analyzing large 3D datasets using cloud-based computational and storage resources. We demonstrate its applicability by showing the setup and costs associated with the analysis of a genome-scale study of bone microstructure. We then evaluate the relative advantages and disadvantages associated with local versus cloud infrastructures.

  8. Moving image analysis to the cloud: A case study with a genome-scale tomographic study

    International Nuclear Information System (INIS)

    Mader, Kevin; Stampanoni, Marco

    2016-01-01

    Over the last decade, the time required to measure a terabyte of microscopic imaging data has gone from years to minutes. This shift has moved many of the challenges away from experimental design and measurement to scalable storage, organization, and analysis. As many scientists and scientific institutions lack training and competencies in these areas, major bottlenecks have arisen and led to substantial delays and gaps between measurement, understanding, and dissemination. We present in this paper a framework for analyzing large 3D datasets using cloud-based computational and storage resources. We demonstrate its applicability by showing the setup and costs associated with the analysis of a genome-scale study of bone microstructure. We then evaluate the relative advantages and disadvantages associated with local versus cloud infrastructures

  9. Shortwave surface radiation network for observing small-scale cloud inhomogeneity fields

    Science.gov (United States)

    Lakshmi Madhavan, Bomidi; Kalisch, John; Macke, Andreas

    2016-03-01

    As part of the High Definition Clouds and Precipitation for advancing Climate Prediction Observational Prototype Experiment (HOPE), a high-density network of 99 silicon photodiode pyranometers was set up around Jülich (10 km × 12 km area) from April to July 2013 to capture the small-scale variability of cloud-induced radiation fields at the surface. In this paper, we provide the details of this unique setup of the pyranometer network, data processing, quality control, and uncertainty assessment under variable conditions. Some exemplary days with clear, broken cloudy, and overcast skies were explored to assess the spatiotemporal observations from the network along with other collocated radiation and sky imager measurements available during the HOPE period.

  10. Effects of Resolution on the Simulation of Boundary-layer Clouds and the Partition of Kinetic Energy to Subgrid Scales

    Directory of Open Access Journals (Sweden)

    Anning Cheng

    2010-02-01

    Full Text Available Seven boundary-layer cloud cases are simulated with UCLA-LES (The University of California, Los Angeles – large eddy simulation model with different horizontal and vertical gridspacing to investigate how the results depend on gridspacing. Some variables are more sensitive to horizontal gridspacing, while others are more sensitive to vertical gridspacing, and still others are sensitive to both horizontal and vertical gridspacings with similar or opposite trends. For cloud-related variables having the opposite dependence on horizontal and vertical gridspacings, changing the gridspacing proportionally in both directions gives the appearance of convergence. In this study, we mainly discuss the impact of subgrid-scale (SGS kinetic energy (KE on the simulations with coarsening of horizontal and vertical gridspacings. A running-mean operator is used to separate the KE of the high-resolution benchmark simulations into that of resolved scales of coarse-resolution simulations and that of SGSs. The diagnosed SGS KE is compared with that parameterized by the Smagorinsky-Lilly SGS scheme at various gridspacings. It is found that the parameterized SGS KE for the coarse-resolution simulations is usually underestimated but the resolved KE is unrealistically large, compared to benchmark simulations. However, the sum of resolved and SGS KEs is about the same for simulations with various gridspacings. The partitioning of SGS and resolved heat and moisture transports is consistent with that of SGS and resolved KE, which means that the parameterized transports are underestimated but resolved-scale transports are overestimated. On the whole, energy shifts to large-scales as the horizontal gridspacing becomes coarse, hence the size of clouds and the resolved circulation increase, the clouds become more stratiform-like with an increase in cloud fraction, cloud liquid-water path and surface precipitation; when coarse vertical gridspacing is used, cloud sizes do not

  11. A cloud based tool for knowledge exchange on local scale flood risk.

    Science.gov (United States)

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the

  12. Large scale IRAM 30 m CO-observations in the giant molecular cloud complex W43

    Science.gov (United States)

    Carlhoff, P.; Nguyen Luong, Q.; Schilke, P.; Motte, F.; Schneider, N.; Beuther, H.; Bontemps, S.; Heitsch, F.; Hill, T.; Kramer, C.; Ossenkopf, V.; Schuller, F.; Simon, R.; Wyrowski, F.

    2013-12-01

    We aim to fully describe the distribution and location of dense molecular clouds in the giant molecular cloud complex W43. It was previously identified as one of the most massive star-forming regions in our Galaxy. To trace the moderately dense molecular clouds in the W43 region, we initiated W43-HERO, a large program using the IRAM 30 m telescope, which covers a wide dynamic range of scales from 0.3 to 140 pc. We obtained on-the-fly-maps in 13CO (2-1) and C18O (2-1) with a high spectral resolution of 0.1 km s-1 and a spatial resolution of 12''. These maps cover an area of ~1.5 square degrees and include the two main clouds of W43 and the lower density gas surrounding them. A comparison to Galactic models and previous distance calculations confirms the location of W43 near the tangential point of the Scutum arm at approximately 6 kpc from the Sun. The resulting intensity cubes of the observed region are separated into subcubes, which are centered on single clouds and then analyzed in detail. The optical depth, excitation temperature, and H2 column density maps are derived out of the 13CO and C18O data. These results are then compared to those derived from Herschel dust maps. The mass of a typical cloud is several 104 M⊙ while the total mass in the dense molecular gas (>102 cm-3) in W43 is found to be ~1.9 × 106 M⊙. Probability distribution functions obtained from column density maps derived from molecular line data and Herschel imaging show a log-normal distribution for low column densities and a power-law tail for high densities. A flatter slope for the molecular line data probability distribution function may imply that those selectively show the gravitationally collapsing gas. Appendices are available in electronic form at http://www.aanda.orgThe final datacubes (13CO and C18O) for the entire survey are only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/560/A24

  13. LARGE-SCALE CO MAPS OF THE LUPUS MOLECULAR CLOUD COMPLEX

    International Nuclear Information System (INIS)

    Tothill, N. F. H.; Loehr, A.; Stark, A. A.; Lane, A. P.; Harnett, J. I.; Bourke, T. L.; Myers, P. C.; Parshley, S. C.; Wright, G. A.; Walker, C. K.

    2009-01-01

    Fully sampled degree-scale maps of the 13 CO 2-1 and CO 4-3 transitions toward three members of the Lupus Molecular Cloud Complex-Lupus I, III, and IV-trace the column density and temperature of the molecular gas. Comparison with IR extinction maps from the c2d project requires most of the gas to have a temperature of 8-10 K. Estimates of the cloud mass from 13 CO emission are roughly consistent with most previous estimates, while the line widths are higher, around 2 km s -1 . CO 4-3 emission is found throughout Lupus I, indicating widespread dense gas, and toward Lupus III and IV. Enhanced line widths at the NW end and along the edge of the B 228 ridge in Lupus I, and a coherent velocity gradient across the ridge, are consistent with interaction between the molecular cloud and an expanding H I shell from the Upper-Scorpius subgroup of the Sco-Cen OB Association. Lupus III is dominated by the effects of two HAe/Be stars, and shows no sign of external influence. Slightly warmer gas around the core of Lupus IV and a low line width suggest heating by the Upper-Centaurus-Lupus subgroup of Sco-Cen, without the effects of an H I shell.

  14. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  15. MULTI-SCALE ANALYSIS OF MAGNETIC FIELDS IN FILAMENTARY MOLECULAR CLOUDS IN ORION A

    International Nuclear Information System (INIS)

    Poidevin, Frédérick; Bastien, P.; Jones, T. J.

    2011-01-01

    New visible and K-band polarization measurements of stars surrounding molecular clouds in Orion A and stars in the Becklin-Neugebauer (BN) vicinity are presented. Our results confirm that magnetic fields located inside the Orion A molecular clouds and in their close neighborhood are spatially connected. On and around the BN object, we measured the angular offsets between the K-band polarization data and available submillimeter (submm) data. We find high values of the polarization degree, P K , and of the optical depth, τ K , close to an angular offset position of 90° whereas lower values of P K and τ K are observed for smaller angular offsets. We interpret these results as evidence for the presence of various magnetic field components toward lines of sight in the vicinity of BN. On a larger scale, we measured the distribution of angular offsets between available H-band polarization data and the same submm data set. Here we find an increase of (P H ) with angular offset, which we interpret as a rotation of the magnetic field by ∼< 60°. This trend generalizes previous results on small scales toward and around lines of sight to BN and is consistent with a twist of the magnetic field on a larger scale toward OMC-1. A comparison of our results with several other studies suggests that a two-component magnetic field, perhaps helical, could be wrapping the OMC-1 filament.

  16. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    Science.gov (United States)

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  17. Cloud Feedbacks on Greenhouse Warming in a Multi-Scale Modeling Framework with a Higher-Order Turbulence Closure

    Science.gov (United States)

    Cheng, Anning; Xu, Kuan-Man

    2015-01-01

    Five-year simulation experiments with a multi-scale modeling Framework (MMF) with a advanced intermediately prognostic higher-order turbulence closure (IPHOC) in its cloud resolving model (CRM) component, also known as SPCAM-IPHOC (super parameterized Community Atmospheric Model), are performed to understand the fast tropical (30S-30N) cloud response to an instantaneous doubling of CO2 concentration with SST held fixed at present-day values. SPCAM-IPHOC has substantially improved the low-level representation compared with SPCAM. It is expected that the cloud responses to greenhouse warming in SPCAM-IPHOC is more realistic. The change of rising motion, surface precipitation, cloud cover, and shortwave and longwave cloud radiative forcing in SPCAM-IPHOC from the greenhouse warming will be presented in the presentation.

  18. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  19. A Coupled GCM-Cloud Resolving Modeling System, and a Regional Scale Model to Study Precipitation Processes

    Science.gov (United States)

    Tao, Wei-Kuo

    2007-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud-resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a superparameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. The Goddard MMF is based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM), and it has started production runs with two years results (1998 and 1999). Also, at Goddard, we have implemented several Goddard microphysical schemes (2ICE, several 31CE), Goddard radiation (including explicitly calculated cloud optical properties), and Goddard Land Information (LIS, that includes the CLM and NOAH land surface models) into a next generatio11 regional scale model, WRF. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes (microphysical and land processes), (2) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), and preliminary results (the comparison with traditional GCMs), and (3) A discussion on the Goddard WRF version (its developments and applications).

  20. Accounting for Unresolved Spatial Variability in Large Scale Models: Development and Evaluation of a Statistical Cloud Parameterization with Prognostic Higher Order Moments

    Energy Technology Data Exchange (ETDEWEB)

    Robert Pincus

    2011-05-17

    This project focused on the variability of clouds that is present across a wide range of scales ranging from the synoptic to the millimeter. In particular, there is substantial variability in cloud properties at scales smaller than the grid spacing of models used to make climate projections (GCMs) and weather forecasts. These models represent clouds and other small-scale processes with parameterizations that describe how those processes respond to and feed back on the largescale state of the atmosphere.

  1. Supervised Outlier Detection in Large-Scale Mvs Point Clouds for 3d City Modeling Applications

    Science.gov (United States)

    Stucker, C.; Richard, A.; Wegner, J. D.; Schindler, K.

    2018-05-01

    We propose to use a discriminative classifier for outlier detection in large-scale point clouds of cities generated via multi-view stereo (MVS) from densely acquired images. What makes outlier removal hard are varying distributions of inliers and outliers across a scene. Heuristic outlier removal using a specific feature that encodes point distribution often delivers unsatisfying results. Although most outliers can be identified correctly (high recall), many inliers are erroneously removed (low precision), too. This aggravates object 3D reconstruction due to missing data. We thus propose to discriminatively learn class-specific distributions directly from the data to achieve high precision. We apply a standard Random Forest classifier that infers a binary label (inlier or outlier) for each 3D point in the raw, unfiltered point cloud and test two approaches for training. In the first, non-semantic approach, features are extracted without considering the semantic interpretation of the 3D points. The trained model approximates the average distribution of inliers and outliers across all semantic classes. Second, semantic interpretation is incorporated into the learning process, i.e. we train separate inlieroutlier classifiers per semantic class (building facades, roof, ground, vegetation, fields, and water). Performance of learned filtering is evaluated on several large SfM point clouds of cities. We find that results confirm our underlying assumption that discriminatively learning inlier-outlier distributions does improve precision over global heuristics by up to ≍ 12 percent points. Moreover, semantically informed filtering that models class-specific distributions further improves precision by up to ≍ 10 percent points, being able to remove very isolated building, roof, and water points while preserving inliers on building facades and vegetation.

  2. A COMPACT HIGH VELOCITY CLOUD NEAR THE MAGELLANIC STREAM: METALLICITY AND SMALL-SCALE STRUCTURE

    Energy Technology Data Exchange (ETDEWEB)

    Kumari, Nimisha [Ecole Polytechnique, Route de Saclay, F-91128 Palaiseau (France); Fox, Andrew J.; Tumlinson, Jason; Thom, Christopher; Ely, Justin [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Westmeier, Tobias [ICRAR, The University of Western Australia, 35 Stirling Highway, Crawley WA 6009 (Australia)

    2015-02-10

    The Magellanic Stream (MS) is a well-resolved gaseous tail originating from the Magellanic Clouds. Studies of its physical properties and chemical composition are needed to understand its role in Galactic evolution. We investigate the properties of a compact HVC (CHVC 224.0-83.4-197) lying close on the sky to the MS to determine whether it is physically connected to the Stream and to examine its internal structure. Our study is based on analysis of HST/COS spectra of three QSOs (Ton S210, B0120-28, and B0117-2837) all of which pass through this single cloud at small angular separation (≲0.°72), allowing us to compare physical conditions on small spatial scales. No significant variation is detected in the ionization structure from one part of the cloud to the other. Using Cloudy photoionization models, toward Ton S210 we derive elemental abundances of [C/H] = –1.21 ± 0.11, [Si/H] = –1.16 ± 0.11, [Al/H] = –1.19 ± 0.17, and [O/H] = –1.12 ± 0.22, which agree within 0.09 dex. The CHVC abundances match the 0.1 solar abundances measured along the main body of the Stream. This suggests that the CHVC (and by extension the extended network of filaments to which it belongs) has an origin in the MS. It may represent a fragment that has been removed from the Stream as it interacts with the gaseous Galactic halo.

  3. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    Science.gov (United States)

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  4. Responses of Cloud Type Distributions to the Large-Scale Dynamical Circulation: Water Budget-Related Dynamical Phase Space and Dynamical Regimes

    Science.gov (United States)

    Wong, Sun; Del Genio, Anthony; Wang, Tao; Kahn, Brian; Fetzer, Eric J.; L'Ecuyer, Tristan S.

    2015-01-01

    Goals: Water budget-related dynamical phase space; Connect large-scale dynamical conditions to atmospheric water budget (including precipitation); Connect atmospheric water budget to cloud type distributions.

  5. a Super Voxel-Based Riemannian Graph for Multi Scale Segmentation of LIDAR Point Clouds

    Science.gov (United States)

    Li, Minglei

    2018-04-01

    Automatically segmenting LiDAR points into respective independent partitions has become a topic of great importance in photogrammetry, remote sensing and computer vision. In this paper, we cast the problem of point cloud segmentation as a graph optimization problem by constructing a Riemannian graph. The scale space of the observed scene is explored by an octree-based over-segmentation with different depths. The over-segmentation produces many super voxels which restrict the structure of the scene and will be used as nodes of the graph. The Kruskal coordinates are used to compute edge weights that are proportional to the geodesic distance between nodes. Then we compute the edge-weight matrix in which the elements reflect the sectional curvatures associated with the geodesic paths between super voxel nodes on the scene surface. The final segmentation results are generated by clustering similar super voxels and cutting off the weak edges in the graph. The performance of this method was evaluated on LiDAR point clouds for both indoor and outdoor scenes. Additionally, extensive comparisons to state of the art techniques show that our algorithm outperforms on many metrics.

  6. Development of a Standardized Methodology for the Use of COSI-Corr Sub-Pixel Image Correlation to Determine Surface Deformation Patterns in Large Magnitude Earthquakes.

    Science.gov (United States)

    Milliner, C. W. D.; Dolan, J. F.; Hollingsworth, J.; Leprince, S.; Ayoub, F.

    2014-12-01

    Coseismic surface deformation is typically measured in the field by geologists and with a range of geophysical methods such as InSAR, LiDAR and GPS. Current methods, however, either fail to capture the near-field coseismic surface deformation pattern where vital information is needed, or lack pre-event data. We develop a standardized and reproducible methodology to fully constrain the surface, near-field, coseismic deformation pattern in high resolution using aerial photography. We apply our methodology using the program COSI-corr to successfully cross-correlate pairs of aerial, optical imagery before and after the 1992, Mw 7.3 Landers and 1999, Mw 7.1 Hector Mine earthquakes. This technique allows measurement of the coseismic slip distribution and magnitude and width of off-fault deformation with sub-pixel precision. This technique can be applied in a cost effective manner for recent and historic earthquakes using archive aerial imagery. We also use synthetic tests to constrain and correct for the bias imposed on the result due to use of a sliding window during correlation. Correcting for artificial smearing of the tectonic signal allows us to robustly measure the fault zone width along a surface rupture. Furthermore, the synthetic tests have constrained for the first time the measurement precision and accuracy of estimated fault displacements and fault-zone width. Our methodology provides the unique ability to robustly understand the kinematics of surface faulting while at the same time accounting for both off-fault deformation and measurement biases that typically complicates such data. For both earthquakes we find that our displacement measurements derived from cross-correlation are systematically larger than the field displacement measurements, indicating the presence of off-fault deformation. We show that the Landers and Hector Mine earthquake accommodated 46% and 38% of displacement away from the main primary rupture as off-fault deformation, over a mean

  7. How important is biological ice nucleation in clouds on a global scale?

    International Nuclear Information System (INIS)

    Hoose, C; Kristjansson, J E; Burrows, S M

    2010-01-01

    The high ice nucleating ability of some biological particles has led to speculations about living and dead organisms being involved in cloud ice and precipitation formation, exerting a possibly significant influence on weather and climate. In the present study, the role of primary biological aerosol particles (PBAPs) as heterogeneous ice nuclei is investigated with a global model. Emission parametrizations for bacteria, fungal spores and pollen based on recent literature are introduced, as well as an immersion freezing parametrization based on classical nucleation theory and laboratory measurements. The simulated contribution of PBAPs to the global average ice nucleation rate is only 10 -5 %, with an uppermost estimate of 0.6%. At the same time, observed PBAP concentrations in air and biological ice nucleus concentrations in snow are reasonably well captured by the model. This implies that 'bioprecipitation' processes (snow and rain initiated by PBAPs) are of minor importance on the global scale.

  8. COMPREHENSIVE COMPARISON OF TWO IMAGE-BASED POINT CLOUDS FROM AERIAL PHOTOS WITH AIRBORNE LIDAR FOR LARGE-SCALE MAPPING

    Directory of Open Access Journals (Sweden)

    E. Widyaningrum

    2017-09-01

    Full Text Available The integration of computer vision and photogrammetry to generate three-dimensional (3D information from images has contributed to a wider use of point clouds, for mapping purposes. Large-scale topographic map production requires 3D data with high precision and accuracy to represent the real conditions of the earth surface. Apart from LiDAR point clouds, the image-based matching is also believed to have the ability to generate reliable and detailed point clouds from multiple-view images. In order to examine and analyze possible fusion of LiDAR and image-based matching for large-scale detailed mapping purposes, point clouds are generated by Semi Global Matching (SGM and by Structure from Motion (SfM. In order to conduct comprehensive and fair comparison, this study uses aerial photos and LiDAR data that were acquired at the same time. Qualitative and quantitative assessments have been applied to evaluate LiDAR and image-matching point clouds data in terms of visualization, geometric accuracy, and classification result. The comparison results conclude that LiDAR is the best data for large-scale mapping.

  9. Automatic Scaling Hadoop in the Cloud for Efficient Process of Big Geospatial Data

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    2016-09-01

    Full Text Available Efficient processing of big geospatial data is crucial for tackling global and regional challenges such as climate change and natural disasters, but it is challenging not only due to the massive data volume but also due to the intrinsic complexity and high dimensions of the geospatial datasets. While traditional computing infrastructure does not scale well with the rapidly increasing data volume, Hadoop has attracted increasing attention in geoscience communities for handling big geospatial data. Recently, many studies were carried out to investigate adopting Hadoop for processing big geospatial data, but how to adjust the computing resources to efficiently handle the dynamic geoprocessing workload was barely explored. To bridge this gap, we propose a novel framework to automatically scale the Hadoop cluster in the cloud environment to allocate the right amount of computing resources based on the dynamic geoprocessing workload. The framework and auto-scaling algorithms are introduced, and a prototype system was developed to demonstrate the feasibility and efficiency of the proposed scaling mechanism using Digital Elevation Model (DEM interpolation as an example. Experimental results show that this auto-scaling framework could (1 significantly reduce the computing resource utilization (by 80% in our example while delivering similar performance as a full-powered cluster; and (2 effectively handle the spike processing workload by automatically increasing the computing resources to ensure the processing is finished within an acceptable time. Such an auto-scaling approach provides a valuable reference to optimize the performance of geospatial applications to address data- and computational-intensity challenges in GIScience in a more cost-efficient manner.

  10. Large Scale Gaussian Processes for Atmospheric Parameter Retrieval and Cloud Screening

    Science.gov (United States)

    Camps-Valls, G.; Gomez-Chova, L.; Mateo, G.; Laparra, V.; Perez-Suay, A.; Munoz-Mari, J.

    2017-12-01

    Current Earth-observation (EO) applications for image classification have to deal with an unprecedented big amount of heterogeneous and complex data sources. Spatio-temporally explicit classification methods are a requirement in a variety of Earth system data processing applications. Upcoming missions such as the super-spectral Copernicus Sentinels EnMAP and FLEX will soon provide unprecedented data streams. Very high resolution (VHR) sensors like Worldview-3 also pose big challenges to data processing. The challenge is not only attached to optical sensors but also to infrared sounders and radar images which increased in spectral, spatial and temporal resolution. Besides, we should not forget the availability of the extremely large remote sensing data archives already collected by several past missions, such ENVISAT, Cosmo-SkyMED, Landsat, SPOT, or Seviri/MSG. These large-scale data problems require enhanced processing techniques that should be accurate, robust and fast. Standard parameter retrieval and classification algorithms cannot cope with this new scenario efficiently. In this work, we review the field of large scale kernel methods for both atmospheric parameter retrieval and cloud detection using infrared sounding IASI data and optical Seviri/MSG imagery. We propose novel Gaussian Processes (GPs) to train problems with millions of instances and high number of input features. Algorithms can cope with non-linearities efficiently, accommodate multi-output problems, and provide confidence intervals for the predictions. Several strategies to speed up algorithms are devised: random Fourier features and variational approaches for cloud classification using IASI data and Seviri/MSG, and engineered randomized kernel functions and emulation in temperature, moisture and ozone atmospheric profile retrieval from IASI as a proxy to the upcoming MTG-IRS sensor. Excellent compromise between accuracy and scalability are obtained in all applications.

  11. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    International Nuclear Information System (INIS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-01-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources. (paper)

  12. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos [McGill Univ., Montreal, QC (Canada

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  13. Implications of Warm Rain in Shallow Cumulus and Congestus Clouds for Large-Scale Circulations

    Science.gov (United States)

    Nuijens, Louise; Emanuel, Kerry; Masunaga, Hirohiko; L'Ecuyer, Tristan

    2017-11-01

    Space-borne observations reveal that 20-40% of marine convective clouds below the freezing level produce rain. In this paper we speculate what the prevalence of warm rain might imply for convection and large-scale circulations over tropical oceans. We present results using a two-column radiative-convective model of hydrostatic, nonlinear flow on a non-rotating sphere, with parameterized convection and radiation, and review ongoing efforts in high-resolution modeling and observations of warm rain. The model experiments investigate the response of convection and circulation to sea surface temperature (SST) gradients between the columns and to changes in a parameter that controls the conversion of cloud condensate to rain. Convection over the cold ocean collapses to a shallow mode with tops near 850 hPa, but a congestus mode with tops near 600 hPa can develop at small SST differences when warm rain formation is more efficient. Here, interactive radiation and the response of the circulation are crucial: along with congestus a deeper moist layer develops, which leads to less low-level radiative cooling, a smaller buoyancy gradient between the columns, and therefore a weaker circulation and less subsidence over the cold ocean. The congestus mode is accompanied with more surface precipitation in the subsiding column and less surface precipitation in the deep convecting column. For the shallow mode over colder oceans, circulations also weaken with more efficient warm rain formation, but only marginally. Here, more warm rain reduces convective tops and the boundary layer depth—similar to Large-Eddy Simulation (LES) studies—which reduces the integrated buoyancy gradient. Elucidating the impact of warm rain can benefit from large-domain high-resolution simulations and observations. Parameterizations of warm rain may be constrained through collocated cloud and rain profiling from ground, and concurrent changes in convection and rain in subsiding and convecting branches of

  14. A parsec-scale optical jet from a massive young star in the Large Magellanic Cloud

    Science.gov (United States)

    McLeod, Anna F.; Reiter, Megan; Kuiper, Rolf; Klaassen, Pamela D.; Evans, Christopher J.

    2018-02-01

    Highly collimated parsec-scale jets, which are generally linked to the presence of an accretion disk, are commonly observed in low-mass young stellar objects. In the past two decades, a few of these jets have been directly (or indirectly) observed from higher-mass (larger than eight solar masses) young stellar objects, adding to the growing evidence that disk-mediated accretion also occurs in high-mass stars, the formation mechanism of which is still poorly understood. Of the observed jets from massive young stars, none is in the optical regime (massive young stars are typically highly obscured by their natal material), and none is found outside of the Milky Way. Here we report observations of HH 1177, an optical ionized jet that originates from a massive young stellar object located in the Large Magellanic Cloud. The jet is highly collimated over its entire measured length of at least ten parsecs and has a bipolar geometry. The presence of a jet indicates ongoing, disk-mediated accretion and, together with the high degree of collimation, implies that this system is probably formed through a scaled-up version of the formation mechanism of low-mass stars. We conclude that the physics that govern jet launching and collimation is independent of stellar mass.

  15. A parsec-scale optical jet from a massive young star in the Large Magellanic Cloud.

    Science.gov (United States)

    McLeod, Anna F; Reiter, Megan; Kuiper, Rolf; Klaassen, Pamela D; Evans, Christopher J

    2018-02-15

    Highly collimated parsec-scale jets, which are generally linked to the presence of an accretion disk, are commonly observed in low-mass young stellar objects. In the past two decades, a few of these jets have been directly (or indirectly) observed from higher-mass (larger than eight solar masses) young stellar objects, adding to the growing evidence that disk-mediated accretion also occurs in high-mass stars, the formation mechanism of which is still poorly understood. Of the observed jets from massive young stars, none is in the optical regime (massive young stars are typically highly obscured by their natal material), and none is found outside of the Milky Way. Here we report observations of HH 1177, an optical ionized jet that originates from a massive young stellar object located in the Large Magellanic Cloud. The jet is highly collimated over its entire measured length of at least ten parsecs and has a bipolar geometry. The presence of a jet indicates ongoing, disk-mediated accretion and, together with the high degree of collimation, implies that this system is probably formed through a scaled-up version of the formation mechanism of low-mass stars. We conclude that the physics that govern jet launching and collimation is independent of stellar mass.

  16. Quantification of waves in lidar observations of noctilucent clouds at scales from seconds to minutes

    Science.gov (United States)

    Kaifler, N.; Baumgarten, G.; Fiedler, J.; Lübken, F.-J.

    2013-12-01

    We present small-scale structures and waves observed in noctilucent clouds (NLC) by lidar at an unprecedented temporal resolution of 30 s or less. The measurements were taken with the Rayleigh/Mie/Raman lidar at the ALOMAR observatory in northern Norway (69° N) in the years 2008-2011. We find multiple layer NLC in 7.9% of the time for a brightness threshold of δ β = 12 × 10-10 m-1 sr-1. In comparison to 10 min averaged data, the 30 s dataset shows considerably more structure. For limited periods, quasi-monochromatic waves in NLC altitude variations are common, in accord with ground-based NLC imagery. For the combined dataset, on the other hand, we do not find preferred periods but rather significant periods at all timescales observed (1 min to 1 h). Typical wave amplitudes in the layer vertical displacements are 0.2 km with maximum amplitudes up to 2.3 km. Average spectral slopes of temporal altitude and brightness variations are -2.01 ± 0.25 for centroid altitude, -1.41 ± 0.24 for peak brightness and -1.73 ± 0.25 for integrated brightness. Evaluating a new single-pulse detection system, we observe altitude variations of 70 s period and spectral slopes down to a scale of 10 s. We evaluate the suitability of NLC parameters as tracers for gravity waves.

  17. Quantification of waves in lidar observations of noctilucent clouds at scales from seconds to minutes

    Directory of Open Access Journals (Sweden)

    N. Kaifler

    2013-12-01

    Full Text Available We present small-scale structures and waves observed in noctilucent clouds (NLC by lidar at an unprecedented temporal resolution of 30 s or less. The measurements were taken with the Rayleigh/Mie/Raman lidar at the ALOMAR observatory in northern Norway (69° N in the years 2008–2011. We find multiple layer NLC in 7.9% of the time for a brightness threshold of δ β = 12 × 10−10 m−1 sr−1. In comparison to 10 min averaged data, the 30 s dataset shows considerably more structure. For limited periods, quasi-monochromatic waves in NLC altitude variations are common, in accord with ground-based NLC imagery. For the combined dataset, on the other hand, we do not find preferred periods but rather significant periods at all timescales observed (1 min to 1 h. Typical wave amplitudes in the layer vertical displacements are 0.2 km with maximum amplitudes up to 2.3 km. Average spectral slopes of temporal altitude and brightness variations are −2.01 ± 0.25 for centroid altitude, −1.41 ± 0.24 for peak brightness and −1.73 ± 0.25 for integrated brightness. Evaluating a new single-pulse detection system, we observe altitude variations of 70 s period and spectral slopes down to a scale of 10 s. We evaluate the suitability of NLC parameters as tracers for gravity waves.

  18. Production of lightning NOx and its vertical distribution calculated from three-dimensional cloud-scale chemical transport model simulations

    KAUST Repository

    Ott, Lesley E.

    2010-02-18

    A three-dimensional (3-D) cloud-scale chemical transport model that includes a parameterized source of lightning NOx on the basis of observed flash rates has been used to simulate six midlatitude and subtropical thunderstorms observed during four field projects. Production per intracloud (PIC) and cloud-to-ground (PCG) flash is estimated by assuming various values of PIC and PCG for each storm and determining which production scenario yields NOx mixing ratios that compare most favorably with in-cloud aircraft observations. We obtain a mean PCG value of 500 moles NO (7 kg N) per flash. The results of this analysis also suggest that on average, PIC may be nearly equal to PCG, which is contrary to the common assumption that intracloud flashes are significantly less productive of NO than are cloud-to-ground flashes. This study also presents vertical profiles of the mass of lightning NOx after convection based on 3-D cloud-scale model simulations. The results suggest that following convection, a large percentage of lightning NOx remains in the middle and upper troposphere where it originated, while only a small percentage is found near the surface. The results of this work differ from profiles calculated from 2-D cloud-scale model simulations with a simpler lightning parameterization that were peaked near the surface and in the upper troposphere (referred to as a “C-shaped” profile). The new model results (a backward C-shaped profile) suggest that chemical transport models that assume a C-shaped vertical profile of lightning NOx mass may place too much mass near the surface and too little in the middle troposphere.

  19. Automated Reconstruction of Building LoDs from Airborne LiDAR Point Clouds Using an Improved Morphological Scale Space

    Directory of Open Access Journals (Sweden)

    Bisheng Yang

    2016-12-01

    Full Text Available Reconstructing building models at different levels of detail (LoDs from airborne laser scanning point clouds is urgently needed for wide application as this method can balance between the user’s requirements and economic costs. The previous methods reconstruct building LoDs from the finest 3D building models rather than from point clouds, resulting in heavy costs and inflexible adaptivity. The scale space is a sound theory for multi-scale representation of an object from a coarser level to a finer level. Therefore, this paper proposes a novel method to reconstruct buildings at different LoDs from airborne Light Detection and Ranging (LiDAR point clouds based on an improved morphological scale space. The proposed method first extracts building candidate regions following the separation of ground and non-ground points. For each building candidate region, the proposed method generates a scale space by iteratively using the improved morphological reconstruction with the increase of scale, and constructs the corresponding topological relationship graphs (TRGs across scales. Secondly, the proposed method robustly extracts building points by using features based on the TRG. Finally, the proposed method reconstructs each building at different LoDs according to the TRG. The experiments demonstrate that the proposed method robustly extracts the buildings with details (e.g., door eaves and roof furniture and illustrate good performance in distinguishing buildings from vegetation or other objects, while automatically reconstructing building LoDs from the finest building points.

  20. Finding Tropical Cyclones on a Cloud Computing Cluster: Using Parallel Virtualization for Large-Scale Climate Simulation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hasenkamp, Daren; Sim, Alexander; Wehner, Michael; Wu, Kesheng

    2010-09-30

    Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, while we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.

  1. Finding Tropical Cyclones on a Cloud Computing Cluster: Using Parallel Virtualization for Large-Scale Climate Simulation Analysis

    International Nuclear Information System (INIS)

    Hasenkamp, Daren; Sim, Alexander; Wehner, Michael; Wu, Kesheng

    2010-01-01

    Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, while we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.

  2. On unravelling mechanism of interplay between cloud and large scale circulation: a grey area in climate science

    Science.gov (United States)

    De, S.; Agarwal, N. K.; Hazra, Anupam; Chaudhari, Hemantkumar S.; Sahai, A. K.

    2018-04-01

    The interaction between cloud and large scale circulation is much less explored area in climate science. Unfolding the mechanism of coupling between these two parameters is imperative for improved simulation of Indian summer monsoon (ISM) and to reduce imprecision in climate sensitivity of global climate model. This work has made an effort to explore this mechanism with CFSv2 climate model experiments whose cloud has been modified by changing the critical relative humidity (CRH) profile of model during ISM. Study reveals that the variable CRH in CFSv2 has improved the nonlinear interactions between high and low frequency oscillations in wind field (revealed as internal dynamics of monsoon) and modulates realistically the spatial distribution of interactions over Indian landmass during the contrasting monsoon season compared to the existing CRH profile of CFSv2. The lower tropospheric wind error energy in the variable CRH simulation of CFSv2 appears to be minimum due to the reduced nonlinear convergence of error to the planetary scale range from long and synoptic scales (another facet of internal dynamics) compared to as observed from other CRH experiments in normal and deficient monsoons. Hence, the interplay between cloud and large scale circulation through CRH may be manifested as a change in internal dynamics of ISM revealed from scale interactive quasi-linear and nonlinear kinetic energy exchanges in frequency as well as in wavenumber domain during the monsoon period that eventually modify the internal variance of CFSv2 model. Conversely, the reduced wind bias and proper modulation of spatial distribution of scale interaction between the synoptic and low frequency oscillations improve the eastward and northward extent of water vapour flux over Indian landmass that in turn give feedback to the realistic simulation of cloud condensates attributing improved ISM rainfall in CFSv2.

  3. The Radiative Properties of Small Clouds: Multi-Scale Observations and Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Feingold, Graham [NOAA ESRL; McComiskey, Allison [CIRES, University of Colorado

    2013-09-25

    Warm, liquid clouds and their representation in climate models continue to represent one of the most significant unknowns in climate sensitivity and climate change. Our project combines ARM observations, LES modeling, and satellite imagery to characterize shallow clouds and the role of aerosol in modifying their radiative effects.

  4. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  5. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Iván Tomás Cotes-Ruiz

    Full Text Available Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS. The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  6. A case study on large-scale dynamical influence on bright band using cloud radar during the Indian summer monsoon

    Science.gov (United States)

    Jha, Ambuj K.; Kalapureddy, M. C. R.; Devisetty, Hari Krishna; Deshpande, Sachin M.; Pandithurai, G.

    2018-02-01

    The present study is a first of its kind attempt in exploring the physical features (e.g., height, width, intensity, duration) of tropical Indian bright band using a Ka-band cloud radar under the influence of large-scale cyclonic circulation and attempts to explain the abrupt changes in bright band features, viz., rise in the bright band height by 430 m and deepening of the bright band by about 300 m observed at around 14:00 UTC on Sep 14, 2016, synoptically as well as locally. The study extends the utility of cloud radar to understand how the bright band features are associated with light precipitation, ranging from 0 to 1.5 mm/h. Our analysis of the precipitation event of Sep 14-15, 2016 shows that the bright band above (below) 3.7 km, thickness less (more) than 300 m can potentially lead to light drizzle of 0-0.25 mm/h (drizzle/light rain) at the surface. It is also seen that the cloud radar may be suitable for bright band study within light drizzle limits than under higher rain conditions. Further, the study illustrates that the bright band features can be determined using the polarimetric capability of the cloud radar. It is shown that an LDR value of - 22 dB can be associated with the top height of bright band in the Ka-band observations which is useful in the extraction of the bright band top height and its width. This study is useful for understanding the bright band phenomenon and could be potentially useful in establishing the bright band-surface rain relationship through the perspective of a cloud radar, which would be helpful to enhance the cloud radar-based quantitative estimates of precipitation.

  7. Comparison of prestellar core elongations and large-scale molecular cloud structures in the Lupus I region

    Energy Technology Data Exchange (ETDEWEB)

    Poidevin, Frédérick [UCL, KLB, Department of Physics and Astronomy, Gower Place, London WC1E 6BT (United Kingdom); Ade, Peter A. R.; Hargrave, Peter C.; Nutter, David [School of Physics and Astronomy, Cardiff University, Queens Buildings, The Parade, Cardiff CF24 3AA (United Kingdom); Angile, Francesco E.; Devlin, Mark J.; Klein, Jeffrey [Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States); Benton, Steven J.; Netterfield, Calvin B. [Department of Physics, University of Toronto, 60 St. George Street, Toronto, ON M5S 1A7 (Canada); Chapin, Edward L. [XMM SOC, ESAC, Apartado 78, E-28691 Villanueva de la Canãda, Madrid (Spain); Fissel, Laura M.; Gandilo, Natalie N. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, ON M5S 3H4 (Canada); Fukui, Yasuo [Department of Physics, Nagoya University, Chikusa-ku, Nagoya, Aichi 464-8601 (Japan); Gundersen, Joshua O. [Department of Physics, University of Miami, 1320 Campo Sano Drive, Coral Gables, FL 33146 (United States); Korotkov, Andrei L. [Department of Physics, Brown University, 182 Hope Street, Providence, RI 02912 (United States); Matthews, Tristan G.; Novak, Giles [Department of Physics and Astronomy, Northwestern University, 2145 Sheridan Road, Evanston, IL 60208 (United States); Moncelsi, Lorenzo; Mroczkowski, Tony K. [California Institute of Technology, 1200 East California Boulevard, Pasadena, CA 91125 (United States); Olmi, Luca, E-mail: fpoidevin@iac.es [Physics Department, University of Puerto Rico, Rio Piedras Campus, Box 23343, UPR station, San Juan, PR 00931 (United States); and others

    2014-08-10

    Turbulence and magnetic fields are expected to be important for regulating molecular cloud formation and evolution. However, their effects on sub-parsec to 100 parsec scales, leading to the formation of starless cores, are not well understood. We investigate the prestellar core structure morphologies obtained from analysis of the Herschel-SPIRE 350 μm maps of the Lupus I cloud. This distribution is first compared on a statistical basis to the large-scale shape of the main filament. We find the distribution of the elongation position angle of the cores to be consistent with a random distribution, which means no specific orientation of the morphology of the cores is observed with respect to the mean orientation of the large-scale filament in Lupus I, nor relative to a large-scale bent filament model. This distribution is also compared to the mean orientation of the large-scale magnetic fields probed at 350 μm with the Balloon-borne Large Aperture Telescope for Polarimetry during its 2010 campaign. Here again we do not find any correlation between the core morphology distribution and the average orientation of the magnetic fields on parsec scales. Our main conclusion is that the local filament dynamics—including secondary filaments that often run orthogonally to the primary filament—and possibly small-scale variations in the local magnetic field direction, could be the dominant factors for explaining the final orientation of each core.

  8. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  9. Flower elliptical constellation of millimeter-wave radiometers for precipitating cloud monitoring at geostationary scale

    Science.gov (United States)

    Marzano, F. S.; Cimini, D.; Montopoli, M.; Rossi, T.; Mortari, D.; di Michele, S.; Bauer, P.

    2009-04-01

    Millimeter-wave observation of the atmospheric parameters is becoming an appealing goal within satellite radiometry applications. The major technological advantage of millimeter-wave (MMW) radiometers is the reduced size of the overall system, for given performances, with respect to microwave sensor. On the other hand, millimeter-wave sounding can exploit window frequencies and various gaseous absorption bands at 50/60 GHz, 118 GHz and 183 GHz. These bands can be used to estimate tropospheric temperature profiles, integrated water vapor and cloud liquid content and, using a differentia spectral mode, light rainfall and snowfall. Millimeter-wave radiometers, for given observation conditions, can also exhibit relatively small field-of-views (FOVs), of the order of some kilometers for low-Earth-orbit (LEO) satellites. However, the temporal resolution of LEO millimeter-wave system observations remains a major drawback with respect to the geostationary-Earth-orbit (GEO) satellites. An overpass every about 12 hours for a single LEO platform (conditioned to a sufficiently large swath of the scanning MMW radiometer) is usually too much when compared with the typical temporal scale variation of atmospheric fields. This feature cannot be improved by resorting to GEO platforms due to their high orbit altitude and consequent degradation of the MMW-sensor FOVs. A way to tackle this impasse is to draw our attention at the regional scale and to focus non-circular orbits over the area of interest, exploiting the concept of micro-satellite flower constellations. The Flower Constellations (FCs) is a general class of elliptical orbits which can be optimized, through genetic algorithms, in order to maximize the revisiting time and the orbital height, ensuring also a repeating ground-track. The constellation concept nicely matches the choice of mini-satellites as a baseline choice, due to their small size, weight (less than 500 kilograms) and relatively low cost (essential when

  10. Hydrologic scales, cloud variability, remote sensing, and models: Implications for forecasting snowmelt and streamflow

    Science.gov (United States)

    Simpson, James J.; Dettinger, M.D.; Gehrke, F.; McIntire, T.J.; Hufford, Gary L.

    2004-01-01

    Accurate prediction of available water supply from snowmelt is needed if the myriad of human, environmental, agricultural, and industrial demands for water are to be satisfied, especially given legislatively imposed conditions on its allocation. Robust retrievals of hydrologic basin model variables (e.g., insolation or areal extent of snow cover) provide several advantages over the current operational use of either point measurements or parameterizations to help to meet this requirement. Insolation can be provided at hourly time scales (or better if needed during rapid melt events associated with flooding) and at 1-km spatial resolution. These satellite-based retrievals incorporate the effects of highly variable (both in space and time) and unpredictable cloud cover on estimates of insolation. The insolation estimates are further adjusted for the effects of basin topography using a high-resolution digital elevation model prior to model input. Simulations of two Sierra Nevada rivers in the snowmelt seasons of 1998 and 1999 indicate that even the simplest improvements in modeled insolation can improve snowmelt simulations, with 10%-20% reductions in root-mean-square errors. Direct retrieval of the areal extent of snow cover may mitigate the need to rely entirely on internal calculations of this variable, a reliance that can yield large errors that are difficult to correct until long after the season is complete and that often leads to persistent underestimates or overestimates of the volumes of the water to operational reservoirs. Agencies responsible for accurately predicting available water resources from the melt of snowpack [e.g., both federal (the National Weather Service River Forecast Centers) and state (the California Department of Water Resources)] can benefit by incorporating concepts developed herein into their operational forecasting procedures. ?? 2004 American Meteorological Society.

  11. An Investigation of the High Efficiency Estimation Approach of the Large-Scale Scattered Point Cloud Normal Vector

    Directory of Open Access Journals (Sweden)

    Xianglin Meng

    2018-03-01

    Full Text Available The normal vector estimation of the large-scale scattered point cloud (LSSPC plays an important role in point-based shape editing. However, the normal vector estimation for LSSPC cannot meet the great challenge of the sharp increase of the point cloud that is mainly attributed to its low computational efficiency. In this paper, a novel, fast method-based on bi-linear interpolation is reported on the normal vector estimation for LSSPC. We divide the point sets into many small cubes to speed up the local point search and construct interpolation nodes on the isosurface expressed by the point cloud. On the premise of calculating the normal vectors of these interpolated nodes, a normal vector bi-linear interpolation of the points in the cube is realized. The proposed approach has the merits of accurate, simple, and high efficiency, because the algorithm only needs to search neighbor and calculates normal vectors for interpolation nodes that are usually far less than the point cloud. The experimental results of several real and simulated point sets show that our method is over three times faster than the Elliptic Gabriel Graph-based method, and the average deviation is less than 0.01 mm.

  12. Cloud-Top Entrainment in Stratocumulus Clouds

    Science.gov (United States)

    Mellado, Juan Pedro

    2017-01-01

    Cloud entrainment, the mixing between cloudy and clear air at the boundary of clouds, constitutes one paradigm for the relevance of small scales in the Earth system: By regulating cloud lifetimes, meter- and submeter-scale processes at cloud boundaries can influence planetary-scale properties. Understanding cloud entrainment is difficult given the complexity and diversity of the associated phenomena, which include turbulence entrainment within a stratified medium, convective instabilities driven by radiative and evaporative cooling, shear instabilities, and cloud microphysics. Obtaining accurate data at the required small scales is also challenging, for both simulations and measurements. During the past few decades, however, high-resolution simulations and measurements have greatly advanced our understanding of the main mechanisms controlling cloud entrainment. This article reviews some of these advances, focusing on stratocumulus clouds, and indicates remaining challenges.

  13. Improvement of Representation of the Cloud-Aerosol Interaction in Large-Scale Models

    Energy Technology Data Exchange (ETDEWEB)

    Khain, Alexander [Hebrew Univ. of Jerusalem (Israel); Phillips, Vaughan [Lund Univ. (Sweden); Pinsky, Mark [Hebrew Univ. of Jerusalem (Israel); Lynn, Barry [Hebrew Univ. of Jerusalem (Israel)

    2016-12-20

    The main achievements reached under the DOE award DE-SC0006788 are described. It is shown that the plan of the Project is completed. Unique results concerning cloud-aerosol interaction are obtained. It is shown that aerosols affect intensity of hurricanes. The effects of small aerosols on formation of ice in anvils of deep convective clouds are discovered, for the first time the mechanisms of drizzle formation are found and described quantitatively. Mechanisms of formation of warm rain are clarified and the dominating role of adiabatic processes and turbulence are stressed. Important results concerning the effects of sea spray on intensity of clouds and tropical cyclones are obtained. A novel methods of calculation of hail formation has been developed and implemented.

  14. Outcrop-scale fracture trace identification using surface roughness derived from a high-density point cloud

    Science.gov (United States)

    Okyay, U.; Glennie, C. L.; Khan, S.

    2017-12-01

    Owing to the advent of terrestrial laser scanners (TLS), high-density point cloud data has become increasingly available to the geoscience research community. Research groups have started producing their own point clouds for various applications, gradually shifting their emphasis from obtaining the data towards extracting more and meaningful information from the point clouds. Extracting fracture properties from three-dimensional data in a (semi-)automated manner has been an active area of research in geosciences. Several studies have developed various processing algorithms for extracting only planar surfaces. In comparison, (semi-)automated identification of fracture traces at the outcrop scale, which could be used for mapping fracture distribution have not been investigated frequently. Understanding the spatial distribution and configuration of natural fractures is of particular importance, as they directly influence fluid-flow through the host rock. Surface roughness, typically defined as the deviation of a natural surface from a reference datum, has become an important metric in geoscience research, especially with the increasing density and accuracy of point clouds. In the study presented herein, a surface roughness model was employed to identify fracture traces and their distribution on an ophiolite outcrop in Oman. Surface roughness calculations were performed using orthogonal distance regression over various grid intervals. The results demonstrated that surface roughness could identify outcrop-scale fracture traces from which fracture distribution and density maps can be generated. However, considering outcrop conditions and properties and the purpose of the application, the definition of an adequate grid interval for surface roughness model and selection of threshold values for distribution maps are not straightforward and require user intervention and interpretation.

  15. Final Technical Report for "High-resolution global modeling of the effects of subgrid-scale clouds and turbulence on precipitating cloud systems"

    Energy Technology Data Exchange (ETDEWEB)

    Larson, Vincent [Univ. of Wisconsin, Milwaukee, WI (United States)

    2016-11-25

    The Multiscale Modeling Framework (MMF) embeds a cloud-resolving model in each grid column of a General Circulation Model (GCM). A MMF model does not need to use a deep convective parameterization, and thereby dispenses with the uncertainties in such parameterizations. However, MMF models grossly under-resolve shallow boundary-layer clouds, and hence those clouds may still benefit from parameterization. In this grant, we successfully created a climate model that embeds a cloud parameterization (“CLUBB”) within a MMF model. This involved interfacing CLUBB’s clouds with microphysics and reducing computational cost. We have evaluated the resulting simulated clouds and precipitation with satellite observations. The chief benefit of the project is to provide a MMF model that has an improved representation of clouds and that provides improved simulations of precipitation.

  16. From Global to Cloud Resolving Scale: Experiments with a Scale- and Aerosol-Aware Physics Package and Impact on Tracer Transport

    Science.gov (United States)

    Grell, G. A.; Freitas, S. R.; Olson, J.; Bela, M.

    2017-12-01

    We will start by providing a summary of the latest cumulus parameterization modeling efforts at NOAA's Earth System Research Laboratory (ESRL) will be presented on both regional and global scales. The physics package includes a scale-aware parameterization of subgrid cloudiness feedback to radiation (coupled PBL, microphysics, radiation, shallow and congestus type convection), the stochastic Grell-Freitas (GF) scale- and aerosol-aware convective parameterization, and an aerosol aware microphysics package. GF is based on a stochastic approach originally implemented by Grell and Devenyi (2002) and described in more detail in Grell and Freitas (2014, ACP). It was expanded to include PDF's for vertical mass flux, as well as modifications to improve the diurnal cycle. This physics package will be used on different scales, spanning global to cloud resolving, to look at the impact on scalar transport and numerical weather prediction.

  17. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    International Nuclear Information System (INIS)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-01-01

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures

  18. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  19. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    Energy Technology Data Exchange (ETDEWEB)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V. [Institute of Informatics Problems, Russian Academy of Sciences (Russian Federation); Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S. [Telecommunication Systems Department, Peoples’ Friendship University of Russia (Russian Federation)

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  20. Dense range images from sparse point clouds using multi-scale processing

    NARCIS (Netherlands)

    Do, Q.L.; Ma, L.; With, de P.H.N.

    2013-01-01

    Multi-modal data processing based on visual and depth/range images has become relevant in computer vision for 3D reconstruction applications such as city modeling, robot navigation etc. In this paper, we generate highaccuracy dense range images from sparse point clouds to facilitate such

  1. The variability of tropical ice cloud properties as a function of the large-scale context from ground-based radar-lidar observations over Darwin, Australia

    Science.gov (United States)

    Protat, A.; Delanoë, J.; May, P. T.; Haynes, J.; Jakob, C.; O'Connor, E.; Pope, M.; Wheeler, M. C.

    2011-08-01

    The high complexity of cloud parameterizations now held in models puts more pressure on observational studies to provide useful means to evaluate them. One approach to the problem put forth in the modelling community is to evaluate under what atmospheric conditions the parameterizations fail to simulate the cloud properties and under what conditions they do a good job. It is the ambition of this paper to characterize the variability of the statistical properties of tropical ice clouds in different tropical "regimes" recently identified in the literature to aid the development of better process-oriented parameterizations in models. For this purpose, the statistical properties of non-precipitating tropical ice clouds over Darwin, Australia are characterized using ground-based radar-lidar observations from the Atmospheric Radiation Measurement (ARM) Program. The ice cloud properties analysed are the frequency of ice cloud occurrence, the morphological properties (cloud top height and thickness), and the microphysical and radiative properties (ice water content, visible extinction, effective radius, and total concentration). The variability of these tropical ice cloud properties is then studied as a function of the large-scale cloud regimes derived from the International Satellite Cloud Climatology Project (ISCCP), the amplitude and phase of the Madden-Julian Oscillation (MJO), and the large-scale atmospheric regime as derived from a long-term record of radiosonde observations over Darwin. The vertical variability of ice cloud occurrence and microphysical properties is largest in all regimes (1.5 order of magnitude for ice water content and extinction, a factor 3 in effective radius, and three orders of magnitude in concentration, typically). 98 % of ice clouds in our dataset are characterized by either a small cloud fraction (smaller than 0.3) or a very large cloud fraction (larger than 0.9). In the ice part of the troposphere three distinct layers characterized by

  2. SEMANTIC3D.NET: a New Large-Scale Point Cloud Classification Benchmark

    Science.gov (United States)

    Hackel, T.; Savinov, N.; Ladicky, L.; Wegner, J. D.; Schindler, K.; Pollefeys, M.

    2017-05-01

    This paper presents a new 3D point cloud classification benchmark data set with over four billion manually labelled points, meant as input for data-hungry (deep) learning methods. We also discuss first submissions to the benchmark that use deep convolutional neural networks (CNNs) as a work horse, which already show remarkable performance improvements over state-of-the-art. CNNs have become the de-facto standard for many tasks in computer vision and machine learning like semantic segmentation or object detection in images, but have no yet led to a true breakthrough for 3D point cloud labelling tasks due to lack of training data. With the massive data set presented in this paper, we aim at closing this data gap to help unleash the full potential of deep learning methods for 3D labelling tasks. Our semantic3D.net data set consists of dense point clouds acquired with static terrestrial laser scanners. It contains 8 semantic classes and covers a wide range of urban outdoor scenes: churches, streets, railroad tracks, squares, villages, soccer fields and castles. We describe our labelling interface and show that our data set provides more dense and complete point clouds with much higher overall number of labelled points compared to those already available to the research community. We further provide baseline method descriptions and comparison between methods submitted to our online system. We hope semantic3D.net will pave the way for deep learning methods in 3D point cloud labelling to learn richer, more general 3D representations, and first submissions after only a few months indicate that this might indeed be the case.

  3. Linking the formation of molecular clouds and high-mass stars: a multi-tracer and multi-scale study

    International Nuclear Information System (INIS)

    Nguyen-Luong, Quang

    2012-01-01

    Star formation is a complex process involving many physical processes acting from the very large scales of the galaxy to the very small scales of individual stars. Among the highly debated topics, the gas to star-formation-rate (SFR) relation is an interesting topic for both the galactic and extragalactic communities. Although it is studied extensively for external galaxies, how this relation behaves with respect to the molecular clouds of the Milky Way is still unclear. The detailed mechanisms of the formation of molecular clouds and stars, especially high-mass stars, are still not clear. To tackle these two questions, we investigate the molecular cloud formation and the star formation activities in the W43 molecular cloud complex and the G035.39-00.33 filament. The first goal is to infer the connections of the gas-SFR relations of these two objects to those of other galactic molecular clouds and to extragalactic ones. The second goal is to look for indications that the converging flows theory has formed the W43 molecular cloud since it is the first theory to explain star formation self-consistently, from the onset of molecular clouds to the formation of seeds of (high-mass) stars. We use a large dataset of continuum tracers at 3.6--870 μm extracted from Galaxy-wide surveys such as HOBYS, EPOS, Hi-GAL, ATLASGAL, GLIMPSE, and MIPSGAL to trace the cloud structure, mass and star formation activities of both the W43 molecular cloud complex and the G035.39-00.33 filament. To explore the detailed formation mechanisms of the molecular cloud in W43 from low-density to very high-density gas, we take advantage of the existing H_I, "1"3CO 1-0 molecular line data from the VGPS and GRS surveys in combination with the new dedicated molecular line surveys with the IRAM 30 m. We characterise the W43 molecular complex as being a massive complex (M(total) ∼ 7.1 *10"6 M. over spatial extent of ∼ 140 pc), which has a high concentration of dense clumps (M(clumps) ∼ 8.4*10"5 M

  4. Large Scale Variability of Phytoplankton Blooms in the Arctic and Peripheral Seas: Relationships with Sea Ice, Temperature, Clouds, and Wind

    Science.gov (United States)

    Comiso, Josefino C.; Cota, Glenn F.

    2004-01-01

    Spatially detailed satellite data of mean color, sea ice concentration, surface temperature, clouds, and wind have been analyzed to quantify and study the large scale regional and temporal variability of phytoplankton blooms in the Arctic and peripheral seas from 1998 to 2002. In the Arctic basin, phytoplankton chlorophyll displays a large symmetry with the Eastern Arctic having about fivefold higher concentrations than those of the Western Arctic. Large monthly and yearly variability is also observed in the peripheral seas with the largest blooms occurring in the Bering Sea, Sea of Okhotsk, and the Barents Sea during spring. There is large interannual and seasonal variability in biomass with average chlorophyll concentrations in 2002 and 2001 being higher than earlier years in spring and summer. The seasonality in the latitudinal distribution of blooms is also very different such that the North Atlantic is usually most expansive in spring while the North Pacific is more extensive in autumn. Environmental factors that influence phytoplankton growth were examined, and results show relatively high negative correlation with sea ice retreat and strong positive correlation with temperature in early spring. Plankton growth, as indicated by biomass accumulation, in the Arctic and subarctic increases up to a threshold surface temperature of about 276-277 degree K (3-4 degree C) beyond which the concentrations start to decrease suggesting an optimal temperature or nutrient depletion. The correlation with clouds is significant in some areas but negligible in other areas, while the correlations with wind speed and its components are generally weak. The effects of clouds and winds are less predictable with weekly climatologies because of unknown effects of averaging variable and intermittent physical forcing (e.g. over storm event scales with mixing and upwelling of nutrients) and the time scales of acclimation by the phytoplankton.

  5. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  6. Improving representation of convective transport for scale-aware parameterization: 2. Analysis of cloud-resolving model simulations

    Science.gov (United States)

    Liu, Yi-Chin; Fan, Jiwen; Zhang, Guang J.; Xu, Kuan-Man; Ghan, Steven J.

    2015-04-01

    Following Part I, in which 3-D cloud-resolving model (CRM) simulations of a squall line and mesoscale convective complex in the midlatitude continental and the tropical regions are conducted and evaluated, we examine the scale dependence of eddy transport of water vapor, evaluate different eddy transport formulations, and improve the representation of convective transport across all scales by proposing a new formulation that more accurately represents the CRM-calculated eddy flux. CRM results show that there are strong grid-spacing dependencies of updraft and downdraft fractions regardless of altitudes, cloud life stage, and geographical location. As for the eddy transport of water vapor, updraft eddy flux is a major contributor to total eddy flux in the lower and middle troposphere. However, downdraft eddy transport can be as large as updraft eddy transport in the lower atmosphere especially at the mature stage of midlatitude continental convection. We show that the single-updraft approach significantly underestimates updraft eddy transport of water vapor because it fails to account for the large internal variability of updrafts, while a single downdraft represents the downdraft eddy transport of water vapor well. We find that using as few as three updrafts can account for the internal variability of updrafts well. Based on the evaluation with the CRM simulated data, we recommend a simplified eddy transport formulation that considers three updrafts and one downdraft. Such formulation is similar to the conventional one but much more accurately represents CRM-simulated eddy flux across all grid scales.

  7. Characterization of the Cloud-Topped Boundary Layer at the Synoptic Scale Using AVHRR Observations during the SEMAPHORE Experiment.

    Science.gov (United States)

    Mathieu, A.; Sèze, G.; Lahellec, A.; Guerin, C.; Weill, A.

    2003-12-01

    Satellite platforms NOAA-11 and -12 Advanced Very High Resolution Radiometer (AVHRR) data are used during the daytime to study large sheets of stratocumulus over the North Atlantic Ocean. The application concerns an anticyclonic period of the Structure des Echanges Mer Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherché Expérimentale (SEMAPHORE) campaign (10 17 November 1993). In the region of interest, the satellite images are recorded under large solar zenith angles. Extending the SEMAPHORE area, a region of about 3000 × 3000 km2 is studied to characterize the atmospheric boundary layer. A statistical cloud classification method is applied to discriminate for low-level and optically thick clouds. For AVHRR pixels covered with thick clouds, brightness temperatures are used to evaluate the boundary layer cloud-top temperature (CTT). The objective is to obtain accurate CTT maps for evaluation of a global model. In this application, the full-resolution fields are reduced to match model grid size. An estimate of overall temperature uncertainty associated with each grid point is also derived, which incorporates subgrid variability of the fields and quality of the temperature retrieval. Results are compared with the SEMAPHORE campaign measurements. A comparison with “DX” products obtained with the same dataset, but at lower resolution, is also presented. The authors claim that such instantaneous CTT maps could be as intensively used as classical SST maps, and both could be efficiently complemented with gridpoint error-bar maps. They may be used for multiple applications: (i) to provide a means to improve numerical weather prediction and climatological reanalyses, (ii) to represent a boundary layer global characterization to analyze the synoptic situation of field experiments, and (iii) to allow validation and to test development of large-scale and mesoscale models.

  8. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  9. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  10. Gas, dust, stars, star formation, and their evolution in M 33 at giant molecular cloud scales

    Science.gov (United States)

    Komugi, Shinya; Miura, Rie E.; Kuno, Nario; Tosaki, Tomoka

    2018-04-01

    We report on a multi-parameter analysis of giant molecular clouds (GMCs) in the nearby spiral galaxy M 33. A catalog of GMCs identifed in 12CO(J = 3-2) was used to compile associated 12CO(J = 1-0), dust, stellar mass, and star formation rate. Each of the 58 GMCs are categorized by their evolutionary stage. Applying the principal component analysis on these parameters, we construct two principal components, PC1 and PC2, which retain 75% of the information from the original data set. PC1 is interpreted as expressing the total interstellar matter content, and PC2 as the total activity of star formation. Young (activity compared to intermediate-age and older clouds. Comparison of average cloud properties in different evolutionary stages imply that GMCs may be heated or grow denser and more massive via aggregation of diffuse material in their first ˜ 10 Myr. The PCA also objectively identified a set of tight relations between ISM and star formation. The ratio of the two CO lines is nearly constant, but weakly modulated by massive star formation. Dust is more strongly correlated with the star formation rate than the CO lines, supporting recent findings that dust may trace molecular gas better than CO. Stellar mass contributes weakly to the star formation rate, reminiscent of an extended form of the Schmidt-Kennicutt relation with the molecular gas term substituted by dust.

  11. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  12. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    Science.gov (United States)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as

  13. ''The ambipolar diffusion time scale and the location of star formation in magnetic interstellar clouds'': Setting the record straight

    International Nuclear Information System (INIS)

    Mouschovias, T.C.

    1984-01-01

    The point of a recent (1983) paper by Scott is that a previous paper (1979) by Mouschovias has concluded ''erroneously'' that star formation takes place off center in a cloud because of the use of an ''improver'' definition of a time scale for ambipolar diffusion. No such conclusion, Scott claims, follows from a ''proper'' definition, such as the ''traditional'' one by Spitzer. (i) Scott misrepresents the reasoning that led to the conclusion in the paper which he criticized. (ii) He is also wrong: both the ''traditional'' and the ''improper'' definitions vary similarly with radius, and both can have an off-center minimum; the spatial variation of the degree of ionization is the determining factor: not the specific value of the time scale at the origin, as Scott claims

  14. Influence of galactic arm scale dynamics on the molecular composition of the cold and dense ISM. I. Observed abundance gradients in dense clouds

    Science.gov (United States)

    Ruaud, M.; Wakelam, V.; Gratier, P.; Bonnell, I. A.

    2018-04-01

    Aim. We study the effect of large scale dynamics on the molecular composition of the dense interstellar medium during the transition between diffuse to dense clouds. Methods: We followed the formation of dense clouds (on sub-parsec scales) through the dynamics of the interstellar medium at galactic scales. We used results from smoothed particle hydrodynamics (SPH) simulations from which we extracted physical parameters that are used as inputs for our full gas-grain chemical model. In these simulations, the evolution of the interstellar matter is followed for 50 Myr. The warm low-density interstellar medium gas flows into spiral arms where orbit crowding produces the shock formation of dense clouds, which are held together temporarily by the external pressure. Results: We show that depending on the physical history of each SPH particle, the molecular composition of the modeled dense clouds presents a high dispersion in the computed abundances even if the local physical properties are similar. We find that carbon chains are the most affected species and show that these differences are directly connected to differences in (1) the electronic fraction, (2) the C/O ratio, and (3) the local physical conditions. We argue that differences in the dynamical evolution of the gas that formed dense clouds could account for the molecular diversity observed between and within these clouds. Conclusions: This study shows the importance of past physical conditions in establishing the chemical composition of the dense medium.

  15. Cloud computing method for dynamically scaling a process across physical machine boundaries

    Science.gov (United States)

    Gillen, Robert E.; Patton, Robert M.; Potok, Thomas E.; Rojas, Carlos C.

    2014-09-02

    A cloud computing platform includes first device having a graph or tree structure with a node which receives data. The data is processed by the node or communicated to a child node for processing. A first node in the graph or tree structure determines the reconfiguration of a portion of the graph or tree structure on a second device. The reconfiguration may include moving a second node and some or all of its descendant nodes. The second and descendant nodes may be copied to the second device.

  16. Particles from a Diesel ship engine: Mixing state on the nano scale and cloud condensation abilities

    Science.gov (United States)

    Lieke, K. I.; Rosenørn, T.; Fuglsang, K.; Frederiksen, T.; Butcher, A. C.; King, S. M.; Bilde, M.

    2012-04-01

    Transport by ship plays an important role in global logistics. Current international policy initiatives by the International Maritime Organization (IMO) are taken to reduce emissions from ship propulsion systems (NO and SO, primarily). However, particulate emissions (e.g. soot) from ships are yet not regulated by legislations. To date, there is still a lack of knowledge regarding the global and local effects of the particulate matter emitted from ships at sea. Particles may influence the climate through their direct effects (scattering and absorption of long and shortwave radiation) and indirectly through formation of clouds. Many studies have been carried out estimating the mass and particle number from ship emissions (e.g. Petzold et al. 2008), many of them in test rig studies (e.g. Kasper et al. 2007). It is shown that particulate emissions vary with engine load and chemical composition of fuels. Only a few studies have been carried out to characterize the chemical composition and cloud-nucleating ability of the particulate matter (e.g. Corbett et al. 1997). In most cases, the cloud-nucleating ability of emission particles is estimated from number size distribution. We applied measurements to characterize particulate emissions from a MAN B&W Low Speed engine on test bed. A unique data set was obtained through the use of a scanning mobility particle sizing system (SMPS), combined with a cloud condensation nucleus (CCN) counter and a thermodenuder - all behind a dilution system. In addition, impactor samples were taken on nickel grids with carbon foil for use in an electron microscope (EM) to characterize the mineral phase and mixing state of the particles. The engine was operated at a series of different load conditions and an exhaust gas recirculation (EGR) system was applied. Measurements were carried out before and after the EGR system respectively. Our observations show significant changes in number size distribution and CCN activity with varying conditions

  17. Molecular-cloud-scale Chemical Composition. II. Mapping Spectral Line Survey toward W3(OH) in the 3 mm Band

    Energy Technology Data Exchange (ETDEWEB)

    Nishimura, Yuri [Institute of Astronomy, The University of Tokyo, 2-21-1, Osawa, Mitaka, Tokyo 181-0015 (Japan); Watanabe, Yoshimasa; Yamamoto, Satoshi [Department of Physics, The University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Harada, Nanase [Academia Sinica Institute of Astronomy and Astrophysics, No.1, Sec. 4, Roosevelt Road, 10617 Taipei, Taiwan, R.O.C. (China); Shimonishi, Takashi [Frontier Research Institute for Interdisciplinary Sciences, Tohoku University, Aramakiazaaoba 6-3, Aoba-ku, Sendai, Miyagi 980-8578 (Japan); Sakai, Nami [RIKEN, 2-1 Hirosawa, Wako, Saitama 351-0198 (Japan); Aikawa, Yuri [Department of Astronomy, The University of Tokyo, 7-3-1, Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Kawamura, Akiko [Chile Observatory, National Astronomical Observatory of Japan, 2-21-1, Osawa, Mitaka, Tokyo 181-8588 (Japan)

    2017-10-10

    To study a molecular-cloud-scale chemical composition, we conducted a mapping spectral line survey toward the Galactic molecular cloud W3(OH), which is one of the most active star-forming regions in the Perseus arm. We conducted our survey through the use of the Nobeyama Radio Observatory 45 m telescope, and observed the area of 16′ × 16′, which corresponds to 9.0 pc × 9.0 pc. The observed frequency ranges are 87–91, 96–103, and 108–112 GHz. We prepared the spectrum averaged over the observed area, in which eight molecular species (CCH, HCN, HCO{sup +}, HNC, CS, SO, C{sup 18}O, and {sup 13}CO) are identified. On the other hand, the spectrum of the W3(OH) hot core observed at a 0.17 pc resolution shows the lines of various molecules such as OCS, H{sub 2}CS CH{sub 3}CCH, and CH{sub 3}CN in addition to the above species. In the spatially averaged spectrum, emission of the species concentrated just around the star-forming core, such as CH{sub 3}OH and HC{sub 3}N, is fainter than in the hot core spectrum, whereas emission of the species widely extended over the cloud such as CCH is relatively brighter. We classified the observed area into five subregions according to the integrated intensity of {sup 13}CO, and evaluated the contribution to the averaged spectrum from each subregion. The CCH, HCN, HCO{sup +}, and CS lines can be seen even in the spectrum of the subregion with the lowest {sup 13}CO integrated intensity range (<10 K km s{sup −1}). Thus, the contributions of the spatially extended emission is confirmed to be dominant in the spatially averaged spectrum.

  18. Large-Scale Control of the Arabian Sea Summer Monsoon Inversion and Low Clouds: A New Perspective

    Science.gov (United States)

    Wu, C. H.; Wang, S. Y.; Hsu, H. H.; Hsu, P. C.

    2016-12-01

    The Arabian Sea undergoes a so-called summer monsoon inversion that reaches the maximum intensity in August associated with a large amount of low-level clouds. The formation of inversion and low clouds was generally thought to be a local system influenced by the India-Pakistan monsoon advancement. New empirical and numerical evidence suggests that, rather than being a mere byproduct of the nearby monsoon, the Arabian Sea monsoon inversion is coupled with a broad-scale monsoon evolution connected across the Africa Sahel, South Asia, and the East Asia-western North Pacific (WNP). Several subseasonal variations occur in tandem: The eastward expansion of the Asian-Pacific monsoonal heating likely suppresses the India-Pakistan monsoon while enhancing low-level thermal inversion of Arabian Sea; the upper-tropospheric anticyclone in South Asia weakens in August smoothing zonal contrast in geopotential heights (10°N-30°N); the subtropical WNP monsoon trough in the lower troposphere that signals the revival of East Asian summer monsoon matures in August; the Sahel rainfall peaks in August accompanied by an intensified tropical easterly jet. The occurrence of the latter two processes enhances upper-level anticyclones over Africa and WNP and this, in turn, induces subsidence in between over the Arabian Sea. Numerical experiments demonstrate the combined effect of the African and WNP monsoonal heating on the enhancement of the Arabian Sea monsoon inversion. Connection is further found in the interannual and decadal variations between the East Asian-WNP monsoon and the Arabian Sea monsoon inversion. In years with reduced low clouds of Arabian Sea, the East Asian midlatitude jet stream remains strong in August while the WNP monsoon trough appears to be weakened. The Arabian Sea inversion (ridge) and WNP trough pattern which forms a dipole structure, is also found to have intensified since the 21st century.

  19. Horizontal Variability of Water and Its Relationship to Cloud Fraction near the Tropical Tropopause: Using Aircraft Observations of Water Vapor to Improve the Representation of Grid-scale Cloud Formation in GEOS-5

    Science.gov (United States)

    Selkirk, Henry B.; Molod, Andrea M.

    2014-01-01

    Large-scale models such as GEOS-5 typically calculate grid-scale fractional cloudiness through a PDF parameterization of the sub-gridscale distribution of specific humidity. The GEOS-5 moisture routine uses a simple rectangular PDF varying in height that follows a tanh profile. While below 10 km this profile is informed by moisture information from the AIRS instrument, there is relatively little empirical basis for the profile above that level. ATTREX provides an opportunity to refine the profile using estimates of the horizontal variability of measurements of water vapor, total water and ice particles from the Global Hawk aircraft at or near the tropopause. These measurements will be compared with estimates of large-scale cloud fraction from CALIPSO and lidar retrievals from the CPL on the aircraft. We will use the variability measurements to perform studies of the sensitivity of the GEOS-5 cloud-fraction to various modifications to the PDF shape and to its vertical profile.

  20. THOR: A New Higher-Order Closure Assumed PDF Subgrid-Scale Parameterization; Evaluation and Application to Low Cloud Feedbacks

    Science.gov (United States)

    Firl, G. J.; Randall, D. A.

    2013-12-01

    The so-called "assumed probability density function (PDF)" approach to subgrid-scale (SGS) parameterization has shown to be a promising method for more accurately representing boundary layer cloudiness under a wide range of conditions. A new parameterization has been developed, named the Two-and-a-Half ORder closure (THOR), that combines this approach with a higher-order turbulence closure. THOR predicts the time evolution of the turbulence kinetic energy components, the variance of ice-liquid water potential temperature (θil) and total non-precipitating water mixing ratio (qt) and the covariance between the two, and the vertical fluxes of horizontal momentum, θil, and qt. Ten corresponding third-order moments in addition to the skewnesses of θil and qt are calculated using diagnostic functions assuming negligible time tendencies. The statistical moments are used to define a trivariate double Gaussian PDF among vertical velocity, θil, and qt. The first three statistical moments of each variable are used to estimate the two Gaussian plume means, variances, and weights. Unlike previous similar models, plume variances are not assumed to be equal or zero. Instead, they are parameterized using the idea that the less dominant Gaussian plume (typically representing the updraft-containing portion of a grid cell) has greater variance than the dominant plume (typically representing the "environmental" or slowly subsiding portion of a grid cell). Correlations among the three variables are calculated using the appropriate covariance moments, and both plume correlations are assumed to be equal. The diagnosed PDF in each grid cell is used to calculate SGS condensation, SGS fluxes of cloud water species, SGS buoyancy terms, and to inform other physical parameterizations about SGS variability. SGS condensation is extended from previous similar models to include condensation over both liquid and ice substrates, dependent on the grid cell temperature. Implementations have been

  1. Advancing cloud lifecycle representation in numerical models using innovative analysis methods that bridge arm observations over a breadth of scales

    Energy Technology Data Exchange (ETDEWEB)

    Tselioudis, George [Columbia Univ., New York, NY (United States)

    2016-03-04

    From its location on the subtropics-midlatitude boundary, the Azores is influenced by both the subtropical high pressure and the midlatitude baroclinic storm regimes, and therefore experiences a wide range of cloud structures, from fair-weather scenes to stratocumulus sheets to deep convective systems. This project combined three types of data sets to study cloud variability in the Azores: a satellite analysis of cloud regimes, a reanalysis characterization of storminess, and a 19-month field campaign that occurred on Graciosa Island. Combined analysis of the three data sets provides a detailed picture of cloud variability and the respective dynamic influences, with emphasis on low clouds that constitute a major uncertainty source in climate model simulations. The satellite cloud regime analysis shows that the Azores cloud distribution is similar to the mean global distribution and can therefore be used to evaluate cloud simulation in global models. Regime analysis of low clouds shows that stratocumulus decks occur under the influence of the Azores high-pressure system, while shallow cumulus clouds are sustained by cold-air outbreaks, as revealed by their preference for post-frontal environments and northwesterly flows. An evaluation of CMIP5 climate model cloud regimes over the Azores shows that all models severely underpredict shallow cumulus clouds, while most models also underpredict the occurrence of stratocumulus cloud decks. It is demonstrated that carefully selected case studies can be related through regime analysis to climatological cloud distributions, and a methodology is suggested utilizing process-resolving model simulations of individual cases to better understand cloud-dynamics interactions and attempt to explain and correct climate model cloud deficiencies.

  2. A review of our understanding of the aerosol-cloud interaction from the perspective of a bin resolved cloud scale modelling

    Science.gov (United States)

    Flossmann, Andrea I.; Wobrock, Wolfram

    2010-09-01

    This review compiles the main results obtained using a mesoscale cloud model with bin resolved cloud micophysics and aerosol particle scavenging, as developed by our group over the years and applied to the simulation of shallow and deep convective clouds. The main features of the model are reviewed in different dynamical frameworks covering parcel model dynamics, as well as 1.5D, 2D and 3D dynamics. The main findings are summarized to yield a digested presentation which completes the general understanding of cloud-aerosol interaction, as currently available from textbook knowledge. Furthermore, it should provide support for general cloud model development, as it will suggest potentially minor processes that might be neglected with respect to more important ones and can support development of parameterizations for air quality, chemical transport and climate models. Our work has shown that in order to analyse dedicated campaign results, the supersaturation field and the complex dynamics of the specific clouds needs to be reproduced. Only 3D dynamics represents the variation of the supersaturation over the entire cloud, the continuous nucleation and deactivation of hydrometeors, and the dependence upon initial particle size distribution and solubility. However, general statements on certain processes can be obtained also by simpler dynamics. In particular, we found: Nucleation incorporates about 90% of the initial aerosol particle mass inside the cloud drops. Collision and coalescence redistributes the scavenged aerosol particle mass in such a way that the particle mass follows the main water mass. Small drops are more polluted than larger ones, as pollutant mass mixing ratio decreases with drops size. Collision and coalescence mixes the chemical composition of the generated drops. Their complete evaporation will release processed particles that are mostly larger and more hygroscopic than the initial particles. An interstitial aerosol is left unactivated between the

  3. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  4. Beating the tyranny of scale with a private cloud configured for Big Data

    Science.gov (United States)

    Lawrence, Bryan; Bennett, Victoria; Churchill, Jonathan; Juckes, Martin; Kershaw, Philip; Pepler, Sam; Pritchard, Matt; Stephens, Ag

    2015-04-01

    The Joint Analysis System, JASMIN, consists of a five significant hardware components: a batch computing cluster, a hypervisor cluster, bulk disk storage, high performance disk storage, and access to a tape robot. Each of the computing clusters consists of a heterogeneous set of servers, supporting a range of possible data analysis tasks - and a unique network environment makes it relatively trivial to migrate servers between the two clusters. The high performance disk storage will include the world's largest (publicly visible) deployment of the Panasas parallel disk system. Initially deployed in April 2012, JASMIN has already undergone two major upgrades, culminating in a system which by April 2015, will have in excess of 16 PB of disk and 4000 cores. Layered on the basic hardware are a range of services, ranging from managed services, such as the curated archives of the Centre for Environmental Data Archival or the data analysis environment for the National Centres for Atmospheric Science and Earth Observation, to a generic Infrastructure as a Service (IaaS) offering for the UK environmental science community. Here we present examples of some of the big data workloads being supported in this environment - ranging from data management tasks, such as checksumming 3 PB of data held in over one hundred million files, to science tasks, such as re-processing satellite observations with new algorithms, or calculating new diagnostics on petascale climate simulation outputs. We will demonstrate how the provision of a cloud environment closely coupled to a batch computing environment, all sharing the same high performance disk system allows massively parallel processing without the necessity to shuffle data excessively - even as it supports many different virtual communities, each with guaranteed performance. We will discuss the advantages of having a heterogeneous range of servers with available memory from tens of GB at the low end to (currently) two TB at the high end

  5. Security Architecture of Cloud Computing

    OpenAIRE

    V.KRISHNA REDDY; Dr. L.S.S.REDDY

    2011-01-01

    The Cloud Computing offers service over internet with dynamically scalable resources. Cloud Computing services provides benefits to the users in terms of cost and ease of use. Cloud Computing services need to address the security during the transmission of sensitive data and critical applications to shared and public cloud environments. The cloud environments are scaling large for data processing and storage needs. Cloud computing environment have various advantages as well as disadvantages o...

  6. Chargeback for cloud services.

    NARCIS (Netherlands)

    Baars, T.; Khadka, R.; Stefanov, H.; Jansen, S.; Batenburg, R.; Heusden, E. van

    2014-01-01

    With pay-per-use pricing models, elastic scaling of resources, and the use of shared virtualized infrastructures, cloud computing offers more efficient use of capital and agility. To leverage the advantages of cloud computing, organizations have to introduce cloud-specific chargeback practices.

  7. Molecular clouds toward three Spitzer bubbles S116, S117, and S118: Evidence for a cloud-cloud collision which formed the three H II regions and a 10 pc scale molecular cavity

    Science.gov (United States)

    Fukui, Yasuo; Ohama, Akio; Kohno, Mikito; Torii, Kazufumi; Fujita, Shinji; Hattori, Yusuke; Nishimura, Atsushi; Yamamoto, Hiroaki; Tachihara, Kengo

    2018-05-01

    We carried out a molecular-line study toward the three Spitzer bubbles S116, S117, and S118, which show active formation of high-mass stars. We found molecular gas consisting of two components with a velocity difference of ˜5 km s-1. One of them, the small cloud, has a typical velocity of -63 km s-1 and the other, the large cloud, has one of -58 km s-1. The large cloud has a nearly circular intensity depression, the size of which is similar to that of the small cloud. We present an interpretation that its cavity was created by a collision between the two clouds and that this collision compressed the gas into a dense layer elongating along the western rim of the small cloud. In this scenario, the O stars including those in the three Spitzer bubbles were formed in the interface layer compressed by the collision. Assuming that the relative motion of the clouds has a tilt of 45° to the line of sight, we estimate that the collision continued for the last 1 Myr at a relative velocity of ˜10 km s-1. In the S116-S117-S118 system the H II regions are located outside of the cavity. This morphology is ascribed to the density-bound distribution of the large cloud which caused the H II regions to expand more easily toward the outer part of the large cloud than towards the inside of the cavity. The present case proves that a cloud-cloud collision creates a cavity without the action of O-star feedback, and suggests that the collision-compressed layer is highly filamentary.

  8. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    Science.gov (United States)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  9. Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.

    Science.gov (United States)

    Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary

    2016-10-04

    Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.

  10. Can Clouds Replace Grids? A Real-Life Exabyte-Scale Test-Case

    CERN Document Server

    Shiers, J

    2008-01-01

    The world’s largest scientific machine – comprising dual 27km circular proton accelerators cooled to 1.9oK and located some 100m underground – currently relies on major production Grid infrastructures for the offline computing needs of the 4 main experiments that will take data at this facility. After many years of sometimes difficult preparation the computing service has been declared â€ワopen” and ready to meet the challenges that will come shortly when the machine restarts in 2009. But the service is not without its problems: reliability – as seen by the experiments, as opposed to that measured by the official tools – still needs to be significantly improved. Prolonged downtimes or degradations of major services or even complete sites are still too common and the operational and coordination effort to keep the overall service running is probably not sustainable at this level. Recently â€ワCloud Computing” – in terms of pay-per-use fabric provisioning – has...

  11. An uncertainty principle for star formation - II. A new method for characterising the cloud-scale physics of star formation and feedback across cosmic history

    Science.gov (United States)

    Kruijssen, J. M. Diederik; Schruba, Andreas; Hygate, Alexander P. S.; Hu, Chia-Yu; Haydon, Daniel T.; Longmore, Steven N.

    2018-05-01

    The cloud-scale physics of star formation and feedback represent the main uncertainty in galaxy formation studies. Progress is hampered by the limited empirical constraints outside the restricted environment of the Local Group. In particular, the poorly-quantified time evolution of the molecular cloud lifecycle, star formation, and feedback obstructs robust predictions on the scales smaller than the disc scale height that are resolved in modern galaxy formation simulations. We present a new statistical method to derive the evolutionary timeline of molecular clouds and star-forming regions. By quantifying the excess or deficit of the gas-to-stellar flux ratio around peaks of gas or star formation tracer emission, we directly measure the relative rarity of these peaks, which allows us to derive their lifetimes. We present a step-by-step, quantitative description of the method and demonstrate its practical application. The method's accuracy is tested in nearly 300 experiments using simulated galaxy maps, showing that it is capable of constraining the molecular cloud lifetime and feedback time-scale to <0.1 dex precision. Access to the evolutionary timeline provides a variety of additional physical quantities, such as the cloud-scale star formation efficiency, the feedback outflow velocity, the mass loading factor, and the feedback energy or momentum coupling efficiencies to the ambient medium. We show that the results are robust for a wide variety of gas and star formation tracers, spatial resolutions, galaxy inclinations, and galaxy sizes. Finally, we demonstrate that our method can be applied out to high redshift (z≲ 4) with a feasible time investment on current large-scale observatories. This is a major shift from previous studies that constrained the physics of star formation and feedback in the immediate vicinity of the Sun.

  12. Comprehensive comparison of two image-based point clouds from aerial photos with airborne lidar for large-scale mapping : Door detection to envelope reconstruction

    NARCIS (Netherlands)

    Widyaningrum, E.; Gorte, B.G.H.

    2017-01-01

    The integration of computer vision and photogrammetry to generate three-dimensional (3D) information from images has contributed to a wider use of point clouds, for mapping purposes. Large-scale topographic map production requires 3D data with high precision and

  13. The solar noise barrier project 3. The effects of seasonal spectral variation, cloud cover and heat distribution on the performance of full-scale luminescent solar concentrator panels

    NARCIS (Netherlands)

    Debije, M.G.; Tzikas, C.; de Jong, M.; Kanellis, M.; Slooff, L.H.

    We report on the relative performances of two large-scale luminescent solar concentrator (LSC) noise barriers placed in an outdoor environment monitored for over a year. Comparisons are made for the performances of a number of attached photovoltaic cells with changing spectral illumination, cloud

  14. GT-WGS: an efficient and economic tool for large-scale WGS analyses based on the AWS cloud service.

    Science.gov (United States)

    Wang, Yiqi; Li, Gen; Ma, Mark; He, Fazhong; Song, Zhuo; Zhang, Wei; Wu, Chengkun

    2018-01-19

    Whole-genome sequencing (WGS) plays an increasingly important role in clinical practice and public health. Due to the big data size, WGS data analysis is usually compute-intensive and IO-intensive. Currently it usually takes 30 to 40 h to finish a 50× WGS analysis task, which is far from the ideal speed required by the industry. Furthermore, the high-end infrastructure required by WGS computing is costly in terms of time and money. In this paper, we aim to improve the time efficiency of WGS analysis and minimize the cost by elastic cloud computing. We developed a distributed system, GT-WGS, for large-scale WGS analyses utilizing the Amazon Web Services (AWS). Our system won the first prize on the Wind and Cloud challenge held by Genomics and Cloud Technology Alliance conference (GCTA) committee. The system makes full use of the dynamic pricing mechanism of AWS. We evaluate the performance of GT-WGS with a 55× WGS dataset (400GB fastq) provided by the GCTA 2017 competition. In the best case, it only took 18.4 min to finish the analysis and the AWS cost of the whole process is only 16.5 US dollars. The accuracy of GT-WGS is 99.9% consistent with that of the Genome Analysis Toolkit (GATK) best practice. We also evaluated the performance of GT-WGS performance on a real-world dataset provided by the XiangYa hospital, which consists of 5× whole-genome dataset with 500 samples, and on average GT-WGS managed to finish one 5× WGS analysis task in 2.4 min at a cost of $3.6. WGS is already playing an important role in guiding therapeutic intervention. However, its application is limited by the time cost and computing cost. GT-WGS excelled as an efficient and affordable WGS analyses tool to address this problem. The demo video and supplementary materials of GT-WGS can be accessed at https://github.com/Genetalks/wgs_analysis_demo .

  15. TripleCloud: An Infrastructure for Exploratory Querying over Web-Scale RDF Data

    NARCIS (Netherlands)

    Gueret, C.D.M.; Kotoulas, S.; Groth, P.T.

    2011-01-01

    As the availability of large scale RDF data sets has grown, there has been a corresponding growth in researchers' and practitioners' interest in analyzing and investigating these data sets. However, given their size and messiness, there is significant overhead in setting up the infrastructure to

  16. Massive Cloud Computing Processing of P-SBAS Time Series for Displacement Analyses at Large Spatial Scale

    Science.gov (United States)

    Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.

    2016-12-01

    A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.

  17. NASA Goddard Earth Sciences Graduate Student Program. [FIRE CIRRUS-II examination of coupling between an upper tropospheric cloud system and synoptic-scale dynamics

    Science.gov (United States)

    Ackerman, Thomas P.

    1994-01-01

    The evolution of synoptic-scale dynamics associated with a middle and upper tropospheric cloud event that occurred on 26 November 1991 is examined. The case under consideration occurred during the FIRE CIRRUS-II Intensive Field Observing Period held in Coffeyville, KS during Nov. and Dec., 1991. Using data from the wind profiler demonstration network and a temporally and spatially augmented radiosonde array, emphasis is given to explaining the evolution of the kinematically-derived ageostrophic vertical circulations and correlating the circulation with the forcing of an extensively sampled cloud field. This is facilitated by decomposing the horizontal divergence into its component parts through a natural coordinate representation of the flow. Ageostrophic vertical circulations are inferred and compared to the circulation forcing arising from geostrophic confluence and shearing deformation derived from the Sawyer-Eliassen Equation. It is found that a thermodynamically indirect vertical circulation existed in association with a jet streak exit region. The circulation was displaced to the cyclonic side of the jet axis due to the orientation of the jet exit between a deepening diffluent trough and building ridge. The cloud line formed in the ascending branch of the vertical circulation with the most concentrated cloud development occurring in conjunction with the maximum large-scale vertical motion. The relationship between the large scale dynamics and the parameterization of middle and upper tropospheric clouds in large-scale models is discussed and an example of ice water contents derived from a parameterization forced by the diagnosed vertical motions and observed water vapor contents is presented.

  18. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  19. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  20. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  1. Cloud Geospatial Analysis Tools for Global-Scale Comparisons of Population Models for Decision Making

    Science.gov (United States)

    Hancher, M.; Lieber, A.; Scott, L.

    2017-12-01

    The volume of satellite and other Earth data is growing rapidly. Combined with information about where people are, these data can inform decisions in a range of areas including food and water security, disease and disaster risk management, biodiversity, and climate adaptation. Google's platform for planetary-scale geospatial data analysis, Earth Engine, grants access to petabytes of continually updating Earth data, programming interfaces for analyzing the data without the need to download and manage it, and mechanisms for sharing the analyses and publishing results for data-driven decision making. In addition to data about the planet, data about the human planet - population, settlement and urban models - are now available for global scale analysis. The Earth Engine APIs enable these data to be joined, combined or visualized with economic or environmental indicators such as nighttime lights trends, global surface water, or climate projections, in the browser without the need to download anything. We will present our newly developed application intended to serve as a resource for government agencies, disaster response and public health programs, or other consumers of these data to quickly visualize the different population models, and compare them to ground truth tabular data to determine which model suits their immediate needs. Users can further tap into the power of Earth Engine and other Google technologies to perform a range of analysis from simple statistics in custom regions to more complex machine learning models. We will highlight case studies in which organizations around the world have used Earth Engine to combine population data with multiple other sources of data, such as water resources and roads data, over deep stacks of temporal imagery to model disease risk and accessibility to inform decisions.

  2. The Magellanic clouds

    International Nuclear Information System (INIS)

    1989-01-01

    As the two galaxies nearest to our own, the Magellanic Clouds hold a special place in studies of the extragalactic distance scale, of stellar evolution and the structure of galaxies. In recent years, results from the South African Astronomical Observatory (SAAO) and elsewhere have shown that it is possible to begin understanding the three dimensional structure of the Clouds. Studies of Magellanic Cloud Cepheids have continued, both to investigate the three-dimensional structure of the Clouds and to learn more about Cepheids and their use as extragalactic distance indicators. Other research undertaken at SAAO includes studies on Nova LMC 1988 no 2 and red variables in the Magellanic Clouds

  3. Inferring Large-Scale Terrestrial Water Storage Through GRACE and GPS Data Fusion in Cloud Computing Environments

    Science.gov (United States)

    Rude, C. M.; Li, J. D.; Gowanlock, M.; Herring, T.; Pankratius, V.

    2016-12-01

    Surface subsidence due to depletion of groundwater can lead to permanent compaction of aquifers and damaged infrastructure. However, studies of such effects on a large scale are challenging and compute intensive because they involve fusing a variety of data sets beyond direct measurements from groundwater wells, such as gravity change measurements from the Gravity Recovery and Climate Experiment (GRACE) or surface displacements measured by GPS receivers. Our work therefore leverages Amazon cloud computing to enable these types of analyses spanning the entire continental US. Changes in groundwater storage are inferred from surface displacements measured by GPS receivers stationed throughout the country. Receivers located on bedrock are anti-correlated with changes in water levels from elastic deformation due to loading, while stations on aquifers correlate with groundwater changes due to poroelastic expansion and compaction. Correlating linearly detrended equivalent water thickness measurements from GRACE with linearly detrended and Kalman filtered vertical displacements of GPS stations located throughout the United States helps compensate for the spatial and temporal limitations of GRACE. Our results show that the majority of GPS stations are negatively correlated with GRACE in a statistically relevant way, as most GPS stations are located on bedrock in order to provide stable reference locations and measure geophysical processes such as tectonic deformations. Additionally, stations located on the Central Valley California aquifer show statistically significant positive correlations. Through the identification of positive and negative correlations, deformation phenomena can be classified as loading or poroelastic expansion due to changes in groundwater. This method facilitates further studies of terrestrial water storage on a global scale. This work is supported by NASA AIST-NNX15AG84G (PI: V. Pankratius) and Amazon.

  4. A cloud-scale chemical-transport simulation during EULINOX. A case study for July 21 1998.

    Science.gov (United States)

    Ramaroson, R.

    2002-12-01

    The main issues addressed by the European project EULINOX covered mainly the quantification of NOx production from lightning, the transport of NOx and surface emissions (e.g. CO) by convective systems, and the lightning distribution around thunderstorms. O3, CO, CO2, NOx, CN concentrations, J(NO2), meteorological variables and lightning have been observed and measured using ground systems and aircraft platforms during the project. Two aircraft have been operated: the DLR Falcon and the Do-228 providing the distribution of species in the PBL and at higher altitudes across the anvil along the jet tracks. July 21st 1998 was a special day during EULINOX : strong convection system, high electrical activities and a NO pic around 23 ppbv on board Falcon in the anvil. Thunderstorms associated with strong convective systems were encountered and well covered by the network of measurement systems as well as for the meteorology than for the chemistry and lightning localization. This work focuses on two objectives: how to describe the meteorology during EULINOX and to quantify the impact of the cloud scales on tropospheric NO and CO concentration. To reach this target, 2 types of simulation have been performed. The fist one uses the MM5 model in a 4 domain-nested version (ratio =3) to simulate the convective cloud system or isolated cell with a 1.5km by 1.5km resolution for the finest grid. The second run uses an off-line chemical transport model (MEDIUM) with a detailed chemistry assimilating in input the MM5 dynamics. On a broader scale, the general synoptic meteorology over Europe is well simulated by MM5. Over the finest domain, the model was able to generate a supercell storm but rather weak (vertical characteristics) and unstable compared to the observations. The cell depth is in a good agreement with observations with a horizontal position lightly shifted spatially. The chemical-transport simulation using MEDIUM including as input the MM5 meteorology output shows a

  5. Comparison of CloudSat and TRMM radar reflectivities

    Indian Academy of Sciences (India)

    Tropical deep convective clouds drive the large scale circulation of ... information concerning tropical clouds since 1998 ..... and CloudSat Data Processing Center, NASA for providing .... ical precipitating clouds ranging from shallow to deep.

  6. Evaluating statistical cloud schemes

    OpenAIRE

    Grützun, Verena; Quaas, Johannes; Morcrette , Cyril J.; Ament, Felix

    2015-01-01

    Statistical cloud schemes with prognostic probability distribution functions have become more important in atmospheric modeling, especially since they are in principle scale adaptive and capture cloud physics in more detail. While in theory the schemes have a great potential, their accuracy is still questionable. High-resolution three-dimensional observational data of water vapor and cloud water, which could be used for testing them, are missing. We explore the potential of ground-based re...

  7. Sensitivity of regional meteorology and atmospheric composition during the DISCOVER-AQ period to subgrid-scale cloud-radiation interactions

    Science.gov (United States)

    Huang, X.; Allen, D. J.; Herwehe, J. A.; Alapaty, K. V.; Loughner, C.; Pickering, K. E.

    2014-12-01

    Subgrid-scale cloudiness directly influences global and regional atmospheric radiation budgets by attenuating shortwave radiation, leading to suppressed convection, decreased surface precipitation as well as other meteorological parameter changes. We use the latest version of WRF (v3.6, Apr 2014), which incorporates the Kain-Fritsch (KF) convective parameterization to provide subgrid-scale cloud fraction and condensate feedback to the rapid radiative transfer model-global (RRTMG) shortwave and longwave radiation schemes. We apply the KF scheme to simulate the DISCOVER-AQ Maryland field campaign (July 2011), and compare the sensitivity of meteorological parameters to the control run that does not include subgrid cloudiness. Furthermore, we will examine the chemical impact from subgrid cloudiness using a regional chemical transport model (CMAQ). There are several meteorological parameters influenced by subgrid cumulus clouds that are very important to air quality modeling, including changes in surface temperature that impact biogenic emission rates; changes in PBL depth that affect pollutant concentrations; and changes in surface humidity levels that impact peroxide-related reactions. Additionally, subgrid cumulus clouds directly impact air pollutant concentrations by modulating photochemistry and vertical mixing. Finally, we will compare with DISCOVER-AQ flight observation data and evaluate how well this off-line CMAQ simulation driven by WRF with the KF scheme simulates the effects of regional convection on atmospheric composition.

  8. Relationships Between Tropical Deep Convection, Tropospheric Mean Temperature and Cloud-Induced Radiative Fluxes on Intraseasonal Time Scales

    Science.gov (United States)

    Ramey, Holly S.; Robertson, Franklin R.

    2010-01-01

    Intraseasonal variability of deep convection represents a fundamental mode of variability in the organization of tropical convection. While most studies of intraseasonal oscillations (ISOs) have focused on the spatial propagation and dynamics of convectively coupled circulations, we examine the projection of ISOs on the tropically-averaged temperature and energy budget. The area of interest is the global oceans between 20degN/S. Our analysis then focuses on these questions: (i) How is tropospheric temperature related to tropical deep convection and the associated ice cloud fractional amount (ICF) and ice water path (IWP)? (ii) What is the source of moisture sustaining the convection and what role does deep convection play in mediating the PBL - free atmospheric temperature equilibration? (iii) What affect do convectively generated upper-tropospheric clouds have on the TOA radiation budget? Our methodology is similar to that of Spencer et al., (2007) with some modifications and some additional diagnostics of both clouds and boundary layer thermodynamics. A composite ISO time series of cloud, precipitation and radiation quantities built from nearly 40 events during a six-year period is referenced to the atmospheric temperature signal. The increase of convective precipitation cannot be sustained by evaporation within the domain, implying strong moisture transports into the tropical ocean area. While there is a decrease in net TOA radiation that develops after the peak in deep convective rainfall, there seems little evidence that an "Infrared Iris"- like mechanism is dominant. Rather, the cloud-induced OLR increase seems largely produced by weakened convection with warmer cloud tops. Tropical ISO events offer an accessible target for studying ISOs not just in terms of propagation mechanisms, but on their global signals of heat, moisture and radiative flux feedback processes.

  9. Towards large-scale data analysis: challenges in the design of portable systems and use of Cloud computing.

    Science.gov (United States)

    Diaz, Javier; Arrizabalaga, Saioa; Bustamante, Paul; Mesa, Iker; Añorga, Javier; Goya, Jon

    2013-01-01

    Portable systems and global communications open a broad spectrum for new health applications. In the framework of electrophysiological applications, several challenges are faced when developing portable systems embedded in Cloud computing services. In order to facilitate new developers in this area based on our experience, five areas of interest are presented in this paper where strategies can be applied for improving the performance of portable systems: transducer and conditioning, processing, wireless communications, battery and power management. Likewise, for Cloud services, scalability, portability, privacy and security guidelines have been highlighted.

  10. What does it take to build a medium scale scientific cloud to process significant amounts of Earth observation data?

    Science.gov (United States)

    Hollstein, André; Diedrich, Hannes; Spengler, Daniel

    2017-04-01

    The installment of the operational fleet of Sentinels by Copernicus offers an unprecedented influx of freely available Earth Observation data with Sentinel-2 being a great example. It offers a broad range of land applications due to its high spatial sampling from 10 m to 20 m and its multi-spectral imaging capabilities with 13 spectral bands. The open access policy allows unrestricted use by everybody and provides data downloads for on the respective sites. For a small area of interest and shorter time series, data processing, and exploitation can easily be done manually. However, for multi-temporal analysis of larger areas, the data size can quickly increase such that it is not manageable in practice on a personal computer which leads to an increasing interest in central data exploitation platforms. Prominent examples are GoogleEarth Engine, NASA Earth Exchange (NEX) or current developments such as CODE-DE in Germany. Open standards are still evolving, and the choice of a platform may create lock-in scenarios and a situation where scientists are not anymore in full control of all aspects of their analysis. Securing intellectual properties of researchers can become a major issue in the future. Partnering with a startup company that is dedicated to providing tools for farm management and precision farming, GFZ builds a small-scale science cloud named GTS2 for processing and distribution of Sentinel-2 data. The service includes a sophisticated atmospheric correction algorithm, spatial co-registration of time series data, as well as a web API for data distribution. This approach is different from the drag to centralized research using infrastructures controlled by others. By keeping the full licensing rights, it allows developing new business models independent from the initially chosen processing provider. Currently, data is held for the greater German area but is extendable to larger areas on short notice due to a scalable distributed network file system. For a

  11. Large Spatial Scale Ground Displacement Mapping through the P-SBAS Processing of Sentinel-1 Data on a Cloud Computing Environment

    Science.gov (United States)

    Casu, F.; Bonano, M.; de Luca, C.; Lanari, R.; Manunta, M.; Manzo, M.; Zinno, I.

    2017-12-01

    Since its launch in 2014, the Sentinel-1 (S1) constellation has played a key role on SAR data availability and dissemination all over the World. Indeed, the free and open access data policy adopted by the European Copernicus program together with the global coverage acquisition strategy, make the Sentinel constellation as a game changer in the Earth Observation scenario. Being the SAR data become ubiquitous, the technological and scientific challenge is focused on maximizing the exploitation of such huge data flow. In this direction, the use of innovative processing algorithms and distributed computing infrastructures, such as the Cloud Computing platforms, can play a crucial role. In this work we present a Cloud Computing solution for the advanced interferometric (DInSAR) processing chain based on the Parallel SBAS (P-SBAS) approach, aimed at processing S1 Interferometric Wide Swath (IWS) data for the generation of large spatial scale deformation time series in efficient, automatic and systematic way. Such a DInSAR chain ingests Sentinel 1 SLC images and carries out several processing steps, to finally compute deformation time series and mean deformation velocity maps. Different parallel strategies have been designed ad hoc for each processing step of the P-SBAS S1 chain, encompassing both multi-core and multi-node programming techniques, in order to maximize the computational efficiency achieved within a Cloud Computing environment and cut down the relevant processing times. The presented P-SBAS S1 processing chain has been implemented on the Amazon Web Services platform and a thorough analysis of the attained parallel performances has been performed to identify and overcome the major bottlenecks to the scalability. The presented approach is used to perform national-scale DInSAR analyses over Italy, involving the processing of more than 3000 S1 IWS images acquired from both ascending and descending orbits. Such an experiment confirms the big advantage of

  12. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  13. Internet ware cloud computing :Challenges

    OpenAIRE

    Qamar, S; Lal, Niranjan; Singh, Mrityunjay

    2010-01-01

    After decades of engineering development and infrastructural investment, Internet connections have become commodity product in many countries, and Internet scalecloud computing” has started to compete with traditional software business through its technological advantages and economy of scale. Cloud computing is a promising enabling technology of Internet ware Cloud Computing is termed as the next big thing in the modern corporate world. Apart from the present day software and technologies,...

  14. Large scale and cloud-based multi-model analytics experiments on climate change data in the Earth System Grid Federation

    Science.gov (United States)

    Fiore, Sandro; Płóciennik, Marcin; Doutriaux, Charles; Blanquer, Ignacio; Barbera, Roberto; Donvito, Giacinto; Williams, Dean N.; Anantharaj, Valentine; Salomoni, Davide D.; Aloisio, Giovanni

    2017-04-01

    In many scientific domains such as climate, data is often n-dimensional and requires tools that support specialized data types and primitives to be properly stored, accessed, analysed and visualized. Moreover, new challenges arise in large-scale scenarios and eco-systems where petabytes (PB) of data can be available and data can be distributed and/or replicated, such as the Earth System Grid Federation (ESGF) serving the Coupled Model Intercomparison Project, Phase 5 (CMIP5) experiment, providing access to 2.5PB of data for the Intergovernmental Panel on Climate Change (IPCC) Fifth Assessment Report (AR5). A case study on climate models intercomparison data analysis addressing several classes of multi-model experiments is being implemented in the context of the EU H2020 INDIGO-DataCloud project. Such experiments require the availability of large amount of data (multi-terabyte order) related to the output of several climate models simulations as well as the exploitation of scientific data management tools for large-scale data analytics. More specifically, the talk discusses in detail a use case on precipitation trend analysis in terms of requirements, architectural design solution, and infrastructural implementation. The experiment has been tested and validated on CMIP5 datasets, in the context of a large scale distributed testbed across EU and US involving three ESGF sites (LLNL, ORNL, and CMCC) and one central orchestrator site (PSNC). The general "environment" of the case study relates to: (i) multi-model data analysis inter-comparison challenges; (ii) addressed on CMIP5 data; and (iii) which are made available through the IS-ENES/ESGF infrastructure. The added value of the solution proposed in the INDIGO-DataCloud project are summarized in the following: (i) it implements a different paradigm (from client- to server-side); (ii) it intrinsically reduces data movement; (iii) it makes lightweight the end-user setup; (iv) it fosters re-usability (of data, final

  15. General overview: European Integrated project on Aerosol Cloud Climate and Air Quality interactions (EUCAARI – integrating aerosol research from nano to global scales

    Directory of Open Access Journals (Sweden)

    D. Simpson

    2011-12-01

    Full Text Available In this paper we describe and summarize the main achievements of the European Aerosol Cloud Climate and Air Quality Interactions project (EUCAARI. EUCAARI started on 1 January 2007 and ended on 31 December 2010 leaving a rich legacy including: (a a comprehensive database with a year of observations of the physical, chemical and optical properties of aerosol particles over Europe, (b comprehensive aerosol measurements in four developing countries, (c a database of airborne measurements of aerosols and clouds over Europe during May 2008, (d comprehensive modeling tools to study aerosol processes fron nano to global scale and their effects on climate and air quality. In addition a new Pan-European aerosol emissions inventory was developed and evaluated, a new cluster spectrometer was built and tested in the field and several new aerosol parameterizations and computations modules for chemical transport and global climate models were developed and evaluated. These achievements and related studies have substantially improved our understanding and reduced the uncertainties of aerosol radiative forcing and air quality-climate interactions. The EUCAARI results can be utilized in European and global environmental policy to assess the aerosol impacts and the corresponding abatement strategies.

  16. Small-scale structure and chemical differentiation in the central region of the Sagittarius B2 molecular cloud

    International Nuclear Information System (INIS)

    Goldsmith, P.F.; Snell, R.L.; Hasegawa, T.; Ukita, N.; Nobeyama Radio Observatory, Minamimaki, Japan)

    1987-01-01

    Fifteen arcsec angular resolution observations of a number of molecular species in the center of the Sgr B2 molecular cloud, including HC3N in the ground and v7 = 1 vibrational states, SO, OCS,l and HNCO, have been performed. Emission from HC3N is fairly uniformly distributed over the region studied; SO and OCS have a spatially extended component but are strongly centrally peaked. HNCO and vibrationally excited HC 3 N emission are essentially restricted to a very small region around the center of activity in the north. The difference between the spatial distributions are attributed to variation in the chemical abundances of the various clumps. The excitation requirements of the vibrationally excited HC 3 N imply the presence of dust and gas at high temperatures. The results further heighten the apparent contradiction presented by the lack of infrared emission from this source. 53 references

  17. Cloud Governance

    DEFF Research Database (Denmark)

    Berthing, Hans Henrik

    Denne præsentation beskriver fordele og værdier ved anvendelse af Cloud Computing. Endvidere inddrager resultater fra en række internationale analyser fra ISACA om Cloud Computing.......Denne præsentation beskriver fordele og værdier ved anvendelse af Cloud Computing. Endvidere inddrager resultater fra en række internationale analyser fra ISACA om Cloud Computing....

  18. GEWEX cloud assessment: A review

    Science.gov (United States)

    Stubenrauch, Claudia; Rossow, William B.; Kinne, Stefan; Ackerman, Steve; Cesana, Gregory; Chepfer, Hélène; Di Girolamo, Larry; Getzewich, Brian; Guignard, Anthony; Heidinger, Andy; Maddux, Brent; Menzel, Paul; Minnis, Patrick; Pearl, Cindy; Platnick, Steven; Poulsen, Caroline; Riedi, Jérôme; Sayer, Andrew; Sun-Mack, Sunny; Walther, Andi; Winker, Dave; Zeng, Shen; Zhao, Guangyu

    2013-05-01

    Clouds cover about 70% of the Earth's surface and play a dominant role in the energy and water cycle of our planet. Only satellite observations provide a continuous survey of the state of the atmosphere over the entire globe and across the wide range of spatial and temporal scales that comprise weather and climate variability. Satellite cloud data records now exceed more than 25 years; however, climatologies compiled from different satellite datasets can exhibit systematic biases. Questions therefore arise as to the accuracy and limitations of the various sensors. The Global Energy and Water cycle Experiment (GEWEX) Cloud Assessment, initiated in 2005 by the GEWEX Radiation Panel, provides the first coordinated intercomparison of publicly available, global cloud products (gridded, monthly statistics) retrieved from measurements of multi-spectral imagers (some with multi-angle view and polarization capabilities), IR sounders and lidar. Cloud properties under study include cloud amount, cloud height (in terms of pressure, temperature or altitude), cloud radiative properties (optical depth or emissivity), cloud thermodynamic phase and bulk microphysical properties (effective particle size and water path). Differences in average cloud properties, especially in the amount of high-level clouds, are mostly explained by the inherent instrument measurement capability for detecting and/or identifying optically thin cirrus, especially when overlying low-level clouds. The study of long-term variations with these datasets requires consideration of many factors. The monthly, gridded database presented here facilitates further assessments, climate studies, and the evaluation of climate models.

  19. Characterization of Cloud Water-Content Distribution

    Science.gov (United States)

    Lee, Seungwon

    2010-01-01

    The development of realistic cloud parameterizations for climate models requires accurate characterizations of subgrid distributions of thermodynamic variables. To this end, a software tool was developed to characterize cloud water-content distributions in climate-model sub-grid scales. This software characterizes distributions of cloud water content with respect to cloud phase, cloud type, precipitation occurrence, and geo-location using CloudSat radar measurements. It uses a statistical method called maximum likelihood estimation to estimate the probability density function of the cloud water content.

  20. Nitric acid particles in cold thick ice clouds observed at global scale: Link with lightning, temperature, and upper tropospheric water vapor

    OpenAIRE

    Chepfer , H.; Minnis , P.; Dubuisson , P.; Chiriaco , Marjolaine; Sun-Mack , S.; Rivière , E.D.

    2007-01-01

    International audience; Signatures of nitric acid particles (NAP) in cold thick ice clouds have been derived from satellite observations. Most NAP are detected in the tropics (9 to 20% of clouds with T < 202.5 K). Higher occurrences were found in the rare midlatitudes very cold clouds. NAP occurrence increases as cloud temperature decreases, and NAP are more numerous in January than July. Comparisons of NAP and lightning distributions show that lightning seems to be the main source of the NOx...

  1. How Often and Why MODIS Cloud Property Retrievals Fail for Liquid-Phase Clouds over Ocean? a Comprehensive Analysis Based on a-Train Observations

    Science.gov (United States)

    Zhang, Z.; Cho, H. M.; Platnick, S. E.; Meyer, K.; Lebsock, M. D.

    2014-12-01

    The cloud optical thickness (τ) and droplet effective radius (re) are two key cloud parameters retrieved by MODIS (Moderate Resolution Imaging Spectroradiometer). These MODIS cloud products are widely used in a broad range of earth system science applications. In this paper, we present a comprehensive analysis of the failed cloud τ and/or re retrievals for liquid-phase clouds over ocean in the Collection 6 MODIS cloud product. The main findings from this study are summarized as follows: MODIS retrieval failure rates for marine boundary layer (MBL) clouds have a strong dependence on the spectral combination used for retrieval (e.g., 0.86 + 2.1 µm vs. 0.8 + 3.7 µm) and the cloud morphology (i.e., "good" pixels vs. partly cloudy (PCL) pixels). Combining all clear-sky-restoral (CSR) categories (CSR=0,1 and 3), the 0.86 + 2.1 µm and 0.86 + 3.7 µm spectral combinations have an overall failure rate of about 20% and 12%, respectively (See figure below). The PCL pixels (CSR=1 & 3) have significantly higher failure rates and contribute more to the total failure population than the "good" (CSR=0) pixels. The majority of the failed retrievals are caused by the re too large failure, which explains about 85% and 70% of the failed 0.86 + 2.1 µm and 0.86 + 3.7 µm retrievals, respectively. The remaining failures are either due to the re too small failure or τ retrieval failure. The geographical distribution of failure rates has a significant dependence on cloud regime, lower over the coastal stratocumulus cloud regime and higher over the broken trade-wind cumulus cloud regime over open oceans. Enhanced retrieval failure rates are found when MBL clouds have high sub-pixel inhomogeneity , or are located at special Sun-satellite viewing geometries, such as sunglint, large viewing or solar zenith angle, or cloudbow and glory angles, or subject to cloud masking, cloud overlapping and/or cloud phase retrieval issues. About 80% of the failure retrievals can be attributed to at

  2. Evolution of the Large Scale Circulation, Cloud Structure and Regional Water Cycle Associated with the South China Sea Monsoon During May-June, 1998

    Science.gov (United States)

    Lau, William K.-M.; Li, Xiao-Fan

    2001-01-01

    In this paper, changes in the large-scale circulation, cloud structures and regional water cycle associated with the evolution of the South China Sea (SCS) monsoon in May-June 1998 were investigated using data from the Tropical Rainfall Measuring Mission (TRMM) and field data from the South China Sea Monsoon Experiment (SCSMEX). Results showed that both tropical and extratropical processes strongly influenced the onset and evolution of the SCS monsoon. Prior to the onset of the SCS monsoon, enhanced convective activities associated with the Madden and Julian Oscillation were detected over the Indian Ocean, and the SCS was under the influence of the West Pacific Anticyclone (WPA) with prevailing low level easterlies and suppressed convection. Establishment of low-level westerlies across Indo-China, following the development of a Bay of Bengal depression played an important role in building up convective available potential energy over the SCS. The onset of SCS monsoon appeared to be triggered by the equatorward penetration of extratropical frontal system, which was established over the coastal region of southern China and Taiwan in early May. Convective activities over the SCS were found to vary inversely with those over the Yangtze River Valley (YRV). Analysis of TRMM microwave and precipitation radar data revealed that during the onset phase, convection over the northern SCS consisted of squall-type rain cell embedded in meso-scale complexes similar to extratropical systems. The radar Z-factor intensity indicated that SCS clouds possessed a bimodal distribution, with a pronounced signal (less than 30dBz) at a height of 2-3 km, and another one (less than 25 dBz) at the 8-10 km level, separated by a well-defined melting level indicated by a bright band at around 5-km level. The stratiform-to-convective cloud ratio was approximately 1:1 in the pre-onset phase, but increased to 5:1 in the active phase. Regional water budget calculations indicated that during the

  3. Technology Trends in Cloud Infrastructure

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Cloud computing is growing at an exponential pace with an increasing number of workloads being hosted in mega-scale public clouds such as Microsoft Azure. Designing and operating such large infrastructures requires not only a significant capital spend for provisioning datacenters, servers, networking and operating systems, but also R&D investments to capitalize on disruptive technology trends and emerging workloads such as AI/ML. This talk will cover the various infrastructure innovations being implemented in large scale public clouds and opportunities/challenges ahead to deliver the next generation of scale computing. About the speaker Kushagra Vaid is the general manager and distinguished engineer for Hardware Infrastructure in the Microsoft Azure division. He is accountable for the architecture and design of compute and storage platforms, which are the foundation for Microsoft’s global cloud-scale services. He and his team have successfully delivered four generations of hyperscale cloud hardwar...

  4. Cloud Computing Law

    CERN Document Server

    Millard, Christopher

    2013-01-01

    This book is about the legal implications of cloud computing. In essence, ‘the cloud’ is a way of delivering computing resources as a utility service via the internet. It is evolving very rapidly with substantial investments being made in infrastructure, platforms and applications, all delivered ‘as a service’. The demand for cloud resources is enormous, driven by such developments as the deployment on a vast scale of mobile apps and the rapid emergence of ‘Big Data’. Part I of this book explains what cloud computing is and how it works. Part II analyses contractual relationships between cloud service providers and their customers, as well as the complex roles of intermediaries. Drawing on primary research conducted by the Cloud Legal Project at Queen Mary University of London, cloud contracts are analysed in detail, including the appropriateness and enforceability of ‘take it or leave it’ terms of service, as well as the scope for negotiating cloud deals. Specific arrangements for public sect...

  5. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    Science.gov (United States)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the

  6. Monitoring Cloud-prone Complex Landscapes At Multiple Spatial Scales Using Medium And High Resolution Optical Data: A Case Study In Central Africa

    Science.gov (United States)

    Basnet, Bikash

    Tracking land surface dynamics over cloud-prone areas with complex mountainous terrain and a landscape that is heterogeneous at a scale of approximately 10 m, is an important challenge in the remote sensing of tropical regions in developing nations, due to the small plot sizes. Persistent monitoring of natural resources in these regions at multiple spatial scales requires development of tools to identify emerging land cover transformation due to anthropogenic causes, such as agricultural expansion and climate change. Along with the cloud cover and obstructions by topographic distortions due to steep terrain, there are limitations to the accuracy of monitoring change using available historical satellite imagery, largely due to sparse data access and the lack of high quality ground truth for classifier training. One such complex region is the Lake Kivu region in Central Africa. This work addressed these problems to create an effective process for monitoring the Lake Kivu region located in Central Africa. The Lake Kivu region is a biodiversity hotspot with a complex and heterogeneous landscape and intensive agricultural development, where individual plot sizes are often at the scale of 10m. Procedures were developed that use optical data from satellite and aerial observations at multiple scales to tackle the monitoring challenges. First, a novel processing chain was developed to systematically monitor the spatio-temporal land cover dynamics of this region over the years 1988, 2001, and 2011 using Landsat data, complemented by ancillary data. Topographic compensation was performed on Landsat reflectances to avoid the strong illumination angle impacts and image compositing was used to compensate for frequent cloud cover and thus incomplete annual data availability in the archive. A systematic supervised classification, using the state-of-the-art machine learning classifier Random Forest, was applied to the composite Landsat imagery to obtain land cover thematic maps

  7. A comparison of shock-cloud and wind-cloud interactions: effect of increased cloud density contrast on cloud evolution

    Science.gov (United States)

    Goldsmith, K. J. A.; Pittard, J. M.

    2018-05-01

    The similarities, or otherwise, of a shock or wind interacting with a cloud of density contrast χ = 10 were explored in a previous paper. Here, we investigate such interactions with clouds of higher density contrast. We compare the adiabatic hydrodynamic interaction of a Mach 10 shock with a spherical cloud of χ = 103 with that of a cloud embedded in a wind with identical parameters to the post-shock flow. We find that initially there are only minor morphological differences between the shock-cloud and wind-cloud interactions, compared to when χ = 10. However, once the transmitted shock exits the cloud, the development of a turbulent wake and fragmentation of the cloud differs between the two simulations. On increasing the wind Mach number, we note the development of a thin, smooth tail of cloud material, which is then disrupted by the fragmentation of the cloud core and subsequent `mass-loading' of the flow. We find that the normalized cloud mixing time (tmix) is shorter at higher χ. However, a strong Mach number dependence on tmix and the normalized cloud drag time, t_{drag}^' }, is not observed. Mach-number-dependent values of tmix and t_{drag}^' } from comparable shock-cloud interactions converge towards the Mach-number-independent time-scales of the wind-cloud simulations. We find that high χ clouds can be accelerated up to 80-90 per cent of the wind velocity and travel large distances before being significantly mixed. However, complete mixing is not achieved in our simulations and at late times the flow remains perturbed.

  8. Atmospheric diffusion of large clouds

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, T. V. [Univ. of California, Lawrence Radiation Lab., Livermore, California (United States)

    1967-07-01

    Clouds of pollutants travel within a coordinate system that is fixed to the earth's surface, and they diffuse and grow within a coordinate system fixed to the cloud's center. This paper discusses an approach to predicting the cloud's properties, within the latter coordinate system, on space scales of a few hundred meters to a few hundred kilometers and for time periods of a few days. A numerical cloud diffusion model is presented which starts with a cloud placed arbitrarily within the troposphere. Similarity theories of atmospheric turbulence are used to predict the horizontal diffusivity as a function of initial cloud size, turbulent atmospheric dissipation, and time. Vertical diffusivity is input as a function of time and height. Therefore, diurnal variations of turbulent diffusion in the boundary layer and effects of temperature inversions, etc. can be modeled. Nondiffusive cloud depletion mechanisms, such as dry deposition, washout, and radioactive decay, are also a part of this numerical model. An effluent cloud, produced by a reactor run at the Nuclear Rocket Development Station, Nevada, is discussed in this paper. Measurements on this cloud, for a period of two days, are compared to calculations with the above numerical cloud diffusion model. In general, there is agreement. within a factor of two, for airborne concentrations, cloud horizontal area, surface air concentrations, and dry deposition as airborne concentration decreased by seven orders of magnitude during the two-day period. (author)

  9. Marine Cloud Brightening

    Energy Technology Data Exchange (ETDEWEB)

    Latham, John; Bower, Keith; Choularton, Tom; Coe, H.; Connolly, P.; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John; Neukermans, Armand; Ormond, Bob; Parkes, Ben; Rasch, Philip J.; Rush, John; Salter, Stephen; Stevenson, Tom; Wang, Hailong; Wang, Qin; Wood, Robert

    2012-09-07

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could - subject to satisfactory resolution of technical and scientific problems identified herein - have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involves (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seedparticle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud-albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action.

  10. Marine cloud brightening.

    Science.gov (United States)

    Latham, John; Bower, Keith; Choularton, Tom; Coe, Hugh; Connolly, Paul; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John; Neukermans, Armand; Ormond, Bob; Parkes, Ben; Rasch, Phillip; Rush, John; Salter, Stephen; Stevenson, Tom; Wang, Hailong; Wang, Qin; Wood, Rob

    2012-09-13

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could-subject to satisfactory resolution of technical and scientific problems identified herein-have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involves (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seed-particle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud-albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100×100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action.

  11. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  12. Privacy proof in the cloud

    NARCIS (Netherlands)

    Jessen, Veerle; Weigand, Hans; Mouratidis, Haris

    Cloud computing has been a frequently researched subject as it brings many advantages, such as the ability to store data remotely and scale rapidly, but also comes with several issues, including privacy, trust and security. The decision whether it is best to go `into the cloud' or to `stay inside'

  13. CLOUD PARAMETERIZATIONS, CLOUD PHYSICS, AND THEIR CONNECTIONS: AN OVERVIEW

    International Nuclear Information System (INIS)

    LIU, Y.; DAUM, P.H.; CHAI, S.K.; LIU, F.

    2002-01-01

    This paper consists of three parts. The first part is concerned with the parameterization of cloud microphysics in climate models. We demonstrate the crucial importance of spectral dispersion of the cloud droplet size distribution in determining radiative properties of clouds (e.g., effective radius), and underline the necessity of specifying spectral dispersion in the parameterization of cloud microphysics. It is argued that the inclusion of spectral dispersion makes the issue of cloud parameterization essentially equivalent to that of the droplet size distribution function, bringing cloud parameterization to the forefront of cloud physics. The second part is concerned with theoretical investigations into the spectral shape of droplet size distributions in cloud physics. After briefly reviewing the mainstream theories (including entrainment and mixing theories, and stochastic theories), we discuss their deficiencies and the need for a paradigm shift from reductionist approaches to systems approaches. A systems theory that has recently been formulated by utilizing ideas from statistical physics and information theory is discussed, along with the major results derived from it. It is shown that the systems formalism not only easily explains many puzzles that have been frustrating the mainstream theories, but also reveals such new phenomena as scale-dependence of cloud droplet size distributions. The third part is concerned with the potential applications of the systems theory to the specification of spectral dispersion in terms of predictable variables and scale-dependence under different fluctuating environments

  14. Making and Breaking Clouds

    Science.gov (United States)

    Kohler, Susanna

    2017-10-01

    they create. But to match with observations, this wouldsuggest that molecular clouds are short-lived objects that are built (and therefore replenished) just as quickly as they are destroyed. Is this possible?Speedy Building?In a recent study, a team of scientists led by Mordecai-Mark Mac Low (American Museum of Natural History and Heidelberg University, Germany) explore whether there is a way to create molecular clouds rapidly enough to match the necessary rate of destruction.Mac Low and collaborators find that some common mechanisms used to explain the formation of molecular clouds like gas being swept up by supernovae cant quite operate quickly enough to combat the rate of cloud destruction. On the other hand, the Toomre gravitational instability,which is a large-scale gravitational instability that occurs in gas disks,can very rapidly assemble gas into clumps dense enough to form molecules.A composite of visible and near-infrared images from the VLT ANTU telescope of the Barnard 68 molecular cloud, roughly half a light-year in diameter. [ESO]A Rapid CycleBased on their findings, the authors argue that dense, star-forming molecular clouds persist only for a short time before collapsing into stars and then being blown apart by stellar feedback but these very clouds are built equally quickly via gravitational instabilities.Conveniently, this model has a very testable prediction: the Toomre instability is expected to become even stronger at higher redshift, which suggests that the fraction of gas in the form of molecules should increase at high redshifts. This appears to agree with observations, supporting the authors picture of a rapid cycle of cloud assembly and destruction.CitationMordecai-Mark Mac Low et al 2017 ApJL 847 L10. doi:10.3847/2041-8213/aa8a61

  15. From clouds to cores to envelopes to disks: a multi-scale view of magnetized star formation

    Science.gov (United States)

    Hull, Charles; Plambeck, R. L.; TADPOL survey Team

    2014-01-01

    Magnetic fields are thought to play an important role in the formation of stars. However, that importance has been called into question by previous observations showing misalignment between protostellar outflows and magnetic fields (B-fields), as well as inconsistency in field morphology between 10,000 and 1000 AU scales. To investigate these inconsistencies, we used the 1.3 mm full-Stokes polarimeter — which I tested, installed, and calibrated for CARMA, a mm-wave interferometer — to map dust polarization with ~2.5" resolution toward 29 star-forming cores and 8 star-forming regions as part of the TADPOL survey. We find that a subset of the sources have consistent B-field orientations between the large 20") scales measured by single-dish submm bolometers and the small scales measured by CARMA. Those same sources also tend to have higher fractional polarizations (measured by CARMA), presumably because the B-fields are less twisted by dynamic effects. However, even in these sources, which seem to have retained the memory of the global B-field direction, the fields in the cores are misaligned with the disks and outflows in the central protostars — a key result of the TADPOL survey. Furthermore, the cores with lower polarization fractions tend to have B-fields that are perpendicular to outflows, which suggests that in these sources the B-fields have lost the memory of the larger-scale global field, and have been wrapped up by core rotation. This is an important result for disk formation theory, as it suggests that field misalignment may indeed be the solution to the magnetic braking catastrophe. Finally, we find that all sources exhibit the so-called “polarization hole” effect, where the polarization drops significantly near the total intensity peak. When this effect was seen in low-resolution single-dish maps, it was attributed to the averaging of unresolved structure in the plane of the sky. However, the higher resolution maps we present here resolve these

  16. Cloud Cover

    Science.gov (United States)

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  17. Cloud Control

    Science.gov (United States)

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  18. RECENT THREATS TO CLOUD COMPUTING DATA AND ITS PREVENTION MEASURES

    OpenAIRE

    Rahul Neware*

    2017-01-01

    As the cloud computing is expanding day by day due to its benefits like Cost, Speed Global Scale, Productivity, Performance, Reliability etc. Everyone, like Business vendors, governments etc are using the cloud computing to grow fast. Although Cloud Computing has above mentioned and other benefits but security of cloud is problems and due to this security problem adoption of cloud computing is not growing. This paper gives information about recent threats to the cloud computing data and its p...

  19. Sentinel-1 data massive processing for large scale DInSAR analyses within Cloud Computing environments through the P-SBAS approach

    Science.gov (United States)

    Lanari, Riccardo; Bonano, Manuela; Buonanno, Sabatino; Casu, Francesco; De Luca, Claudio; Fusco, Adele; Manunta, Michele; Manzo, Mariarosaria; Pepe, Antonio; Zinno, Ivana

    2017-04-01

    -core programming techniques. Currently, Cloud Computing environments make available large collections of computing resources and storage that can be effectively exploited through the presented S1 P-SBAS processing chain to carry out interferometric analyses at a very large scale, in reduced time. This allows us to deal also with the problems connected to the use of S1 P-SBAS chain in operational contexts, related to hazard monitoring and risk prevention and mitigation, where handling large amounts of data represents a challenging task. As a significant experimental result we performed a large spatial scale SBAS analysis relevant to the Central and Southern Italy by exploiting the Amazon Web Services Cloud Computing platform. In particular, we processed in parallel 300 S1 acquisitions covering the Italian peninsula from Lazio to Sicily through the presented S1 P-SBAS processing chain, generating 710 interferograms, thus finally obtaining the displacement time series of the whole processed area. This work has been partially supported by the CNR-DPC agreement, the H2020 EPOS-IP project (GA 676564) and the ESA GEP project.

  20. Marine cloud brightening

    Science.gov (United States)

    Latham, John; Bower, Keith; Choularton, Tom; Coe, Hugh; Connolly, Paul; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John; Neukermans, Armand; Ormond, Bob; Parkes, Ben; Rasch, Phillip; Rush, John; Salter, Stephen; Stevenson, Tom; Wang, Hailong; Wang, Qin; Wood, Rob

    2012-01-01

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could—subject to satisfactory resolution of technical and scientific problems identified herein—have the capacity to balance global warming up to the carbon dioxide-doubling point. We describe herein an account of our recent research on a number of critical issues associated with MCB. This involves (i) GCM studies, which are our primary tools for evaluating globally the effectiveness of MCB, and assessing its climate impacts on rainfall amounts and distribution, and also polar sea-ice cover and thickness; (ii) high-resolution modelling of the effects of seeding on marine stratocumulus, which are required to understand the complex array of interacting processes involved in cloud brightening; (iii) microphysical modelling sensitivity studies, examining the influence of seeding amount, seed-particle salt-mass, air-mass characteristics, updraught speed and other parameters on cloud–albedo change; (iv) sea water spray-production techniques; (v) computational fluid dynamics studies of possible large-scale periodicities in Flettner rotors; and (vi) the planning of a three-stage limited-area field research experiment, with the primary objectives of technology testing and determining to what extent, if any, cloud albedo might be enhanced by seeding marine stratocumulus clouds on a spatial scale of around 100×100 km. We stress that there would be no justification for deployment of MCB unless it was clearly established that no significant adverse consequences would result. There would also need to be an international agreement firmly in favour of such action

  1. CloudETL

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Thomsen, Christian; Pedersen, Torben Bach

    2014-01-01

    Extract-Transform-Load (ETL) programs process data into data warehouses (DWs). Rapidly growing data volumes demand systems that scale out. Recently, much attention has been given to MapReduce for parallel handling of massive data sets in cloud environments. Hive is the most widely used RDBMS...

  2. Nitric acid particles in cold thick ice clouds observed at global scale: Link with lightning, temperature, and upper tropospheric water vapor

    Science.gov (United States)

    Chepfer, H.; Minnis, P.; Dubuisson, P.; Chiriaco, M.; Sun-Mack, S.; RivièRe, E. D.

    2007-03-01

    Signatures of nitric acid particles (NAP) in cold thick ice clouds have been derived from satellite observations. Most NAP are detected in the tropics (9 to 20% of clouds with T < 202.5 K). Higher occurrences were found in the rare midlatitudes very cold clouds. NAP occurrence increases as cloud temperature decreases, and NAP are more numerous in January than July. Comparisons of NAP and lightning distributions show that lightning seems to be the main source of the NOx, which forms NAP in cold clouds over continents. Qualitative comparisons of NAP with upper tropospheric humidity distributions suggest that NAP may play a role in the dehydration of the upper troposphere when the tropopause is colder than 195 K.

  3. Evaluating Vegetation Type Effects on Land Surface Temperature at the City Scale

    Science.gov (United States)

    Wetherley, E. B.; McFadden, J. P.; Roberts, D. A.

    2017-12-01

    Understanding the effects of different plant functional types and urban materials on surface temperatures has significant consequences for climate modeling, water management, and human health in cities. To date, doing so at the urban scale has been complicated by small-scale surface heterogeneity and limited data. In this study we examined gradients of land surface temperature (LST) across sub-pixel mixtures of different vegetation types and urban materials across the entire Los Angeles, CA, metropolitan area (4,283 km2). We used AVIRIS airborne hyperspectral imagery (36 m resolution, 224 bands, 0.35 - 2.5 μm) to estimate sub-pixel fractions of impervious, pervious, tree, and turfgrass surfaces, validating them with simulated mixtures constructed from image spectra. We then used simultaneously imaged LST retrievals collected at multiple times of day to examine how temperature changed along gradients of the sub-pixel mixtures. Diurnal in situ LST measurements were used to confirm image values. Sub-pixel fractions were well correlated with simulated validation data for turfgrass (r2 = 0.71), tree (r2 = 0.77), impervious (r2 = 0.77), and pervious (r2 = 0.83) surfaces. The LST of pure pixels showed the effects of both the diurnal cycle and the surface type, with vegetated classes having a smaller diurnal temperature range of 11.6°C whereas non-vegetated classes had a diurnal range of 16.2°C (similar to in situ measurements collected simultaneously with the imagery). Observed LST across fractional gradients of turf/impervious and tree/impervious sub-pixel mixtures decreased linearly with increasing vegetation fraction. The slopes of decreasing LST were significantly different between tree and turf mixtures, with steeper slopes observed for turf (p < 0.05). These results suggest that different physiological characteristics and different access to irrigation water of urban trees and turfgrass results in significantly different LST effects, which can be detected at

  4. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  5. Scaling Critical Zone analysis tasks from desktop to the cloud utilizing contemporary distributed computing and data management approaches: A case study for project based learning of Cyberinfrastructure concepts

    Science.gov (United States)

    Swetnam, T. L.; Pelletier, J. D.; Merchant, N.; Callahan, N.; Lyons, E.

    2015-12-01

    Earth science is making rapid advances through effective utilization of large-scale data repositories such as aerial LiDAR and access to NSF-funded cyberinfrastructures (e.g. the OpenTopography.org data portal, iPlant Collaborative, and XSEDE). Scaling analysis tasks that are traditionally developed using desktops, laptops or computing clusters to effectively leverage national and regional scale cyberinfrastructure pose unique challenges and barriers to adoption. To address some of these challenges in Fall 2014 an 'Applied Cyberinfrastructure Concepts' a project-based learning course (ISTA 420/520) at the University of Arizona focused on developing scalable models of 'Effective Energy and Mass Transfer' (EEMT, MJ m-2 yr-1) for use by the NSF Critical Zone Observatories (CZO) project. EEMT is a quantitative measure of the flux of available energy to the critical zone, and its computation involves inputs that have broad applicability (e.g. solar insolation). The course comprised of 25 students with varying level of computational skills and with no prior domain background in the geosciences, collaborated with domain experts to develop the scalable workflow. The original workflow relying on open-source QGIS platform on a laptop was scaled to effectively utilize cloud environments (Openstack), UA Campus HPC systems, iRODS, and other XSEDE and OSG resources. The project utilizes public data, e.g. DEMs produced by OpenTopography.org and climate data from Daymet, which are processed using GDAL, GRASS and SAGA and the Makeflow and Work-queue task management software packages. Students were placed into collaborative groups to develop the separate aspects of the project. They were allowed to change teams, alter workflows, and design and develop novel code. The students were able to identify all necessary dependencies, recompile source onto the target execution platforms, and demonstrate a functional workflow, which was further improved upon by one of the group leaders over

  6. Evaluation of Passive Multilayer Cloud Detection Using Preliminary CloudSat and CALIPSO Cloud Profiles

    Science.gov (United States)

    Minnis, P.; Sun-Mack, S.; Chang, F.; Huang, J.; Nguyen, L.; Ayers, J. K.; Spangenberg, D. A.; Yi, Y.; Trepte, C. R.

    2006-12-01

    During the last few years, several algorithms have been developed to detect and retrieve multilayered clouds using passive satellite data. Assessing these techniques has been difficult due to the need for active sensors such as cloud radars and lidars that can "see" through different layers of clouds. Such sensors have been available only at a few surface sites and on aircraft during field programs. With the launch of the CALIPSO and CloudSat satellites on April 28, 2006, it is now possible to observe multilayered systems all over the globe using collocated cloud radar and lidar data. As part of the A- Train, these new active sensors are also matched in time ad space with passive measurements from the Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) and Advanced Microwave Scanning Radiometer - EOS (AMSR-E). The Clouds and the Earth's Radiant Energy System (CERES) has been developing and testing algorithms to detect ice-over-water overlapping cloud systems and to retrieve the cloud liquid path (LWP) and ice water path (IWP) for those systems. One technique uses a combination of the CERES cloud retrieval algorithm applied to MODIS data and a microwave retrieval method applied to AMSR-E data. The combination of a CO2-slicing cloud retireval technique with the CERES algorithms applied to MODIS data (Chang et al., 2005) is used to detect and analyze such overlapped systems that contain thin ice clouds. A third technique uses brightness temperature differences and the CERES algorithms to detect similar overlapped methods. This paper uses preliminary CloudSat and CALIPSO data to begin a global scale assessment of these different methods. The long-term goals are to assess and refine the algorithms to aid the development of an optimal combination of the techniques to better monitor ice 9and liquid water clouds in overlapped conditions.

  7. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  8. TURBULENCE DECAY AND CLOUD CORE RELAXATION IN MOLECULAR CLOUDS

    International Nuclear Information System (INIS)

    Gao, Yang; Law, Chung K.; Xu, Haitao

    2015-01-01

    The turbulent motion within molecular clouds is a key factor controlling star formation. Turbulence supports molecular cloud cores from evolving to gravitational collapse and hence sets a lower bound on the size of molecular cloud cores in which star formation can occur. On the other hand, without a continuous external energy source maintaining the turbulence, such as in molecular clouds, the turbulence decays with an energy dissipation time comparable to the dynamic timescale of clouds, which could change the size limits obtained from Jean's criterion by assuming constant turbulence intensities. Here we adopt scaling relations of physical variables in decaying turbulence to analyze its specific effects on the formation of stars. We find that the decay of turbulence provides an additional approach for Jeans' criterion to be achieved, after which gravitational infall governs the motion of the cloud core. This epoch of turbulence decay is defined as cloud core relaxation. The existence of cloud core relaxation provides a more complete understanding of the effect of the competition between turbulence and gravity on the dynamics of molecular cloud cores and star formation

  9. Context-aware distributed cloud computing using CloudScheduler

    Science.gov (United States)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  10. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  11. Mobile Clouds

    DEFF Research Database (Denmark)

    Fitzek, Frank; Katz, Marcos

    A mobile cloud is a cooperative arrangement of dynamically connected communication nodes sharing opportunistic resources. In this book, authors provide a comprehensive and motivating overview of this rapidly emerging technology. The book explores how distributed resources can be shared by mobile...... users in very different ways and for various purposes. The book provides many stimulating examples of resource-sharing applications. Enabling technologies for mobile clouds are also discussed, highlighting the key role of network coding. Mobile clouds have the potential to enhance communications...... performance, improve utilization of resources and create flexible platforms to share resources in very novel ways. Energy efficient aspects of mobile clouds are discussed in detail, showing how being cooperative can bring mobile users significant energy saving. The book presents and discusses multiple...

  12. The PdBI arcsecond whirlpool survey (PAWS). I. A cloud-scale/multi-wavelength view of the interstellar medium in a grand-design spiral galaxy

    International Nuclear Information System (INIS)

    Schinnerer, Eva; Meidt, Sharon E.; Hughes, Annie; Colombo, Dario; Pety, Jérôme; Schuster, Karl F.; Dumas, Gaëlle; García-Burillo, Santiago; Dobbs, Clare L.; Leroy, Adam K.; Kramer, Carsten; Thompson, Todd A.; Regan, Michael W.

    2013-01-01

    The Plateau de Bure Interferometer Arcsecond Whirlpool Survey has mapped the molecular gas in the central ∼9 kpc of M51 in its 12 CO(1-0) line emission at a cloud-scale resolution of ∼40 pc using both IRAM telescopes. We utilize this data set to quantitatively characterize the relation of molecular gas (or CO emission) to other tracers of the interstellar medium, star formation, and stellar populations of varying ages. Using two-dimensional maps, a polar cross-correlation technique and pixel-by-pixel diagrams, we find: (1) that (as expected) the distribution of the molecular gas can be linked to different components of the gravitational potential; (2) evidence for a physical link between CO line emission and radio continuum that seems not to be caused by massive stars, but rather depends on the gas density; (3) a close spatial relation between polycyclic aromatic hydrocarbon (PAH) and molecular gas emission, but no predictive power of PAH emission for the molecular gas mass; (4) that the I – H color map is an excellent predictor of the distribution (and to a lesser degree, the brightness) of CO emission; and (5) that the impact of massive (UV-intense) young star-forming regions on the bulk of the molecular gas in central ∼9 kpc cannot be significant due to a complex spatial relation between molecular gas and star-forming regions that ranges from cospatial to spatially offset to absent. The last point, in particular, highlights the importance of galactic environment—and thus the underlying gravitational potential—for the distribution of molecular gas and star formation.

  13. Extending 3D near-cloud corrections from shorter to longer wavelengths

    International Nuclear Information System (INIS)

    Marshak, Alexander; Evans, K. Frank; Várnai, Tamás; Wen, Guoyong

    2014-01-01

    Satellite observations have shown a positive correlation between cloud amount and aerosol optical thickness (AOT) that can be explained by the humidification of aerosols near clouds, and/or by cloud contamination by sub-pixel size clouds and the cloud adjacency effect. The last effect may substantially increase reflected radiation in cloud-free columns, leading to overestimates in the retrieved AOT. For clear-sky areas near boundary layer clouds the main contribution to the enhancement of clear sky reflectance at shorter wavelengths comes from the radiation scattered into clear areas by clouds and then scattered to the sensor by air molecules. Because of the wavelength dependence of air molecule scattering, this process leads to a larger reflectance increase at shorter wavelengths, and can be corrected using a simple two-layer model [18]. However, correcting only for molecular scattering skews spectral properties of the retrieved AOT. Kassianov and Ovtchinnikov [9] proposed a technique that uses spectral reflectance ratios to retrieve AOT in the vicinity of clouds; they assumed that the cloud adjacency effect influences the spectral ratio between reflectances at two wavelengths less than it influences the reflectances themselves. This paper combines the two approaches: It assumes that the 3D correction for the shortest wavelength is known with some uncertainties, and then it estimates the 3D correction for longer wavelengths using a modified ratio method. The new approach is tested with 3D radiances simulated for 26 cumulus fields from Large-Eddy Simulations, supplemented with 40 aerosol profiles. The results showed that (i) for a variety of cumulus cloud scenes and aerosol profiles over ocean the 3D correction due to cloud adjacency effect can be extended from shorter to longer wavelengths and (ii) the 3D corrections for longer wavelengths are not very sensitive to unbiased random uncertainties in the 3D corrections at shorter wavelengths. - Highlights:

  14. Evolution of Cloud Storage as Cloud Computing Infrastructure Service

    OpenAIRE

    Rajan, Arokia Paul; Shanmugapriyaa

    2013-01-01

    Enterprises are driving towards less cost, more availability, agility, managed risk - all of which is accelerated towards Cloud Computing. Cloud is not a particular product, but a way of delivering IT services that are consumable on demand, elastic to scale up and down as needed, and follow a pay-for-usage model. Out of the three common types of cloud computing service models, Infrastructure as a Service (IaaS) is a service model that provides servers, computing power, network bandwidth and S...

  15. Determination of clouds in MSG data for the validation of clouds in a regional climate model

    OpenAIRE

    Huckle, Roger

    2009-01-01

    Regional climate models (e.g. CLM) can help to asses the influence of the antropogenic climate change on the different regions of the earth. Validation of these models is very important. Satellite data are of great benefit, as data on a global scale and high temporal resolution is available. In this thesis a cloud detection and object based cloud classification for Meteosat Second Generation (MSG) was developed and used to validate CLM clouds. Results show sometimes too many clouds in the CLM.

  16. Soft Clouding

    DEFF Research Database (Denmark)

    Søndergaard, Morten; Markussen, Thomas; Wetton, Barnabas

    2012-01-01

    Soft Clouding is a blended concept, which describes the aim of a collaborative and transdisciplinary project. The concept is a metaphor implying a blend of cognitive, embodied interaction and semantic web. Furthermore, it is a metaphor describing our attempt of curating a new semantics of sound...... archiving. The Soft Clouding Project is part of LARM - a major infrastructure combining research in and access to sound and radio archives in Denmark. In 2012 the LARM infrastructure will consist of more than 1 million hours of radio, combined with metadata who describes the content. The idea is to analyse...... the concept of ‘infrastructure’ and ‘interface’ on a creative play with the fundamentals of LARM (and any sound archive situation combining many kinds and layers of data and sources). This paper will present and discuss the Soft clouding project from the perspective of the three practices and competencies...

  17. Cloud Chamber

    DEFF Research Database (Denmark)

    Gfader, Verina

    Cloud Chamber takes its roots in a performance project, titled The Guests 做东, devised by Verina Gfader for the 11th Shanghai Biennale, ‘Why Not Ask Again: Arguments, Counter-arguments, and Stories’. Departing from the inclusion of the biennale audience to write a future folk tale, Cloud Chamber......: fiction and translation and translation through time; post literacy; world picturing-world typing; and cartographic entanglements and expressions of subjectivity; through the lens a social imaginary of worlding or cosmological quest. Art at its core? Contributions by Nikos Papastergiadis, Rebecca Carson...

  18. Scales

    Science.gov (United States)

    Scales are a visible peeling or flaking of outer skin layers. These layers are called the stratum ... Scales may be caused by dry skin, certain inflammatory skin conditions, or infections. Examples of disorders that ...

  19. Quantifying Diurnal Cloud Radiative Effects by Cloud Type in the Tropical Western Pacific

    Energy Technology Data Exchange (ETDEWEB)

    Burleyson, Casey D.; Long, Charles N.; Comstock, Jennifer M.

    2015-06-01

    Cloud radiative effects are examined using long-term datasets collected at the three Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facilities in the tropical western Pacific. We quantify the surface radiation budget, cloud populations, and cloud radiative effects by partitioning the data by cloud type, time of day, and as a function of large scale modes of variability such as El Niño Southern Oscillation (ENSO) phase and wet/dry seasons at Darwin. The novel facet of our analysis is that we break aggregate cloud radiative effects down by cloud type across the diurnal cycle. The Nauru cloud populations and subsequently the surface radiation budget are strongly impacted by ENSO variability whereas the cloud populations over Manus only shift slightly in response to changes in ENSO phase. The Darwin site exhibits large seasonal monsoon related variations. We show that while deeper convective clouds have a strong conditional influence on the radiation reaching the surface, their limited frequency reduces their aggregate radiative impact. The largest source of shortwave cloud radiative effects at all three sites comes from low clouds. We use the observations to demonstrate that potential model biases in the amplitude of the diurnal cycle and mean cloud frequency would lead to larger errors in the surface energy budget compared to biases in the timing of the diurnal cycle of cloud frequency. Our results provide solid benchmarks to evaluate model simulations of cloud radiative effects in the tropics.

  20. Enabling Global Observations of Clouds and Precipitation on Fine Spatio-Temporal Scales from CubeSat Constellations: Temporal Experiment for Storms and Tropical Systems Technology Demonstration (TEMPEST-D)

    Science.gov (United States)

    Reising, S. C.; Todd, G.; Padmanabhan, S.; Lim, B.; Heneghan, C.; Kummerow, C.; Chandra, C. V.; Berg, W. K.; Brown, S. T.; Pallas, M.; Radhakrishnan, C.

    2017-12-01

    The Temporal Experiment for Storms and Tropical Systems (TEMPEST) mission concept consists of a constellation of 5 identical 6U-Class satellites observing storms at 5 millimeter-wave frequencies with 5-10 minute temporal sampling to observe the time evolution of clouds and their transition to precipitation. Such a small satellite mission would enable the first global measurements of clouds and precipitation on the time scale of tens of minutes and the corresponding spatial scale of a few km. TEMPEST is designed to improve the understanding of cloud processes by providing critical information on temporal signatures of precipitation and helping to constrain one of the largest sources of uncertainty in cloud models. TEMPEST millimeter-wave radiometers are able to perform remote observations of the cloud interior to observe microphysical changes as the cloud begins to precipitate or ice accumulates inside the storm. The TEMPEST technology demonstration (TEMPEST-D) mission is in progress to raise the TRL of the instrument and spacecraft systems from 6 to 9 as well as to demonstrate radiometer measurement and differential drag capabilities required to deploy a constellation of 6U-Class satellites in a single orbital plane. The TEMPEST-D millimeter-wave radiometer instrument provides observations at 89, 165, 176, 180 and 182 GHz using a single compact instrument designed for 6U-Class satellites. The direct-detection topology of the radiometer receiver substantially reduces both its power consumption and design complexity compared to heterodyne receivers. The TEMPEST-D instrument performs precise, end-to-end calibration using a cross-track scanning reflector to view an ambient blackbody calibration target and cosmic microwave background every scan period. The TEMPEST-D radiometer instrument has been fabricated and successfully tested under environmental conditions (vibration, thermal cycling and vacuum) expected in low-Earth orbit. TEMPEST-D began in Aug. 2015, with a

  1. Formation of massive, dense cores by cloud-cloud collisions

    Science.gov (United States)

    Takahira, Ken; Shima, Kazuhiro; Habe, Asao; Tasker, Elizabeth J.

    2018-05-01

    We performed sub-parsec (˜ 0.014 pc) scale simulations of cloud-cloud collisions of two idealized turbulent molecular clouds (MCs) with different masses in the range of (0.76-2.67) × 104 M_{⊙} and with collision speeds of 5-30 km s-1. Those parameters are larger than in Takahira, Tasker, and Habe (2014, ApJ, 792, 63), in which study the colliding system showed a partial gaseous arc morphology that supports the NANTEN observations of objects indicated to be colliding MCs using numerical simulations. Gas clumps with density greater than 10-20 g cm-3 were identified as pre-stellar cores and tracked through the simulation to investigate the effects of the mass of colliding clouds and the collision speeds on the resulting core population. Our results demonstrate that the smaller cloud property is more important for the results of cloud-cloud collisions. The mass function of formed cores can be approximated by a power-law relation with an index γ = -1.6 in slower cloud-cloud collisions (v ˜ 5 km s-1), and is in good agreement with observation of MCs. A faster relative speed increases the number of cores formed in the early stage of collisions and shortens the gas accretion phase of cores in the shocked region, leading to the suppression of core growth. The bending point appears in the high-mass part of the core mass function and the bending point mass decreases with increase in collision speed for the same combination of colliding clouds. The higher-mass part of the core mass function than the bending point mass can be approximated by a power law with γ = -2-3 that is similar to the power index of the massive part of the observed stellar initial mass function. We discuss implications of our results for the massive-star formation in our Galaxy.

  2. General overview: European Integrated project on Aerosol Cloud Climate and Air Quality interactions (EUCAARI) – integrating aerosol research from nano to global scales

    DEFF Research Database (Denmark)

    Kulmala, M.; Asmi, A.; Lappalainen, H. K.

    2011-01-01

    In this paper we describe and summarize the main achievements of the European Aerosol Cloud Climate and Air Quality Interactions project (EUCAARI). EUCAARI started on 1 January 2007 and ended on 31 December 2010 leaving a rich legacy including: (a) a comprehensive database with a year...... of observations of the physical, chemical and optical properties of aerosol particles over Europe, (b) comprehensive aerosol measurements in four developing countries, (c) a database of airborne measurements of aerosols and clouds over Europe during May 2008, (d) comprehensive modeling tools to study aerosol...

  3. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  4. Cloud Computing

    Indian Academy of Sciences (India)

    IAS Admin

    2014-03-01

    Mar 1, 2014 ... There are several types of services available on a cloud. We describe .... CPU speed has been doubling every 18 months at constant cost. Besides this ... Plain text (e.g., email) may be read by anyone who is able to access it.

  5. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  6. Galaxy CloudMan: delivering cloud compute clusters.

    Science.gov (United States)

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  7. General overview: European Integrated project on Aerosol Cloud Climate and Air Quality interactions (EUCAARI)-integrating aerosol research from nano to global scales

    NARCIS (Netherlands)

    Kulmala, M.; Asmi, A.; Lappalainen, H.K.; Baltensperger, U.; Brenguier, J.-L.; Facchini, M.C.; Hansson, H.-C.; Hov, Ø.; O'Dowd, C.D.; Pöschl, U.; Wiedensohler, A.; Boers, R.; Boucher, O.; Leeuw, G. de; Denier van der Gon, H.A.C.; Feichter, J.; Krejci, R.; Laj, P.; Lihavainen, H.; Lohmann, U.; McFiggans, G.; Mentel, T.; Pilinis, C.; Riipinen, I.; Schulz, M.; Stohl, A.; Swietlicki, E.; Vignati, E.; Alves, C.; Amann, M.; Ammann, M.; Arabas, S.; Artaxo, P.; Baars, H.; Beddows, D.C.S.; Bergström, R.; Beukes, J.P.; Bilde, M.; Burkhart, J.F.; Canonaco, F.; Clegg, S.L.; Coe, H.; Crumeyrolle, S.; D'Anna, B.; Decesari, S.; Gilardoni, S.; Fischer, M.; Fjaeraa, A.M.; Fountoukis, C.; George, C.; Gomes, L.; Halloran, P.; Hamburger, T.; Harrison, R.M.; Herrmann, H.; Hoffmann, T.; Hoose, C.; Hu, M.; Hyvärinen, A.; Hõrrak, U.; Iinuma, Y.; Iversen, T.; Josipovic, M.; Kanakidou, M.; Kiendler-Scharr, A.; Kirkevåg, A.; Kiss, G.; Klimont, Z.; Kolmonen, P.; Komppula, M.; Kristjánsson, J.-E.; Laakso, L.; Laaksonen, A.; Labonnote, L.; Lanz, V.A.; Lehtinen, K.E.J.; Rizzo, L.V.; Makkonen, R.; Manninen, H.E.; McMeeking, G.; Merikanto, J.; Minikin, A.; Mirme, S.; Morgan, W.T.; Nemitz, E.; O'Donnell, D.; Panwar, T.S.; Pawlowska, H.; Petzold, A.; Pienaar, J.J.; Pio, C.; Plass-Duelmer, C.; Prévôt, A.S.H.; Pryor, S.; Reddington, C.L.; Roberts, G.; Rosenfeld, D.; Schwarz, J.; Seland, O.; Sellegri, K.; Shen, X.J.; Shiraiwa, M.; Siebert, H.; Sierau, B.; Simpson, D.; Sun, J.Y.; Topping, D.; Tunved, P.; Vaattovaara, P.; Vakkari, V.; Veefkind, J.P.; Visschedijk, A.; Vuollekoski, H.; Vuolo, R.; Wehner, B.; Wildt, J.; Woodward, S.; Worsnop, D.R.; Zadelhoff, G.J. van; Zardini, A.A.; Zhang, K.; Zyl, P.G. van; Kerminen, V.-M.; Carslaw, K.S.; Pandis, S.N.

    2011-01-01

    In this paper we describe and summarize the main achievements of the European Aerosol Cloud Climate and Air Quality Interactions project (EUCAARI). EUCAARI started on 1 January 2007 and ended on 31 December 2010 leaving a rich legacy including: (a) a comprehensive database with a year of

  8. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  9. Consolidation of Cloud Computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Di Girolamo, Alessandro; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall

    2016-01-01

    Throughout the first year of LHC Run 2, ATLAS Cloud Computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS Cloud Computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vac resources, streamlined usage of the High Level Trigger cloud for simulation and reconstruction, extreme scaling on Amazon EC2, and procurement of commercial cloud capacity in Europe. Building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems. ...

  10. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  11. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  12. Cloud management and security

    CERN Document Server

    Abbadi, Imad M

    2014-01-01

    Written by an expert with over 15 years' experience in the field, this book establishes the foundations of Cloud computing, building an in-depth and diverse understanding of the technologies behind Cloud computing. In this book, the author begins with an introduction to Cloud computing, presenting fundamental concepts such as analyzing Cloud definitions, Cloud evolution, Cloud services, Cloud deployment types and highlighting the main challenges. Following on from the introduction, the book is divided into three parts: Cloud management, Cloud security, and practical examples. Part one presents the main components constituting the Cloud and federated Cloud infrastructure(e.g., interactions and deployment), discusses management platforms (resources and services), identifies and analyzes the main properties of the Cloud infrastructure, and presents Cloud automated management services: virtual and application resource management services. Part two analyzes the problem of establishing trustworthy Cloud, discuss...

  13. Cloud time

    CERN Document Server

    Lockwood, Dean

    2012-01-01

    The ‘Cloud’, hailed as a new digital commons, a utopia of collaborative expression and constant connection, actually constitutes a strategy of vitalist post-hegemonic power, which moves to dominate immanently and intensively, organizing our affective political involvements, instituting new modes of enclosure, and, crucially, colonizing the future through a new temporality of control. The virtual is often claimed as a realm of invention through which capitalism might be cracked, but it is precisely here that power now thrives. Cloud time, in service of security and profit, assumes all is knowable. We bear witness to the collapse of both past and future virtuals into a present dedicated to the exploitation of the spectres of both.

  14. Clustering, randomness, and regularity in cloud fields. 4: Stratocumulus cloud fields

    Science.gov (United States)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-01-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (more than 900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  15. Clustering, randomness, and regularity in cloud fields. 4. Stratocumulus cloud fields

    Science.gov (United States)

    Lee, J.; Chou, J.; Weger, R. C.; Welch, R. M.

    1994-07-01

    To complete the analysis of the spatial distribution of boundary layer cloudiness, the present study focuses on nine stratocumulus Landsat scenes. The results indicate many similarities between stratocumulus and cumulus spatial distributions. Most notably, at full spatial resolution all scenes exhibit a decidedly clustered distribution. The strength of the clustering signal decreases with increasing cloud size; the clusters themselves consist of a few clouds (less than 10), occupy a small percentage of the cloud field area (less than 5%), contain between 20% and 60% of the cloud field population, and are randomly located within the scene. In contrast, stratocumulus in almost every respect are more strongly clustered than are cumulus cloud fields. For instance, stratocumulus clusters contain more clouds per cluster, occupy a larger percentage of the total area, and have a larger percentage of clouds participating in clusters than the corresponding cumulus examples. To investigate clustering at intermediate spatial scales, the local dimensionality statistic is introduced. Results obtained from this statistic provide the first direct evidence for regularity among large (>900 m in diameter) clouds in stratocumulus and cumulus cloud fields, in support of the inhibition hypothesis of Ramirez and Bras (1990). Also, the size compensated point-to-cloud cumulative distribution function statistic is found to be necessary to obtain a consistent description of stratocumulus cloud distributions. A hypothesis regarding the underlying physical mechanisms responsible for cloud clustering is presented. It is suggested that cloud clusters often arise from 4 to 10 triggering events localized within regions less than 2 km in diameter and randomly distributed within the cloud field. As the size of the cloud surpasses the scale of the triggering region, the clustering signal weakens and the larger cloud locations become more random.

  16. A study of Monte Carlo radiative transfer through fractal clouds

    Energy Technology Data Exchange (ETDEWEB)

    Gautier, C.; Lavallec, D.; O`Hirok, W.; Ricchiazzi, P. [Univ. of California, Santa Barbara, CA (United States)] [and others

    1996-04-01

    An understanding of radiation transport (RT) through clouds is fundamental to studies of the earth`s radiation budget and climate dynamics. The transmission through horizontally homogeneous clouds has been studied thoroughly using accurate, discreet ordinates radiative transfer models. However, the applicability of these results to general problems of global radiation budget is limited by the plane parallel assumption and the fact that real clouds fields show variability, both vertically and horizontally, on all size scales. To understand how radiation interacts with realistic clouds, we have used a Monte Carlo radiative transfer model to compute the details of the photon-cloud interaction on synthetic cloud fields. Synthetic cloud fields, generated by a cascade model, reproduce the scaling behavior, as well as the cloud variability observed and estimated from cloud satellite data.

  17. Migrating enterprise storage applications to the cloud

    OpenAIRE

    Vrable, Michael Daniel

    2011-01-01

    Cloud computing has emerged as a model for hosting computing infrastructure and outsourcing management of that infrastructure. It offers the promise of simplified provisioning and management, lower costs, and access to resources that scale up and down with demand. Cloud computing has seen growing use for Web site hosting, large batch processing jobs, and similar tasks. Despite potential advantages, however, cloud computing is not much used for enterprise applications such as backup, shared fi...

  18. Cloud Computing Security Issues and Challenges

    OpenAIRE

    Kuyoro S. O.; Ibikunle F; Awodele O

    2011-01-01

    Cloud computing is a set of IT services that are provided to a customer over a network on a leased basis and with the ability to scale up or down their service requirements. Usually cloud computing services are delivered by a third party provider who owns the infrastructure. It advantages to mention but a few include scalability, resilience, flexibility, efficiency and outsourcing non-core activities. Cloud computing offers an innovative business model for organizations to adopt IT services w...

  19. The Monoceros R2 Molecular Cloud

    Science.gov (United States)

    Carpenter, J. M.; Hodapp, K. W.

    2008-12-01

    The Monoceros R2 region was first recognized as a chain of reflection nebulae illuminated by A- and B-type stars. These nebulae are associated with a giant molecular cloud that is one of the closest massive star forming regions to the Sun. This chapter reviews the properties of the Mon R2 region, including the namesake reflection nebulae, the large scale molecula= r cloud, global star formation activity, and properties of prominent star forming regions in the cloud.

  20. Interoperable Resource Management for establishing Federated Clouds

    OpenAIRE

    Kecskeméti, Gábor; Kertész, Attila; Marosi, Attila; Kacsuk, Péter

    2012-01-01

    Cloud Computing builds on the latest achievements of diverse research areas, such as Grid Computing, Service-oriented computing, business process modeling and virtualization. As this new computing paradigm was mostly lead by companies, several proprietary systems arose. Recently, alongside these commercial systems, several smaller-scale privately owned systems are maintained and developed. This chapter focuses on issues faced by users with interests on Multi-Cloud use and by Cloud providers w...

  1. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  2. Scaling of Thermal Images at Different Spatial Resolution: The Mixed Pixel Problem

    Directory of Open Access Journals (Sweden)

    Hamlyn G. Jones

    2014-07-01

    Full Text Available The consequences of changes in spatial resolution for application of thermal imagery in plant phenotyping in the field are discussed. Where image pixels are significantly smaller than the objects of interest (e.g., leaves, accurate estimates of leaf temperature are possible, but when pixels reach the same scale or larger than the objects of interest, the observed temperatures become significantly biased by the background temperature as a result of the presence of mixed pixels. Approaches to the estimation of the true leaf temperature that apply both at the whole-pixel level and at the sub-pixel level are reviewed and discussed.

  3. Self-Awareness of Cloud Applications

    NARCIS (Netherlands)

    Iosup, Alexandru; Zhu, Xiaoyun; Merchant, Arif; Kalyvianaki, Eva; Maggio, Martina; Spinner, Simon; Abdelzaher, Tarek; Mengshoel, Ole; Bouchenak, Sara

    2016-01-01

    Cloud applications today deliver an increasingly larger portion of the Information and Communication Technology (ICT) services. To address the scale, growth, and reliability of cloud applications, self-aware management and scheduling are becoming commonplace. How are they used in practice? In this

  4. Cloud Computing, Tieto Cloud Server Model

    OpenAIRE

    Suikkanen, Saara

    2013-01-01

    The purpose of this study is to find out what is cloud computing. To be able to make wise decisions when moving to cloud or considering it, companies need to understand what cloud is consists of. Which model suits best to they company, what should be taken into account before moving to cloud, what is the cloud broker role and also SWOT analysis of cloud? To be able to answer customer requirements and business demands, IT companies should develop and produce new service models. IT house T...

  5. Scientific Services on the Cloud

    Science.gov (United States)

    Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong

    Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.

  6. Blue skies for CLOUD

    CERN Multimedia

    2006-01-01

    Through the recently approved CLOUD experiment, CERN will soon be contributing to climate research. Tests are being performed on the first prototype of CLOUD, an experiment designed to assess cosmic radiation influence on cloud formation.

  7. Variability in Surface BRDF at Different Spatial Scales (30m-500m) Over a Mixed Agricultural Landscape as Retrieved from Airborne and Satellite Spectral Measurements

    Science.gov (United States)

    Roman, Miguel O.; Gatebe, Charles K.; Schaaf, Crystal B.; Poudyal, Rajesh; Wang, Zhuosen; King, Michael D.

    2012-01-01

    Over the past decade, the role of multiangle 1 remote sensing has been central to the development of algorithms for the retrieval of global land surface properties including models of the bidirectional reflectance distribution function (BRDF), albedo, land cover/dynamics, burned area extent, as well as other key surface biophysical quantities represented by the anisotropic reflectance characteristics of vegetation. In this study, a new retrieval strategy for fine-to-moderate resolution multiangle observations was developed, based on the operational sequence used to retrieve the Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5 reflectance and BRDF/albedo products. The algorithm makes use of a semiempirical kernel-driven bidirectional reflectance model to provide estimates of intrinsic albedo (i.e., directional-hemispherical reflectance and bihemispherical reflectance), model parameters describing the BRDF, and extensive quality assurance information. The new retrieval strategy was applied to NASA's Cloud Absorption Radiometer (CAR) data acquired during the 2007 Cloud and Land Surface Interaction Campaign (CLASIC) over the well-instrumented Atmospheric Radiation Measurement Program (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site in Oklahoma, USA. For the case analyzed, we obtained approx.1.6 million individual surface bidirectional reflectance factor (BRF) retrievals, from nadir to 75deg off-nadir, and at spatial resolutions ranging from 3 m - 500 m. This unique dataset was used to examine the interaction of the spatial and angular 18 characteristics of a mixed agricultural landscape; and provided the basis for detailed assessments of: (1) the use of a priori knowledge in kernel-driven BRDF model inversions; (2) the interaction between surface reflectance anisotropy and instrument spatial resolution; and (3) the uncertainties that arise when sub-pixel differences in the BRDF are aggregated to a moderate resolution satellite

  8. Moving towards Cloud Security

    OpenAIRE

    Edit Szilvia Rubóczki; Zoltán Rajnai

    2015-01-01

    Cloud computing hosts and delivers many different services via Internet. There are a lot of reasons why people opt for using cloud resources. Cloud development is increasing fast while a lot of related services drop behind, for example the mass awareness of cloud security. However the new generation upload videos and pictures without reason to a cloud storage, but only few know about data privacy, data management and the proprietary of stored data in the cloud. In an enterprise environment th...

  9. Cloud computing for comparative genomics

    Directory of Open Access Journals (Sweden)

    Pivovarov Rimma

    2010-05-01

    Full Text Available Abstract Background Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD, to run within Amazon's Elastic Computing Cloud (EC2. We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. Results We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. Conclusions The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  10. Cloud computing for comparative genomics.

    Science.gov (United States)

    Wall, Dennis P; Kudtarkar, Parul; Fusaro, Vincent A; Pivovarov, Rimma; Patil, Prasad; Tonellato, Peter J

    2010-05-18

    Large comparative genomics studies and tools are becoming increasingly more compute-expensive as the number of available genome sequences continues to rise. The capacity and cost of local computing infrastructures are likely to become prohibitive with the increase, especially as the breadth of questions continues to rise. Alternative computing architectures, in particular cloud computing environments, may help alleviate this increasing pressure and enable fast, large-scale, and cost-effective comparative genomics strategies going forward. To test this, we redesigned a typical comparative genomics algorithm, the reciprocal smallest distance algorithm (RSD), to run within Amazon's Elastic Computing Cloud (EC2). We then employed the RSD-cloud for ortholog calculations across a wide selection of fully sequenced genomes. We ran more than 300,000 RSD-cloud processes within the EC2. These jobs were farmed simultaneously to 100 high capacity compute nodes using the Amazon Web Service Elastic Map Reduce and included a wide mix of large and small genomes. The total computation time took just under 70 hours and cost a total of $6,302 USD. The effort to transform existing comparative genomics algorithms from local compute infrastructures is not trivial. However, the speed and flexibility of cloud computing environments provides a substantial boost with manageable cost. The procedure designed to transform the RSD algorithm into a cloud-ready application is readily adaptable to similar comparative genomics problems.

  11. AVOCLOUDY: a simulator of volunteer clouds

    DEFF Research Database (Denmark)

    Sebastio, Stefano; Amoretti, Michele; Lluch Lafuente, Alberto

    2015-01-01

    The increasing demand of computational and storage resources is shifting users toward the adoption of cloud technologies. Cloud computing is based on the vision of computing as utility, where users no more need to buy machines but simply access remote resources made available on-demand by cloud...... application, intelligent agents constitute a feasible technology to add autonomic features to cloud operations. Furthermore, the volunteer computing paradigm—one of the Information and Communications Technology (ICT) trends of the last decade—can be pulled alongside traditional cloud approaches...... management solutions before their deployment in the production environment. However, currently available simulators of cloud platforms are not suitable to model and analyze such heterogeneous, large-scale, and highly dynamic systems. We propose the AVOCLOUDY simulator to fill this gap. This paper presents...

  12. Sahara Dust Cloud

    Science.gov (United States)

    2005-01-01

    [figure removed for brevity, see original site] Dust Particles Click on the image for Quicktime movie from 7/15-7/24 A continent-sized cloud of hot air and dust originating from the Sahara Desert crossed the Atlantic Ocean and headed towards Florida and the Caribbean. A Saharan Air Layer, or SAL, forms when dry air and dust rise from Africa's west coast and ride the trade winds above the Atlantic Ocean. These dust clouds are not uncommon, especially during the months of July and August. They start when weather patterns called tropical waves pick up dust from the desert in North Africa, carry it a couple of miles into the atmosphere and drift westward. In a sequence of images created by data acquired by the Earth-orbiting Atmospheric Infrared Sounder ranging from July 15 through July 24, we see the distribution of the cloud in the atmosphere as it swirls off of Africa and heads across the ocean to the west. Using the unique silicate spectral signatures of dust in the thermal infrared, AIRS can detect the presence of dust in the atmosphere day or night. This detection works best if there are no clouds present on top of the dust; when clouds are present, they can interfere with the signal, making it much harder to detect dust as in the case of July 24, 2005. In the Quicktime movie, the scale at the bottom of the images shows +1 for dust definitely detected, and ranges down to -1 for no dust detected. The plots are averaged over a number of AIRS observations falling within grid boxes, and so it is possible to obtain fractional numbers. [figure removed for brevity, see original site] Total Water Vapor in the Atmosphere Around the Dust Cloud Click on the image for Quicktime movie The dust cloud is contained within a dry adiabatic layer which originates over the Sahara Desert. This Saharan Air Layer (SAL) advances Westward over the Atlantic Ocean, overriding the cool, moist air nearer the surface. This burst of very dry air is visible in the AIRS retrieved total water

  13. Exploiting Virtualization and Cloud Computing in ATLAS

    International Nuclear Information System (INIS)

    Harald Barreiro Megino, Fernando; Van der Ster, Daniel; Benjamin, Doug; De, Kaushik; Gable, Ian; Paterson, Michael; Taylor, Ryan; Hendrix, Val; Vitillo, Roberto A; Panitkin, Sergey; De Silva, Asoka; Walker, Rod

    2012-01-01

    The ATLAS Computing Model was designed around the concept of grid computing; since the start of data-taking, this model has proven very successful in the federated operation of more than one hundred Worldwide LHC Computing Grid (WLCG) sites for offline data distribution, storage, processing and analysis. However, new paradigms in computing, namely virtualization and cloud computing, present improved strategies for managing and provisioning IT resources that could allow ATLAS to more flexibly adapt and scale its storage and processing workloads on varied underlying resources. In particular, ATLAS is developing a “grid-of-clouds” infrastructure in order to utilize WLCG sites that make resources available via a cloud API. This work will present the current status of the Virtualization and Cloud Computing R and D project in ATLAS Distributed Computing. First, strategies for deploying PanDA queues on cloud sites will be discussed, including the introduction of a “cloud factory” for managing cloud VM instances. Next, performance results when running on virtualized/cloud resources at CERN LxCloud, StratusLab, and elsewhere will be presented. Finally, we will present the ATLAS strategies for exploiting cloud-based storage, including remote XROOTD access to input data, management of EC2-based files, and the deployment of cloud-resident LCG storage elements.

  14. Cloud Macroscopic Organization: Order Emerging from Randomness

    Science.gov (United States)

    Yuan, Tianle

    2011-01-01

    Clouds play a central role in many aspects of the climate system and their forms and shapes are remarkably diverse. Appropriate representation of clouds in climate models is a major challenge because cloud processes span at least eight orders of magnitude in spatial scales. Here we show that there exists order in cloud size distribution of low-level clouds, and that it follows a power-law distribution with exponent gamma close to 2. gamma is insensitive to yearly variations in environmental conditions, but has regional variations and land-ocean contrasts. More importantly, we demonstrate this self-organizing behavior of clouds emerges naturally from a complex network model with simple, physical organizing principles: random clumping and merging. We also demonstrate symmetry between clear and cloudy skies in terms of macroscopic organization because of similar fundamental underlying organizing principles. The order in the apparently complex cloud-clear field thus has its root in random local interactions. Studying cloud organization with complex network models is an attractive new approach that has wide applications in climate science. We also propose a concept of cloud statistic mechanics approach. This approach is fully complementary to deterministic models, and the two approaches provide a powerful framework to meet the challenge of representing clouds in our climate models when working in tandem.

  15. Programming Microsoft's Clouds Windows Azure and Office 365

    CERN Document Server

    Rizzo, Thomas; van Otegem, Michiel; Bishop, Darrin; Durzi, George; Tejada, Zoiner; Mann, David

    2012-01-01

    A detailed look at a diverse set of Cloud topics, particularly Azure and Office 365 More and more companies are realizing the power and potential of Cloud computing as a viable way to save energy and money. This valuable book offers an in-depth look at a wide range of Cloud topics unlike any other book on the market. Examining how Cloud services allows users to pay as they go for exactly what they use, this guide explains how companies can easily scale their Cloud use up and down to fit their business requirements. After an introduction to Cloud computing, you'll discover how to prepare your e

  16. Cloud computing as a new technology trend in education

    OpenAIRE

    Шамина, Ольга Борисовна; Буланова, Татьяна Валентиновна

    2014-01-01

    The construction and operation of extremely large-scale, commodity-computer datacenters was the key necessary enabler of Cloud Computing. Cloud Computing could offer services make a good profit for using in education. With Cloud Computing it is possible to increase the quality of education, improve communicative culture and give to teachers and students new application opportunities.

  17. Cloud Infrastructure & Applications - CloudIA

    Science.gov (United States)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  18. Grids, Clouds and Virtualization

    CERN Document Server

    Cafaro, Massimo

    2011-01-01

    Research into grid computing has been driven by the need to solve large-scale, increasingly complex problems for scientific applications. Yet the applications of grid computing for business and casual users did not begin to emerge until the development of the concept of cloud computing, fueled by advances in virtualization techniques, coupled with the increased availability of ever-greater Internet bandwidth. The appeal of this new paradigm is mainly based on its simplicity, and the affordable price for seamless access to both computational and storage resources. This timely text/reference int

  19. Carbon pellet cloud striations

    International Nuclear Information System (INIS)

    Parks, P.B.

    1989-01-01

    Fine scale striations, with alternating rows of bright and dark zones, have been observed in the ablation clouds of carbon pellets injected into the TEXT tokamak. The striations extend along the magnetic field for about 1 cm with quite regular cross-field variations characterized by a wavelength of a few mm. Their potential as a diagnostic tool for measuring q-profiles in tokamaks provides motivation for investigating the origin of the striations. The authors propose that the striations are not due to a sequence of high and low ablation rates because of the finite thermal magnetic islands localized at rational surfaces, q = m/n, would be responsible for reducing the electron flux to the pellet region; the length of the closed field line which forms the local magnetic axis of the island is too long to prevent a depletion of plasma electrons in a flux tube intercepting the pellet for the duration 2 rp / vp . Instead, they propose that striations are the manifestation of the saturated state of growing fluctuations inside the cloud. The instability is generated by E x B rotation of the ablation cloud. The outward centrifugal force points down the ablation density gradient inducing the Rayleigh-Taylor instability. The instability is not present for wave numbers along the field lines, which may explain why the striations are long and uniform in that direction. The E field develops inside the ablation cloud as a result of cold electron return currents which are induced to cancel the incoming hot plasma electron current streaming along the field lines

  20. Silicon Photonics Cloud (SiCloud)

    DEFF Research Database (Denmark)

    DeVore, P. T. S.; Jiang, Y.; Lynch, M.

    2015-01-01

    Silicon Photonics Cloud (SiCloud.org) is the first silicon photonics interactive web tool. Here we report new features of this tool including mode propagation parameters and mode distribution galleries for user specified waveguide dimensions and wavelengths.......Silicon Photonics Cloud (SiCloud.org) is the first silicon photonics interactive web tool. Here we report new features of this tool including mode propagation parameters and mode distribution galleries for user specified waveguide dimensions and wavelengths....

  1. Bigdata Driven Cloud Security: A Survey

    Science.gov (United States)

    Raja, K.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.

  2. Simplified cloud-oriented virtual machine management with MLN

    OpenAIRE

    Begnum, Kyrre

    2010-01-01

    System administrators are faced with the challenge of making their existing systems power-efficient and scalable. Although Cloud Computing is offered as a solution to this challenge by many, we argue that having multiple interfaces and cloud providers can result in more complexity than before. This paper addresses cloud computing from a user perspective. We show how complex scenarios, such as an on-demand render farm and scaling web-service, can be achieved utilizing clouds ...

  3. Cloud cover typing from environmental satellite imagery. Discriminating cloud structure with Fast Fourier Transforms (FFT)

    Science.gov (United States)

    Logan, T. L.; Huning, J. R.; Glackin, D. L.

    1983-01-01

    The use of two dimensional Fast Fourier Transforms (FFTs) subjected to pattern recognition technology for the identification and classification of low altitude stratus cloud structure from Geostationary Operational Environmental Satellite (GOES) imagery was examined. The development of a scene independent pattern recognition methodology, unconstrained by conventional cloud morphological classifications was emphasized. A technique for extracting cloud shape, direction, and size attributes from GOES visual imagery was developed. These attributes were combined with two statistical attributes (cloud mean brightness, cloud standard deviation), and interrogated using unsupervised clustering amd maximum likelihood classification techniques. Results indicate that: (1) the key cloud discrimination attributes are mean brightness, direction, shape, and minimum size; (2) cloud structure can be differentiated at given pixel scales; (3) cloud type may be identifiable at coarser scales; (4) there are positive indications of scene independence which would permit development of a cloud signature bank; (5) edge enhancement of GOES imagery does not appreciably improve cloud classification over the use of raw data; and (6) the GOES imagery must be apodized before generation of FFTs.

  4. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    Science.gov (United States)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  5. Large-scale CO J = 1-0 observations of the giant molecular cloud associated with the infrared ring N35 with the Nobeyama 45 m telescope

    Science.gov (United States)

    Torii, Kazufumi; Fujita, Shinji; Matsuo, Mitsuhiro; Nishimura, Atsushi; Kohno, Mikito; Kuriki, Mika; Tsuda, Yuya; Minamidani, Tetsuhiro; Umemoto, Tomofumi; Kuno, Nario; Hattori, Yusuke; Yoshiike, Satoshi; Ohama, Akio; Tachihara, Kengo; Shima, Kazuhiro; Habe, Asao; Fukui, Yasuo

    2018-05-01

    We report an observational study of the giant molecular cloud (GMC) associated with the Galactic infrared ring-like structure N35 and two nearby H II regions G024.392+00.072 (H II region A) and G024.510-00.060 (H II region B), using the new CO J = 1-0 data obtained as a part of the FOREST Unbiased Galactic Plane Imaging survey with the Nobeyama 45 m telescope (FUGIN) project at a spatial resolution of 21″. Our CO data reveals that the GMC, with a total molecular mass of 2.1 × 106 M⊙, has two velocity components of over ˜10-15 km s-1. The majority of molecular gas in the GMC is included in the lower-velocity component (LVC) at ˜110-114 km s-1, while the higher-velocity components (HVCs) at ˜118-126 km s-1 consist of three smaller molecular clouds which are located near the three H II regions. The LVC and HVCs show spatially complementary distributions along the line-of-sight, despite large velocity separations of ˜5-15 km s-1, and are connected in velocity by the CO emission with intermediate intensities. By comparing the observations with simulations, we discuss a scenario where collisions of the three HVCs with the LVC at velocities of ˜10-15 km s-1 can provide an interpretation of these two observational signatures. The intermediate-velocity features between the LVC and HVCs can be understood as broad bridge features, which indicate the turbulent motion of the gas at the collision interfaces, while the spatially complementary distributions represent the cavities created in the LVC by the HVCs through the collisions. Our model indicates that the three H II regions were formed after the onset of the collisions, and it is therefore suggested that the high-mass star formation in the GMC was triggered by the collisions.

  6. RACORO Extended-Term Aircraft Observations of Boundary-Layer Clouds

    Science.gov (United States)

    Vogelmann, Andrew M.; McFarquhar, Greg M.; Ogren, John A.; Turner, David D.; Comstock, Jennifer M.; Feingold, Graham; Long, Charles N.; Jonsson, Haflidi H.; Bucholtz, Anthony; Collins, Don R.; hide

    2012-01-01

    Small boundary-layer clouds are ubiquitous over many parts of the globe and strongly influence the Earths radiative energy balance. However, our understanding of these clouds is insufficient to solve pressing scientific problems. For example, cloud feedback represents the largest uncertainty amongst all climate feedbacks in general circulation models (GCM). Several issues complicate understanding boundary-layer clouds and simulating them in GCMs. The high spatial variability of boundary-layer clouds poses an enormous computational challenge, since their horizontal dimensions and internal variability occur at spatial scales much finer than the computational grids used in GCMs. Aerosol-cloud interactions further complicate boundary-layer cloud measurement and simulation. Additionally, aerosols influence processes such as precipitation and cloud lifetime. An added complication is that at small scales (order meters to 10s of meters) distinguishing cloud from aerosol is increasingly difficult, due to the effects of aerosol humidification, cloud fragments and photon scattering between clouds.

  7. Head in the Clouds: A Review of Current and Future Potential for Cloud-Enabled Pedagogies

    Science.gov (United States)

    Stevenson, Michael; Hedberg, John G.

    2011-01-01

    This paper reviews the research on the disruptive and transformative potential of newly-emerging cloud-based pedagogies. It takes into consideration the extent to which Cloud Computing can be leveraged to disseminate and scale web-based applications within and across learning contexts. It examines ideas from current literature in Web 2.0- and…

  8. The CLOUD experiment

    CERN Multimedia

    Maximilien Brice

    2006-01-01

    The Cosmics Leaving Outdoor Droplets (CLOUD) experiment as shown by Jasper Kirkby (spokesperson). Kirkby shows a sketch to illustrate the possible link between galactic cosmic rays and cloud formations. The CLOUD experiment uses beams from the PS accelerator at CERN to simulate the effect of cosmic rays on cloud formations in the Earth's atmosphere. It is thought that cosmic ray intensity is linked to the amount of low cloud cover due to the formation of aerosols, which induce condensation.

  9. BUSINESS INTELLIGENCE IN CLOUD

    OpenAIRE

    Celina M. Olszak

    2014-01-01

    . The paper reviews and critiques current research on Business Intelligence (BI) in cloud. This review highlights that organizations face various challenges using BI cloud. The research objectives for this study are a conceptualization of the BI cloud issue, as well as an investigation of some benefits and risks from BI cloud. The study was based mainly on a critical analysis of literature and some reports on BI cloud using. The results of this research can be used by IT and business leaders ...

  10. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  11. GIANT MOLECULAR CLOUD FORMATION IN DISK GALAXIES: CHARACTERIZING SIMULATED VERSUS OBSERVED CLOUD CATALOGS

    Energy Technology Data Exchange (ETDEWEB)

    Benincasa, Samantha M.; Pudritz, Ralph E.; Wadsley, James [Department of Physics and Astronomy, McMaster University, Hamilton, ON L8S 4M1 (Canada); Tasker, Elizabeth J. [Department of Physics, Faculty of Science, Hokkaido University, Kita-ku, Sapporo 060-0810 (Japan)

    2013-10-10

    We present the results of a study of simulated giant molecular clouds (GMCs) formed in a Milky Way-type galactic disk with a flat rotation curve. This simulation, which does not include star formation or feedback, produces clouds with masses ranging between 10{sup 4} M{sub ☉} and 10{sup 7} M{sub ☉}. We compare our simulated cloud population to two observational surveys: the Boston University-Five College Radio Astronomy Observatory Galactic Ring Survey and the BIMA All-Disk Survey of M33. An analysis of the global cloud properties as well as a comparison of Larson's scaling relations is carried out. We find that simulated cloud properties agree well with the observed cloud properties, with the closest agreement occurring between the clouds at comparable resolution in M33. Our clouds are highly filamentary—a property that derives both from their formation due to gravitational instability in the sheared galactic environment, as well as to cloud-cloud gravitational encounters. We also find that the rate at which potentially star-forming gas accumulates within dense regions—wherein n{sub thresh} ≥ 10{sup 4} cm{sup –3}—is 3% per 10 Myr, in clouds of roughly 10{sup 6} M{sub ☉}. This suggests that star formation rates in observed clouds are related to the rates at which gas can be accumulated into dense subregions within GMCs via filamentary flows. The most internally well-resolved clouds are chosen for listing in a catalog of simulated GMCs—the first of its kind. The cataloged clouds are available as an extracted data set from the global simulation.

  12. Cloud Processed CCN Suppress Stratus Cloud Drizzle

    Science.gov (United States)

    Hudson, J. G.; Noble, S. R., Jr.

    2017-12-01

    Conversion of sulfur dioxide to sulfate within cloud droplets increases the sizes and decreases the critical supersaturation, Sc, of cloud residual particles that had nucleated the droplets. Since other particles remain at the same sizes and Sc a size and Sc gap is often observed. Hudson et al. (2015) showed higher cloud droplet concentrations (Nc) in stratus clouds associated with bimodal high-resolution CCN spectra from the DRI CCN spectrometer compared to clouds associated with unimodal CCN spectra (not cloud processed). Here we show that CCN spectral shape (bimodal or unimodal) affects all aspects of stratus cloud microphysics and drizzle. Panel A shows mean differential cloud droplet spectra that have been divided according to traditional slopes, k, of the 131 measured CCN spectra in the Marine Stratus/Stratocumulus Experiment (MASE) off the Central California coast. K is generally high within the supersaturation, S, range of stratus clouds (< 0.5%). Because cloud processing decreases Sc of some particles, it reduces k. Panel A shows higher concentrations of small cloud droplets apparently grown on lower k CCN than clouds grown on higher k CCN. At small droplet sizes the concentrations follow the k order of the legend, black, red, green, blue (lowest to highest k). Above 13 µm diameter the lines cross and the hierarchy reverses so that blue (highest k) has the highest concentrations followed by green, red and black (lowest k). This reversed hierarchy continues into the drizzle size range (panel B) where the most drizzle drops, Nd, are in clouds grown on the least cloud-processed CCN (blue), while clouds grown on the most processed CCN (black) have the lowest Nd. Suppression of stratus cloud drizzle by cloud processing is an additional 2nd indirect aerosol effect (IAE) that along with the enhancement of 1st IAE by higher Nc (panel A) are above and beyond original IAE. However, further similar analysis is needed in other cloud regimes to determine if MASE was

  13. eEcoLiDAR, eScience infrastructure for ecological applications of LiDAR point clouds: reconstructing the 3D ecosystem structure for animals at regional to continental scales

    Directory of Open Access Journals (Sweden)

    W. Daniel Kissling

    2017-07-01

    Full Text Available The lack of high-resolution measurements of 3D ecosystem structure across broad spatial extents impedes major advancements in animal ecology and biodiversity science. We aim to fill this gap by using Light Detection and Ranging (LiDAR technology to characterize the vertical and horizontal complexity of vegetation and landscapes at high resolution across regional to continental scales. The newly LiDAR-derived 3D ecosystem structures will be applied in species distribution models for breeding birds in forests and marshlands, for insect pollinators in agricultural landscapes, and songbirds at stopover sites during migration. This will allow novel insights into the hierarchical structure of animal-habitat associations, into why animal populations decline, and how they respond to habitat fragmentation and ongoing land use change. The processing of these massive amounts of LiDAR point cloud data will be achieved by developing a generic interactive eScience environment with multi-scale object-based image analysis (OBIA and interpretation of LiDAR point clouds, including data storage, scalable computing, tools for machine learning and visualisation (feature selection, annotation/segmentation, object classification, and evaluation, and a PostGIS spatial database. The classified objects will include trees, forests, vegetation strata, edges, bushes, hedges, reedbeds etc. with their related metrics, attributes and summary statistics (e.g. vegetation openness, height, density, vertical biomass distribution etc.. The newly developed eScience tools and data will be available to other disciplines and applications in ecology and the Earth sciences, thereby achieving high impact. The project will foster new multi-disciplinary collaborations between ecologists and eScientists and contribute to training a new generation of geo-ecologists.

  14. Global analysis of cloud field coverage and radiative properties, using morphological methods and MODIS observations

    Directory of Open Access Journals (Sweden)

    R. Z. Bar-Or

    2011-01-01

    Full Text Available The recently recognized continuous transition zone between detectable clouds and cloud-free atmosphere ("the twilight zone" is affected by undetectable clouds and humidified aerosol. In this study, we suggest to distinguish cloud fields (including the detectable clouds and the surrounding twilight zone from cloud-free areas, which are not affected by clouds. For this classification, a robust and simple-to-implement cloud field masking algorithm which uses only the spatial distribution of clouds, is presented in detail. A global analysis, estimating Earth's cloud field coverage (50° S–50° N for 28 July 2008, using the Moderate Resolution Imaging Spectroradiometer (MODIS data, finds that while the declared cloud fraction is 51%, the global cloud field coverage reaches 88%. The results reveal the low likelihood for finding a cloud-free pixel and suggest that this likelihood may decrease as the pixel size becomes larger. A global latitudinal analysis of cloud fields finds that unlike oceans, which are more uniformly covered by cloud fields, land areas located under the subsidence zones of the Hadley cell (the desert belts, contain proper areas for investigating cloud-free atmosphere as there is 40–80% probability to detect clear sky over them. Usually these golden-pixels, with higher likelihood to be free of clouds, are over deserts. Independent global statistical analysis, using MODIS aerosol and cloud products, reveals a sharp exponential decay of the global mean aerosol optical depth (AOD as a function of the distance from the nearest detectable cloud, both above ocean and land. Similar statistical analysis finds an exponential growth of mean aerosol fine-mode fraction (FMF over oceans when the distance from the nearest cloud increases. A 30 km scale break clearly appears in several analyses here, suggesting this is a typical natural scale of cloud fields. This work shows different microphysical and optical properties of cloud fields

  15. Relationship between cloud radiative forcing, cloud fraction and cloud albedo, and new surface-based approach for determining cloud albedo

    OpenAIRE

    Y. Liu; W. Wu; M. P. Jensen; T. Toto

    2011-01-01

    This paper focuses on three interconnected topics: (1) quantitative relationship between surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo; (2) surfaced-based approach for measuring cloud albedo; (3) multiscale (diurnal, annual and inter-annual) variations and covariations of surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo. An analytical expression is first derived to quantify the relationship between cloud radiative forcing, cloud fractio...

  16. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    Science.gov (United States)

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  17. The effect of cloud shape on radiative characteristics

    International Nuclear Information System (INIS)

    Welch, R.M.; Zdunkowski, W.G.

    1981-01-01

    Cumulus cloud radiative characteristics are calculated using Monte-Carlo codes as a function of solar zenith angle for clouds approximated by hemispherical, cylindrical and combination-type geometries. Values of cloud reflectivity, transmissivity and absorptivity are compared with values computed from assuming cubic and rectangular geometries, the basis for most previous finite cloud calculations. Poor agreement is obtained at large cloud sizes and only marginal agreement is obtained at small cloud sizes. Two approximations based upon various scalings of cloud optical depth (extinction parameters) are also constructed, but with limited success in reproducing the values produced by the convex shaped clouds. Reasonable agreement among the various approximations occurs at large solar zenith angles, but extremely poor agreement may occur at small angles. (orig./WB) [de

  18. Satellite retrieval of cloud condensation nuclei concentrations by using clouds as CCN chambers

    Science.gov (United States)

    Rosenfeld, Daniel; Zheng, Youtong; Hashimshoni, Eyal; Pöhlker, Mira L.; Jefferson, Anne; Pöhlker, Christopher; Yu, Xing; Zhu, Yannian; Liu, Guihua; Yue, Zhiguo; Fischman, Baruch; Li, Zhanqing; Giguzin, David; Goren, Tom; Artaxo, Paulo; Pöschl, Ulrich

    2016-01-01

    Quantifying the aerosol/cloud-mediated radiative effect at a global scale requires simultaneous satellite retrievals of cloud condensation nuclei (CCN) concentrations and cloud base updraft velocities (Wb). Hitherto, the inability to do so has been a major cause of high uncertainty regarding anthropogenic aerosol/cloud-mediated radiative forcing. This can be addressed by the emerging capability of estimating CCN and Wb of boundary layer convective clouds from an operational polar orbiting weather satellite. Our methodology uses such clouds as an effective analog for CCN chambers. The cloud base supersaturation (S) is determined by Wb and the satellite-retrieved cloud base drop concentrations (Ndb), which is the same as CCN(S). Validation against ground-based CCN instruments at Oklahoma, at Manaus, and onboard a ship in the northeast Pacific showed a retrieval accuracy of ±25% to ±30% for individual satellite overpasses. The methodology is presently limited to boundary layer not raining convective clouds of at least 1 km depth that are not obscured by upper layer clouds, including semitransparent cirrus. The limitation for small solar backscattering angles of <25° restricts the satellite coverage to ∼25% of the world area in a single day. PMID:26944081

  19. Using MODIS Cloud Regimes to Sort Diagnostic Signals of Aerosol-Cloud-Precipitation Interactions.

    Science.gov (United States)

    Oreopoulos, Lazaros; Cho, Nayeong; Lee, Dongmin

    2017-05-27

    Coincident multi-year measurements of aerosol, cloud, precipitation and radiation at near-global scales are analyzed to diagnose their apparent relationships as suggestive of interactions previously proposed based on theoretical, observational, and model constructs. Specifically, we examine whether differences in aerosol loading in separate observations go along with consistently different precipitation, cloud properties, and cloud radiative effects. Our analysis uses a cloud regime (CR) framework to dissect and sort the results. The CRs come from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor and are defined as distinct groups of cloud systems with similar co-variations of cloud top pressure and cloud optical thickness. Aerosol optical depth used as proxy for aerosol loading comes from two sources, MODIS observations, and the MERRA-2 re-analysis, and its variability is defined with respect to local seasonal climatologies. The choice of aerosol dataset impacts our results substantially. We also find that the responses of the marine and continental component of a CR are frequently quite disparate. Overall, CRs dominated by warm clouds tend to exhibit less ambiguous signals, but also have more uncertainty with regard to precipitation changes. Finally, we find weak, but occasionally systematic co-variations of select meteorological indicators and aerosol, which serves as a sober reminder that ascribing changes in cloud and cloud-affected variables solely to aerosol variations is precarious.

  20. Cloud CCN feedback

    International Nuclear Information System (INIS)

    Hudson, J.G.

    1992-01-01

    Cloud microphysics affects cloud albedo precipitation efficiency and the extent of cloud feedback in response to global warming. Compared to other cloud parameters, microphysics is unique in its large range of variability and the fact that much of the variability is anthropogenic. Probably the most important determinant of cloud microphysics is the spectra of cloud condensation nuclei (CCN) which display considerable variability and have a large anthropogenic component. When analyzed in combination three field observation projects display the interrelationship between CCN and cloud microphysics. CCN were measured with the Desert Research Institute (DRI) instantaneous CCN spectrometer. Cloud microphysical measurements were obtained with the National Center for Atmospheric Research Lockheed Electra. Since CCN and cloud microphysics each affect the other a positive feedback mechanism can result

  1. Testing remote sensing on artificial observations: impact of drizzle and 3-D cloud structure on effective radius retrievals

    Directory of Open Access Journals (Sweden)

    T. Zinner

    2010-10-01

    Full Text Available Remote sensing of cloud effective particle size with passive sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS is an important tool for cloud microphysical studies. As a measure of the radiatively relevant droplet size, effective radius can be retrieved with different combinations of visible through shortwave and midwave infrared channels. In practice, retrieved effective radii from these combinations can be quite different. This difference is perhaps indicative of different penetration depths and path lengths for the spectral reflectances used. In addition, operational liquid water cloud retrievals are based on the assumption of a relatively narrow distribution of droplet sizes; the role of larger precipitation particles in these distributions is neglected. Therefore, possible explanations for the discrepancy in some MODIS spectral size retrievals could include 3-D radiative transport effects, including sub-pixel cloud inhomogeneity, and/or the impact of drizzle formation.

    For three cloud cases the possible factors of influence are isolated and investigated in detail by the use of simulated cloud scenes and synthetic satellite data: marine boundary layer cloud scenes from large eddy simulations (LES with detailed microphysics are combined with Monte Carlo radiative transfer calculations that explicitly account for the detailed droplet size distributions as well as 3-D radiative transfer to simulate MODIS observations. The operational MODIS optical thickness and effective radius retrieval algorithm is applied to these and the results are compared to the given LES microphysics.

    We investigate two types of marine cloud situations each with and without drizzle from LES simulations: (1 a typical daytime stratocumulus deck at two times in the diurnal cycle and (2 one scene with scattered cumulus. Only small impact of drizzle formation on the retrieved domain average and on the differences between the three

  2. Guidelines for Building a Private Cloud Infrastructure

    DEFF Research Database (Denmark)

    Ali Babar, Muhammad; Pantić, Zoran

    on open source software. One of the key objectives of this project was to create relevant material for providing a reference guide on the use of open source software for designing and implementing a private cloud. The primary focus on this document is to provide a brief background on different theoretical......, and a view on the different aspects of cloud computing in this document. Defining the cloud computing; analysis of the economical, security, legality, privacy, confidentiality aspects. There is also a short discussion about the potential impact on the employee’s future roles, and the challenges of migrating...... to a private cloud. The management of the instances and the related subjects are out of the scope of this document. This document is accompanied by three supplemental books that contain material from our experiences of scaling out in the virtual environment and cloud implementation in a physical environment...

  3. MODEL FOR SEMANTICALLY RICH POINT CLOUD DATA

    Directory of Open Access Journals (Sweden)

    F. Poux

    2017-10-01

    Full Text Available This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  4. Model for Semantically Rich Point Cloud Data

    Science.gov (United States)

    Poux, F.; Neuville, R.; Hallot, P.; Billen, R.

    2017-10-01

    This paper proposes an interoperable model for managing high dimensional point clouds while integrating semantics. Point clouds from sensors are a direct source of information physically describing a 3D state of the recorded environment. As such, they are an exhaustive representation of the real world at every scale: 3D reality-based spatial data. Their generation is increasingly fast but processing routines and data models lack of knowledge to reason from information extraction rather than interpretation. The enhanced smart point cloud developed model allows to bring intelligence to point clouds via 3 connected meta-models while linking available knowledge and classification procedures that permits semantic injection. Interoperability drives the model adaptation to potentially many applications through specialized domain ontologies. A first prototype is implemented in Python and PostgreSQL database and allows to combine semantic and spatial concepts for basic hybrid queries on different point clouds.

  5. Comparison of Monthly Mean Cloud Fraction and Cloud Optical depth Determined from Surface Cloud Radar, TOVS, AVHRR, and MODIS over Barrow, Alaska

    Science.gov (United States)

    Uttal, Taneil; Frisch, Shelby; Wang, Xuan-Ji; Key, Jeff; Schweiger, Axel; Sun-Mack, Sunny; Minnis, Patrick

    2005-01-01

    A one year comparison is made of mean monthly values of cloud fraction and cloud optical depth over Barrow, Alaska (71 deg 19.378 min North, 156 deg 36.934 min West) between 35 GHz radar-based retrievals, the TOVS Pathfinder Path-P product, the AVHRR APP-X product, and a MODIS based cloud retrieval product from the CERES-Team. The data sets represent largely disparate spatial and temporal scales, however, in this paper, the focus is to provide a preliminary analysis of how the mean monthly values derived from these different data sets compare, and determine how they can best be used separately, and in combination to provide reliable estimates of long-term trends of changing cloud properties. The radar and satellite data sets described here incorporate Arctic specific modifications that account for cloud detection challenges specific to the Arctic environment. The year 2000 was chosen for this initial comparison because the cloud radar data was particularly continuous and reliable that year, and all of the satellite retrievals of interest were also available for the year 2000. Cloud fraction was chosen as a comparison variable as accurate detection of cloud is the primary product that is necessary for any other cloud property retrievals. Cloud optical depth was additionally selected as it is likely the single cloud property that is most closely correlated to cloud influences on surface radiation budgets.

  6. Hybrid cloud for dummies

    CERN Document Server

    Hurwitz, Judith; Halper, Fern; Kirsch, Dan

    2012-01-01

    Understand the cloud and implement a cloud strategy for your business Cloud computing enables companies to save money by leasing storage space and accessing technology services through the Internet instead of buying and maintaining equipment and support services. Because it has its own unique set of challenges, cloud computing requires careful explanation. This easy-to-follow guide shows IT managers and support staff just what cloud computing is, how to deliver and manage cloud computing services, how to choose a service provider, and how to go about implementation. It also covers security and

  7. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  8. Clouds of Venus

    Energy Technology Data Exchange (ETDEWEB)

    Knollenberg, R G [Particle Measuring Systems, Inc., 1855 South 57th Court, Boulder, Colorado 80301, U.S.A.; Hansen, J [National Aeronautics and Space Administration, New York (USA). Goddard Inst. for Space Studies; Ragent, B [National Aeronautics and Space Administration, Moffett Field, Calif. (USA). Ames Research Center; Martonchik, J [Jet Propulsion Lab., Pasadena, Calif. (USA); Tomasko, M [Arizona Univ., Tucson (USA)

    1977-05-01

    The current state of knowledge of the Venusian clouds is reviewed. The visible clouds of Venus are shown to be quite similar to low level terrestrial hazes of strong anthropogenic influence. Possible nucleation and particle growth mechanisms are presented. The Pioneer Venus experiments that emphasize cloud measurements are described and their expected findings are discussed in detail. The results of these experiments should define the cloud particle composition, microphysics, thermal and radiative heat budget, rough dynamical features and horizontal and vertical variations in these and other parameters. This information should be sufficient to initialize cloud models which can be used to explain the cloud formation, decay, and particle life cycle.

  9. Impact of cloud microphysics on cloud-radiation interactions in the CSU general circulation model

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, L.D.; Randall, D.A.

    1995-04-01

    Our ability to study and quantify the impact of cloud-radiation interactions in studying global scale climate variations strongly relies upon the ability of general circulation models (GCMs) to simulate the coupling between the spatial and temporal variations of the model-generated cloudiness and atmospheric moisture budget components. In particular, the ability of GCMs to reproduce the geographical distribution of the sources and sinks of the planetary radiation balance depends upon their representation of the formation and dissipation of cloudiness in conjunction with cloud microphysics processes, and the fractional amount and optical characteristics of cloudiness in conjunction with the mass of condensate stored in the atmosphere. A cloud microphysics package which encompasses five prognostic variables for the mass of water vapor, cloud water, cloud ice, rain, and snow has been implemented in the Colorado State University General Circulation Model (CSU GCM) to simulate large-scale condensation processes. Convection interacts with the large-scale environment through the detrainment of cloud water and cloud ice at the top of cumulus towers. The cloud infrared emissivity and cloud optical depth of the model-generated cloudiness are interactive and depend upon the mass of cloud water and cloud ice suspended in the atmosphere. The global atmospheric moisture budget and planetary radiation budget of the CSU GCM obtained from a perpetual January simulation are discussed. Geographical distributions of the atmospheric moisture species are presented. Global maps of the top-of-atmosphere outgoing longwave radiation and planetary albedo are compared against Earth Radiation Budget Experiment (ERBE) satellite data.

  10. Using Radar, Lidar, and Radiometer measurements to Classify Cloud Type and Study Middle-Level Cloud Properties

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Zhien

    2010-06-29

    The project is mainly focused on the characterization of cloud macrophysical and microphysical properties, especially for mixed-phased clouds and middle level ice clouds by combining radar, lidar, and radiometer measurements available from the ACRF sites. First, an advanced mixed-phase cloud retrieval algorithm will be developed to cover all mixed-phase clouds observed at the ACRF NSA site. The algorithm will be applied to the ACRF NSA observations to generate a long-term arctic mixed-phase cloud product for model validations and arctic mixed-phase cloud processes studies. To improve the representation of arctic mixed-phase clouds in GCMs, an advanced understanding of mixed-phase cloud processes is needed. By combining retrieved mixed-phase cloud microphysical properties with in situ data and large-scale meteorological data, the project aim to better understand the generations of ice crystals in supercooled water clouds, the maintenance mechanisms of the arctic mixed-phase clouds, and their connections with large-scale dynamics. The project will try to develop a new retrieval algorithm to study more complex mixed-phase clouds observed at the ACRF SGP site. Compared with optically thin ice clouds, optically thick middle level ice clouds are less studied because of limited available tools. The project will develop a new two wavelength radar technique for optically thick ice cloud study at SGP site by combining the MMCR with the W-band radar measurements. With this new algorithm, the SGP site will have a better capability to study all ice clouds. Another area of the proposal is to generate long-term cloud type classification product for the multiple ACRF sites. The cloud type classification product will not only facilitates the generation of the integrated cloud product by applying different retrieval algorithms to different types of clouds operationally, but will also support other research to better understand cloud properties and to validate model simulations. The

  11. A cloud shadow detection method combined with cloud height iteration and spectral analysis for Landsat 8 OLI data

    Science.gov (United States)

    Sun, Lin; Liu, Xinyan; Yang, Yikun; Chen, TingTing; Wang, Quan; Zhou, Xueying

    2018-04-01

    Although enhanced over prior Landsat instruments, Landsat 8 OLI can obtain very high cloud detection precisions, but for the detection of cloud shadows, it still faces great challenges. Geometry-based cloud shadow detection methods are considered the most effective and are being improved constantly. The Function of Mask (Fmask) cloud shadow detection method is one of the most representative geometry-based methods that has been used for cloud shadow detection with Landsat 8 OLI. However, the Fmask method estimates cloud height employing fixed temperature rates, which are highly uncertain, and errors of large area cloud shadow detection can be caused by errors in estimations of cloud height. This article improves the geometry-based cloud shadow detection method for Landsat OLI from the following two aspects. (1) Cloud height no longer depends on the brightness temperature of the thermal infrared band but uses a possible dynamic range from 200 m to 12,000 m. In this case, cloud shadow is not a specific location but a possible range. Further analysis was carried out in the possible range based on the spectrum to determine cloud shadow location. This effectively avoids the cloud shadow leakage caused by the error in the height determination of a cloud. (2) Object-based and pixel spectral analyses are combined to detect cloud shadows, which can realize cloud shadow detection from two aspects of target scale and pixel scale. Based on the analysis of the spectral differences between the cloud shadow and typical ground objects, the best cloud shadow detection bands of Landsat 8 OLI were determined. The combined use of spectrum and shape can effectively improve the detection precision of cloud shadows produced by thin clouds. Several cloud shadow detection experiments were carried out, and the results were verified by the results of artificial recognition. The results of these experiments indicated that this method can identify cloud shadows in different regions with correct

  12. Clouds enhance Greenland ice sheet mass loss

    Science.gov (United States)

    Van Tricht, Kristof; Gorodetskaya, Irina V.; L'Ecuyer, Tristan; Lenaerts, Jan T. M.; Lhermitte, Stef; Noel, Brice; Turner, David D.; van den Broeke, Michiel R.; van Lipzig, Nicole P. M.

    2015-04-01

    Clouds have a profound influence on both the Arctic and global climate, while they still represent one of the key uncertainties in climate models, limiting the fidelity of future climate projections. The potentially important role of thin liquid-containing clouds over Greenland in enhancing ice sheet melt has recently gained interest, yet current research is spatially and temporally limited, focusing on particular events, and their large scale impact on the surface mass balance remains unknown. We used a combination of satellite remote sensing (CloudSat - CALIPSO), ground-based observations and climate model (RACMO) data to show that liquid-containing clouds warm the Greenland ice sheet 94% of the time. High surface reflectivity (albedo) for shortwave radiation reduces the cloud shortwave cooling effect on the absorbed fluxes, while not influencing the absorption of longwave radiation. Cloud warming over the ice sheet therefore dominates year-round. Only when albedo values drop below ~0.6 in the coastal areas during summer, the cooling effect starts to overcome the warming effect. The year-round excess of energy due to the presence of liquid-containing clouds has an extensive influence on the mass balance of the ice sheet. Simulations using the SNOWPACK snow model showed not only a strong influence of these liquid-containing clouds on melt increase, but also on the increased sublimation mass loss. Simulations with the Community Earth System Climate Model for the end of the 21st century (2080-2099) show that Greenland clouds contain more liquid water path and less ice water path. This implies that cloud radiative forcing will be further enhanced in the future. Our results therefore urge the need for improving cloud microphysics in climate models, to improve future projections of ice sheet mass balance and global sea level rise.

  13. Fluctuations in a quasi-stationary shallow cumulus cloud ensemble

    Directory of Open Access Journals (Sweden)

    M. Sakradzija

    2015-01-01

    Full Text Available We propose an approach to stochastic parameterisation of shallow cumulus clouds to represent the convective variability and its dependence on the model resolution. To collect information about the individual cloud lifecycles and the cloud ensemble as a whole, we employ a large eddy simulation (LES model and a cloud tracking algorithm, followed by conditional sampling of clouds at the cloud-base level. In the case of a shallow cumulus ensemble, the cloud-base mass flux distribution is bimodal, due to the different shallow cloud subtypes, active and passive clouds. Each distribution mode can be approximated using a Weibull distribution, which is a generalisation of exponential distribution by accounting for the change in distribution shape due to the diversity of cloud lifecycles. The exponential distribution of cloud mass flux previously suggested for deep convection parameterisation is a special case of the Weibull distribution, which opens a way towards unification of the statistical convective ensemble formalism of shallow and deep cumulus clouds. Based on the empirical and theoretical findings, a stochastic model has been developed to simulate a shallow convective cloud ensemble. It is formulated as a compound random process, with the number of convective elements drawn from a Poisson distribution, and the cloud mass flux sampled from a mixed Weibull distribution. Convective memory is accounted for through the explicit cloud lifecycles, making the model formulation consistent with the choice of the Weibull cloud mass flux distribution function. The memory of individual shallow clouds is required to capture the correct convective variability. The resulting distribution of the subgrid convective states in the considered shallow cumulus case is scale-adaptive – the smaller the grid size, the broader the distribution.

  14. Radiative properties of clouds

    International Nuclear Information System (INIS)

    Twomey, S.

    1993-01-01

    The climatic effects of condensation nuclei in the formation of cloud droplets and the subsequent role of the cloud droplets as contributors to the planetary short-wave albedo is emphasized. Microphysical properties of clouds, which can be greatly modified by the degree of mixing with cloud-free air from outside, are discussed. The effect of clouds on visible radiation is assessed through multiple scattering of the radiation. Cloudwater or ice absorbs more with increasing wavelength in the near-infrared region, with water vapor providing the stronger absorption over narrower wavelength bands. Cloud thermal infrared absorption can be solely related to liquid water content at least for shallow clouds and clouds in the early development state. Three-dimensional general circulation models have been used to study the climatic effect of clouds. It was found for such studies (which did not consider variations in cloud albedo) that the cooling effects due to the increase in planetary short-wave albedo from clouds were offset by heating effects due to thermal infrared absorption by the cloud. Two permanent direct effects of increased pollution are discussed in this chapter: (a) an increase of absorption in the visible and near infrared because of increased amounts of elemental carbon, which gives rise to a warming effect climatically, and (b) an increased optical thickness of clouds due to increasing cloud droplet number concentration caused by increasing cloud condensation nuclei number concentration, which gives rise to a cooling effect climatically. An increase in cloud albedo from 0.7 to 0.87 produces an appreciable climatic perturbation of cooling up to 2.5 K at the ground, using a hemispheric general circulation model. Effects of pollution on cloud thermal infrared absorption are negligible

  15. Cloud Application Architectures Building Applications and Infrastructure in the Cloud

    CERN Document Server

    Reese, George

    2009-01-01

    If you're involved in planning IT infrastructure as a network or system architect, system administrator, or developer, this book will help you adapt your skills to work with these highly scalable, highly redundant infrastructure services. Cloud Application Architectures will help you determine whether and how to put your applications into these virtualized services, with critical guidance on issues of cost, availability, performance, scaling, privacy, and security.

  16. Cloud-edge mixing: Direct numerical simulation and observations in Indian Monsoon clouds

    Science.gov (United States)

    Kumar, Bipin; Bera, Sudarsan; Prabha, Thara V.; Grabowski, Wojceich W.

    2017-03-01

    A direct numerical simulation (DNS) with the decaying turbulence setup has been carried out to study cloud-edge mixing and its impact on the droplet size distribution (DSD) applying thermodynamic conditions observed in monsoon convective clouds over Indian subcontinent during the Cloud Aerosol Interaction and Precipitation Enhancement EXperiment (CAIPEEX). Evaporation at the cloud-edges initiates mixing at small scale and gradually introduces larger-scale fluctuations of the temperature, moisture, and vertical velocity due to droplet evaporation. Our focus is on early evolution of simulated fields that show intriguing similarities to the CAIPEEX cloud observations. A strong dilution at the cloud edge, accompanied by significant spatial variations of the droplet concentration, mean radius, and spectral width, are found in both the DNS and in observations. In DNS, fluctuations of the mean radius and spectral width come from the impact of small-scale turbulence on the motion and evaporation of inertial droplets. These fluctuations decrease with the increase of the volume over which DNS data are averaged, as one might expect. In cloud observations, these fluctuations also come from other processes, such as entrainment/mixing below the observation level, secondary CCN activation, or variations of CCN activation at the cloud base. Despite large differences in the spatial and temporal scales, the mixing diagram often used in entrainment/mixing studies with aircraft data is remarkably similar for both DNS and cloud observations. We argue that the similarity questions applicability of heuristic ideas based on mixing between two air parcels (that the mixing diagram is designed to properly represent) to the evolution of microphysical properties during turbulent mixing between a cloud and its environment.

  17. Clouds, Wind and the Biogeography of Central American Cloud Forests: Remote Sensing, Atmospheric Modeling, and Walking in the Jungle

    Science.gov (United States)

    Lawton, R.; Nair, U. S.

    2011-12-01

    Cloud forests stand at the core of the complex of montane ecosystems that provide the backbone to the multinational Mesoamerican Biological Corridor, which seeks to protect a biodiversity conservation "hotspot" of global significance in an area of rapidly changing land use. Although cloud forests are generally defined by frequent and prolonged immersion in cloud, workers differ in their feelings about "frequent" and "prolonged", and quantitative assessments are rare. Here we focus on the dry season, in which the cloud and mist from orographic cloud plays a critical role in forest water relations, and discuss remote sensing of orographic clouds, and regional and atmospheric modeling at several scales to quantitatively examine the distribution of the atmospheric conditions that characterize cloud forests. Remote sensing using data from GOES reveals diurnal and longer scale patterns in the distribution of dry season orographic clouds in Central America at both regional and local scales. Data from MODIS, used to calculate the base height of orographic cloud banks, reveals not only the geographic distributon of cloud forest sites, but also striking regional variation in the frequency of montane immersion in orographic cloud. At a more local scale, wind is known to have striking effects on forest structure and species distribution in tropical montane ecosystems, both as a general mechanical stress and as the major agent of ecological disturbance. High resolution regional atmospheric modeling using CSU RAMS in the Monteverde cloud forests of Costa Rica provides quantitative information on the spatial distribution of canopy level winds, insight into the spatial structure and local dynamics of cloud forest communities. This information will be useful in not only in local conservation planning and the design of the Mesoamerican Biological Corridor, but also in assessments of the sensitivity of cloud forests to global and regional climate changes.

  18. Impact of Aerosol Processing on Orographic Clouds

    Science.gov (United States)

    Pousse-Nottelmann, Sara; Zubler, Elias M.; Lohmann, Ulrike

    2010-05-01

    . [6]. Our investigation regarding the influence of aerosol processing will focus on the regional scale using a cloud-system resolving model with a much higher resolution. Emphasis will be placed on orographic mixed-phase precipitation. Different two-dimensional simulations of idealized orographic clouds will be conducted to estimate the effect of aerosol processing on orographic cloud formation and precipitation. Here, cloud lifetime, location and extent as well as the cloud type will be of particular interest. In a supplementary study, the new parameterization will be compared to observations of total and interstitial aerosol concentrations and size distribution at the remote high alpine research station Jungfraujoch in Switzerland. In addition, our simulations will be compared to recent simulations of aerosol processing in warm, mixed-phase and cold clouds, which have been carried out at the location of Jungfraujoch station [5]. References: [1] Pruppacher & Jaenicke (1995), The processing of water vapor and aerosols by atmospheric clouds, a global estimate, Atmos. Res., 38, 283295. [2] Seifert & Beheng (2006), A two-moment microphysics parameterization for mixed-phase clouds. Part 1: Model description, Meteorol. Atmos. Phys., 92, 4566. [3] Vignati et al. (2004), An efficient size-resolved aerosol microphysics module for large-scale transport models, J. Geophys. Res., 109, D22202 [4] Muhlbauer & Lohmann (2008), Sensitivity studies of the role of aerosols in warm-phase orographic precipitation in different flow regimes, J. Atmos. Sci., 65, 25222542. [5] Hoose et al. (2008), Aerosol processing in mixed-phase clouds in ECHAM5HAM: Model description and comparison to observations, J. Geophys. Res., 113, D071210. [6] Hoose et al. (2008), Global simulations of aerosol processing in clouds, Atmos. Chem. Phys., 8, 69396963.

  19. Large-Scale Analysis of Relationships between Mineral Dust, Ice Cloud Properties, and Precipitation from Satellite Observations Using a Bayesian Approach: Theoretical Basis and First Results for the Tropical Atlantic Ocean

    Directory of Open Access Journals (Sweden)

    Lars Klüser

    2017-01-01

    Full Text Available Mineral dust and ice cloud observations from the Infrared Atmospheric Sounding Interferometer (IASI are used to assess the relationships between desert dust aerosols and ice clouds over the tropical Atlantic Ocean during the hurricane season 2008. Cloud property histograms are first adjusted for varying cloud top temperature or ice water path distributions with a Bayesian approach to account for meteorological constraints on the cloud variables. Then, histogram differences between dust load classes are used to describe the impact of dust load on cloud property statistics. The analysis of the histogram differences shows that ice crystal sizes are reduced with increasing aerosol load and ice cloud optical depth and ice water path are increased. The distributions of all three variables broaden and get less skewed in dusty environments. For ice crystal size the significant bimodality is reduced and the order of peaks is reversed. Moreover, it is shown that not only are distributions of ice cloud variables simply shifted linearly but also variance, skewness, and complexity of the cloud variable distributions are significantly affected. This implies that the whole cloud variable distributions have to be considered for indirect aerosol effects in any application for climate modelling.

  20. SECURITY AND PRIVACY ISSUES IN CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Amina AIT OUAHMAN

    2014-10-01

    Full Text Available Today, cloud computing is defined and talked about across the ICT industry under different contexts and with different definitions attached to it. It is a new paradigm in the evolution of Information Technology, as it is one of the biggest revolutions in this field to have taken place in recent times. According to the National Institute for Standards and Technology (NIST, “cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services that can be rapidly provisioned and released with minimal management effort or service provider interaction” [1]. The importance of Cloud Computing is increasing and it is receiving a growing attention in the scientific and industrial communities. A study by Gartner [2] considered Cloud Computing as the first among the top 10 most important technologies and with a better prospect in successive years by companies and organizations. Clouds bring out tremendous benefits for both individuals and enterprises. Clouds support economic savings, outsourcing mechanisms, resource sharing, any-where any-time accessibility, on-demand scalability, and service flexibility. Clouds minimize the need for user involvement by masking technical details such as software upgrades, licenses, and maintenance from its customers. Clouds could also offer better security advantages over individual server deployments. Since a cloud aggregates resources, cloud providers charter expert security personnel while typical companies could be limited with a network administrator who might not be well versed in cyber security issues. The new concepts introduced by the clouds, such as computation outsourcing, resource sharing, and external data warehousing, increase the security and privacy concerns and create new security challenges. Moreover, the large scale of the clouds, the proliferation of mobile access devices (e

  1. Moving towards Cloud Security

    Directory of Open Access Journals (Sweden)

    Edit Szilvia Rubóczki

    2015-01-01

    Full Text Available Cloud computing hosts and delivers many different services via Internet. There are a lot of reasons why people opt for using cloud resources. Cloud development is increasing fast while a lot of related services drop behind, for example the mass awareness of cloud security. However the new generation upload videos and pictures without reason to a cloud storage, but only few know about data privacy, data management and the proprietary of stored data in the cloud. In an enterprise environment the users have to know the rule of cloud usage, however they have little knowledge about traditional IT security. It is important to measure the level of their knowledge, and evolve the training system to develop the security awareness. The article proves the importance of suggesting new metrics and algorithms for measuring security awareness of corporate users and employees to include the requirements of emerging cloud security.

  2. Cloud Computing Strategy

    Science.gov (United States)

    2012-07-01

    regardless of  access point or the device being used across the Global Information Grid ( GIG ).  These data  centers will host existing applications...state.  It  illustrates that the DoD Enterprise Cloud is an integrated environment on the  GIG , consisting of  DoD Components, commercial entities...Operations and Maintenance (O&M) costs by  leveraging  economies  of scale, and automate monitoring and provisioning to reduce the  human cost of service

  3. THE MASS-LOSS RETURN FROM EVOLVED STARS TO THE LARGE MAGELLANIC CLOUD. VI. LUMINOSITIES AND MASS-LOSS RATES ON POPULATION SCALES

    International Nuclear Information System (INIS)

    Riebel, D.; Meixner, M.; Srinivasan, S.; Sargent, B.

    2012-01-01

    We present results from the first application of the Grid of Red Supergiant and Asymptotic Giant Branch ModelS (GRAMS) model grid to the entire evolved stellar population of the Large Magellanic Cloud (LMC). GRAMS is a pre-computed grid of 80,843 radiative transfer models of evolved stars and circumstellar dust shells composed of either silicate or carbonaceous dust. We fit GRAMS models to ∼30,000 asymptotic giant branch (AGB) and red supergiant (RSG) stars in the LMC, using 12 bands of photometry from the optical to the mid-infrared. Our published data set consists of thousands of evolved stars with individually determined evolutionary parameters such as luminosity and mass-loss rate. The GRAMS grid has a greater than 80% accuracy rate discriminating between oxygen- and carbon-rich chemistry. The global dust injection rate to the interstellar medium (ISM) of the LMC from RSGs and AGB stars is on the order of 2.1 × 10 –5 M ☉ yr –1 , equivalent to a total mass injection rate (including the gas) into the ISM of ∼6 × 10 –3 M ☉ yr –1 . Carbon stars inject two and a half times as much dust into the ISM as do O-rich AGB stars, but the same amount of mass. We determine a bolometric correction factor for C-rich AGB stars in the K s band as a function of J – K s color, BC K s = -0.40(J-K s ) 2 + 1.83(J-K s ) + 1.29. We determine several IR color proxies for the dust mass-loss rate (M-dot d ) from C-rich AGB stars, such as log M-dot d = (-18.90/((K s -[8.0])+3.37) - 5.93. We find that a larger fraction of AGB stars exhibiting the 'long-secondary period' phenomenon are more O-rich than stars dominated by radial pulsations, and AGB stars without detectable mass loss do not appear on either the first-overtone or fundamental-mode pulsation sequences.

  4. Marine cloud brightening

    OpenAIRE

    Latham, John; Bower, Keith; Choularton, Tom; Coe, Hugh; Connolly, Paul; Cooper, Gary; Craft, Tim; Foster, Jack; Gadian, Alan; Galbraith, Lee; Iacovides, Hector; Johnston, David; Launder, Brian; Leslie, Brian; Meyer, John

    2012-01-01

    The idea behind the marine cloud-brightening (MCB) geoengineering technique is that seeding marine stratocumulus clouds with copious quantities of roughly monodisperse sub-micrometre sea water particles might significantly enhance the cloud droplet number concentration, and thereby the cloud albedo and possibly longevity. This would produce a cooling, which general circulation model (GCM) computations suggest could—subject to satisfactory resolution of technical and scientific problems identi...

  5. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  6. Towards Indonesian Cloud Campus

    OpenAIRE

    Thamrin, Taqwan; Lukman, Iing; Wahyuningsih, Dina Ika

    2013-01-01

    Nowadays, Cloud Computing is most discussed term in business and academic environment.Cloud campus has many benefits such as accessing the file storages, e-mails, databases,educational resources, research applications and tools anywhere for faculty, administrators,staff, students and other users in university, on demand. Furthermore, cloud campus reduces universities’ IT complexity and cost.This paper discuss the implementation of Indonesian cloud campus and various opportunies and benefits...

  7. Cloud Infrastructure Security

    OpenAIRE

    Velev , Dimiter; Zlateva , Plamena

    2010-01-01

    Part 4: Security for Clouds; International audience; Cloud computing can help companies accomplish more by eliminating the physical bonds between an IT infrastructure and its users. Users can purchase services from a cloud environment that could allow them to save money and focus on their core business. At the same time certain concerns have emerged as potential barriers to rapid adoption of cloud services such as security, privacy and reliability. Usually the information security professiona...

  8. Cloud services in organization

    OpenAIRE

    FUXA, Jan

    2013-01-01

    The work deals with the definition of the word cloud computing, cloud computing models, types, advantages, disadvantages, and comparing SaaS solutions such as: Google Apps and Office 365 in the area of electronic communications. The work deals with the use of cloud computing in the corporate practice, both good and bad practice. The following section describes the methodology for choosing the appropriate cloud service organization. Another part deals with analyzing the possibilities of SaaS i...

  9. Orchestrating Your Cloud Orchestra

    OpenAIRE

    Hindle, Abram

    2015-01-01

    Cloud computing potentially ushers in a new era of computer music performance with exceptionally large computer music instruments consisting of 10s to 100s of virtual machines which we propose to call a `cloud-orchestra'. Cloud computing allows for the rapid provisioning of resources, but to deploy such a complicated and interconnected network of software synthesizers in the cloud requires a lot of manual work, system administration knowledge, and developer/operator skills. This is a barrier ...

  10. Cloud security mechanisms

    OpenAIRE

    2014-01-01

    Cloud computing has brought great benefits in cost and flexibility for provisioning services. The greatest challenge of cloud computing remains however the question of security. The current standard tools in access control mechanisms and cryptography can only partly solve the security challenges of cloud infrastructures. In the recent years of research in security and cryptography, novel mechanisms, protocols and algorithms have emerged that offer new ways to create secure services atop cloud...

  11. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  12. Cloud Robotics Model

    OpenAIRE

    Mester, Gyula

    2015-01-01

    Cloud Robotics was born from the merger of service robotics and cloud technologies. It allows robots to benefit from the powerful computational, storage, and communications resources of modern data centres. Cloud robotics allows robots to take advantage of the rapid increase in data transfer rates to offload tasks without hard real time requirements. Cloud Robotics has rapidly gained momentum with initiatives by companies such as Google, Willow Garage and Gostai as well as more than a dozen a...

  13. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  14. Secure Web System in a Cloud Environment

    OpenAIRE

    Pokherl, Bibesh

    2013-01-01

    Advent of cloud computing has brought a lot of benefits for users based on its essential characteristics. Users are attracted by its costs per use service and rapidly deploy their applications in the cloud and scale by using virtualization technology without investing in their own IT infrastructure. These applications can be accessed through web based technology, such as web browsers or mobile apps. However, security becomes a major challenge when user’s data and applications are stored in a ...

  15. On CLOUD nine

    CERN Multimedia

    2009-01-01

    The team from the CLOUD experiment - the world’s first experiment using a high-energy particle accelerator to study the climate - were on cloud nine after the arrival of their new three-metre diameter cloud chamber. This marks the end of three years’ R&D and design, and the start of preparations for data taking later this year.

  16. Cloud Computing Explained

    Science.gov (United States)

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  17. Greening the Cloud

    NARCIS (Netherlands)

    van den Hoed, Robert; Hoekstra, Eric; Procaccianti, G.; Lago, P.; Grosso, Paola; Taal, Arie; Grosskop, Kay; van Bergen, Esther

    The cloud has become an essential part of our daily lives. We use it to store our documents (Dropbox), to stream our music and lms (Spotify and Net ix) and without giving it any thought, we use it to work on documents in the cloud (Google Docs). The cloud forms a massive storage and processing

  18. Security in the cloud.

    Science.gov (United States)

    Degaspari, John

    2011-08-01

    As more provider organizations look to the cloud computing model, they face a host of security-related questions. What are the appropriate applications for the cloud, what is the best cloud model, and what do they need to know to choose the best vendor? Hospital CIOs and security experts weigh in.

  19. Frank: The LOD cloud at your fingertips?

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens

    2015-01-01

    Large-scale, algorithmic access to LOD Cloud data has been hampered by the absence of queryable endpoints for many datasets, a plethora of serialization formats, and an abundance of idiosyncrasies such as syntax errors. As of late, very large-scale — hundreds of thousands of document, tens of

  20. Frank : The LOD cloud at your fingertips?

    NARCIS (Netherlands)

    Beek, Wouter; Rietveld, Laurens

    2015-01-01

    Large-scale, algorithmic access to LOD Cloud data has been hampered by the absence of queryable endpoints for many datasets, a plethora of serialization formats, and an abundance of idiosyncrasies such as syntax errors. As of late, very large-scale - hundreds of thousands of document, tens of

  1. A Novel Cloud Computing Algorithm of Security and Privacy

    Directory of Open Access Journals (Sweden)

    Chih-Yung Chen

    2013-01-01

    Full Text Available The emergence of cloud computing has simplified the flow of large-scale deployment distributed system of software suppliers; when issuing respective application programs in a sharing clouds service to different user, the management of material becomes more complex. Therefore, in multitype clouds service of trust environment, when enterprises face cloud computing, what most worries is the issue of security, but individual users are worried whether the privacy material will have an outflow risk. This research has mainly analyzed several different construction patterns of cloud computing, and quite relevant case in the deployment construction security of cloud computing by fit and unfit quality, and proposed finally an optimization safe deployment construction of cloud computing and security mechanism of material protection calculating method, namely, Global Authentication Register System (GARS, to reduce cloud material outflow risk. We implemented a system simulation to test the GARS algorithm of availability, security and performance. By experimental data analysis, the solutions of cloud computing security, and privacy derived from the research can be effective protection in cloud information security. Moreover, we have proposed cloud computing in the information security-related proposals that would provide related units for the development of cloud computing security practice.

  2. Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests

    Science.gov (United States)

    Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.

    2010-01-01

    Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.

  3. Evaluating the impact of aerosol particles above cloud on cloud optical depth retrievals from MODIS

    Science.gov (United States)

    Alfaro-Contreras, Ricardo; Zhang, Jianglong; Campbell, James R.; Holz, Robert E.; Reid, Jeffrey S.

    2014-05-01

    Using two different operational Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) cloud optical depth (COD) retrievals (0.86 versus 1.6 µm), we evaluate the impact of above-cloud smoke aerosol particles on near-IR (0.86 µm) COD retrievals. Aerosol Index (AI) from the collocated Ozone Monitoring Instrument (OMI) are used to identify above-cloud aerosol particle loading over the southern Atlantic Ocean, including both smoke and dust from the African subcontinent. Collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation data constrain cloud phase and provide contextual above-cloud aerosol optical depth. The frequency of occurrence of above-cloud aerosol events is depicted on a global scale for the spring and summer seasons from OMI and Cloud Aerosol Lidar with Orthogonal Polarization. Seasonal frequencies for smoke-over-cloud off the southwestern Africa coastline reach 20-50% in boreal summer. We find a corresponding low COD bias of 10-20% for standard MODIS COD retrievals when averaged OMI AI are larger than 1. No such bias is found over the Saharan dust outflow region off northern Africa, since both MODIS 0.86 and 1.6 µm channels are vulnerable to radiance attenuation due to dust particles. A similar result is found for a smaller domain, in the Gulf of Tonkin region, from smoke advection over marine stratocumulus clouds and outflow into the northern South China Sea in spring. This study shows the necessity of accounting for the above-cloud aerosol events for future studies using standard MODIS cloud products in biomass burning outflow regions, through the use of collocated OMI AI and supplementary MODIS 1.6 µm COD products.

  4. Evaluating the impact of above-cloud aerosols on cloud optical depth retrievals from MODIS

    Science.gov (United States)

    Alfaro, Ricardo

    Using two different operational Aqua Moderate Resolution Imaging Spectroradiometer (MODIS) cloud optical depth (COD) retrievals (visible and shortwave infrared), the impacts of above-cloud absorbing aerosols on the standard COD retrievals are evaluated. For fine-mode aerosol particles, aerosol optical depth (AOD) values diminish sharply from the visible to the shortwave infrared channels. Thus, a suppressed above-cloud particle radiance aliasing effect occurs for COD retrievals using shortwave infrared channels. Aerosol Index (AI) from the spatially and temporally collocated Ozone Monitoring Instrument (OMI) are used to identify above-cloud aerosol particle loading over the southern Atlantic Ocean, including both smoke and dust from the African sub-continent. MODIS and OMI Collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) data are used to constrain cloud phase and provide contextual above-cloud AOD values. The frequency of occurrence of above-cloud aerosols is depicted on a global scale for the spring and summer seasons from OMI and CALIOP, thus indicating the significance of the problem. Seasonal frequencies for smoke-over-cloud off the southwestern Africa coastline reach 20--50% in boreal summer. We find a corresponding low COD bias of 10--20% for standard MODIS COD retrievals when averaged OMI AI are larger than 1.0. No such bias is found over the Saharan dust outflow region off northern Africa, since both MODIS visible and shortwave in channels are vulnerable to dust particle aliasing, and thus a COD impact cannot be isolated with this method. A similar result is found for a smaller domain, in the Gulf of Tonkin region, from smoke advection over marine stratocumulus clouds and outflow into the northern South China Sea in spring. This study shows the necessity of accounting for the above-cloud aerosol events for future studies using standard MODIS cloud products in biomass burning outflow regions, through the use of

  5. Microphysical processing of aerosol particles in orographic clouds

    Science.gov (United States)

    Pousse-Nottelmann, S.; Zubler, E. M.; Lohmann, U.

    2015-08-01

    An explicit and detailed treatment of cloud-borne particles allowing for the consideration of aerosol cycling in clouds has been implemented into COSMO-Model, the regional weather forecast and climate model of the Consortium for Small-scale Modeling (COSMO). The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed us to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener-Bergeron-Findeisen (WBF) process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snowflakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snowflakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. Thereby, the processes impact the total aerosol number and mass and additionally alter the shape of the aerosol size distributions by enhancing the internally mixed/soluble Aitken and accumulation mode and generating coarse-mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases

  6. CLOUD STORAGE SERVICES

    OpenAIRE

    Yan, Cheng

    2017-01-01

    Cloud computing is a hot topic in recent research and applications. Because it is widely used in various fields. Up to now, Google, Microsoft, IBM, Amazon and other famous co partnership have proposed their cloud computing application. Look upon cloud computing as one of the most important strategy in the future. Cloud storage is the lower layer of cloud computing system which supports the service of the other layers above it. At the same time, it is an effective way to store and manage heavy...

  7. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  8. Benchmarking Cloud Storage Systems

    OpenAIRE

    Wang, Xing

    2014-01-01

    With the rise of cloud computing, many cloud storage systems like Dropbox, Google Drive and Mega have been built to provide decentralized and reliable file storage. It is thus of prime importance to know their features, performance, and the best way to make use of them. In this context, we introduce BenchCloud, a tool designed as part of this thesis to conveniently and efficiently benchmark any cloud storage system. First, we provide a study of six commonly-used cloud storage systems to ident...

  9. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  10. An OpenFlow based network virtualization framework for the Cloud

    NARCIS (Netherlands)

    Matias, J.; Jacob, E.; Sanchez, D.; Demchenko, Y.

    2011-01-01

    The Cloud computing paradigm entails a challenging networking scenario. Due to the economy of scale, the Cloud is mainly supported by Data Center infrastructures. Therefore, virtualized environment manageability, seamless migration of virtual machines, inter-domain communication issues and

  11. A Diagnostic PDF Cloud Scheme to Improve Subtropical Low Clouds in NCAR Community Atmosphere Model (CAM5)

    Science.gov (United States)

    Qin, Yi; Lin, Yanluan; Xu, Shiming; Ma, Hsi-Yen; Xie, Shaocheng

    2018-02-01

    Low clouds strongly impact the radiation budget of the climate system, but their simulation in most GCMs has remained a challenge, especially over the subtropical stratocumulus region. Assuming a Gaussian distribution for the subgrid-scale total water and liquid water potential temperature, a new statistical cloud scheme is proposed and tested in NCAR Community Atmospheric Model version 5 (CAM5). The subgrid-scale variance is diagnosed from the turbulent and shallow convective processes in CAM5. The approach is able to maintain the consistency between cloud fraction and cloud condensate and thus alleviates the adjustment needed in the default relative humidity-based cloud fraction scheme. Short-term forecast simulations indicate that low cloud fraction and liquid water content, including their diurnal cycle, are improved due to a proper consideration of subgrid-scale variance over the southeastern Pacific Ocean region. Compared with the default cloud scheme, the new approach produced the mean climate reasonably well with improved shortwave cloud forcing (SWCF) due to more reasonable low cloud fraction and liquid water path over regions with predominant low clouds. Meanwhile, the SWCF bias over the tropical land regions is also alleviated. Furthermore, the simulated marine boundary layer clouds with the new approach extend further offshore and agree better with observations. The new approach is able to obtain the top of atmosphere (TOA) radiation balance with a slightly alleviated double ITCZ problem in preliminary coupled simulations. This study implies that a close coupling of cloud processes with other subgrid-scale physical processes is a promising approach to improve cloud simulations.

  12. Diffusion and deposition of the Schooner clouds

    Energy Technology Data Exchange (ETDEWEB)

    Crawford, Todd V [Lawrence Radiation Laboratory, University of California, Livermore, CA (United States)

    1970-05-01

    Schooner was a 31-kt nuclear cratering experiment done as part of the U.S. Atomic Energy Commission's Plowshare Program. Detonation was at 0800 PST on December 8, 1968 at the Nevada Test Site. The resulting cloud had ceased its dynamic growth by about H+4 min. Two distinct parts, a base surge and a main cloud, were evident. Thereafter, further cloud growth was by diffusion and fallout as the cloud moved downwind. Aircraft sampling of the cloud at H+12.5 min revealed that the main cloud part contained about 10 times as much radioactivity as the base surge part. Later aircraft data, local fallout field measurements, and airborne particle size data indicate that the H+12.5-min cloud burdens, primarily the tungsten isotopes, were depleted by a factor of about 2, due to fallout, over the next few hours. The remaining airborne cloud burdens for each cloud were used as input to diffusion calculations. Calculated main cloud center concentrations using observed cloud sizes, cloud burdens, and meteorology agree with measurements to better than a factor of 2 over 1 1/2 days. These postshot calculations and data are about a factor of 3 higher than calculations done preshot. Base surge calculations are consistent with available data to within about a factor of 4, but the data needed to perform as complete an analysis as was done for the main cloud do not exist. Fallout, as distinguished from deposition of nonfalling debris, was important to a distance of about 500 km for the main cloud and to a distance of about 100 km for the base surge. At distances closer to ground zero, diffusion calculations under-predicted ground level concentration and deposition, but an isotopically scaled external gross gamma fallout calculation was within about a factor of 3 of the data. At larger distances downwind for the base surge, ground level exposure rate calculations and deposition for a variety of nuclides agree to within about a factor of 3 of measurements. (author)

  13. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    Ştefan IOVAN

    2016-05-01

    Full Text Available Cloud computing reprentes the software applications offered as a service online, but also the software and hardware components from the data center.In the case of wide offerd services for any type of client, we are dealing with a public cloud. In the other case, in wich a cloud is exclusively available for an organization and is not available to the open public, this is consider a private cloud [1]. There is also a third type, called hibrid in which case an user or an organization might use both services available in the public and private cloud. One of the main challenges of cloud computing are to build the trust and ofer information privacy in every aspect of service offerd by cloud computingle. The variety of existing standards, just like the lack of clarity in sustenability certificationis not a real help in building trust. Also appear some questions marks regarding the efficiency of traditionsecurity means that are applied in the cloud domain. Beside the economic and technology advantages offered by cloud, also are some advantages in security area if the information is migrated to cloud. Shared resources available in cloud includes the survey, use of the "best practices" and technology for advance security level, above all the solutions offered by the majority of medium and small businesses, big companies and even some guvermental organizations [2].

  14. Validation of Cloud Properties From Multiple Satellites Using CALIOP Data

    Science.gov (United States)

    Yost, Christopher R.; Minnis, Patrick; Bedka, Kristopher M.; Heck, Patrick W.; Palikonda, Rabindra; Sun-Mack, Sunny; Trepte, Qing

    2016-01-01

    The NASA Langley Satellite ClOud and Radiative Property retrieval System (SatCORPS) is routinely applied to multispectral imagery from several geostationary and polar-orbiting imagers to retrieve cloud properties for weather and climate applications. Validation of the retrievals with independent datasets is continuously ongoing in order to understand differences caused by calibration, spatial resolution, viewing geometry, and other factors. The CALIOP instrument provides a decade of detailed cloud observations which can be used to evaluate passive imager retrievals of cloud boundaries, thermodynamic phase, cloud optical depth, and water path on a global scale. This paper focuses on comparisons of CALIOP retrievals to retrievals from MODIS, VIIRS, AVHRR, GOES, SEVIRI, and MTSAT. CALIOP is particularly skilled at detecting weakly-scattering cirrus clouds with optical depths less than approx. 0.5. These clouds are often undetected by passive imagers and the effect this has on the property retrievals is discussed.

  15. Cloud chamber experiments on the origin of ice crystal complexity in cirrus clouds

    Directory of Open Access Journals (Sweden)

    M. Schnaiter

    2016-04-01

    Full Text Available This study reports on the origin of small-scale ice crystal complexity and its influence on the angular light scattering properties of cirrus clouds. Cloud simulation experiments were conducted at the AIDA (Aerosol Interactions and Dynamics in the Atmosphere cloud chamber of the Karlsruhe Institute of Technology (KIT. A new experimental procedure was applied to grow and sublimate ice particles at defined super- and subsaturated ice conditions and for temperatures in the −40 to −60 °C range. The experiments were performed for ice clouds generated via homogeneous and heterogeneous initial nucleation. Small-scale ice crystal complexity was deduced from measurements of spatially resolved single particle light scattering patterns by the latest version of the Small Ice Detector (SID-3. It was found that a high crystal complexity dominates the microphysics of the simulated clouds and the degree of this complexity is dependent on the available water vapor during the crystal growth. Indications were found that the small-scale crystal complexity is influenced by unfrozen H2SO4 / H2O residuals in the case of homogeneous initial ice nucleation. Angular light scattering functions of the simulated ice clouds were measured by the two currently available airborne polar nephelometers: the polar nephelometer (PN probe of Laboratoire de Métérologie et Physique (LaMP and the Particle Habit Imaging and Polar Scattering (PHIPS-HALO probe of KIT. The measured scattering functions are featureless and flat in the side and backward scattering directions. It was found that these functions have a rather low sensitivity to the small-scale crystal complexity for ice clouds that were grown under typical atmospheric conditions. These results have implications for the microphysical properties of cirrus clouds and for the radiative transfer through these clouds.

  16. Bipolar H II regions produced by cloud-cloud collisions

    Science.gov (United States)

    Whitworth, Anthony; Lomax, Oliver; Balfour, Scott; Mège, Pierre; Zavagno, Annie; Deharveng, Lise

    2018-05-01

    We suggest that bipolar H II regions may be the aftermath of collisions between clouds. Such a collision will produce a shock-compressed layer, and a star cluster can then condense out of the dense gas near the center of the layer. If the clouds are sufficiently massive, the star cluster is likely to contain at least one massive star, which emits ionizing radiation, and excites an H II region, which then expands, sweeping up the surrounding neutral gas. Once most of the matter in the clouds has accreted onto the layer, expansion of the H II region meets little resistance in directions perpendicular to the midplane of the layer, and so it expands rapidly to produce two lobes of ionized gas, one on each side of the layer. Conversely, in directions parallel to the midplane of the layer, expansion of the H II region stalls due to the ram pressure of the gas that continues to fall towards the star cluster from the outer parts of the layer; a ring of dense neutral gas builds up around the waist of the bipolar H II region, and may spawn a second generation of star formation. We present a dimensionless model for the flow of ionized gas in a bipolar H II region created according to the above scenario, and predict the characteristics of the resulting free-free continuum and recombination-line emission. This dimensionless model can be scaled to the physical parameters of any particular system. Our intention is that these predictions will be useful in testing the scenario outlined above, and thereby providing indirect support for the role of cloud-cloud collisions in triggering star formation.

  17. Rayleigh convective instability in the presence of phase transitions of water vapor. The formation of large-scale eddies and cloud structures

    International Nuclear Information System (INIS)

    Shmerlin, Boris Ya; Kalashnik, Maksim V

    2013-01-01

    Convective motions in moist saturated air are accompanied by the release of latent heat of condensation. Taking this effect into account, we consider the problem of convective instability of a moist saturated air layer, generalizing the formulation of the classical Rayleigh problem. An analytic solution demonstrating the fundamental difference between moist convection and Rayleigh convection is obtained. Upon losing stability in the two-dimensional case, localized convective rolls or spatially periodic chains of rollers with localized areas of upward motion evolve. In the case of axial symmetry, the growth of localized convective vortices with circulation characteristic of tropical cyclones (hurricanes) is possible at the early stages of development and on the scale of tornados to tropical cyclones. (methodological notes)

  18. Aerosol effects on cloud water amounts were successfully simulated by a global cloud-system resolving model.

    Science.gov (United States)

    Sato, Yousuke; Goto, Daisuke; Michibata, Takuro; Suzuki, Kentaroh; Takemura, Toshihiko; Tomita, Hirofumi; Nakajima, Teruyuki

    2018-03-07

    Aerosols affect climate by modifying cloud properties through their role as cloud condensation nuclei or ice nuclei, called aerosol-cloud interactions. In most global climate models (GCMs), the aerosol-cloud interactions are represented by empirical parameterisations, in which the mass of cloud liquid water (LWP) is assumed to increase monotonically with increasing aerosol loading. Recent satellite observations, however, have yielded contradictory results: LWP can decrease with increasing aerosol loading. This difference implies that GCMs overestimate the aerosol effect, but the reasons for the difference are not obvious. Here, we reproduce satellite-observed LWP responses using a global simulation with explicit representations of cloud microphysics, instead of the parameterisations. Our analyses reveal that the decrease in LWP originates from the response of evaporation and condensation processes to aerosol perturbations, which are not represented in GCMs. The explicit representation of cloud microphysics in global scale modelling reduces the uncertainty of climate prediction.

  19. Searchable Encryption in Cloud Storage

    OpenAIRE

    Ren-Junn Hwang; Chung-Chien Lu; Jain-Shing Wu

    2014-01-01

    Cloud outsource storage is one of important services in cloud computing. Cloud users upload data to cloud servers to reduce the cost of managing data and maintaining hardware and software. To ensure data confidentiality, users can encrypt their files before uploading them to a cloud system. However, retrieving the target file from the encrypted files exactly is difficult for cloud server. This study proposes a protocol for performing multikeyword searches for encrypted cloud data by applying ...

  20. Enterprise Cloud Adoption - Cloud Maturity Assessment Model

    OpenAIRE

    Conway, Gerry; Doherty, Eileen; Carcary, Marian; Crowley, Catherine

    2017-01-01

    The introduction and use of cloud computing by an organization has the promise of significant benefits that include reduced costs, improved services, and a pay-per-use model. Organizations that successfully harness these benefits will potentially have a distinct competitive edge, due to their increased agility and flexibility to rapidly respond to an ever changing and complex business environment. However, as cloud technology is a relatively new ph...

  1. STAR FORMATION IN TURBULENT MOLECULAR CLOUDS WITH COLLIDING FLOW

    International Nuclear Information System (INIS)

    Matsumoto, Tomoaki; Dobashi, Kazuhito; Shimoikura, Tomomi

    2015-01-01

    Using self-gravitational hydrodynamical numerical simulations, we investigated the evolution of high-density turbulent molecular clouds swept by a colliding flow. The interaction of shock waves due to turbulence produces networks of thin filamentary clouds with a sub-parsec width. The colliding flow accumulates the filamentary clouds into a sheet cloud and promotes active star formation for initially high-density clouds. Clouds with a colliding flow exhibit a finer filamentary network than clouds without a colliding flow. The probability distribution functions (PDFs) for the density and column density can be fitted by lognormal functions for clouds without colliding flow. When the initial turbulence is weak, the column density PDF has a power-law wing at high column densities. The colliding flow considerably deforms the PDF, such that the PDF exhibits a double peak. The stellar mass distributions reproduced here are consistent with the classical initial mass function with a power-law index of –1.35 when the initial clouds have a high density. The distribution of stellar velocities agrees with the gas velocity distribution, which can be fitted by Gaussian functions for clouds without colliding flow. For clouds with colliding flow, the velocity dispersion of gas tends to be larger than the stellar velocity dispersion. The signatures of colliding flows and turbulence appear in channel maps reconstructed from the simulation data. Clouds without colliding flow exhibit a cloud-scale velocity shear due to the turbulence. In contrast, clouds with colliding flow show a prominent anti-correlated distribution of thin filaments between the different velocity channels, suggesting collisions between the filamentary clouds

  2. STAR FORMATION IN TURBULENT MOLECULAR CLOUDS WITH COLLIDING FLOW

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, Tomoaki [Faculty of Humanity and Environment, Hosei University, Fujimi, Chiyoda-ku, Tokyo 102-8160 (Japan); Dobashi, Kazuhito; Shimoikura, Tomomi, E-mail: matsu@hosei.ac.jp [Department of Astronomy and Earth Sciences, Tokyo Gakugei University, Koganei, Tokyo 184-8501 (Japan)

    2015-03-10

    Using self-gravitational hydrodynamical numerical simulations, we investigated the evolution of high-density turbulent molecular clouds swept by a colliding flow. The interaction of shock waves due to turbulence produces networks of thin filamentary clouds with a sub-parsec width. The colliding flow accumulates the filamentary clouds into a sheet cloud and promotes active star formation for initially high-density clouds. Clouds with a colliding flow exhibit a finer filamentary network than clouds without a colliding flow. The probability distribution functions (PDFs) for the density and column density can be fitted by lognormal functions for clouds without colliding flow. When the initial turbulence is weak, the column density PDF has a power-law wing at high column densities. The colliding flow considerably deforms the PDF, such that the PDF exhibits a double peak. The stellar mass distributions reproduced here are consistent with the classical initial mass function with a power-law index of –1.35 when the initial clouds have a high density. The distribution of stellar velocities agrees with the gas velocity distribution, which can be fitted by Gaussian functions for clouds without colliding flow. For clouds with colliding flow, the velocity dispersion of gas tends to be larger than the stellar velocity dispersion. The signatures of colliding flows and turbulence appear in channel maps reconstructed from the simulation data. Clouds without colliding flow exhibit a cloud-scale velocity shear due to the turbulence. In contrast, clouds with colliding flow show a prominent anti-correlated distribution of thin filaments between the different velocity channels, suggesting collisions between the filamentary clouds.

  3. Microphysical processing of aerosol particles in orographic clouds

    Directory of Open Access Journals (Sweden)

    S. Pousse-Nottelmann

    2015-08-01

    aerosol cycling in clouds has been implemented into COSMO-Model, the regional weather forecast and climate model of the Consortium for Small-scale Modeling (COSMO. The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed us to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener–Bergeron–Findeisen (WBF process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snowflakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snowflakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. Thereby, the processes impact the total aerosol number and mass and additionally alter the shape of the aerosol size distributions by enhancing the internally mixed/soluble Aitken and accumulation mode and generating coarse-mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases the cloud droplet number concentration with possible implications for the ice crystal number

  4. A cloud/particle model of the interstellar medium - Galactic spiral structure

    Science.gov (United States)

    Levinson, F. H.; Roberts, W. W., Jr.

    1981-01-01

    A cloud/particle model for gas flow in galaxies is developed that incorporates cloud-cloud collisions and supernovae as dominant local processes. Cloud-cloud collisions are the main means of dissipation. To counter this dissipation and maintain local dispersion, supernova explosions in the medium administer radial snowplow pushes to all nearby clouds. The causal link between these processes is that cloud-cloud collisions will form stars and that these stars will rapidly become supernovae. The cloud/particle model is tested and used to investigate the gas dynamics and spiral structures in galaxies where these assumptions may be reasonable. Particular attention is given to whether large-scale galactic shock waves, which are thought to underlie the regular well-delineated spiral structure in some galaxies, form and persist in a cloud-supernova dominated interstellar medium; this question is answered in the affirmative.

  5. Star clouds of Magellan

    International Nuclear Information System (INIS)

    Tucker, W.

    1981-01-01

    The Magellanic Clouds are two irregular galaxies belonging to the local group which the Milky Way belongs to. By studying the Clouds, astronomers hope to gain insight into the origin and composition of the Milky Way. The overall structure and dynamics of the Clouds are clearest when studied in radio region of the spectrum. One benefit of directly observing stellar luminosities in the Clouds has been the discovery of the period-luminosity relation. Also, the Clouds are a splendid laboratory for studying stellar evolution. It is believed that both Clouds may be in the very early stage in the development of a regular, symmetric galaxy. This raises a paradox because some of the stars in the star clusters of the Clouds are as old as the oldest stars in our galaxy. An explanation for this is given. The low velocity of the Clouds with respect to the center of the Milky Way shows they must be bound to it by gravity. Theories are given on how the Magellanic Clouds became associated with the galaxy. According to current ideas the Clouds orbits will decay and they will spiral into the Galaxy

  6. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  7. THE CALIFORNIA MOLECULAR CLOUD

    International Nuclear Information System (INIS)

    Lada, Charles J.; Lombardi, Marco; Alves, Joao F.

    2009-01-01

    We present an analysis of wide-field infrared extinction maps of a region in Perseus just north of the Taurus-Auriga dark cloud complex. From this analysis we have identified a massive, nearby, but previously unrecognized, giant molecular cloud (GMC). Both a uniform foreground star density and measurements of the cloud's velocity field from CO observations indicate that this cloud is likely a coherent structure at a single distance. From comparison of foreground star counts with Galactic models, we derive a distance of 450 ± 23 pc to the cloud. At this distance the cloud extends over roughly 80 pc and has a mass of ∼ 10 5 M sun , rivaling the Orion (A) molecular cloud as the largest and most massive GMC in the solar neighborhood. Although surprisingly similar in mass and size to the more famous Orion molecular cloud (OMC) the newly recognized cloud displays significantly less star formation activity with more than an order of magnitude fewer young stellar objects than found in the OMC, suggesting that both the level of star formation and perhaps the star formation rate in this cloud are an order of magnitude or more lower than in the OMC. Analysis of extinction maps of both clouds shows that the new cloud contains only 10% the amount of high extinction (A K > 1.0 mag) material as is found in the OMC. This, in turn, suggests that the level of star formation activity and perhaps the star formation rate in these two clouds may be directly proportional to the total amount of high extinction material and presumably high density gas within them and that there might be a density threshold for star formation on the order of n(H 2 ) ∼ a few x 10 4 cm -3 .

  8. Coupled fvGCM-GCE Modeling System, 3D Cloud-Resolving Model and Cloud Library

    Science.gov (United States)

    Tao, Wei-Kuo

    2005-01-01

    Recent GEWEX Cloud System Study (GCSS) model comparison projects have indicated that cloud- resolving models (CRMs) agree with observations better than traditional single-column models in simulating various types of clouds and cloud systems from different geographic locations. Current and future NASA satellite programs can provide cloud, precipitation, aerosol and other data at very fine spatial and temporal scales. It requires a coupled global circulation model (GCM) and cloud-scale model (termed a super-parameterization or multi-scale modeling framework, MMF) to use these satellite data to improve the understanding of the physical processes that are responsible for the variation in global and regional climate and hydrological systems. The use of a GCM will enable global coverage, and the use of a CRM will allow for better and more sophisticated physical parameterization. NASA satellite and field campaign cloud related datasets can provide initial conditions as well as validation for both the MMF and CRMs. A seed fund is available at NASA Goddard to build a MMF based on the 2D Goddard Cumulus Ensemble (GCE) model and the Goddard finite volume general circulation model (fvGCM). A prototype MMF in being developed and production runs will be conducted at the beginning of 2005. In this talk, I will present: (1) A brief review on GCE model and its applications on precipitation processes, ( 2 ) The Goddard MMF and the major difference between two existing MMFs (CSU MMF and Goddard MMF), (3) A cloud library generated by Goddard MMF, and 3D GCE model, and (4) A brief discussion on the GCE model on developing a global cloud simulator.

  9. Expansion of magnetic clouds

    International Nuclear Information System (INIS)

    Suess, S.T.

    1987-01-01

    Magnetic clouds are a carefully defined subclass of all interplanetary signatures of coronal mass ejections whose geometry is thought to be that of a cylinder embedded in a plane. It has been found that the total magnetic pressure inside the clouds is higher than the ion pressure outside, and that the clouds are expanding at 1 AU at about half the local Alfven speed. The geometry of the clouds is such that even though the magnetic pressure inside is larger than the total pressure outside, expansion will not occur because the pressure is balanced by magnetic tension - the pinch effect. The evidence for expansion of clouds at 1 AU is nevertheless quite strong so another reason for its existence must be found. It is demonstrated that the observations can be reproduced by taking into account the effects of geometrical distortion of the low plasma beta clouds as they move away from the Sun

  10. Encyclopedia of cloud computing

    CERN Document Server

    Bojanova, Irena

    2016-01-01

    The Encyclopedia of Cloud Computing provides IT professionals, educators, researchers and students with a compendium of cloud computing knowledge. Authored by a spectrum of subject matter experts in industry and academia, this unique publication, in a single volume, covers a wide range of cloud computing topics, including technological trends and developments, research opportunities, best practices, standards, and cloud adoption. Providing multiple perspectives, it also addresses questions that stakeholders might have in the context of development, operation, management, and use of clouds. Furthermore, it examines cloud computing's impact now and in the future. The encyclopedia presents 56 chapters logically organized into 10 sections. Each chapter covers a major topic/area with cross-references to other chapters and contains tables, illustrations, side-bars as appropriate. Furthermore, each chapter presents its summary at the beginning and backend material, references and additional resources for further i...

  11. Considerations for Cloud Security Operations

    OpenAIRE

    Cusick, James

    2016-01-01

    Information Security in Cloud Computing environments is explored. Cloud Computing is presented, security needs are discussed, and mitigation approaches are listed. Topics covered include Information Security, Cloud Computing, Private Cloud, Public Cloud, SaaS, PaaS, IaaS, ISO 27001, OWASP, Secure SDLC.

  12. Cloud Computing Governance Lifecycle

    OpenAIRE

    Soňa Karkošková; George Feuerlicht

    2016-01-01

    Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is uncle...

  13. Security in cloud computing

    OpenAIRE

    Moreno Martín, Oriol

    2016-01-01

    Security in Cloud Computing is becoming a challenge for next generation Data Centers. This project will focus on investigating new security strategies for Cloud Computing systems. Cloud Computingisarecent paradigmto deliver services over Internet. Businesses grow drastically because of it. Researchers focus their work on it. The rapid access to exible and low cost IT resources on an on-demand fashion, allows the users to avoid planning ahead for provisioning, and enterprises to save money ...

  14. Security and Privacy Implications of Cloud Computing – Lost in the Cloud

    OpenAIRE

    Tchifilionova , Vassilka

    2010-01-01

    Part 4: Security for Clouds; International audience; Cloud computing - the new paradigm, the future for IT consumer utility, the economy of scale approach, the illusion of un infinite resources availability, yet the debate over security and privacy issues is still undergoing and a common policy framework is missing. Research confirms that users are concern when presented with scenarios in which companies may put their data to uses of which they may not be aware. Therefore, privacy and securit...

  15. CLOUD TECHNOLOGY IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Alexander N. Dukkardt

    2014-01-01

    Full Text Available This article is devoted to the review of main features of cloud computing that can be used in education. Particular attention is paid to those learning and supportive tasks, that can be greatly improved in the case of the using of cloud services. Several ways to implement this approach are proposed, based on widely accepted models of providing cloud services. Nevertheless, the authors have not ignored currently existing problems of cloud technologies , identifying the most dangerous risks and their impact on the core business processes of the university. 

  16. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  17. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  18. CLAAS: the CM SAF cloud property data set using SEVIRI

    Science.gov (United States)

    Stengel, M. S.; Kniffka, A. K.; Meirink, J. F. M.; Lockhoff, M. L.; Tan, J. T.; Hollmann, R. H.

    2014-04-01

    An 8-year record of satellite-based cloud properties named CLAAS (CLoud property dAtAset using SEVIRI) is presented, which was derived within the EUMETSAT Satellite Application Facility on Climate Monitoring. The data set is based on SEVIRI measurements of the Meteosat Second Generation satellites, of which the visible and near-infrared channels were intercalibrated with MODIS. Applying two state-of-the-art retrieval schemes ensures high accuracy in cloud detection, cloud vertical placement and microphysical cloud properties. These properties were further processed to provide daily to monthly averaged quantities, mean diurnal cycles and monthly histograms. In particular, the per-month histogram information enhances the insight in spatio-temporal variability of clouds and their properties. Due to the underlying intercalibrated measurement record, the stability of the derived cloud properties is ensured, which is exemplarily demonstrated for three selected cloud variables for the entire SEVIRI disc and a European subregion. All data products and processing levels are introduced and validation results indicated. The sampling uncertainty of the averaged products in CLAAS is minimized due to the high temporal resolution of SEVIRI. This is emphasized by studying the impact of reduced temporal sampling rates taken at typical overpass times of polar-orbiting instruments. In particular, cloud optical thickness and cloud water path are very sensitive to the sampling rate, which in our study amounted to systematic deviations of over 10% if only sampled once a day. The CLAAS data set facilitates many cloud related applications at small spatial scales of a few kilometres and short temporal scales of a~few hours. Beyond this, the spatiotemporal characteristics of clouds on diurnal to seasonal, but also on multi-annual scales, can be studied.

  19. The temperature of large dust grains in molecular clouds

    Science.gov (United States)

    Clark, F. O.; Laureijs, R. J.; Prusti, T.

    1991-01-01

    The temperature of the large dust grains is calculated from three molecular clouds ranging in visual extinction from 2.5 to 8 mag, by comparing maps of either extinction derived from star counts or gas column density derived from molecular observations to I(100). Both techniques show the dust temperature declining into clouds. The two techniques do not agree in absolute scale.

  20. Visualization and labeling of point clouds in virtual reality

    DEFF Research Database (Denmark)

    Stets, Jonathan Dyssel; Sun, Yongbin; Greenwald, Scott W.

    2017-01-01

    We present a Virtual Reality (VR) application for labeling and handling point cloud data sets. A series of room-scale point clouds are recorded as a video sequence using a Microsoft Kinect. The data can be played and paused, and frames can be skipped just like in a video player. The user can walk...

  1. Cloud-particle galactic gas dynamics and star formation

    International Nuclear Information System (INIS)

    Roberts, W.W. Jr.

    1983-01-01

    Galactic gas dynamics, spiral structure, and star formation are discussed in the context of N-body computational studies based on a cloud-particle model of the interstellar medium. On the small scale, the interstellar medium appears to be cloud-dominated and supernova-perturbed. The cloud-particle model simulates cloud-cloud collisions, the formation of stellar associations, and supernova explosions as dominant local processes. On the large scale in response to a spiral galactic gravitational field, global density waves and galactic shocks develop with large-scale characteristics similar to those found in continuum gas dynamical studies. Both the system of gas clouds and the system of young stellar associations forming from the clouds share in the global spiral structure. However, with the attributes of neither assuming a continuum of gas (as in continuum gas dynamical studies) nor requiring a prescribed equation of state such as the isothermal condition so often employed, the cloud-particle picture retains much of the detail lost in earlier work: namely, the small-scale features and structures so important in understanding the local, turbulent state of the interstellar medium as well as the degree of raggedness often observed superposed on global spiral structure. (Auth.)

  2. Review of Cloud Computing and existing Frameworks for Cloud adoption

    OpenAIRE

    Chang, Victor; Walters, Robert John; Wills, Gary

    2014-01-01

    This paper presents a selected review for Cloud Computing and explains the benefits and risks of adopting Cloud Computing in a business environment. Although all the risks identified may be associated with two major Cloud adoption challenges, a framework is required to support organisations as they begin to use Cloud and minimise risks of Cloud adoption. Eleven Cloud Computing frameworks are investigated and a comparison of their strengths and limitations is made; the result of the comparison...

  3. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  4. Lost in Cloud

    Science.gov (United States)

    Maluf, David A.; Shetye, Sandeep D.; Chilukuri, Sri; Sturken, Ian

    2012-01-01

    Cloud computing can reduce cost significantly because businesses can share computing resources. In recent years Small and Medium Businesses (SMB) have used Cloud effectively for cost saving and for sharing IT expenses. With the success of SMBs, many perceive that the larger enterprises ought to move into Cloud environment as well. Government agency s stove-piped environments are being considered as candidates for potential use of Cloud either as an enterprise entity or pockets of small communities. Cloud Computing is the delivery of computing as a service rather than as a product, whereby shared resources, software, and information are provided to computers and other devices as a utility over a network. Underneath the offered services, there exists a modern infrastructure cost of which is often spread across its services or its investors. As NASA is considered as an Enterprise class organization, like other enterprises, a shift has been occurring in perceiving its IT services as candidates for Cloud services. This paper discusses market trends in cloud computing from an enterprise angle and then addresses the topic of Cloud Computing for NASA in two possible forms. First, in the form of a public Cloud to support it as an enterprise, as well as to share it with the commercial and public at large. Second, as a private Cloud wherein the infrastructure is operated solely for NASA, whether managed internally or by a third-party and hosted internally or externally. The paper addresses the strengths and weaknesses of both paradigms of public and private Clouds, in both internally and externally operated settings. The content of the paper is from a NASA perspective but is applicable to any large enterprise with thousands of employees and contractors.

  5. Simulation models for the evolution of cloud systems. I. Introduction and preliminary simulations

    International Nuclear Information System (INIS)

    Pumphrey, W.A.; Scalo, J.M.

    1983-01-01

    The evolution of systems of interactings gas clouds is investigated, with application to protogalaxies in galaxy clusters, proto--globular clusters in galacies, and protostellar fragments in interstellar clouds. The evolution of these systems can be parameterized in terms of three dimensionless quantities: the number of clouds, the volume filling factor of clouds, and the fraction of the mass of the system in clouds. We discuss the range of parameter space in which direct cloud collisions, tidal encounters, interactions of clouds with ambient gas, cloud collapse, cloud orbital motion due to the gravitational acceleration of the rest of the system, and cumulative long-range gravitational scatterings are important. All of these processes except for long-range gravitational scattering and probably tidal encounters are competitive for the systems of interest. The evolution of the mass spectrum and velocity distribution of clouds in self-gravitating clouds should be dominated by direct collisions for high-mass clouds and by drag, accretion, or ablation for small-mass clouds. We tentatively identify the critical mass at which the collision time scale equals the collapse time scale with the low-mass turnovers observed in the mass spectrum of stars in open clusters, and predict that rich galaxy clusters should exhibit variations in the faint end of the luminosity function if these clusters form by fragmentation. If collisions perturb the attempted collapse of clouds, the low-mass ''stars'' should form before high-mass stars

  6. THE MAGELLANIC MOPRA ASSESSMENT (MAGMA). I. THE MOLECULAR CLOUD POPULATION OF THE LARGE MAGELLANIC CLOUD

    International Nuclear Information System (INIS)

    Wong, Tony; Chu, You-Hua; Gruendl, Robert A.; Looney, Leslie W.; Seale, Jonathan; Welty, Daniel E.; Hughes, Annie; Maddison, Sarah; Ott, Jürgen; Muller, Erik; Fukui, Yasuo; Kawamura, Akiko; Mizuno, Yoji; Pineda, Jorge L.; Bernard, Jean-Philippe; Paradis, Deborah; Henkel, Christian; Klein, Ulrich

    2011-01-01

    We present the properties of an extensive sample of molecular clouds in the Large Magellanic Cloud (LMC) mapped at 11 pc resolution in the CO(1-0) line. Targets were chosen based on a limiting CO flux and peak brightness as measured by the NANTEN survey. The observations were conducted with the ATNF Mopra Telescope as part of the Magellanic Mopra Assessment. We identify clouds as regions of connected CO emission and find that the distributions of cloud sizes, fluxes, and masses are sensitive to the choice of decomposition parameters. In all cases, however, the luminosity function of CO clouds is steeper than dN/dL∝L –2 , suggesting that a substantial fraction of mass is in low-mass clouds. A correlation between size and linewidth, while apparent for the largest emission structures, breaks down when those structures are decomposed into smaller structures. We argue that the correlation between virial mass and CO luminosity is the result of comparing two covariant quantities, with the correlation appearing tighter on larger scales where a size-linewidth relation holds. The virial parameter (the ratio of a cloud's kinetic to self-gravitational energy) shows a wide range of values and exhibits no clear trends with the CO luminosity or the likelihood of hosting young stellar object (YSO) candidates, casting further doubt on the assumption of virialization for molecular clouds in the LMC. Higher CO luminosity increases the likelihood of a cloud harboring a YSO candidate, and more luminous YSOs are more likely to be coincident with detectable CO emission, confirming the close link between giant molecular clouds and massive star formation.

  7. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  8. VMware vCloud security

    CERN Document Server

    Sarkar, Prasenjit

    2013-01-01

    VMware vCloud Security provides the reader with in depth knowledge and practical exercises sufficient to implement a secured private cloud using VMware vCloud Director and vCloud Networking and Security.This book is primarily for technical professionals with system administration and security administration skills with significant VMware vCloud experience who want to learn about advanced concepts of vCloud security and compliance.

  9. Security in hybrid cloud computing

    OpenAIRE

    Koudelka, Ondřej

    2016-01-01

    This bachelor thesis deals with the area of hybrid cloud computing, specifically with its security. The major aim of the thesis is to analyze and compare the chosen hybrid cloud providers. For the minor aim this thesis compares the security challenges of hybrid cloud as opponent to other deployment models. In order to accomplish said aims, this thesis defines the terms cloud computing and hybrid cloud computing in its theoretical part. Furthermore the security challenges for cloud computing a...

  10. Explicit prediction of ice clouds in general circulation models

    Science.gov (United States)

    Kohler, Martin

    1999-11-01

    ) and falling snow (diagnosed) components. An empirical parameterization of the effect of upward turbulent water fluxes in cloud layers is obtained from the CRM simulations by (1) identifying the time-scale of conversion of cloud ice to snow as the key parameter, and (2) regressing it onto cloud differential IR heating and environmental static stability. The updated UCLA-GCM achieves close agreement with observations in global mean top of atmosphere fluxes (within 1--4 W/m2). Artificially suppressing the impact of cloud turbulent fluxes reduces the global mean ice water path by a factor of 3 and produces errors in each of solar and IR fluxes at the top of atmosphere of about 5--6 W/m2.

  11. Cloud security in vogelvlucht

    NARCIS (Netherlands)

    Pieters, Wolter

    2011-01-01

    Cloud computing is dé hype in IT op het moment, en hoewel veel aspecten niet nieuw zijn, leidt het concept wel tot de noodzaak voor nieuwe vormen van beveiliging. Het idee van cloud computing biedt echter ook juist kansen om hierover na te denken: wat is de rol van informatiebeveiliging in een

  12. CLOUD SERVICES IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Z.S. Seydametova

    2011-05-01

    Full Text Available We present the on-line services based on cloud computing, provided by Google to educational institutions. We describe the own experience of the implementing the Google Apps Education Edition in the educational process. We analyzed and compared the other universities experience of using cloud technologies.

  13. Cloud MicroAtlas

    Indian Academy of Sciences (India)

    We begin by outlining the life cycle of a tall cloud, and thenbriefly discuss cloud systems. We choose one aspect of thislife cycle, namely, the rapid growth of water droplets in ice freeclouds, to then discuss in greater detail. Taking a singlevortex to be a building block of turbulence, we demonstrateone mechanism by which ...

  14. Greening the cloud

    NARCIS (Netherlands)

    van den Hoed, Robert; Hoekstra, Eric; Procaccianti, Giuseppe; Lago, Patricia; Grosso, Paolo; Taal, Arie; Grosskop, Kay; van Bergen, Esther

    The cloud has become an essential part of our daily lives. We use it to store our documents (Dropbox), to stream our music and films (Spotify and Netflix) and without giving it any thought, we use it to work on documents in the cloud (Google Docs).

  15. Learning in the Clouds?

    Science.gov (United States)

    Butin, Dan W.

    2013-01-01

    Engaged learning--the type that happens outside textbooks and beyond the four walls of the classroom--moves beyond right and wrong answers to grappling with the uncertainties and contradictions of a complex world. iPhones back up to the "cloud." GoogleDocs is all about "cloud computing." Facebook is as ubiquitous as the sky.…

  16. Kernel structures for Clouds

    Science.gov (United States)

    Spafford, Eugene H.; Mckendry, Martin S.

    1986-01-01

    An overview of the internal structure of the Clouds kernel was presented. An indication of how these structures will interact in the prototype Clouds implementation is given. Many specific details have yet to be determined and await experimentation with an actual working system.

  17. GPU-accelerated micromagnetic simulations using cloud computing

    International Nuclear Information System (INIS)

    Jermain, C.L.; Rowlands, G.E.; Buhrman, R.A.; Ralph, D.C.

    2016-01-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  18. GPU-accelerated micromagnetic simulations using cloud computing

    Energy Technology Data Exchange (ETDEWEB)

    Jermain, C.L., E-mail: clj72@cornell.edu [Cornell University, Ithaca, NY 14853 (United States); Rowlands, G.E.; Buhrman, R.A. [Cornell University, Ithaca, NY 14853 (United States); Ralph, D.C. [Cornell University, Ithaca, NY 14853 (United States); Kavli Institute at Cornell, Ithaca, NY 14853 (United States)

    2016-03-01

    Highly parallel graphics processing units (GPUs) can improve the speed of micromagnetic simulations significantly as compared to conventional computing using central processing units (CPUs). We present a strategy for performing GPU-accelerated micromagnetic simulations by utilizing cost-effective GPU access offered by cloud computing services with an open-source Python-based program for running the MuMax3 micromagnetics code remotely. We analyze the scaling and cost benefits of using cloud computing for micromagnetics. - Highlights: • The benefits of cloud computing for GPU-accelerated micromagnetics are examined. • We present the MuCloud software for running simulations on cloud computing. • Simulation run times are measured to benchmark cloud computing performance. • Comparison benchmarks are analyzed between CPU and GPU based solvers.

  19. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  20. Mixed phase clouds: observations and theoretical advances (overview)

    Science.gov (United States)

    Korolev, Alexei

    2013-04-01

    Mixed phase clouds play important role in precipitation formation and radiation budget of the Earth. The microphysical measurements in mixed phase clouds are notoriously difficult due to many technical challenges. The airborne instrumentation for characterization of the microstructure of mixed phase clouds is discussed. The results multiyear airborne observations and measurements of frequency of occurrence of mixed phase, characteristic spatial scales, humidity in mixed phase and ice clouds are presented. A theoretical framework describing the thermodynamics and phase transformation of a three phase component system consisting of ice particles, liquid droplets and water vapor is discussed. It is shown that the Wegener-Bergeron-Findeisen process plays different role in clouds with different dynamics. The problem of maintenance and longevity of mixed phase clouds is discussed.

  1. Statistical thermodynamics and the size distributions of tropical convective clouds.

    Science.gov (United States)

    Garrett, T. J.; Glenn, I. B.; Krueger, S. K.; Ferlay, N.

    2017-12-01

    Parameterizations for sub-grid cloud dynamics are commonly developed by using fine scale modeling or measurements to explicitly resolve the mechanistic details of clouds to the best extent possible, and then to formulating these behaviors cloud state for use within a coarser grid. A second is to invoke physical intuition and some very general theoretical principles from equilibrium statistical thermodynamics. This second approach is quite widely used elsewhere in the atmospheric sciences: for example to explain the heat capacity of air, blackbody radiation, or even the density profile or air in the atmosphere. Here we describe how entrainment and detrainment across cloud perimeters is limited by the amount of available air and the range of moist static energy in the atmosphere, and that constrains cloud perimeter distributions to a power law with a -1 exponent along isentropes and to a Boltzmann distribution across isentropes. Further, the total cloud perimeter density in a cloud field is directly tied to the buoyancy frequency of the column. These simple results are shown to be reproduced within a complex dynamic simulation of a tropical convective cloud field and in passive satellite observations of cloud 3D structures. The implication is that equilibrium tropical cloud structures can be inferred from the bulk thermodynamic structure of the atmosphere without having to analyze computationally expensive dynamic simulations.

  2. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  3. Cloud Computing and Its Applications in GIS

    Science.gov (United States)

    Kang, Cao

    2011-12-01

    this assessment of cloud computing technology, the exploration of the challenges and solutions to the migration of GIS algorithms to cloud computing infrastructures, and the examination of strategies for serving large amounts of GIS data in a cloud computing infrastructure, this dissertation lends support to the feasibility of building a cloud-based GIS system. However, there are still challenges that need to be addressed before a full-scale functional cloud-based GIS system can be successfully implemented. (Abstract shortened by UMI.)

  4. Optical Instruments Synergy in Determination of Optical Depth of Thin Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Vladutescu, Daniela V.; Schwartz, Stephen E.

    2017-06-25

    Optically thin clouds have a strong radiative effect and need to be represented accurately in climate models. Cloud optical depth of thin clouds was retrieved using high resolution digital photography, lidar, and a radiative transfer model. The Doppler Lidar was operated at 1.5 μm, minimizing return from Rayleigh scattering, emphasizing return from aerosols and clouds. This approach examined cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opening new avenues for examination of cloud structure and evolution.

  5. Cloud computing basics

    CERN Document Server

    Srinivasan, S

    2014-01-01

    Cloud Computing Basics covers the main aspects of this fast moving technology so that both practitioners and students will be able to understand cloud computing. The author highlights the key aspects of this technology that a potential user might want to investigate before deciding to adopt this service. This book explains how cloud services can be used to augment existing services such as storage, backup and recovery. Addressing the details on how cloud security works and what the users must be prepared for when they move their data to the cloud. Also this book discusses how businesses could prepare for compliance with the laws as well as industry standards such as the Payment Card Industry.

  6. Solar variability and clouds

    CERN Document Server

    Kirkby, Jasper

    2000-01-01

    Satellite observations have revealed a surprising imprint of the 11- year solar cycle on global low cloud cover. The cloud data suggest a correlation with the intensity of Galactic cosmic rays. If this apparent connection between cosmic rays and clouds is real, variations of the cosmic ray flux caused by long-term changes in the solar wind could have a significant influence on the global energy radiation budget and the climate. However a direct link between cosmic rays and clouds has not been unambiguously established and, moreover, the microphysical mechanism is poorly understood. New experiments are being planned to find out whether cosmic rays can affect cloud formation, and if so how. (37 refs).

  7. Abstracting application deployment on Cloud infrastructures

    Science.gov (United States)

    Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.

    2017-10-01

    Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.

  8. Stratocumulus Cloud Top Radiative Cooling and Cloud Base Updraft Speeds

    Science.gov (United States)

    Kazil, J.; Feingold, G.; Balsells, J.; Klinger, C.

    2017-12-01

    Cloud top radiative cooling is a primary driver of turbulence in the stratocumulus-topped marine boundary. A functional relationship between cloud top cooling and cloud base updraft speeds may therefore exist. A correlation of cloud top radiative cooling and cloud base updraft speeds has been recently identified empirically, providing a basis for satellite retrieval of cloud base updraft speeds. Such retrievals may enable analysis of aerosol-cloud interactions using satellite observations: Updraft speeds at cloud base co-determine supersaturation and therefore the activation of cloud condensation nuclei, which in turn co-determine cloud properties and precipitation formation. We use large eddy simulation and an off-line radiative transfer model to explore the relationship between cloud-top radiative cooling and cloud base updraft speeds in a marine stratocumulus cloud over the course of the diurnal cycle. We find that during daytime, at low cloud water path (CWP correlated, in agreement with the reported empirical relationship. During the night, in the absence of short-wave heating, CWP builds up (CWP > 50 g m-2) and long-wave emissions from cloud top saturate, while cloud base heating increases. In combination, cloud top cooling and cloud base updrafts become weakly anti-correlated. A functional relationship between cloud top cooling and cloud base updraft speed can hence be expected for stratocumulus clouds with a sufficiently low CWP and sub-saturated long-wave emissions, in particular during daytime. At higher CWPs, in particular at night, the relationship breaks down due to saturation of long-wave emissions from cloud top.

  9. Diffusion and scattering in multifractal clouds

    Energy Technology Data Exchange (ETDEWEB)

    Lovejoy, S. [McGill Univ., Montreal, Quebec (Canada); Schertzer, D. [Universite Pierre et Marie Curie, Paris (France); Waston, B. [St. Lawrence Univ., Canton, NY (United States)] [and others

    1996-04-01

    This paper describes investigations of radiative properties of multifractal clouds using two different approaches. In the first, diffusion is considered by examining the scaling properties of one dimensional random walks on media with multifractal diffusivities. The second approach considers the scattering statistics associated with radiative transport.

  10. Vertical microphysical profiles of convective clouds as a tool for obtaining aerosol cloud-mediated climate forcings

    Energy Technology Data Exchange (ETDEWEB)

    Rosenfeld, Daniel [Hebrew Univ. of Jerusalem (Israel)

    2015-12-23

    Quantifying the aerosol/cloud-mediated radiative effect at a global scale requires simultaneous satellite retrievals of cloud condensation nuclei (CCN) concentrations and cloud base updraft velocities (Wb). Hitherto, the inability to do so has been a major cause of high uncertainty regarding anthropogenic aerosol/cloud-mediated radiative forcing. This can be addressed by the emerging capability of estimating CCN and Wb of boundary layer convective clouds from an operational polar orbiting weather satellite. Our methodology uses such clouds as an effective analog for CCN chambers. The cloud base supersaturation (S) is determined by Wb and the satellite-retrieved cloud base drop concentrations (Ndb), which is the same as CCN(S). Developing and validating this methodology was possible thanks to the ASR/ARM measurements of CCN and vertical updraft profiles. Validation against ground-based CCN instruments at the ARM sites in Oklahoma, Manaus, and onboard a ship in the northeast Pacific showed a retrieval accuracy of ±25% to ±30% for individual satellite overpasses. The methodology is presently limited to boundary layer not raining convective clouds of at least 1 km depth that are not obscured by upper layer clouds, including semitransparent cirrus. The limitation for small solar backscattering angles of <25º restricts the satellite coverage to ~25% of the world area in a single day. This methodology will likely allow overcoming the challenge of quantifying the aerosol indirect effect and facilitate a substantial reduction of the uncertainty in anthropogenic climate forcing.

  11. Formation of Massive Molecular Cloud Cores by Cloud-cloud Collision

    OpenAIRE

    Inoue, Tsuyoshi; Fukui, Yasuo

    2013-01-01

    Recent observations of molecular clouds around rich massive star clusters including NGC3603, Westerlund 2, and M20 revealed that the formation of massive stars could be triggered by a cloud-cloud collision. By using three-dimensional, isothermal, magnetohydrodynamics simulations with the effect of self-gravity, we demonstrate that massive, gravitationally unstable, molecular cloud cores are formed behind the strong shock waves induced by the cloud-cloud collision. We find that the massive mol...

  12. THE MASS-SIZE RELATION FROM CLOUDS TO CORES. II. SOLAR NEIGHBORHOOD CLOUDS

    International Nuclear Information System (INIS)

    Kauffmann, J.; Shetty, R.; Goodman, A. A.; Pillai, T.; Myers, P. C.

    2010-01-01

    We measure the mass and size of cloud fragments in several molecular clouds continuously over a wide range of spatial scales (0.05 ∼ 2 , is not well suited to describe the derived mass-size data. Solar neighborhood clouds not forming massive stars (∼ sun ; Pipe Nebula, Taurus, Perseus, and Ophiuchus) obey m(r) ≤ 870 M sun (r/pc) 1.33 . In contrast to this, clouds forming massive stars (Orion A, G10.15 - 0.34, G11.11 - 0.12) do exceed the aforementioned relation. Thus, this limiting mass-size relation may approximate a threshold for the formation of massive stars. Across all clouds, cluster-forming cloud fragments are found to be-at given radius-more massive than fragments devoid of clusters. The cluster-bearing fragments are found to roughly obey a mass-size law m ∝ r 1.27 (where the exponent is highly uncertain in any given cloud, but is certainly smaller than 1.5).

  13. Location-aware network operation for cloud radio access network

    KAUST Repository

    Wang, Fanggang; Ruan, Liangzhong; Win, Moe Z.

    2017-01-01

    One of the major challenges in effectively operating a cloud radio access network (C-RAN) is the excessive overhead signaling and computation load that scale rapidly with the size of the network. In this paper, the exploitation of location

  14. Accelerating HPC Applications through Specialized Linear Algebra Clouds, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing has the potential to permit scientists to scale up to solve large science problems without having to invest in hardware and software infrastructure....

  15. Resource Allocation for Cloud Radio Access Networks

    KAUST Repository

    Dhifallah, Oussama

    2016-04-01

    Cloud-radio access network (CRAN) is expected to be the core network architecture for next generation mobile radio system. In CRANs, joint signal processing is performed at multiple cloud computing centers (clouds) that are connected to several base stations (BSs) via high capacity backhaul links. As a result, large-scale interference management and network power consumption reduction can be effectively achieved. Unlike recent works on CRANs which consider a single cloud processing and treat inter-cloud interference as background noise, the first part of this thesis focuses on the more practical scenario of the downlink of a multi-cloud radio access network where BSs are connected to each cloud through wireline backhaul links. Assume that each cloud serves a set of pre-known single-antenna mobile users (MUs). This part focuses on minimizing the network total power consumption subject to practical constraints. The problems are solved using sophisticated techniques from optimization theory (e.g. Dual Decomposition-based algorithm and the alternating direction method of multipliers (ADMM)-based algorithm). One highlight of this part is that the proposed solutions can be implemented in a distributed fashion by allowing a reasonable information exchange between the coupled clouds. Additionally, feasible solutions of the considered optimization problems can be estimated locally at each iteration. Simulation results show that the proposed distributed algorithms converge to the centralized algorithms in a reasonable number of iterations. To further account of the backhaul congestion due to densification in CRANs, the second part of this thesis considers the downlink of a cache-enabled CRAN where each BS is equipped with a local-cache with limited size used to store the popular files without the need for backhauling. Further, each cache-enabled BS is connected to the cloud via limited capacity backhaul link and can serve a set of pre-known single antenna MUs. This part

  16. When STAR meets the Clouds-Virtualization and Cloud Computing Experiences

    International Nuclear Information System (INIS)

    Lauret, J; Hajdu, L; Walker, M; Balewski, J; Goasguen, S; Stout, L; Fenn, M; Keahey, K

    2011-01-01

    In recent years, Cloud computing has become a very attractive paradigm and popular model for accessing distributed resources. The Cloud has emerged as the next big trend. The burst of platform and projects providing Cloud resources and interfaces at the very same time that Grid projects are entering a production phase in their life cycle has however raised the question of the best approach to handling distributed resources. Especially, are Cloud resources scaling at the levels shown by Grids? Are they performing at the same level? What is their overhead on the IT teams and infrastructure? Rather than seeing the two as orthogonal, the STAR experiment has viewed them as complimentary and has studied merging the best of the two worlds with Grid middleware providing the aggregation of both Cloud and traditional resources. Since its first use of Cloud resources on Amazon EC2 in 2008/2009 using a Nimbus/EC2 interface, the STAR software team has tested and experimented with many novel approaches: from a traditional, native EC2 approach to the Virtual Organization Cluster (VOC) at Clemson University and Condor/VM on the GLOW resources at the University of Wisconsin. The STAR team is also planning to run as part of the DOE/Magellan project. In this paper, we will present an overview of our findings from using truly opportunistic resources and scaling-out two orders of magnitude in both tests and practical usage.

  17. A simple dynamic rising nuclear cloud based model of ground radioactive fallout for atmospheric nuclear explosion

    International Nuclear Information System (INIS)

    Zheng Yi

    2008-01-01

    A simple dynamic rising nuclear cloud based model for atmospheric nuclear explosion radioactive prediction was presented. The deposition of particles and initial cloud radius changing with time before the cloud stabilization was considered. Large-scale relative diffusion theory was used after cloud stabilization. The model was considered reasonable and dependable in comparison with four U.S. nuclear test cases and DELFIC model results. (authors)

  18. Cloud Computing: An Overview

    Directory of Open Access Journals (Sweden)

    Libor Sarga

    2012-10-01

    Full Text Available As cloud computing is gaining acclaim as a cost-effective alternative to acquiring processing resources for corporations, scientific applications and individuals, various challenges are rapidly coming to the fore. While academia struggles to procure a concise definition, corporations are more interested in competitive advantages it may generate and individuals view it as a way of speeding up data access times or a convenient backup solution. Properties of the cloud architecture largely preclude usage of existing practices while achieving end-users’ and companies’ compliance requires considering multiple infrastructural as well as commercial factors, such as sustainability in case of cloud-side interruptions, identity management and off-site corporate data handling policies. The article overviews recent attempts at formal definitions of cloud computing, summarizes and critically evaluates proposed delimitations, and specifies challenges associated with its further proliferation. Based on the conclusions, future directions in the field of cloud computing are also briefly hypothesized to include deeper focus on community clouds and bolstering innovative cloud-enabled platforms and devices such as tablets, smart phones, as well as entertainment applications.

  19. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  20. Cloud fluid compression and softening in spiral arms and the formation of giant molecular cloud complexes

    International Nuclear Information System (INIS)

    Cowie, L.L.

    1981-01-01

    In this, the second paper of a series on the galactodynamics of the cloudy interstellar medium, we consider the response of such a gas to a forcing potential in the tight-winding density wave theory. The cloud fluid is treated in the hydrodynamic limit with an equation of state which softens at high densities. It is shown that in the inner regions of the galaxy, cooling of the cloud fluid in the arms can result in gravitational instability and the formation of large bound complexes of clouds which we identify with the giant molecular clouds (GMCs). Masses dimensions, distributions, and scale heights of the GMCs are predicted by the theory. It is suggested that the interstellar gas density in the disk is regulated by the gravitational instability mechanism in the arms which siphons material into star formation. Implications for the evolution of individual GMCs and for galactic morphology are discussed

  1. HNSciCloud - Overview and technical Challenges

    Science.gov (United States)

    Gasthuber, Martin; Meinhard, Helge; Jones, Robert

    2017-10-01

    HEP is only one of many sciences with sharply increasing compute requirements that cannot be met by profiting from Moore’s law alone. Commercial clouds potentially allow for realising larger economies of scale. While some small-scale experience requiring dedicated effort has been collected, public cloud resources have not been integrated yet with the standard workflows of science organisations in their private data centres; in addition, European science has not ramped up to significant scale yet. The HELIX NEBULA Science Cloud project - HNSciCloud, partly funded by the European Commission, addresses these points. Ten organisations under CERN’s leadership, covering particle physics, bioinformatics, photon science and other sciences, have joined to procure public cloud resources as well as dedicated development efforts towards this integration. The HNSciCloud project faces the challenge to accelerate developments performed by the selected commercial providers. In order to guarantee cost efficient usage of IaaS resources across a wide range of scientific communities, the technical requirements had to be carefully constructed. With respect to current IaaS offerings, dataintensive science is the biggest challenge; other points that need to be addressed concern identity federations, network connectivity and how to match business practices of large IaaS providers with those of public research organisations. In the first section, this paper will give an overview of the project and explain the findings so far. The last section will explain the key points of the technical requirements and present first results of the experience of the procurers with the services in comparison to their’on-premise’ infrastructure.

  2. Cluster analysis of midlatitude oceanic cloud regimes: mean properties and temperature sensitivity

    Directory of Open Access Journals (Sweden)

    N. D. Gordon

    2010-07-01

    Full Text Available Clouds play an important role in the climate system by reducing the amount of shortwave radiation reaching the surface and the amount of longwave radiation escaping to space. Accurate simulation of clouds in computer models remains elusive, however, pointing to a lack of understanding of the connection between large-scale dynamics and cloud properties. This study uses a k-means clustering algorithm to group 21 years of satellite cloud data over midlatitude oceans into seven clusters, and demonstrates that the cloud clusters are associated with distinct large-scale dynamical conditions. Three clusters correspond to low-level cloud regimes with different cloud fraction and cumuliform or stratiform characteristics, but all occur under large-scale descent and a relatively dry free troposphere. Three clusters correspond to vertically extensive cloud regimes with tops in the middle or upper troposphere, and they differ according to the strength of large-scale ascent and enhancement of tropospheric temperature and humidity. The final cluster is associated with a lower troposphere that is dry and an upper troposphere that is moist and experiencing weak ascent and horizontal moist advection.

    Since the present balance of reflection of shortwave and absorption of longwave radiation by clouds could change as the atmosphere warms from increasing anthropogenic greenhouse gases, we must also better understand how increasing temperature modifies cloud and radiative properties. We therefore undertake an observational analysis of how midlatitude oceanic clouds change with temperature when dynamical processes are held constant (i.e., partial derivative with respect to temperature. For each of the seven cloud regimes, we examine the difference in cloud and radiative properties between warm and cold subsets. To avoid misinterpreting a cloud response to large-scale dynamical forcing as a cloud response to temperature, we require horizontal and vertical

  3. Integrating Cloud-Computing-Specific Model into Aircraft Design

    Science.gov (United States)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  4. Diffuse interstellar clouds

    International Nuclear Information System (INIS)

    Black, J.H.

    1987-01-01

    The author defines and discusses the nature of diffuse interstellar clouds. He discusses how they contribute to the general extinction of starlight. The atomic and molecular species that have been identified in the ultraviolet, visible, and near infrared regions of the spectrum of a diffuse cloud are presented. The author illustrates some of the practical considerations that affect absorption line observations of interstellar atoms and molecules. Various aspects of the theoretical description of diffuse clouds required for a full interpretation of the observations are discussed

  5. Cloud Computing Security

    OpenAIRE

    Ngongang, Guy

    2011-01-01

    This project aimed to show how possible it is to use a network intrusion detection system in the cloud. The security in the cloud is a concern nowadays and security professionals are still finding means to make cloud computing more secure. First of all the installation of the ESX4.0, vCenter Server and vCenter lab manager in server hardware was successful in building the platform. This allowed the creation and deployment of many virtual servers. Those servers have operating systems and a...

  6. Aerosols, clouds and radiation

    Energy Technology Data Exchange (ETDEWEB)

    Twomey, S [University of Arizona, Tucson, AZ (USA). Inst. of Atmospheric Physics

    1991-01-01

    Most of the so-called 'CO{sub 2} effect' is, in fact, an 'H{sub 2}O effect' brought into play by the climate modeler's assumption that planetary average temperature dictates water-vapor concentration (following Clapeyron-Clausius). That assumption ignores the removal process, which cloud physicists know to be influenced by the aerosol, since the latter primarily controls cloud droplet number and size. Droplet number and size are also influential for shortwave (solar) energy. The reflectance of many thin to moderately thick clouds changes when nuclei concentrations change and make shortwave albedo susceptible to aerosol influence.

  7. Trusted cloud computing

    CERN Document Server

    Krcmar, Helmut; Rumpe, Bernhard

    2014-01-01

    This book documents the scientific results of the projects related to the Trusted Cloud Program, covering fundamental aspects of trust, security, and quality of service for cloud-based services and applications. These results aim to allow trustworthy IT applications in the cloud by providing a reliable and secure technical and legal framework. In this domain, business models, legislative circumstances, technical possibilities, and realizable security are closely interwoven and thus are addressed jointly. The book is organized in four parts on "Security and Privacy", "Software Engineering and

  8. Large Interstellar Polarisation Survey. II. UV/optical study of cloud-to-cloud variations of dust in the diffuse ISM

    Science.gov (United States)

    Siebenmorgen, R.; Voshchinnikov, N. V.; Bagnulo, S.; Cox, N. L. J.; Cami, J.; Peest, C.

    2018-03-01

    It is well known that the dust properties of the diffuse interstellar medium exhibit variations towards different sight-lines on a large scale. We have investigated the variability of the dust characteristics on a small scale, and from cloud-to-cloud. We use low-resolution spectro-polarimetric data obtained in the context of the Large Interstellar Polarisation Survey (LIPS) towards 59 sight-lines in the Southern Hemisphere, and we fit these data using a dust model composed of silicate and carbon particles with sizes from the molecular to the sub-micrometre domain. Large (≥6 nm) silicates of prolate shape account for the observed polarisation. For 32 sight-lines we complement our data set with UVES archive high-resolution spectra, which enable us to establish the presence of single-cloud or multiple-clouds towards individual sight-lines. We find that the majority of these 35 sight-lines intersect two or more clouds, while eight of them are dominated by a single absorbing cloud. We confirm several correlations between extinction and parameters of the Serkowski law with dust parameters, but we also find previously undetected correlations between these parameters that are valid only in single-cloud sight-lines. We find that interstellar polarisation from multiple-clouds is smaller than from single-cloud sight-lines, showing that the presence of a second or more clouds depolarises the incoming radiation. We find large variations of the dust characteristics from cloud-to-cloud. However, when we average a sufficiently large number of clouds in single-cloud or multiple-cloud sight-lines, we always retrieve similar mean dust parameters. The typical dust abundances of the single-cloud cases are [C]/[H] = 92 ppm and [Si]/[H] = 20 ppm.

  9. Insights from a Regime Decomposition Approach on CERES and CloudSat-inferred Cloud Radiative Effects

    Science.gov (United States)

    Oreopoulos, L.; Cho, N.; Lee, D.

    2015-12-01

    Our knowledge of the Cloud Radiative Effect (CRE) not only at the Top-of-the-Atmosphere (TOA), but also (with the help of some modeling) at the surface (SFC) and within the atmospheric column (ATM) has been steadily growing in recent years. Not only do we have global values for these CREs, but we can now also plot global maps of their geographical distribution. The next step in our effort to advance our knowledge of CRE is to systematically assess the contributions of prevailing cloud systems to the global values. The presentation addresses this issue directly. We identify the world's prevailing cloud systems, which we call "Cloud Regimes" (CRs) via clustering analysis of MODIS (Aqua-Terra) daily joint histograms of Cloud Top Pressure and Cloud Optical Thickness (TAU) at 1 degree scales. We then composite CERES diurnal values of CRE (TOA, SFC, ATM) separately for each CR by averaging these values for each CR occurrence, and thus find the contribution of each CR to the global value of CRE. But we can do more. We can actually decompose vertical profiles of inferred instantaneous CRE from CloudSat/CALIPSO (2B-FLXHR-LIDAR product) by averaging over Aqua CR occurrences (since A-Train formation flying allows collocation). Such an analysis greatly enhances our understanding of the radiative importance of prevailing cloud mixtures at different atmospheric levels. We can, for example, in addition to examining whether the CERES findings on which CRs contribute to radiative cooling and warming of the atmospheric column are consistent with CloudSat, also gain insight on why and where exactly this happens from the shape of the full instantaneous CRE vertical profiles.

  10. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  11. Molecular clouds near supernova remnants

    International Nuclear Information System (INIS)

    Wootten, H.A.

    1978-01-01

    The physical properties of molecular clouds near supernova remnants were investigated. Various properties of the structure and kinematics of these clouds are used to establish their physical association with well-known remmnants. An infrared survey of the most massive clouds revealed embedded objects, probably stars whose formation was induced by the supernova blast wave. In order to understand the relationship between these and other molecular clouds, a control group of clouds was also observed. Excitation models for dense regions of all the clouds are constructed to evaluate molecular abundances in these regions. Those clouds that have embedded stars have lower molecular abundances than the clouds that do not. A cloud near the W28 supernova remnant also has low abundances. Molecular abundances are used to measure an important parameter, the electron density, which is not directly observable. In some clouds extensive deuterium fractionation is observed which confirms electron density measurements in those clouds. Where large deuterium fractionation is observed, the ionization rate in the cloud interior can also be measured. The electron density and ionization rate in the cloud near W28 are higher than in most clouds. The molecular abundances and electron densities are functions of the chemical and dynamical state of evolution of the cloud. Those clouds with lowest abundances are probably the youngest clouds. As low-abundance clouds, some clouds near supernova remnants may have been recently swept from the local interstellar material. Supernova remnants provide sites for star formation in ambient clouds by compressing them, and they sweep new clouds from more diffuse local matter

  12. Taxonomy of cloud computing services

    NARCIS (Netherlands)

    Hoefer, C.N.; Karagiannis, Georgios

    2010-01-01

    Cloud computing is a highly discussed topic, and many big players of the software industry are entering the development of cloud services. Several companies want to explore the possibilities and benefits of cloud computing, but with the amount of cloud computing services increasing quickly, the need

  13. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  14. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  15. IBM SmartCloud essentials

    CERN Document Server

    Schouten, Edwin

    2013-01-01

    A practical, user-friendly guide that provides an introduction to cloud computing using IBM SmartCloud, along with a thorough understanding of resource management in a cloud environment.This book is great for anyone who wants to get a grasp of what cloud computing is and what IBM SmartCloud has to offer. If you are an IT specialist, IT architect, system administrator, or a developer who wants to thoroughly understand the cloud computing resource model, this book is ideal for you. No prior knowledge of cloud computing is expected.

  16. Islands in the Sky: Ecophysiological Cloud-Vegetation Linkages in Southern Appalachian Mountain Cloud Forests

    Science.gov (United States)

    Reinhardt, K.; Emanuel, R. E.; Johnson, D. M.

    2013-12-01

    Mountain cloud forest (MCF) ecosystems are characterized by a high frequency of cloud fog, with vegetation enshrouded in fog. The altitudinal boundaries of cloud-fog zones co-occur with conspicuous, sharp vegetation ecotones between MCF- and non-MCF-vegetation. This suggests linkages between cloud-fog and vegetation physiology and ecosystem functioning. However, very few studies have provided a mechanistic explanation for the sharp changes in vegetation communities, or how (if) cloud-fog and vegetation are linked. We investigated ecophysiological linkages between clouds and trees in Southern Appalachian spruce-fir MCF. These refugial forests occur in only six mountain-top, sky-island populations, and are immersed in clouds on up to 80% of all growing season days. Our fundamental research questions was: How are cloud-fog and cloud-forest trees linked? We measured microclimate and physiology of canopy tree species across a range of sky conditions (cloud immersed, partly cloudy, sunny). Measurements included: 1) sunlight intensity and spectral quality; 2) carbon gain and photosynthetic capacity at leaf (gas exchange) and ecosystem (eddy covariance) scales; and 3) relative limitations to carbon gain (biochemical, stomatal, hydraulic). RESULTS: 1) Midday sunlight intensity ranged from very dark (2500 μmol m-2 s-1), and was highly variable on minute-to-minute timescales whenever clouds were present in the sky. Clouds and cloud-fog increased the proportion of blue-light wavelengths 5-15% compared to sunny conditions, and altered blue:red and red:far red ratios, both of which have been shown to strongly affect stomatal functioning. 2) Cloud-fog resulted in ~50% decreased carbon gain at leaf and ecosystem scales, due to sunlight levels below photosynthetic light-saturation-points. However, greenhouse studies and light-response-curve analyses demonstrated that MCF tree species have low light-compensation points (can photosynthesize even at low light levels), and maximum

  17. Cloud MicroAtlas∗

    Indian Academy of Sciences (India)

    ∗Any resemblance to the title of David Mitchell's book is purely intentional! RESONANCE | March 2017. 269 .... The most comprehensive reference we know of on the subject of cloud microphysics is the book .... Economic and. Political Weekly ...

  18. Experimental project - Cloud chamber

    International Nuclear Information System (INIS)

    Nour, Elena; Quinchard, Gregory; Soudon, Paul

    2015-01-01

    This document reports an academic experimental project dealing with the general concepts of radioactivity and their application to the cloud room experiment. The author first recalls the history of the design and development of a cloud room, and some definitions and characteristics of cosmic radiation, and proposes a description of the principle and physics of a cloud room. The second part is a theoretical one, and addresses the involved particles, the origins of electrons, and issues related to the transfer of energy (Bremsstrahlung effect, Bragg peak). The third part reports the experimental work with the assessment of a cloud droplet radius, the identification of a trace for each particle (alphas and electrons), and the study of the magnetic field deviation

  19. Green symbiotic cloud communications

    CERN Document Server

    Mustafa, H D; Desai, Uday B; Baveja, Brij Mohan

    2017-01-01

    This book intends to change the perception of modern day telecommunications. Communication systems, usually perceived as “dumb pipes”, carrying information / data from one point to another, are evolved into intelligently communicating smart systems. The book introduces a new field of cloud communications. The concept, theory, and architecture of this new field of cloud communications are discussed. The book lays down nine design postulates that form the basis of the development of a first of its kind cloud communication paradigm entitled Green Symbiotic Cloud Communications or GSCC. The proposed design postulates are formulated in a generic way to form the backbone for development of systems and technologies of the future. The book can be used to develop courses that serve as an essential part of graduate curriculum in computer science and electrical engineering. Such courses can be independent or part of high-level research courses. The book will also be of interest to a wide range of readers including b...

  20. Entangled Cloud Storage

    DEFF Research Database (Denmark)

    Ateniese, Giuseppe; Dagdelen, Özgür; Damgård, Ivan Bjerre

    2012-01-01

    keeps the files in it private but still lets each client P_i recover his own data by interacting with S; no cooperation from other clients is needed. At the same time, the cloud provider is discouraged from altering or overwriting any significant part of c as this will imply that none of the clients can......Entangled cloud storage enables a set of clients {P_i} to “entangle” their files {f_i} into a single clew c to be stored by a (potentially malicious) cloud provider S. The entanglement makes it impossible to modify or delete significant part of the clew without affecting all files in c. A clew...... recover their files. We provide theoretical foundations for entangled cloud storage, introducing the notion of an entangled encoding scheme that guarantees strong security requirements capturing the properties above. We also give a concrete construction based on privacy-preserving polynomial interpolation...

  1. Cloudbus Toolkit for Market-Oriented Cloud Computing

    Science.gov (United States)

    Buyya, Rajkumar; Pandey, Suraj; Vecchiola, Christian

    This keynote paper: (1) presents the 21st century vision of computing and identifies various IT paradigms promising to deliver computing as a utility; (2) defines the architecture for creating market-oriented Clouds and computing atmosphere by leveraging technologies such as virtual machines; (3) provides thoughts on market-based resource management strategies that encompass both customer-driven service management and computational risk management to sustain SLA-oriented resource allocation; (4) presents the work carried out as part of our new Cloud Computing initiative, called Cloudbus: (i) Aneka, a Platform as a Service software system containing SDK (Software Development Kit) for construction of Cloud applications and deployment on private or public Clouds, in addition to supporting market-oriented resource management; (ii) internetworking of Clouds for dynamic creation of federated computing environments for scaling of elastic applications; (iii) creation of 3rd party Cloud brokering services for building content delivery networks and e-Science applications and their deployment on capabilities of IaaS providers such as Amazon along with Grid mashups; (iv) CloudSim supporting modelling and simulation of Clouds for performance studies; (v) Energy Efficient Resource Allocation Mechanisms and Techniques for creation and management of Green Clouds; and (vi) pathways for future research.

  2. Scalability of Parallel Scientific Applications on the Cloud

    Directory of Open Access Journals (Sweden)

    Satish Narayana Srirama

    2011-01-01

    Full Text Available Cloud computing, with its promise of virtually infinite resources, seems to suit well in solving resource greedy scientific computing problems. To study the effects of moving parallel scientific applications onto the cloud, we deployed several benchmark applications like matrix–vector operations and NAS parallel benchmarks, and DOUG (Domain decomposition On Unstructured Grids on the cloud. DOUG is an open source software package for parallel iterative solution of very large sparse systems of linear equations. The detailed analysis of DOUG on the cloud showed that parallel applications benefit a lot and scale reasonable on the cloud. We could also observe the limitations of the cloud and its comparison with cluster in terms of performance. However, for efficiently running the scientific applications on the cloud infrastructure, the applications must be reduced to frameworks that can successfully exploit the cloud resources, like the MapReduce framework. Several iterative and embarrassingly parallel algorithms are reduced to the MapReduce model and their performance is measured and analyzed. The analysis showed that Hadoop MapReduce has significant problems with iterative methods, while it suits well for embarrassingly parallel algorithms. Scientific computing often uses iterative methods to solve large problems. Thus, for scientific computing on the cloud, this paper raises the necessity for better frameworks or optimizations for MapReduce.

  3. CLOUD COMPUTING SECURITY ISSUES

    OpenAIRE

    Florin OGIGAU-NEAMTIU

    2012-01-01

    The term “cloud computing” has been in the spotlights of IT specialists the last years because of its potential to transform this industry. The promised benefits have determined companies to invest great sums of money in researching and developing this domain and great steps have been made towards implementing this technology. Managers have traditionally viewed IT as difficult and expensive and the promise of cloud computing leads many to think that IT will now be easy and cheap. The reality ...

  4. Cloud benchmarking for performance

    OpenAIRE

    Varghese, Blesson; Akgun, Ozgur; Miguel, Ian; Thai, Long; Barker, Adam

    2014-01-01

    Date of Acceptance: 20/09/2014 How can applications be deployed on the cloud to achieve maximum performance? This question has become significant and challenging with the availability of a wide variety of Virtual Machines (VMs) with different performance capabilities in the cloud. The above question is addressed by proposing a six step benchmarking methodology in which a user provides a set of four weights that indicate how important each of the following groups: memory, processor, computa...

  5. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  6. A TRUSTWORTHY CLOUD FORENSICS ENVIRONMENT

    OpenAIRE

    Zawoad , Shams; Hasan , Ragib

    2015-01-01

    Part 5: CLOUD FORENSICS; International audience; The rapid migration from traditional computing and storage models to cloud computing environments has made it necessary to support reliable forensic investigations in the cloud. However, current cloud computing environments often lack support for forensic investigations and the trustworthiness of evidence is often questionable because of the possibility of collusion between dishonest cloud providers, users and forensic investigators. This chapt...

  7. On Cloud-based Oversubscription

    OpenAIRE

    Householder, Rachel; Arnold, Scott; Green, Robert

    2014-01-01

    Rising trends in the number of customers turning to the cloud for their computing needs has made effective resource allocation imperative for cloud service providers. In order to maximize profits and reduce waste, providers have started to explore the role of oversubscribing cloud resources. However, the benefits of cloud-based oversubscription are not without inherent risks. This paper attempts to unveil the incentives, risks, and techniques behind oversubscription in a cloud infrastructure....

  8. SOME CONSIDERATIONS ON CLOUD ACCOUNTING

    OpenAIRE

    Doina Pacurari; Elena Nechita

    2013-01-01

    Cloud technologies have developed intensively during the last years. Cloud computing allows the customers to interact with their data and applications at any time, from any location, while the providers host these resources. A client company may choose to run in the cloud a part of its business (sales by agents, payroll, etc.), or even the entire business. The company can get access to a large category of cloud-based software, including accounting software. Cloud solutions are especially reco...

  9. A cosmic ray-climate link and cloud observations

    Directory of Open Access Journals (Sweden)

    Dunne Eimear M.

    2012-11-01

    Full Text Available Despite over 35 years of constant satellite-based measurements of cloud, reliable evidence of a long-hypothesized link between changes in solar activity and Earth’s cloud cover remains elusive. This work examines evidence of a cosmic ray cloud link from a range of sources, including satellite-based cloud measurements and long-term ground-based climatological measurements. The satellite-based studies can be divided into two categories: (1 monthly to decadal timescale analysis and (2 daily timescale epoch-superpositional (composite analysis. The latter analyses frequently focus on sudden high-magnitude reductions in the cosmic ray flux known as Forbush decrease events. At present, two long-term independent global satellite cloud datasets are available (ISCCP and MODIS. Although the differences between them are considerable, neither shows evidence of a solar-cloud link at either long or short timescales. Furthermore, reports of observed correlations between solar activity and cloud over the 1983–1995 period are attributed to the chance agreement between solar changes and artificially induced cloud trends. It is possible that the satellite cloud datasets and analysis methods may simply be too insensitive to detect a small solar signal. Evidence from ground-based studies suggests that some weak but statistically significant cosmic ray-cloud relationships may exist at regional scales, involving mechanisms related to the global electric circuit. However, a poor understanding of these mechanisms and their effects on cloud makes the net impacts of such links uncertain. Regardless of this, it is clear that there is no robust evidence of a widespread link between the cosmic ray flux and clouds.

  10. CLOUD COMPUTING TECHNOLOGY TRENDS

    Directory of Open Access Journals (Sweden)

    Cristian IVANUS

    2014-05-01

    Full Text Available Cloud computing has been a tremendous innovation, through which applications became available online, accessible through an Internet connection and using any computing device (computer, smartphone or tablet. According to one of the most recent studies conducted in 2012 by Everest Group and Cloud Connect, 57% of companies said they already use SaaS application (Software as a Service, and 38% reported using standard tools PaaS (Platform as a Service. However, in the most cases, the users of these solutions highlighted the fact that one of the main obstacles in the development of this technology is the fact that, in cloud, the application is not available without an Internet connection. The new challenge of the cloud system has become now the offline, specifically accessing SaaS applications without being connected to the Internet. This topic is directly related to user productivity within companies as productivity growth is one of the key promises of cloud computing system applications transformation. The aim of this paper is the presentation of some important aspects related to the offline cloud system and regulatory trends in the European Union (EU.

  11. A CloudSat-CALIPSO View of Cloud and Precipitation Properties Across Cold Fronts over the Global Oceans

    Science.gov (United States)

    Naud, Catherine M.; Posselt, Derek J.; van den Heever, Susan C.

    2015-01-01

    The distribution of cloud and precipitation properties across oceanic extratropical cyclone cold fronts is examined using four years of combined CloudSat radar and CALIPSO lidar retrievals. The global annual mean cloud and precipitation distributions show that low-level clouds are ubiquitous in the post frontal zone while higher-level cloud frequency and precipitation peak in the warm sector along the surface front. Increases in temperature and moisture within the cold front region are associated with larger high-level but lower mid-/low level cloud frequencies and precipitation decreases in the cold sector. This behavior seems to be related to a shift from stratiform to convective clouds and precipitation. Stronger ascent in the warm conveyor belt tends to enhance cloudiness and precipitation across the cold front. A strong temperature contrast between the warm and cold sectors also encourages greater post-cold-frontal cloud occurrence. While the seasonal contrasts in environmental temperature, moisture, and ascent strength are enough to explain most of the variations in cloud and precipitation across cold fronts in both hemispheres, they do not fully explain the differences between Northern and Southern Hemisphere cold fronts. These differences are better explained when the impact of the contrast in temperature across the cold front is also considered. In addition, these large-scale parameters do not explain the relatively large frequency in springtime post frontal precipitation.

  12. High-mass star formation possibly triggered by cloud-cloud collision in the H II region RCW 34

    Science.gov (United States)

    Hayashi, Katsuhiro; Sano, Hidetoshi; Enokiya, Rei; Torii, Kazufumi; Hattori, Yusuke; Kohno, Mikito; Fujita, Shinji; Nishimura, Atsushi; Ohama, Akio; Yamamoto, Hiroaki; Tachihara, Kengo; Hasegawa, Yutaka; Kimura, Kimihiro; Ogawa, Hideo; Fukui, Yasuo

    2018-05-01

    We report on the possibility that the high-mass star located in the H II region RCW 34 was formed by a triggering induced by a collision of molecular clouds. Molecular gas distributions of the 12CO and 13CO J = 2-1 and 12CO J = 3-2 lines in the direction of RCW 34 were measured using the NANTEN2 and ASTE telescopes. We found two clouds with velocity ranges of 0-10 km s-1 and 10-14 km s-1. Whereas the former cloud is as massive as ˜1.4 × 104 M⊙ and has a morphology similar to the ring-like structure observed in the infrared wavelengths, the latter cloud, with a mass of ˜600 M⊙, which has not been recognized by previous observations, is distributed to just cover the bubble enclosed by the other cloud. The high-mass star with a spectral type of O8.5V is located near the boundary of the two clouds. The line intensity ratio of 12CO J = 3-2/J = 2-1 yields high values (≳1.0), suggesting that these clouds are associated with the massive star. We also confirm that the obtained position-velocity diagram shows a similar distribution to that derived by a numerical simulation of the supersonic collision of two clouds. Using the relative velocity between the two clouds (˜5 km s-1), the collisional time scale is estimated to be ˜0.2 Myr with the assumption of a distance of 2.5 kpc. These results suggest that the high-mass star in RCW 34 was formed rapidly within a time scale of ˜0.2 Myr via a triggering of a cloud-cloud collision.

  13. STAR FORMATION IN DISK GALAXIES. I. FORMATION AND EVOLUTION OF GIANT MOLECULAR CLOUDS VIA GRAVITATIONAL INSTABILITY AND CLOUD COLLISIONS

    International Nuclear Information System (INIS)

    Tasker, Elizabeth J.; Tan, Jonathan C.

    2009-01-01

    We investigate the formation and evolution of giant molecular clouds (GMCs) in a Milky-Way-like disk galaxy with a flat rotation curve. We perform a series of three-dimensional adaptive mesh refinement numerical simulations that follow both the global evolution on scales of ∼20 kpc and resolve down to scales ∼ H ≥ 100 cm -3 and track the evolution of individual clouds as they orbit through the galaxy from their birth to their eventual destruction via merger or via destructive collision with another cloud. After ∼140 Myr a large fraction of the gas in the disk has fragmented into clouds with masses ∼10 6 M sun and a mass spectrum similar to that of Galactic GMCs. The disk settles into a quasi-steady-state in which gravitational scattering of clouds keeps the disk near the threshold of global gravitational instability. The cloud collision time is found to be a small fraction, ∼1/5, of the orbital time, and this is an efficient mechanism to inject turbulence into the clouds. This helps to keep clouds only moderately gravitationally bound, with virial parameters of order unity. Many other observed GMC properties, such as mass surface density, angular momentum, velocity dispersion, and vertical distribution, can be accounted for in this simple model with no stellar feedback.

  14. Secure Skyline Queries on Cloud Platform.

    Science.gov (United States)

    Liu, Jinfei; Yang, Juncheng; Xiong, Li; Pei, Jian

    2017-04-01

    Outsourcing data and computation to cloud server provides a cost-effective way to support large scale data storage and query processing. However, due to security and privacy concerns, sensitive data (e.g., medical records) need to be protected from the cloud server and other unauthorized users. One approach is to outsource encrypted data to the cloud server and have the cloud server perform query processing on the encrypted data only. It remains a challenging task to support various queries over encrypted data in a secure and efficient way such that the cloud server does not gain any knowledge about the data, query, and query result. In this paper, we study the problem of secure skyline queries over encrypted data. The skyline query is particularly important for multi-criteria decision making but also presents significant challenges due to its complex computations. We propose a fully secure skyline query protocol on data encrypted using semantically-secure encryption. As a key subroutine, we present a new secure dominance protocol, which can be also used as a building block for other queries. Finally, we provide both serial and parallelized implementations and empirically study the protocols in terms of efficiency and scalability under different parameter settings, verifying the feasibility of our proposed solutions.

  15. Precombination Cloud Collapse and Baryonic Dark Matter

    Science.gov (United States)

    Hogan, Craig J.

    1993-01-01

    A simple spherical model of dense baryon clouds in the hot big bang 'strongly nonlinear primordial isocurvature baryon fluctuations' is reviewed and used to describe the dependence of cloud behavior on the model parameters, baryon mass, and initial over-density. Gravitational collapse of clouds before and during recombination is considered including radiation diffusion and trapping, remnant type and mass, and effects on linear large-scale fluctuation modes. Sufficiently dense clouds collapse early into black holes with a minimum mass of approx. 1 solar mass, which behave dynamically like collisionless cold dark matter. Clouds below a critical over-density, however, delay collapse until recombination, remaining until then dynamically coupled to the radiation like ordinary diffuse baryons, and possibly producing remnants of other kinds and lower mass. The mean density in either type of baryonic remnant is unconstrained by observed element abundances. However, mixed or unmixed spatial variations in abundance may survive in the diffuse baryon and produce observable departures from standard predictions.

  16. The impact of radiatively active water-ice clouds on Martian mesoscale atmospheric circulations

    Science.gov (United States)

    Spiga, A.; Madeleine, J.-B.; Hinson, D.; Navarro, T.; Forget, F.

    2014-04-01

    Background and Goals Water ice clouds are a key component of the Martian climate [1]. Understanding the properties of the Martian water ice clouds is crucial to constrain the Red Planet's climate and hydrological cycle both in the present and in the past [2]. In recent years, this statement have become all the more true as it was shown that the radiative effects of water ice clouds is far from being as negligible as hitherto believed; water ice clouds plays instead a key role in the large-scale thermal structure and dynamics of the Martian atmosphere [3, 4, 5]. Nevertheless, the radiative effect of water ice clouds at lower scales than the large synoptic scale (the so-called meso-scales) is still left to be explored. Here we use for the first time mesoscale modeling with radiatively active water ice clouds to address this open question.

  17. MREG V1.1 : a multi-scale image registration algorithm for SAR applications.

    Energy Technology Data Exchange (ETDEWEB)

    Eichel, Paul H.

    2013-08-01

    MREG V1.1 is the sixth generation SAR image registration algorithm developed by the Signal Processing&Technology Department for Synthetic Aperture Radar applications. Like its predecessor algorithm REGI, it employs a powerful iterative multi-scale paradigm to achieve the competing goals of sub-pixel registration accuracy and the ability to handle large initial offsets. Since it is not model based, it allows for high fidelity tracking of spatially varying terrain-induced misregistration. Since it does not rely on image domain phase, it is equally adept at coherent and noncoherent image registration. This document provides a brief history of the registration processors developed by Dept. 5962 leading up to MREG V1.1, a full description of the signal processing steps involved in the algorithm, and a user's manual with application specific recommendations for CCD, TwoColor MultiView, and SAR stereoscopy.

  18. Examination of Regional Trends in Cloud Properties over Surface Sites Derived from MODIS and AVHRR using the CERES Cloud Algorithm

    Science.gov (United States)

    Smith, W. L., Jr.; Minnis, P.; Bedka, K. M.; Sun-Mack, S.; Chen, Y.; Doelling, D. R.; Kato, S.; Rutan, D. A.

    2017-12-01

    Recent studies analyzing long-term measurements of surface insolation at ground sites suggest that decadal-scale trends of increasing (brightening) and decreasing (dimming) downward solar flux have occurred at various times over the last century. Regional variations have been reported that range from near 0 Wm-2/decade to as large as 9 Wm-2/decade depending on the location and time period analyzed. The more significant trends have been attributed to changes in overhead clouds and aerosols, although quantifying their relative impacts using independent observations has been difficult, owing in part to a lack of consistent long-term measurements of cloud properties. This paper examines new satellite based records of cloud properties derived from MODIS (2000-present) and AVHRR (1981- present) data to infer cloud property trends over a number of surface radiation sites across the globe. The MODIS cloud algorithm was developed for the NASA Clouds and the Earth's Radiant Energy System (CERES) project to provide a consistent record of cloud properties to help improve broadband radiation measurements and to better understand cloud radiative effects. The CERES-MODIS cloud algorithm has been modified to analyze other satellites including the AVHRR on the NOAA satellites. Compared to MODIS, obtaining consistent cloud properties over a long period from AVHRR is a much more significant challenge owing to the number of different satellites, instrument calibration uncertainties, orbital drift and other factors. Nevertheless, both the MODIS and AVHRR cloud properties will be analyzed to determine trends, and their level of consistency and correspondence with surface radiation trends derived from the ground-based radiometer data. It is anticipated that this initial study will contribute to an improved understanding of surface solar radiation trends and their relationship to clouds.

  19. The influence of rain and clouds on a satellite dual frequency radar altimeter system operating at 13 and 35 GHz

    Science.gov (United States)

    Walsh, E. J.; Monaldo, F. M.; Goldhirsh, J.

    1983-01-01

    The effects of inhomogeneous spatial attenuation resulting from clouds and rain on the altimeter estimate of the range to mean sea level are modelled. It is demonstrated that typical cloud and rain attenuation variability at commonly expected spatial scales can significantly degrade altimeter range precision. Rain cell and cloud scale sizes and attenuations are considered as factors. The model simulation of altimeter signature distortion is described, and the distortion of individual radar pulse waveforms by different spatial scales of attenuation is considered. Examples of range errors found for models of a single cloud, a rain cell, and cloud streets are discussed.

  20. Life in the clouds: are tropical montane cloud forests responding to changes in climate?

    Science.gov (United States)

    Hu, Jia; Riveros-Iregui, Diego A

    2016-04-01

    The humid tropics represent only one example of the many places worldwide where anthropogenic disturbance and climate change are quickly affecting the feedbacks between water and trees. In this article, we address the need for a more long-term perspective on the effects of climate change on tropical montane cloud forests (TMCF) in order to fully assess the combined vulnerability and long-term response of tropical trees to changes in precipitation regimes, including cloud immersion. We first review the ecophysiological benefits that cloud water interception offers to trees in TMCF and then examine current climatological evidence that suggests changes in cloud base height and impending changes in cloud immersion for TMCF. Finally, we propose an experimental approach to examine the long-term dynamics of tropical trees in TMCF in response to environmental conditions on decade-to-century time scales. This information is important to assess the vulnerability and long-term response of TMCF to changes in cloud cover and fog frequency and duration.

  1. Cloud networking understanding cloud-based data center networks

    CERN Document Server

    Lee, Gary

    2014-01-01

    Cloud Networking: Understanding Cloud-Based Data Center Networks explains the evolution of established networking technologies into distributed, cloud-based networks. Starting with an overview of cloud technologies, the book explains how cloud data center networks leverage distributed systems for network virtualization, storage networking, and software-defined networking. The author offers insider perspective to key components that make a cloud network possible such as switch fabric technology and data center networking standards. The final chapters look ahead to developments in architectures

  2. The Atmospheric Aerosols And Their Effects On Cloud Albedo And Radiative Forcing

    International Nuclear Information System (INIS)

    Stefan, S.; Iorga, G.; Zoran, M.

    2007-01-01

    The aim of this study is to provide results of the theoretical experiments in order to improve the estimates of indirect effect of aerosol on the cloud albedo and consequently on the radiative forcing. The cloud properties could be changed primarily because of changing of both the aerosol type and concentration in the atmosphere. Only a part of aerosol interacts effectively with water and will, in turn, determine the number concentration of cloud droplets (CDNC). We calculated the CDNC, droplet effective radius (reff), cloud optical thickness (or), cloud albedo and radiative forcing, for various types of aerosol. Our results show into what extent the change of aerosol characteristics (number concentration and chemical composition) on a regional scale can modify the cloud reflectivity. Higher values for cloud albedo in the case of the continental (urban) clouds were obtained

  3. Interaction of plasma cloud with external electric field in lower ionosphere

    Directory of Open Access Journals (Sweden)

    Y. S. Dimant

    2010-03-01

    Full Text Available In the auroral lower-E and upper-D region of the ionosphere, plasma clouds, such as sporadic-E layers and meteor plasma trails, occur daily. Large-scale electric fields, created by the magnetospheric dynamo, will polarize these highly conducting clouds, redistributing the electrostatic potential and generating anisotropic currents both within and around the cloud. Using a simplified model of the cloud and the background ionosphere, we develop the first self-consistent three-dimensional analytical theory of these phenomena. For dense clouds, this theory predicts highly amplified electric fields around the cloud, along with strong currents collected from the ionosphere and circulated through the cloud. This has implications for the generation of plasma instabilities, electron heating, and global MHD modeling of magnetosphere-ionosphere coupling via modifications of conductances induced by sporadic-E clouds.

  4. Narrowing the Gap in Quantification of Aerosol-Cloud Radiative Effects

    Science.gov (United States)

    Feingold, G.; McComiskey, A. C.; Yamaguchi, T.; Kazil, J.; Johnson, J. S.; Carslaw, K. S.

    2016-12-01

    Despite large advances in our understanding of aerosol and cloud processes over the past years, uncertainty in the aerosol-cloud radiative effect/forcing is still of major concern. In this talk we will advocate a methodology for quantifying the aerosol-cloud radiative effect that considers the primacy of fundamental cloud properties such as cloud amount and albedo alongside the need for process level understanding of aerosol-cloud interactions. We will present a framework for quantifying the aerosol-cloud radiative effect, regime-by-regime, through process-based modelling and observations at the large eddy scale. We will argue that understanding the co-variability between meteorological and aerosol drivers of the radiative properties of the cloud system may be as important an endeavour as attempting to untangle these drivers.

  5. Factors influencing the parameterization of anvil clouds within GCMs

    International Nuclear Information System (INIS)

    Leone, J.M. Jr.; Chin, Hung-Neng.

    1993-03-01

    The overall goal of this project is to improve the representation of clouds and their effects within global climate models (GCMs). The authors have concentrated on a small portion of the overall goal, the evolution of convectively generated cirrus clouds and their effects on the large-scale environment. Because of the large range of time and length scales involved they have been using a multi-scale attack. For the early time generation and development of the cirrus anvil they are using a cloud-scale model with horizontal resolution of 1--2 kilometers; while for the larger scale transport by the larger scale flow they are using a mesoscale model with a horizontal resolution of 20--60 kilometers. The eventual goal is to use the information obtained from these simulations together with available observations to derive improved cloud parameterizations for use in GCMs. This paper presents results from their cloud-scale studies and describes a new tool, a cirrus generator, that they have developed to aid in their mesoscale studies

  6. USGEO DMWG Cloud Computing Recommendations

    Science.gov (United States)

    de la Beaujardiere, J.; McInerney, M.; Frame, M. T.; Summers, C.

    2017-12-01

    The US Group on Earth Observations (USGEO) Data Management Working Group (DMWG) has been developing Cloud Computing Recommendations for Earth Observations. This inter-agency report is currently in draft form; DMWG hopes to have released the report as a public Request for Information (RFI) by the time of AGU. The recommendations are geared toward organizations that have already decided to use the Cloud for some of their activities (i.e., the focus is not on "why you should use the Cloud," but rather "If you plan to use the Cloud, consider these suggestions.") The report comprises Introductory Material, including Definitions, Potential Cloud Benefits, and Potential Cloud Disadvantages, followed by Recommendations in several areas: Assessing When to Use the Cloud, Transferring Data to the Cloud, Data and Metadata Contents, Developing Applications in the Cloud, Cost Minimization, Security Considerations, Monitoring and Metrics, Agency Support, and Earth Observations-specific recommendations. This talk will summarize the recommendations and invite comment on the RFI.

  7. Cloud GIS Based Watershed Management

    Science.gov (United States)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  8. Security Problems in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Rola Motawie

    2016-12-01

    Full Text Available Cloud is a pool of computing resources which are distributed among cloud users. Cloud computing has many benefits like scalability, flexibility, cost savings, reliability, maintenance and mobile accessibility. Since cloud-computing technology is growing day by day, it comes with many security problems. Securing the data in the cloud environment is most critical challenges which act as a barrier when implementing the cloud. There are many new concepts that cloud introduces, such as resource sharing, multi-tenancy, and outsourcing, create new challenges for the security community. In this work, we provide a comparable study of cloud computing privacy and security concerns. We identify and classify known security threats, cloud vulnerabilities, and attacks.

  9. Radiative Importance of Aerosol-Cloud Interaction

    Science.gov (United States)

    Tsay, Si-Chee

    1999-01-01

    Aerosol particles are input into the troposphere by biomass burning, among other sources. These aerosol palls cover large expanses of the earth's surface. Aerosols may directly scatter solar radiation back to space, thus increasing the earth's albedo and act to cool the earth's surface and atmosphere. Aerosols also contribute to the earth's energy balance indirectly. Hygroscopic aerosol act as cloud condensation nuclei (CCN) and thus affects cloud properties. In 1977, Twomey theorized that additional available CCN would create smaller but more numerous cloud droplets in a cloud with a given amount of liquid water. This in turn would increase the cloud albedo which would scatter additional radiation back to space and create a similar cooling pattern as the direct aerosol effect. Estimates of the magnitude of the aerosol indirect effect on a global scale range from 0.0 to -4.8 W/sq m. Thus the indirect effect can be of comparable magnitude and opposite in sign to the estimates of global greenhouse gas forcing Aerosol-cloud interaction is not a one-way process. Just as aerosols have an influence on clouds through the cloud microphysics, clouds have an influence on aerosols. Cloud dropl