WorldWideScience

Sample records for high-velocity clouds based

  1. Supernovae-generated high-velocity compact clouds

    Science.gov (United States)

    Yalinewich, A.; Beniamini, P.

    2018-05-01

    Context. A previous study claimed the discovery of an intermediate-mass black hole (IMBH). This hypothetical black hole was invoked in order to explain the high-velocity dispersion in one of several dense molecular clouds near the Galactic center. The same study considered the possibility that this cloud was due to a supernova explosion, but disqualified this scenario because no X-rays were detected. Aims: We here check whether a supernova explosion could have produced that cloud, and whether this explanation is more likely than an IMBH. More specifically, we wish to determine whether a supernova inside a dense molecular cloud would emit in the X-rays. Methods: We have approached this problem from two different directions. First, we performed an analytic calculation to determine the cooling rate by thermal bremsstrahlung and compared this time to the lifetime of the cloud. Second, we estimated the creation rate of these dense clouds in the central molecular zone (CMZ) region near the Galactic center, where they were observed. Based on this rate, we can place lower bounds on the total mass of IMBHs and clouds and compare this to the masses of the components of the CMZ. Results: We find that the cooling time of the supernova remnant inside a molecular cloud is shorter than its dynamical time. This means that the temperature in such a remnant would be much lower than that of a typical supernova remnant. At such a low temperature, the remnant is not expected to emit in the X-rays. We also find that to explain the rate at which such dense clouds are created requires fine-tuning the number of IMBHs. Conclusions: We find the supernova model to be a more likely explanation for the formation of high-velocity compact clouds than an IMBH.

  2. SIMULATIONS OF HIGH-VELOCITY CLOUDS. I. HYDRODYNAMICS AND HIGH-VELOCITY HIGH IONS

    International Nuclear Information System (INIS)

    Kwak, Kyujin; Henley, David B.; Shelton, Robin L.

    2011-01-01

    We present hydrodynamic simulations of high-velocity clouds (HVCs) traveling through the hot, tenuous medium in the Galactic halo. A suite of models was created using the FLASH hydrodynamics code, sampling various cloud sizes, densities, and velocities. In all cases, the cloud-halo interaction ablates material from the clouds. The ablated material falls behind the clouds where it mixes with the ambient medium to produce intermediate-temperature gas, some of which radiatively cools to less than 10,000 K. Using a non-equilibrium ionization algorithm, we track the ionization levels of carbon, nitrogen, and oxygen in the gas throughout the simulation period. We present observation-related predictions, including the expected H I and high ion (C IV, N V, and O VI) column densities on sightlines through the clouds as functions of evolutionary time and off-center distance. The predicted column densities overlap those observed for Complex C. The observations are best matched by clouds that have interacted with the Galactic environment for tens to hundreds of megayears. Given the large distances across which the clouds would travel during such time, our results are consistent with Complex C having an extragalactic origin. The destruction of HVCs is also of interest; the smallest cloud (initial mass ∼ 120 M sun ) lost most of its mass during the simulation period (60 Myr), while the largest cloud (initial mass ∼ 4 x 10 5 M sun ) remained largely intact, although deformed, during its simulation period (240 Myr).

  3. Distances, metallicities and origins of high-velocity clouds

    NARCIS (Netherlands)

    van Woerden, H; Wakker, BP; Peletier, RF; Schwarz, UJ; KraanKorteweg, RC; Henning, PA; Andernach, H

    2000-01-01

    A review is given of distances of high-velocity clouds (HVCs) derived from absorption-line measurements, and of the metallicities of HVCs. Chain A definitely lies in the Galactic halo, between 2.5 and 7 kpc above the plane. The distance limits available for other HVCs allow a variety of locations:

  4. MAGNETIZED GAS IN THE SMITH HIGH VELOCITY CLOUD

    International Nuclear Information System (INIS)

    Hill, Alex S.; McClure-Griffiths, Naomi M.; Mao, S. A.; Benjamin, Robert A.; Lockman, Felix J.

    2013-01-01

    We report the first detection of magnetic fields associated with the Smith High Velocity Cloud. We use a catalog of Faraday rotation measures toward extragalactic radio sources behind the Smith Cloud, new H I observations from the Robert C. Byrd Green Bank Telescope, and a spectroscopic map of Hα from the Wisconsin H-Alpha Mapper Northern Sky Survey. There are enhancements in rotation measure (RM) of ≈100 rad m –2 which are generally well correlated with decelerated Hα emission. We estimate a lower limit on the line-of-sight component of the field of ≈8 μG along a decelerated filament; this is a lower limit due to our assumptions about the geometry. No RM excess is evident in sightlines dominated by H I or Hα at the velocity of the Smith Cloud. The smooth Hα morphology of the emission at the Smith Cloud velocity suggests photoionization by the Galactic ionizing radiation field as the dominant ionization mechanism, while the filamentary morphology and high (≈1 Rayleigh) Hα intensity of the lower-velocity magnetized ionized gas suggests an ionization process associated with shocks due to interaction with the Galactic interstellar medium. The presence of the magnetic field may contribute to the survival of high velocity clouds like the Smith Cloud as they move from the Galactic halo to the disk. We expect these data to provide a test for magnetohydrodynamic simulations of infalling gas

  5. DISTRIBUTION AND ORIGIN OF HIGH-VELOCITY CLOUDS .3. CLOUDS, COMPLEXES AND POPULATIONS

    NARCIS (Netherlands)

    WAKKER, BP; VANWOERDEN, H

    1991-01-01

    We present the first complete catalogue of high-velocity clouds (HVCs), followed by a classification of these clouds into complexes and populations. The catalogue will form the basis for comparisons with theoretical models. The study described here yields the following conclusions: (1) Differential

  6. Galactic hail: the origin of the high-velocity cloud complex C

    NARCIS (Netherlands)

    Fraternali, F.; Marasco, A.; Armillotta, L.; Marinacci, F.

    High-velocity clouds consist of cold gas that appears to be raining down from the halo to the disc of the Milky Way. Over the past 50 years, two competing scenarios have attributed their origin either to gas accretion from outside the Galaxy or to circulation of gas from the Galactic disc powered by

  7. Complex C: A Low-Metallicity, High-Velocity Cloud Plunging into the Milky Way

    Science.gov (United States)

    Tripp, Todd M.; Wakker, Bart P.; Jenkins, Edward B.; Bowers, C. W.; Danks, A. C.; Green, R. F.; Heap, S. R.; Joseph, C. L.; Kaiser, M. E.; Linsky, J. L.; Woodgate, B. E.

    2003-06-01

    We present evidence that high-velocity cloud (HVC) complex C is a low-metallicity gas cloud that is plunging toward the disk and beginning to interact with the ambient gas that surrounds the Milky Way. This evidence begins with a new high-resolution (7 km s-1 FWHM) echelle spectrum of 3C 351 obtained with the Space Telescope Imaging Spectrograph (STIS). 3C 351 lies behind the low-latitude edge of complex C, and the new spectrum provides accurate measurements of O I, Si II, Al II, Fe II, and Si III absorption lines at the velocity of complex C; N I, S II, Si IV, and C IV are not detected at 3 σ significance in complex C proper. However, Si IV and C IV as well as O I, Al II, Si II and Si III absorption lines are clearly present at somewhat higher velocities associated with a ``high-velocity ridge'' (HVR) of 21 cm emission. This high-velocity ridge has a similar morphology to and is roughly centered on complex C proper. The similarities of the absorption-line ratios in the HVR and complex C suggest that these structures are intimately related. In complex C proper we find [O/H]=-0.76+0.23-0.21. For other species the measured column densities indicate that ionization corrections are important. We use collisional and photoionization models to derive ionization corrections; in both models we find that the overall metallicity Z=0.1-0.3 Zsolar in complex C proper, but nitrogen must be underabundant. The iron abundance indicates that the complex C contains very little dust. The size and density implied by the ionization models indicate that the absorbing gas is not gravitationally confined. The gas could be pressure confined by an external medium, but alternatively we may be viewing the leading edge of the HVC, which is ablating and dissipating as it plunges into the Milky Way. O VI column densities observed with the Far Ultraviolet Spectroscopic Explorer (FUSE) toward nine QSOs/AGNs behind complex C support this conclusion: N(O VI) is highest near 3C 351, and the O VI/H I

  8. Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields

    Science.gov (United States)

    Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo

    The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.

  9. Rotational explanation of the high-velocity meolecular emission from the Orion Molecular Cloud

    International Nuclear Information System (INIS)

    Clark, F.O.; Biretta, J.A.; Martin, H.M.

    1979-01-01

    The high-velocity molecular emission of the Orion Molecular Cloud has been sampled using the J/sub N/=2 2 --1 1 rotational spectral line of the SO molecule. The resulting profile, including the high-velocity wings, has been reproduced using only known large-scale properties of the gas and applications of the results of published theoretical calculations. No new physical mechanism is required; observed rotation and conservation of angular momentum are sufficient to reproduce the line profile. The resulting physical state appears to be consistent with all known physical properties. This solution is not unique, but indicates the strengths and weaknesses of such a model for interpretation of Orion as well as the similarities of alternative explanations

  10. Searching for Dark Matter Annihilation in the Smith High-Velocity Cloud

    Science.gov (United States)

    Drlica-Wagner, Alex; Gomez-Vargas, German A.; Hewitt, John W.; Linden, Tim; Tibaldo, Luigi

    2014-01-01

    Recent observations suggest that some high-velocity clouds may be confined by massive dark matter halos. In particular, the proximity and proposed dark matter content of the Smith Cloud make it a tempting target for the indirect detection of dark matter annihilation. We argue that the Smith Cloud may be a better target than some Milky Way dwarf spheroidal satellite galaxies and use gamma-ray observations from the Fermi Large Area Telescope to search for a dark matter annihilation signal. No significant gamma-ray excess is found coincident with the Smith Cloud, and we set strong limits on the dark matter annihilation cross section assuming a spatially extended dark matter profile consistent with dynamical modeling of the Smith Cloud. Notably, these limits exclude the canonical thermal relic cross section (approximately 3 x 10 (sup -26) cubic centimeters per second) for dark matter masses less than or approximately 30 gigaelectronvolts annihilating via the B/B- bar oscillation or tau/antitau channels for certain assumptions of the dark matter density profile; however, uncertainties in the dark matter content of the Smith Cloud may significantly weaken these constraints.

  11. Searching for dark matter annihilation in the Smith high-velocity cloud

    International Nuclear Information System (INIS)

    Drlica-Wagner, Alex; Gómez-Vargas, Germán A.; Hewitt, John W.; Linden, Tim; Tibaldo, Luigi

    2014-01-01

    Recent observations suggest that some high-velocity clouds may be confined by massive dark matter halos. In particular, the proximity and proposed dark matter content of the Smith Cloud make it a tempting target for the indirect detection of dark matter annihilation. We argue that the Smith Cloud may be a better target than some Milky Way dwarf spheroidal satellite galaxies and use γ-ray observations from the Fermi Large Area Telescope to search for a dark matter annihilation signal. No significant γ-ray excess is found coincident with the Smith Cloud, and we set strong limits on the dark matter annihilation cross section assuming a spatially extended dark matter profile consistent with dynamical modeling of the Smith Cloud. Notably, these limits exclude the canonical thermal relic cross section (∼ 3 × 10 –26 cm 3 s –1 ) for dark matter masses ≲ 30 GeV annihilating via the b b-bar or τ + τ – channels for certain assumptions of the dark matter density profile; however, uncertainties in the dark matter content of the Smith Cloud may significantly weaken these constraints.

  12. Searching for dark matter annihilation in the Smith high-velocity cloud

    Energy Technology Data Exchange (ETDEWEB)

    Drlica-Wagner, Alex [Center for Particle Astrophysics, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Gómez-Vargas, Germán A. [Departamento de Fisíca, Pontificia Universidad Católica de Chile, Avenida Vicuña Mackenna 4860, Santiago (Chile); Hewitt, John W. [CRESST, University of Maryland, Baltimore County, Baltimore, MD 21250 (United States); Linden, Tim [The Kavli Institute for Cosmological Physics, University of Chicago, Chicago, IL 60637 (United States); Tibaldo, Luigi [W. W. Hansen Experimental Physics Laboratory, Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics and SLAC National Accelerator Laboratory, Stanford University, Stanford, CA 94305 (United States)

    2014-07-20

    Recent observations suggest that some high-velocity clouds may be confined by massive dark matter halos. In particular, the proximity and proposed dark matter content of the Smith Cloud make it a tempting target for the indirect detection of dark matter annihilation. We argue that the Smith Cloud may be a better target than some Milky Way dwarf spheroidal satellite galaxies and use γ-ray observations from the Fermi Large Area Telescope to search for a dark matter annihilation signal. No significant γ-ray excess is found coincident with the Smith Cloud, and we set strong limits on the dark matter annihilation cross section assuming a spatially extended dark matter profile consistent with dynamical modeling of the Smith Cloud. Notably, these limits exclude the canonical thermal relic cross section (∼ 3 × 10{sup –26} cm{sup 3} s{sup –1}) for dark matter masses ≲ 30 GeV annihilating via the b b-bar or τ{sup +}τ{sup –} channels for certain assumptions of the dark matter density profile; however, uncertainties in the dark matter content of the Smith Cloud may significantly weaken these constraints.

  13. High-energy radiation from collisions of high-velocity clouds and the Galactic disc

    Science.gov (United States)

    del Valle, Maria V.; Müller, A. L.; Romero, G. E.

    2018-04-01

    High-velocity clouds (HVCs) are interstellar clouds of atomic hydrogen that do not follow normal Galactic rotation and have velocities of a several hundred kilometres per second. A considerable number of these clouds are falling down towards the Galactic disc. HVCs form large and massive complexes, so if they collide with the disc a great amount of energy would be released into the interstellar medium. The cloud-disc interaction produces two shocks: one propagates through the cloud and the other through the disc. The properties of these shocks depend mainly on the cloud velocity and the disc-cloud density ratio. In this work, we study the conditions necessary for these shocks to accelerate particles by diffusive shock acceleration and we study the non-thermal radiation that is produced. We analyse particle acceleration in both the cloud and disc shocks. Solving a time-dependent two-dimensional transport equation for both relativistic electrons and protons, we obtain particle distributions and non-thermal spectral energy distributions. In a shocked cloud, significant synchrotron radio emission is produced along with soft gamma rays. In the case of acceleration in the shocked disc, the non-thermal radiation is stronger; the gamma rays, of leptonic origin, might be detectable with current instruments. A large number of protons are injected into the Galactic interstellar medium, and locally exceed the cosmic ray background. We conclude that under adequate conditions the contribution from HVC-disc collisions to the galactic population of relativistic particles and the associated extended non-thermal radiation might be important.

  14. Cool C-shocks and high-velocity flows in molecular clouds

    International Nuclear Information System (INIS)

    Smith, M.D.; Brand, P.W.J.L.

    1990-01-01

    C-shocks can be driven through dense clouds when the neutrals and magnetic field interact weakly due to a paucity of ions. We develop a method for calculating C-shock properties with the aim of interpreting the observed high-velocity molecular hydrogen. A high Mach number approximation, corresponding to low temperatures, is employed. Under strong cooling conditions the flow is continuous even though a subsonic region may be present downstream. Analytic expressions for the maximum temperature, dissociation fraction, self-ionization level and J-shock transition are derived. (author)

  15. HIGH-RESOLUTION OBSERVATIONS AND THE PHYSICS OF HIGH-VELOCITY CLOUD A0

    International Nuclear Information System (INIS)

    Verschuur, Gerrit L.

    2013-01-01

    The neutral hydrogen structure of high-velocity cloud A0 (at about –180 km s –1 ) has been mapped with a 9.'1 resolution. Gaussian decomposition of the profiles is used to separately map families of components defined by similarities in center velocities and line widths. About 70% of the H I gas is in the form of a narrow, twisted filament whose typical line widths are of the order of 24 km s –1 . Many bright features with narrow line widths of the order of 6 km s –1 , clouds, are located in and near the filament. A third category with properties between those of the filament and clouds appears in the data. The clouds are not always co-located with the broader line width filament emission as seen projected on the sky. Under the assumption that magnetic fields underlie the presence of the filament, a theorem is developed for its stability in terms of a toroidal magnetic field generated by the flow of gas along field lines. It is suggested that the axial magnetic field strength may be derived from the excess line width of the H I emission over and above that due to kinetic temperature by invoking the role of Alfvén waves that create what is in essence a form of magnetic turbulence. At a distance of 200 pc the axial and the derived toroidal magnetic field strengths in the filament are then about 6 μG while for the clouds they are about 4 μG. The dependence of the derived field strength on distance is discussed.

  16. A systematic search for dwarf counterparts to ultra compact high velocity clouds

    Science.gov (United States)

    Bennet, Paul; Sand, David J.; Crnojevic, Denija; Strader, Jay

    2015-01-01

    Observations of the Universe on scales smaller than typical, massive galaxies challenge the standard Lambda Cold Dark Matter paradigm for structure formation. It is thus imperative to discover and characterize the faintest dwarf galaxy systems, not just within the Local Group, but in relatively isolated environments as well in order to properly connect them with models of structure formation. Here we report on a systematic search of public ultraviolet and optical archives for dwarf galaxy counterparts to so-called Ultra Compact High Velocity Clouds (UCHVCs), which are compact, isolated HI sources recently found in the Galactic Arecibo L-band Feed Array-HI (GALFA-HI) and Arecibo Legacy Fast ALFA (ALFALFA-HI) surveys. Our search has uncovered at least three strong dwarf galaxy candidates, and we present their inferred star formation rate and structural properties here.

  17. A CATALOG OF ULTRA-COMPACT HIGH VELOCITY CLOUDS FROM THE ALFALFA SURVEY: LOCAL GROUP GALAXY CANDIDATES?

    International Nuclear Information System (INIS)

    Adams, Elizabeth A. K.; Giovanelli, Riccardo; Haynes, Martha P.

    2013-01-01

    We present a catalog of 59 ultra-compact high velocity clouds (UCHVCs) extracted from the 40% complete ALFALFA HI-line survey. The ALFALFA UCHVCs have median flux densities of 1.34 Jy km s –1 , median angular diameters of 10', and median velocity widths of 23 km s –1 . We show that the full UCHVC population cannot easily be associated with known populations of high velocity clouds. Of the 59 clouds presented here, only 11 are also present in the compact cloud catalog extracted from the commensal GALFA-HI survey, demonstrating the utility of this separate dataset and analysis. Based on their sky distribution and observed properties, we infer that the ALFALFA UCHVCs are consistent with the hypothesis that they may be very low mass galaxies within the Local Volume. In that case, most of their baryons would be in the form of gas, and because of their low stellar content, they remain unidentified by extant optical surveys. At distances of ∼1 Mpc, the UCHVCs have neutral hydrogen (H I) masses of ∼10 5 -10 6 M ☉ , H I diameters of ∼2-3 kpc, and indicative dynamical masses within the H I extent of ∼10 7 -10 8 M ☉ , similar to the Local Group ultra-faint dwarf Leo T. The recent ALFALFA discovery of the star-forming, metal-poor, low mass galaxy Leo P demonstrates that this hypothesis is true in at least one case. In the case of the individual UCHVCs presented here, confirmation of their extragalactic nature will require further work, such as the identification of an optical counterpart to constrain their distance.

  18. A CATALOG OF ULTRA-COMPACT HIGH VELOCITY CLOUDS FROM THE ALFALFA SURVEY: LOCAL GROUP GALAXY CANDIDATES?

    Energy Technology Data Exchange (ETDEWEB)

    Adams, Elizabeth A. K.; Giovanelli, Riccardo; Haynes, Martha P., E-mail: betsey@astro.cornell.edu, E-mail: riccardo@astro.cornell.edu, E-mail: haynes@astro.cornell.edu [Center for Radiophysics and Space Research, Space Sciences Building, Cornell University, Ithaca, NY 14853 (United States)

    2013-05-01

    We present a catalog of 59 ultra-compact high velocity clouds (UCHVCs) extracted from the 40% complete ALFALFA HI-line survey. The ALFALFA UCHVCs have median flux densities of 1.34 Jy km s{sup -1}, median angular diameters of 10', and median velocity widths of 23 km s{sup -1}. We show that the full UCHVC population cannot easily be associated with known populations of high velocity clouds. Of the 59 clouds presented here, only 11 are also present in the compact cloud catalog extracted from the commensal GALFA-HI survey, demonstrating the utility of this separate dataset and analysis. Based on their sky distribution and observed properties, we infer that the ALFALFA UCHVCs are consistent with the hypothesis that they may be very low mass galaxies within the Local Volume. In that case, most of their baryons would be in the form of gas, and because of their low stellar content, they remain unidentified by extant optical surveys. At distances of {approx}1 Mpc, the UCHVCs have neutral hydrogen (H I) masses of {approx}10{sup 5}-10{sup 6} M{sub Sun }, H I diameters of {approx}2-3 kpc, and indicative dynamical masses within the H I extent of {approx}10{sup 7}-10{sup 8} M{sub Sun }, similar to the Local Group ultra-faint dwarf Leo T. The recent ALFALFA discovery of the star-forming, metal-poor, low mass galaxy Leo P demonstrates that this hypothesis is true in at least one case. In the case of the individual UCHVCs presented here, confirmation of their extragalactic nature will require further work, such as the identification of an optical counterpart to constrain their distance.

  19. THE FIRST DISTANCE CONSTRAINT ON THE RENEGADE HIGH-VELOCITY CLOUD COMPLEX WD

    Energy Technology Data Exchange (ETDEWEB)

    Peek, J. E. G.; Roman-Duval, Julia; Tumlinson, Jason [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Bordoloi, Rongmon [MIT-Kavli Center for Astrophysics and Space Research, 77 Massachusetts Avenue, Cambridge, MA 02139 (United States); Sana, Hugues [Institute of Astronomy, KU Leuven, Celestijnenlaan 200 D, B-3001 Leuven (Belgium); Zheng, Yong [Department of Astronomy, Columbia University, New York, NY 10027 (United States)

    2016-09-10

    We present medium-resolution, near-ultraviolet Very Large Telescope/FLAMES observations of the star USNO-A0600-15865535. We adapt a standard method of stellar typing to our measurement of the shape of the Balmer ϵ absorption line to demonstrate that USNO-A0600-15865535 is a blue horizontal branch star, residing in the lower stellar halo at a distance of 4.4 kpc from the Sun. We measure the H and K lines of singly ionized calcium and find two isolated velocity components, one originating in the disk, and one associated with the high-velocity cloud complex WD. This detection demonstrated that complex WD is closer than ∼4.4 kpc and is the first distance constraint on the +100 km s{sup −1} Galactic complex of clouds. We find that complex WD is not in corotation with the Galactic disk, which has been assumed for decades. We examine a number of scenarios and find that the most likely scenario is that complex WD was ejected from the solar neighborhood and is only a few kiloparsecs from the Sun.

  20. A High-velocity Cloud Impact Forming a Supershell in the Milky Way

    Science.gov (United States)

    Park, Geumsook; Koo, Bon-Chul; Kang, Ji-hyun; Gibson, Steven J.; Peek, J. E. G.; Douglas, Kevin A.; Korpela, Eric J.; Heiles, Carl E.

    2016-08-01

    Neutral atomic hydrogen (H I) gas in interstellar space is largely organized into filaments, loops, and shells, the most prominent of which are “supershells.” These gigantic structures, which require ≳ 3× {10}52 erg to form, are generally thought to be produced by either the explosion of multiple supernovae (SNe) in OB associations or, alternatively, by the impact of high-velocity clouds (HVCs) falling into the Galactic disk. Here, we report the detection of a kiloparsec (kpc)-size supershell in the outskirts of the Milky Way with the compact HVC 040 + 01-282 (hereafter, CHVC040) at its geometrical center using the “Inner-Galaxy Arecibo L-band Feed Array” H I 21 cm survey data. The morphological and physical properties of both objects suggest that CHVC040, which is either a fragment of a nearby disrupted galaxy or a cloud that originated from an intergalactic accreting flow, collided with the disk ˜5 Myr ago to form the supershell. Our results show that some compact HVCs can survive their trip through the Galactic halo and inject energy and momentum into the Milky Way disk.

  1. Evolution of star-bearing molecular clouds: the high-velocity HCO+ flow in NGC 2071

    International Nuclear Information System (INIS)

    Wootten, A.; Loren, R.B.; Sandqvist, A.; Friberg, P.; Hjalmarson, Aa.

    1984-01-01

    The J = 1-0 and J = 302 lines of HCO + and H 13 CO + have been observed in the molecular cloud NGC 2071, where they map the dense portions of a bidirectional molecular flow. The high resolution (42'') of our observations has enabled us to determine the distribution of mass, momentum , and energy in the flow as a function of projected distance from the cluster. Both momentum and energy diminish with distance from the central cluster of infrared sources. The highest velocities at a given intensity in this dense flow occur in a limited region coincident with an infrared cluster and the densest part of the molecular cloud. Higher resolution (33'') CO and 13 CO observations reveal that the extreme velocities in the flow occur in regions displaced on opposite sides of the cluster, suggesting that the flow only becomes visible in molecular line emission at distances approx.0.1 pc from its supposed source. Lower velocity material containing most of the mass of the flow is found over larger regions, as expected if the flow has decelerated as it has evolved. Assuming conservation of momentum, the historical rate of momentum injection is found to have been roughly constant over a period of 10 4 years, suggesting a constancy of the average luminosity of the central cluster over that time. The J = 3--2 HCO + profile does not show the absorption which is a prominent feature of the J = 1--0 profile, and the J = 3--2 line appears to be a useful probe of conditions specific to the dense cores of clouds. The high velocity HCO + emission correlates very well with spatial and velocity events of molecular hydrogen emission. The abundance of HCO + [X(HCO + )approx.10 -8 ], and by inference the electron density, is similar in material at all velocities

  2. A Discovery of a Compact High Velocity Cloud-Galactic Supershell System

    Science.gov (United States)

    Park, Geumsook; Koo, Bon-Chul; Kang, Ji-hyun; Gibson, Steven J.; Peek, Joshua Eli Goldston; Douglas, Kevin A.; Korpela, Eric J.; Heiles, Carl E.

    2017-01-01

    High velocity clouds (HVCs) are neutral hydrogen (HI) gas clouds having very different radial velocities from those of the Galactic disk material. While some large HVC complexes are known to be gas streams tidally stripped from satellite galaxies of the Milky Way, there are relatively isolated and small angular-sized HVCs, so called “compact HVCs (CHVCs)”, the origin of which remains controversial. There are about 300 known CHVCs in the Milky Way, and many of them show a head-tail structure, implying a ram pressure interaction with the diffuse Galactic halo gas. It is, however, not clear whether CHVCs are completely dissipated in the Galactic halo to feed the multi-phase circumgalactic medium or they can survive their trip through the halo and collide with the Galactic disk. The colliding CHVCs may leave a gigantic trail in the disk, and it had been suggested that some of HI supershells that require ≧ 3 x 1052 erg may be produced by the collision of such HVCs.Here we report the detection of a kiloparsec (kpc)-size supershell in the outskirts of the Milky Way with the compact HVC 040+01-282 (hereafter, CHVC040) at its geometrical center using the “Inner-Galaxy Arecibo L-band Feed Array” HI 21 cm survey data. The morphological and physical properties of both objects suggest that CHVC040, which is either a fragment of a nearby disrupted galaxy or a cloud that originated from an intergalactic accreting flow, collided with the disk ˜5 Myr ago to form the supershell. Our results show that some compact HVCs can survive their trip through the Galactic halo and inject energy and momentum into the Milky Way disk.

  3. A new all-sky map of Galactic high-velocity clouds from the 21-cm HI4PI survey

    Science.gov (United States)

    Westmeier, Tobias

    2018-02-01

    High-velocity clouds (HVCs) are neutral or ionized gas clouds in the vicinity of the Milky Way that are characterized by high radial velocities inconsistent with participation in the regular rotation of the Galactic disc. Previous attempts to create a homogeneous all-sky H I map of HVCs have been hampered by a combination of poor angular resolution, limited surface brightness sensitivity and suboptimal sampling. Here, a new and improved H I map of Galactic HVCs based on the all-sky HI4PI survey is presented. The new map is fully sampled and provides significantly better angular resolution (16.2 versus 36 arcmin) and column density sensitivity (2.3 versus 3.7 × 1018 cm-2 at the native resolution) than the previously available LAB survey. The new HVC map resolves many of the major HVC complexes in the sky into an intricate network of narrow H I filaments and clumps that were not previously resolved by the LAB survey. The resulting sky coverage fraction of high-velocity H I emission above a column density level of 2 × 1018 cm-2 is approximately 15 per cent, which reduces to about 13 per cent when the Magellanic Clouds and other non-HVC emission are removed. The differential sky coverage fraction as a function of column density obeys a truncated power law with an exponent of -0.93 and a turnover point at about 5 × 1019 cm-2. H I column density and velocity maps of the HVC sky are made publicly available as FITS images for scientific use by the community.

  4. Ultra-compact high velocity clouds in the ALFALFA HI survey: Candidate Local Group galaxies?

    Science.gov (United States)

    Adams, Elizabeth Ann Kovenz

    The increased sensitivity and spatial resolution of the ALFALFA HI survey has resulted in the detection of ultra-compact high velocity clouds (UCHVCs). These objects are good candidates to represent low mass gas-rich galaxies in the Local Group and Local Volume with stellar populations that are too faint to be detected in extant optical surveys. This idea is referred to as the "minihalo hypothesis". We identify the UCHVCs within the ALFALFA dataset via the use of a 3D matched filtering signal identification algorithm. UCHVCs are selected based on a compact size ( 120 km s-1) and isolation. Within the 40% complete ALFALFA survey (alpha.40), 59 UCHVCs are identified; 19 are in a most-isolated subset and are the best galaxy candidates. Due to the presence of large HVC complexes in the fall sky, most notably the Magellanic Stream, the association of UCHVCs with existing structure cannot be ruled out. In the spring sky, the spatial and kinematic distribution of the UCHVCs is consistent with simulations of dark matter halos within the Local Group. In addition, the HI properties of the UCHVCs (if placed at 1 Mpc) are consistent with both theoretical and observational predictions for low mass gas-rich galaxies. Importantly, the HI properties of the UCHVCs are consistent with those of two recently discovered low mass gas-rich galaxies in the Local Group and Local Volume, Leo T and Leo P. Detailed follow-up observations are key for addressing the minihalo hypothesis. High resolution HI observations can constrain the environment of a UCHVC and offer evidence for a hosting dark matter halo through evidence of rotation support and comparison to theoretical models. Observations of one UCHVC at high resolution (15'') reveal the presence of a clumpy HI distribution, similar to both low mass galaxies and circumgalactic compact HVCs. An extended envelope containing ˜50% of the HI flux is resolved out by the array configuration; observations at lower spatial resolution can recover

  5. ULTRA-COMPACT HIGH VELOCITY CLOUDS AS MINIHALOS AND DWARF GALAXIES

    Energy Technology Data Exchange (ETDEWEB)

    Faerman, Yakov; Sternberg, Amiel [Raymond and Beverly Sackler School of Physics and Astronomy, Tel Aviv University, Ramat Aviv 69978 (Israel); McKee, Christopher F., E-mail: yakovfae@post.tau.ac.il [Department of Physics and Department of Astronomy, University of California at Berkeley, Berkeley, CA 94720 (United States)

    2013-11-10

    We present dark matter minihalo models for the Ultra-Compact, High-Velocity H I Clouds (UCHVCs) recently discovered in the 21 cm ALFALFA survey. We assume gravitational confinement of 10{sup 4} K H I gas by flat-cored dark-matter subhalos within the Local Group. We show that for flat cores, typical (median) tidally stripped cosmological subhalos at redshift z = 0 have dark-matter masses of ∼10{sup 7} M{sub ☉} within the central 300 pc (independent of total halo mass), consistent with the 'Strigari mass scale' observed in low-luminosity dwarf galaxies. Flat-cored subhalos also resolve the mass discrepancy between simulated and observed satellites around the Milky Way. For the UCHVCs, we calculate the photoionization-limited hydrostatic gas profiles for any distance-dependent total observed H I mass and predict the associated (projected) H I half-mass radii, assuming the clouds are embedded in distant (d ∼> 300 kpc) and unstripped subhalos. For a typical UCHVC (0.9 Jy km s{sup –1}), we predict physical H I half-mass radii of 0.18 to 0.35 kpc (or angular sizes of 0.'6 to 2.'1) for distances ranging from 300 kpc to 2 Mpc. As a consistency check, we model the gas-rich dwarf galaxy Leo T, for which there is a well-resolved H I column density profile and a known distance (420 kpc). For Leo T, we find that a subhalo with M{sub 300} = 8 (± 0.2) × 10{sup 6} M{sub ☉} best fits the observed H I profile. We derive an upper limit of P{sub HIM} ∼< 150 cm{sup –3} K for the pressure of any enveloping hot intergalactic medium gas at the distance of Leo T. Our analysis suggests that some of the UCHVCs may in fact constitute a population of 21 cm-selected but optically faint dwarf galaxies in the Local Group.

  6. A COMPACT HIGH VELOCITY CLOUD NEAR THE MAGELLANIC STREAM: METALLICITY AND SMALL-SCALE STRUCTURE

    Energy Technology Data Exchange (ETDEWEB)

    Kumari, Nimisha [Ecole Polytechnique, Route de Saclay, F-91128 Palaiseau (France); Fox, Andrew J.; Tumlinson, Jason; Thom, Christopher; Ely, Justin [Space Telescope Science Institute, Baltimore, MD 21218 (United States); Westmeier, Tobias [ICRAR, The University of Western Australia, 35 Stirling Highway, Crawley WA 6009 (Australia)

    2015-02-10

    The Magellanic Stream (MS) is a well-resolved gaseous tail originating from the Magellanic Clouds. Studies of its physical properties and chemical composition are needed to understand its role in Galactic evolution. We investigate the properties of a compact HVC (CHVC 224.0-83.4-197) lying close on the sky to the MS to determine whether it is physically connected to the Stream and to examine its internal structure. Our study is based on analysis of HST/COS spectra of three QSOs (Ton S210, B0120-28, and B0117-2837) all of which pass through this single cloud at small angular separation (≲0.°72), allowing us to compare physical conditions on small spatial scales. No significant variation is detected in the ionization structure from one part of the cloud to the other. Using Cloudy photoionization models, toward Ton S210 we derive elemental abundances of [C/H] = –1.21 ± 0.11, [Si/H] = –1.16 ± 0.11, [Al/H] = –1.19 ± 0.17, and [O/H] = –1.12 ± 0.22, which agree within 0.09 dex. The CHVC abundances match the 0.1 solar abundances measured along the main body of the Stream. This suggests that the CHVC (and by extension the extended network of filaments to which it belongs) has an origin in the MS. It may represent a fragment that has been removed from the Stream as it interacts with the gaseous Galactic halo.

  7. THE EVOLUTION OF GAS CLOUDS FALLING IN THE MAGNETIZED GALACTIC HALO: HIGH-VELOCITY CLOUDS (HVCs) ORIGINATED IN THE GALACTIC FOUNTAIN

    International Nuclear Information System (INIS)

    Kwak, Kyujin; Shelton, Robin L.; Raley, Elizabeth A.

    2009-01-01

    In the Galactic fountain scenario, supernovae and/or stellar winds propel material into the Galactic halo. As the material cools, it condenses into clouds. By using FLASH three-dimensional magnetohydrodynamic simulations, we model and study the dynamical evolution of these gas clouds after they form and begin to fall toward the Galactic plane. In our simulations, we assume that the gas clouds form at a height of z = 5 kpc above the Galactic midplane, then begin to fall from rest. We investigate how the cloud's evolution, dynamics, and interaction with the interstellar medium (ISM) are affected by the initial mass of the cloud. We find that clouds with sufficiently large initial densities (n ≥ 0.1 H atoms cm -3 ) accelerate sufficiently and maintain sufficiently large column densities as to be observed and identified as high-velocity clouds (HVCs) even if the ISM is weakly magnetized (1.3 μG). However, the ISM can provide noticeable resistance to the motion of a low-density cloud (n ≤ 0.01 H atoms cm -3 ) thus making it more probable that a low-density cloud will attain the speed of an intermediate-velocity cloud rather than the speed of an HVC. We also investigate the effects of various possible magnetic field configurations. As expected, the ISM's resistance is greatest when the magnetic field is strong and perpendicular to the motion of the cloud. The trajectory of the cloud is guided by the magnetic field lines in cases where the magnetic field is oriented diagonal to the Galactic plane. The model cloud simulations show that the interactions between the cloud and the ISM can be understood via analogy to the shock tube problem which involves shock and rarefaction waves. We also discuss accelerated ambient gas, streamers of material ablated from the clouds, and the cloud's evolution from a sphere-shaped to a disk- or cigar-shaped object.

  8. PRESENT-DAY GALACTIC EVOLUTION: LOW-METALLICITY, WARM, IONIZED GAS INFLOW ASSOCIATED WITH HIGH-VELOCITY CLOUD COMPLEX A

    Energy Technology Data Exchange (ETDEWEB)

    Barger, K. A.; Haffner, L. M.; Wakker, B. P.; Hill, Alex S. [Department of Astronomy, University of Wisconsin-Madison, Madison, WI 53706 (United States); Madsen, G. J. [Sydney Institute for Astronomy, School of Physics, University of Sydney, NSW 2006 (Australia); Duncan, A. K., E-mail: kbarger@astro.wisc.edu, E-mail: haffner@astro.wisc.edu, E-mail: Alex.Hill@csiro.au, E-mail: wakker@astro.wisc.edu, E-mail: greg.madsen@sydney.edu.au [Rose-Hulman Institute of Technology, Terre Haute, IN 47803 (United States)

    2012-12-20

    The high-velocity cloud Complex A is a probe of the physical conditions in the Galactic halo. The kinematics, morphology, distance, and metallicity of Complex A indicate that it represents new material that is accreting onto the Galaxy. We present Wisconsin H{alpha} Mapper kinematically resolved observations of Complex A over the velocity range of -250 to -50 km s{sup -1} in the local standard of rest reference frame. These observations include the first full H{alpha} intensity map of Complex A across (l, b) = (124 Degree-Sign , 18 Degree-Sign ) to (171 Degree-Sign , 53 Degree-Sign ) and deep targeted observations in H{alpha}, [S II] {lambda}6716, [N II] {lambda}6584, and [O I] {lambda}6300 toward regions with high H I column densities, background quasars, and stars. The H{alpha} data imply that the masses of neutral and ionized material in the cloud are similar, both being greater than 10{sup 6} M{sub Sun }. We find that the Bland-Hawthorn and Maloney model for the intensity of the ionizing radiation near the Milky Way is consistent with the known distance of the high-latitude part of Complex A and an assumed cloud geometry that puts the lower-latitude parts of the cloud at a distance of 7-8 kpc. This compatibility implies a 5% ionizing photon escape fraction from the Galactic disk. We also provide the nitrogen and sulfur upper abundance solutions for a series of temperatures, metallicities, and cloud configurations for purely photoionized gas; these solutions are consistent with the sub-solar abundances found by previous studies, especially for temperatures above 10{sup 4} K or for gas with a high fraction of singly ionized nitrogen and sulfur.

  9. DETECTION OF CA II ABSORPTION BY A HIGH-VELOCITY CLOUD IN THE DIRECTION OF THE QUASAR PKS 0837-120

    NARCIS (Netherlands)

    ROBERTSON, JG; SCHWARZ, UJ; VANWOERDEN, H; MURRAY, JD; MORTON, DC; HULSBOSCH, ANM

    1991-01-01

    We present optical absorption spectroscopy of the Ca II K and H lines along the sight line to the quasar PKS 0837-120, which lies in the direction of a high-velocity cloud (HVC) detected in H I 21-cm emission at V(LSR) = + 105 km s-1. Our data show Ca II absorption due to the HVC as well as a lower

  10. Introducing a novel gravitation-based high-velocity compaction analysis method for pharmaceutical powders.

    Science.gov (United States)

    Tanner, Timo; Antikainen, Osmo; Ehlers, Henrik; Yliruusi, Jouko

    2017-06-30

    With modern tableting machines large amounts of tablets are produced with high output. Consequently, methods to examine powder compression in a high-velocity setting are in demand. In the present study, a novel gravitation-based method was developed to examine powder compression. A steel bar is dropped on a punch to compress microcrystalline cellulose and starch samples inside the die. The distance of the bar is being read by a high-accuracy laser displacement sensor which provides a reliable distance-time plot for the bar movement. In-die height and density of the compact can be seen directly from this data, which can be examined further to obtain information on velocity, acceleration and energy distribution during compression. The energy consumed in compact formation could also be seen. Despite the high vertical compression speed, the method was proven to be cost-efficient, accurate and reproducible. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. TOPICAL REVIEW Warm spraying—a novel coating process based on high-velocity impact of solid particles

    Directory of Open Access Journals (Sweden)

    Seiji Kuroda et al

    2008-01-01

    Full Text Available In recent years, coating processes based on the impact of high-velocity solid particles such as cold spraying and aerosol deposition have been developed and attracting much industrial attention. A novel coating process called 'warm spraying' has been developed, in which coatings are formed by the high-velocity impact of solid powder particles heated to appropriate temperatures below the melting point of the powder material. The advantages of such process are as follows: (1 the critical velocity needed to form a coating can be significantly lowered by heating, (2 the degradation of feedstock powder such as oxidation can be significantly controlled compared with conventional thermal spraying where powder is molten, and (3 various coating structures can be realized from porous to dense ones by controlling the temperature and velocity of the particles. The principles and characteristics of this new process are discussed in light of other existing spray processes such as high-velocity oxy-fuel spraying and cold spraying. The gas dynamics of particle heating and acceleration by the spraying apparatus as well as the high-velocity impact phenomena of powder particles are discussed in detail. Several examples of depositing heat sensitive materials such as titanium, metallic glass, WC–Co cermet and polymers are described with potential industrial applications.

  12. Microstructure Characterization of WCCo-Mo Based Coatings Produced Using High Velocity Oxygen Fuel

    Directory of Open Access Journals (Sweden)

    Serkan Islak

    2015-12-01

    Full Text Available The present study has been carried out in order to investigate the microstructural properties of WCCo-Mo composite coatings deposited onto a SAE 4140 steel substrate by high velocity oxygen fuel (HVOF thermal spray. For this purpose, the Mo quantity added to the WCCo was changed as 10, 20, 30 and 40 wt. % percents. The coatings are compared in terms of their phase composition, microstructure and hardness. Phase compound and microstructure of coating layers were examined using X-ray diffractometer (XRD and scanning electron microscope (SEM. XRD results showed that WCCo-Mo composite coatings were mainly composed of WC, W2C, Co3W3C, Mo2C, MoO2, Mo and Co phases. The average hardness of the coatings increased with increasing Mo content.

  13. Hubble Space Telescope Imaging of the Ultra-compact High Velocity Cloud AGC 226067: A Stripped Remnant in the Virgo Cluster

    Energy Technology Data Exchange (ETDEWEB)

    Sand, D. J.; Crnojević, D. [Texas Tech University, Physics and Astronomy Department, Box 41051, Lubbock, TX 79409-1051 (United States); Seth, A. C. [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Spekkens, K. [Royal Military College of Canada, Department of Physics, P.O. Box 17000, Station Forces, Kingston, Ontario, K7K 7B4 (Canada); Strader, J. [Center for Data Intensive and Time Domain Astronomy, Department of Physics and Astronomy, Michigan State University, 567 Wilson Road, East Lansing, MI 48824 (United States); Adams, E. A. K. [ASTRON, Netherlands Institute for Radio Astronomy, Postbus 2, 7900 AA Dwingeloo (Netherlands); Caldwell, N.; Randall, S. [Harvard-Smithsonian Center for Astrophysics, Cambridge, MA 02138 (United States); Guhathakurta, P. [UCO/Lick Observatory, University of California, Santa Cruz, 1156 High Street, Santa Cruz, CA 95064 (United States); Kenney, J. [Yale University Astronomy Department, P.O. Box 208101, New Haven, CT 06520-8101 (United States); Simon, J. D. [Observatories of the Carnegie Institution for Science, 813 Santa Barbara Street, Pasadena, CA 91101 (United States); Toloba, E. [Department of Physics, University of the Pacific, 3601 Pacific Avenue, Stockton, CA 95211 (United States); Willman, B., E-mail: david.sand@ttu.edu [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States)

    2017-07-10

    We analyze the optical counterpart to the ultra-compact high velocity cloud AGC 226067, utilizing imaging taken with the Advanced Camera for Surveys (ACS) on the Hubble Space Telescope . The color–magnitude diagram of the main body of AGC 226067 reveals an exclusively young stellar population, with an age of ∼7–50 Myr, and is consistent with a metallicity of [Fe/H] ∼ −0.3 as previous work has measured via H ii region spectroscopy. Additionally, the color–magnitude diagram is consistent with a distance of D ≈ 17 Mpc, suggesting an association with the Virgo cluster. A secondary stellar system located ∼1.′6 (∼8 kpc) away in projection has a similar stellar population. The lack of an old red giant branch (≳5 Gyr) is contrasted with a serendipitously discovered Virgo dwarf in the ACS field of view (Dw J122147+132853), and the total diffuse light from AGC 226067 is consistent with the luminosity function of the resolved ∼7–50 Myr stellar population. The main body of AGC 226067 has a M {sub V} = −11.3 ± 0.3, or M {sub stars} = 5.4 ± 1.3 × 10{sup 4} M {sub ⊙} given the stellar population. We searched 20 deg{sup 2} of imaging data adjacent to AGC 226067 in the Virgo Cluster, and found two similar stellar systems dominated by a blue stellar population, far from any massive galaxy counterpart—if this population has star-formation properties that are similar to those of AGC 226067, it implies ∼0.1 M {sub ⊙} yr{sup −1} in Virgo intracluster star formation. Given its unusual stellar population, AGC 226067 is likely a stripped remnant and is plausibly the result of compressed gas from the ram pressure stripped M86 subgroup (∼350 kpc away in projection) as it falls into the Virgo Cluster.

  14. Looking for Galaxies in All the Right Places: A Search for Stellar Populations in ALFALFA’s Ultra-compact High Velocity Clouds

    Science.gov (United States)

    Janesh, William; Rhode, Katherine L.; Salzer, John J.; Janowiecki, Steven; Adams, Elizabeth; Haynes, Martha P.; Giovanelli, Riccardo; Cannon, John M.

    2018-01-01

    Nearby gas-rich dwarf galaxies are excellent laboratories for investigating the baryonic feedback processes that govern star formation and galaxy evolution in galaxies at the extreme end of the mass function. Detecting and studying such objects may help resolve the well-known tension between cosmological model predictions for low-mass dark matter halos and observations. The ALFALFA neutral hydrogen (Hi) survey has detected a sample of isolated ultra-compact high-velocity Hi clouds (UCHVCs) with kinematic properties that make them likely members of the Local Volume, but that have no optical counterparts in existing optical surveys. This UCHVC sample possesses Hi properties (at 1 Mpc, Hi masses of ~105-106 M⊙, Hi diameters of ~2-3 kpc, and dynamical masses of ~107-108 M⊙) similar to other known ultra-faint dwarf galaxies like Leo T. Following the discovery of Leo P, an extremely metal-poor, gas-rich star-forming dwarf galaxy associated with an ALFALFA UCHVC, we have initiated a campaign to obtain deep optical imaging of 56 UCHVCs using the wide field-of-view, high-resolution ODI camera on the WIYN 3.5-m telescope. Here we present a brief overview of our campaign to search for resolved stellar populations associated with the UCHVCs in our optical images, and initial results from our survey.After creating a stellar catalog from the pipeline-reduced and stacked ODI g- and i-band images, we apply a color-magnitude filter tuned for old, metal-poor stellar populations to select red giant branch stars at distances between 250 kpc and 2 Mpc. The spatial distribution of the stars selected by the filter is then smoothed, and overdensities in the fields are identified. Of the 22 targets analyzed to date, seven have associated stellar populations detected at a high confidence (92% to 99.9% significance). The detected objects have a range of distances (from 350 kpc to 1.6 Mpc) and have optical properties similar to those of ultra-faint dwarf galaxies. These objects have

  15. High Velocity Gas Gun

    Science.gov (United States)

    1988-01-01

    A video tape related to orbital debris research is presented. The video tape covers the process of loading a High Velocity Gas Gun and firing it into a mounted metal plate. The process is then repeated in slow motion.

  16. High-Velocity Cloud Complex H and Weaver's "Jet": Two candidate dwarf satellite galaxies for which dark matter halo models indicate distances of ~27 kpc and ~108 kpc

    Science.gov (United States)

    Simonson, S. Christian

    2018-04-01

    Two anomalous-velocity H I features, High-Velocity Cloud Complex H (HVC H) (Blitz et al. 1999), and Weaver's "jet" (Weaver 1974), appear to be good candidates for dwarf satellites. In this work they are modeled as H I disks in dark matter halos that move in 3D orbits in the combined time-dependent gravitational fields of the Milky Way and M31. As they orbit in the Local Group they develop tidal distortions and produce debris. The current l,b,V appearance of the tidal features as they approach the Milky Way indicate distances of 27 ± 9 kpc for HVC H and 108 ± 36 kpc for Weaver's "jet". As these are within the distances to known Milky Way satellites, finding stellar components would be of interest for the star formation history of the Milky Way. This work uses recent Hubble Space Telescope results on M31 (van der Marel et al. 2012) to calculate the center-of-mass (COM) locations and the dark matter mass distributions of the Milky-Way—M31 system since the Big Bang. Time-dependent COM orbits of the satellites have been computed in 3D, along with rings of test particles representing their disks. Tidal effects that develop on these rings have been compared with published 21-cm line data from Lockman (2003) and Simonson (1975). For HVC H at l = 130.5°, b = +1.5°, V = -200 km/s, the dark matter mass (in solar masses) is estimated as 5.2 ± 3.5E8. The previously estimated H I mass is 6.4E6, or 1.2% of the newly derived satellite mass. For Weaver's "jet", which covers 2° by 7° at l = 197.3°, b = +2.1°, V = -30 to -87 km/s, the dark matter mass is estimated as 1.8 ± 0.6E9. The H I mass is 1.8 ± 1.1E8, or 6% to 12% of the satellite mass. In the case of HVC H, owing to its disk angle of 45°, tidal debris is thrown upward. This would presumably contribute to a halo star stream. In the case of Weaver's "jet", the streamer represents accreting material for the disk. I am grateful to Leo Blitz for bringing Lockman's work on HVC H to my attention and for many helpful

  17. On Cloud-based Oversubscription

    OpenAIRE

    Householder, Rachel; Arnold, Scott; Green, Robert

    2014-01-01

    Rising trends in the number of customers turning to the cloud for their computing needs has made effective resource allocation imperative for cloud service providers. In order to maximize profits and reduce waste, providers have started to explore the role of oversubscribing cloud resources. However, the benefits of cloud-based oversubscription are not without inherent risks. This paper attempts to unveil the incentives, risks, and techniques behind oversubscription in a cloud infrastructure....

  18. Cloud GIS Based Watershed Management

    Science.gov (United States)

    Bediroğlu, G.; Colak, H. E.

    2017-11-01

    In this study, we generated a Cloud GIS based watershed management system with using Cloud Computing architecture. Cloud GIS is used as SAAS (Software as a Service) and DAAS (Data as a Service). We applied GIS analysis on cloud in terms of testing SAAS and deployed GIS datasets on cloud in terms of DAAS. We used Hybrid cloud computing model in manner of using ready web based mapping services hosted on cloud (World Topology, Satellite Imageries). We uploaded to system after creating geodatabases including Hydrology (Rivers, Lakes), Soil Maps, Climate Maps, Rain Maps, Geology and Land Use. Watershed of study area has been determined on cloud using ready-hosted topology maps. After uploading all the datasets to systems, we have applied various GIS analysis and queries. Results shown that Cloud GIS technology brings velocity and efficiency for watershed management studies. Besides this, system can be easily implemented for similar land analysis and management studies.

  19. Southern high-velocity stars

    International Nuclear Information System (INIS)

    Augensen, H.J.; Buscombe, W.

    1978-01-01

    Using the model of the Galaxy presented by Eggen, Lynden-Bell and Sandage (1962), plane galactic orbits have been calculated for 800 southern high-velocity stars which possess parallax, proper motion, and radial velocity data. The stars with trigonometric parallaxes were selected from Buscombe and Morris (1958), supplemented by more recent spectroscopic data. Photometric parallaxes from infrared color indices were used for bright red giants studied by Eggen (1970), and for red dwarfs for which Rodgers and Eggen (1974) determined radial velocities. A color-color diagram based on published values of (U-B) and (B-V) for most of these stars is shown. (Auth.)

  20. Application of TiC reinforced Fe-based coatings by means of High Velocity Air Fuel Spraying

    Science.gov (United States)

    Bobzin, K.; Öte, M.; Knoch, M. A.; Liao, X.; Sommer, J.

    2017-03-01

    In the field of hydraulic applications, different development trends can cause problems for coatings currently used as wear and corrosion protection for piston rods. Aqueous hydraulic fluids and rising raw material prices necessitate the search for alternatives to conventional coatings like galvanic hard chrome or High Velocity Oxygen Fuel (HVOF)-sprayed WC/Co coatings. In a previous study, Fe/TiC coatings sprayed by a HVOF-process, were identified to be promising coating systems for wear and corrosion protection in hydraulic systems. In this feasibility study, the novel High Velocity Air Fuel (HVAF)-process, a modification of the HVOF-process, is investigated using the same feedstock material, which means the powder is not optimized for the HVAF-process. The asserted benefits of the HVAF-process are higher particle velocities and lower process temperatures, which can result in a lower porosity and oxidation of the coating. Further benefits of the HVAF process are claimed to be lower process costs and higher deposition rates. In this study, the focus is set on to the applicability of Fe/TiC coatings by HVAF in general. The Fe/TiC HVAF coating could be produced, successfully. The HVAF- and HVOF-coatings, produced with the same powder, were investigated using micro-hardness, porosity, wear and corrosion tests. A similar wear coefficient and micro-hardness for both processes could be achieved. Furthermore the propane/hydrogen proportion of the HVAF process and its influence on the coating thickness and the porosity was investigated.

  1. Relationship between cloud radiative forcing, cloud fraction and cloud albedo, and new surface-based approach for determining cloud albedo

    OpenAIRE

    Y. Liu; W. Wu; M. P. Jensen; T. Toto

    2011-01-01

    This paper focuses on three interconnected topics: (1) quantitative relationship between surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo; (2) surfaced-based approach for measuring cloud albedo; (3) multiscale (diurnal, annual and inter-annual) variations and covariations of surface shortwave cloud radiative forcing, cloud fraction, and cloud albedo. An analytical expression is first derived to quantify the relationship between cloud radiative forcing, cloud fractio...

  2. Magnetic properties of iron oxide-based nanoparticles: Study using Mössbauer spectroscopy with a high velocity resolution and magnetization measurements

    Energy Technology Data Exchange (ETDEWEB)

    Ushakov, M.V. [Institute of Physics and Technology, Ural Federal University, Ekaterinburg 620002 (Russian Federation); Oshtrakh, M.I., E-mail: oshtrakh@gmail.com [Institute of Physics and Technology, Ural Federal University, Ekaterinburg 620002 (Russian Federation); Felner, I. [Racah Institute of Physics, The Hebrew University, Jerusalem (Israel); Semenova, A.S.; Kellerman, D.G. [Institute of Solid State Chemistry, Ural Branch, Russian Academy of Sciences, Ekaterinburg 620990 (Russian Federation); Šepelák, V. [Institute of Nanotechnology, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Semionkin, V.A. [Institute of Physics and Technology, Ural Federal University, Ekaterinburg 620002 (Russian Federation); Morais, P.C. [School of Chemistry and Chemical Engineering, Anhui University, Hefei 230601 (China); Universidade de Brasília, Instituto de Física, DF, Brasília 70910-900 (Brazil)

    2017-06-01

    We review the results of the study of magnetite, maghemite and nickel ferrite nanoparticles (NPs), applying for magnetic fluids, using Mössbauer spectroscopy with a high velocity resolution and magnetization measurements. The Mössbauer spectra of these NPs were fitted using a large number of magnetic sextets reflecting NPs complicity. The presence of polar molecules at the magnetite surface in magnetic fluid increases the NPs magnetic moment and the median hyperfine magnetic field. However, surface coating of maghemite NPs with dimeracptosuccinic acid decreases the median hyperfine magnetic field. An example of nickel ferrite NPs demonstrated a new physical model based on distribution of Ni{sup 2+} in the local microenvironment of Fe{sup 3+} which can explain a large number of magnetic sextets in the Mössbauer spectra measured with a high velocity resolution.

  3. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  4. Stratocumulus Cloud Top Radiative Cooling and Cloud Base Updraft Speeds

    Science.gov (United States)

    Kazil, J.; Feingold, G.; Balsells, J.; Klinger, C.

    2017-12-01

    Cloud top radiative cooling is a primary driver of turbulence in the stratocumulus-topped marine boundary. A functional relationship between cloud top cooling and cloud base updraft speeds may therefore exist. A correlation of cloud top radiative cooling and cloud base updraft speeds has been recently identified empirically, providing a basis for satellite retrieval of cloud base updraft speeds. Such retrievals may enable analysis of aerosol-cloud interactions using satellite observations: Updraft speeds at cloud base co-determine supersaturation and therefore the activation of cloud condensation nuclei, which in turn co-determine cloud properties and precipitation formation. We use large eddy simulation and an off-line radiative transfer model to explore the relationship between cloud-top radiative cooling and cloud base updraft speeds in a marine stratocumulus cloud over the course of the diurnal cycle. We find that during daytime, at low cloud water path (CWP correlated, in agreement with the reported empirical relationship. During the night, in the absence of short-wave heating, CWP builds up (CWP > 50 g m-2) and long-wave emissions from cloud top saturate, while cloud base heating increases. In combination, cloud top cooling and cloud base updrafts become weakly anti-correlated. A functional relationship between cloud top cooling and cloud base updraft speed can hence be expected for stratocumulus clouds with a sufficiently low CWP and sub-saturated long-wave emissions, in particular during daytime. At higher CWPs, in particular at night, the relationship breaks down due to saturation of long-wave emissions from cloud top.

  5. Cloud Based Applications and Platforms (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  6. An investigation of coseismic OSL / TL time zeroing of quartz gouge based on low- to high-velocity friction experiments

    Science.gov (United States)

    Akasegawa, K.; Oohashi, K.; Hasebe, N.; Miura, K.

    2016-12-01

    To determine an age of coseismic event of an active fault, we generally examine crosscutting relationship between faults and overlying strata by trenching. However, we could not apply this method in case there are no overlying young strata in the vicinity of the fault zones. The alternative is a dating of fault zone materials whose age experienced resetting with seismic fault slip (for example, the ESR method;. Ikeya et al,1982; the OSL and TL methods). The idea behinds to the OSL (optically stimulated luminescence) and TL (thermoluminescence) dating methods for a determination of paleo-earthquake event is the accumulated natural radiation damage becomes to zero (time zeroing) by the frictional heating and grinding. However, physical and geological conditions required to induce time zeroing is not well understood because there is only few experimental investigations under the limited conditions (Hiraga et al,2004;. Kim et al, 2014) . In this study, we conduct low- to high-velocity friction experiments using quartz gouge under various experimental conditions (e.g., normal stress, displacement, moisture content) to establish an empirical relationship and physical and geological conditions of coseismic OSL time zeroing. In this experiment, we carry out the friction experiments using quartz in Tsushigawa granite taken from the east wall of the Nojima fault Ogura trench site, which was excavated in 2015. Samples were taken from the most distant position from the fault in the trench site. The samples were clashed using a mortar and sieved to a grain size of treatment. The residual is user for the friction experiments after having known radiation dose using an artificial gamma-ray source. In this presentation, we show results of the friction experiments and dating of the quartz gouge and discuss physical and geological conditions of OSL time zeroing. References Okumura, T., and Shitaoka, Y., 2011. Engineering Geology of Japan, No. 1, 5-17. Hiraga, S., Yoshimoto, A., and

  7. Cloud networking understanding cloud-based data center networks

    CERN Document Server

    Lee, Gary

    2014-01-01

    Cloud Networking: Understanding Cloud-Based Data Center Networks explains the evolution of established networking technologies into distributed, cloud-based networks. Starting with an overview of cloud technologies, the book explains how cloud data center networks leverage distributed systems for network virtualization, storage networking, and software-defined networking. The author offers insider perspective to key components that make a cloud network possible such as switch fabric technology and data center networking standards. The final chapters look ahead to developments in architectures

  8. Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2011-01-01

    The vulnerability and inefficiency of backing up data on-site is prompting school districts to switch to more secure, less troublesome cloud-based options. District auditors are pushing for a better way to back up their data than the on-site, tape-based system that had been used for years. About three years ago, Hendrick School District in…

  9. Peltier-based cloud chamber

    Science.gov (United States)

    Nar, Sevda Yeliz; Cakir, Altan

    2018-02-01

    Particles produced by nuclear decay, cosmic radiation and reactions can be identified through various methods. One of these methods that has been effective in the last century is the cloud chamber. The chamber makes visible cosmic particles that we are exposed to radiation per second. Diffusion cloud chamber is a kind of cloud chamber that is cooled by dry ice. This traditional model has some application difficulties. In this work, Peltier-based cloud chamber cooled by thermoelectric modules is studied. The new model provided uniformly cooled base of the chamber, moreover, it has longer lifetime than the traditional chamber in terms of observation time. This gain has reduced the costs which spent each time for cosmic particle observation. The chamber is an easy-to-use system according to traditional diffusion cloud chamber. The new model is portable, easier to make, and can be used in the nuclear physics experiments. In addition, it would be very useful to observe Muons which are the direct evidence for Lorentz contraction and time expansion predicted by Einsteins special relativity principle.

  10. Cloud-Based Mobile Learning

    Directory of Open Access Journals (Sweden)

    Alexandru BUTOI

    2013-01-01

    Full Text Available As the cloud technologies are largely studied and mobile technologies are evolving, new di-rections for development of mobile learning tools deployed on cloud are proposed.. M-Learning is treated as part of the ubiquitous learning paradigm and is a pervasive extension of E-Learning technologies. Development of such learning tools requires specific development strategies for an effective abstracting of pedagogical principles at the software design and implementation level. Current paper explores an interdisciplinary approach for designing and development of cloud based M-Learning tools by mapping a specific development strategy used for educational programs to software prototyping strategy. In order for such instruments to be user effective from the learning outcome point of view, the evaluation process must be rigorous as we propose a metric model for expressing the trainee’s overall learning experience with evaluated levels of interactivity, content presentation and graphical user interface usability.

  11. Effect of the post heat treatment on the sliding wear resistance of a nickel base coating deposited by high velocity oxyl-fuel (HVOF)

    International Nuclear Information System (INIS)

    Cadenas, P.; Rodriguez, M.; Staia, M. H.

    2007-01-01

    In the present research, a nickel base coating was deposited on an AISI 1020 substrate by using high velocity oxy-fuel technique (HVOF). The coating was subsequently post heat-treated by means of an oxyacetylene flame. For the conditions evaluated in the present study, it was found that the CTT coating coating has 1,15 better wear resistance for the smaller level of the applied load and nearly 50 times for the highest level of the applied load when compared to the STT coatings. These results have been attributed to a better distribution of the hard phases, better cohesion between particles and an increase in hardness, as consequence of the post heat treatment process. A severe wear regime was found for all the samples since the wear rates presented values which were higher tan 1.10''-5 mm''3/m. For the CT T coatings, the wear mechanisms was mainly due to the adhesion and oxidation phenomena, meanwhile for the steel counterpart mechanisms such oxidation, grooving and three body abrasion were observed. (Author) 22 refs

  12. High velocity impact experiment (HVIE)

    Energy Technology Data Exchange (ETDEWEB)

    Toor, A.; Donich, T.; Carter, P.

    1998-02-01

    The HVIE space project was conceived as a way to measure the absolute EOS for approximately 10 materials at pressures up to {approximately}30 Mb with order-of-magnitude higher accuracy than obtainable in any comparable experiment conducted on earth. The experiment configuration is such that each of the 10 materials interacts with all of the others thereby producing one-hundred independent, simultaneous EOS experiments The materials will be selected to provide critical information to weapons designers, National Ignition Facility target designers and planetary and geophysical scientists. In addition, HVIE will provide important scientific information to other communities, including the Ballistic Missile Defense Organization and the lethality and vulnerability community. The basic HVIE concept is to place two probes in counter rotating, highly elliptical orbits and collide them at high velocity (20 km/s) at 100 km altitude above the earth. The low altitude of the experiment will provide quick debris strip-out of orbit due to atmospheric drag. The preliminary conceptual evaluation of the HVIE has found no show stoppers. The design has been very easy to keep within the lift capabilities of commonly available rides to low earth orbit including the space shuttle. The cost of approximately 69 million dollars for 100 EOS experiment that will yield the much needed high accuracy, absolute measurement data is a bargain!

  13. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  14. Graph based techniques for tag cloud generation

    DEFF Research Database (Denmark)

    Leginus, Martin; Dolog, Peter; Lage, Ricardo Gomes

    2013-01-01

    Tag cloud is one of the navigation aids for exploring documents. Tag cloud also link documents through the user defined terms. We explore various graph based techniques to improve the tag cloud generation. Moreover, we introduce relevance measures based on underlying data such as ratings...... or citation counts for improved measurement of relevance of tag clouds. We show, that on the given data sets, our approach outperforms the state of the art baseline methods with respect to such relevance by 41 % on Movielens dataset and by 11 % on Bibsonomy data set....

  15. Consideration of wear rates at high velocity

    Science.gov (United States)

    Hale, Chad S.

    The development of the research presented here is one in which high velocity relative sliding motion between two bodies in contact has been considered. Overall, the wear environment is truly three-dimensional. The attempt to characterize three-dimensional wear was not economically feasible because it must be analyzed at the micro-mechanical level to get results. Thus, an engineering approximation was carried out. This approximation was based on a metallographic study identifying the need to include viscoplasticity constitutive material models, coefficient of friction, relationships between the normal load and velocity, and the need to understand wave propagation. A sled test run at the Holloman High Speed Test Track (HHSTT) was considered for the determination of high velocity wear rates. In order to adequately characterize high velocity wear, it was necessary to formulate a numerical model that contained all of the physical events present. The experimental results of a VascoMax 300 maraging steel slipper sliding on an AISI 1080 steel rail during a January 2008 sled test mission were analyzed. During this rocket sled test, the slipper traveled 5,816 meters in 8.14 seconds and reached a maximum velocity of 1,530 m/s. This type of environment was never considered previously in terms of wear evaluation. Each of the features of the metallography were obtained through micro-mechanical experimental techniques. The byproduct of this analysis is that it is now possible to formulate a model that contains viscoplasticity, asperity collisions, temperature and frictional features. Based on the observations of the metallographic analysis, these necessary features have been included in the numerical model, which makes use of a time-dynamic program which follows the movement of a slipper during its experimental test run. The resulting velocity and pressure functions of time have been implemented in the explicit finite element code, ABAQUS. Two-dimensional, plane strain models

  16. Composite coating containing WC/12Co cermet and Fe-based metallic glass deposited by high-velocity oxygen fuel spraying

    International Nuclear Information System (INIS)

    Terajima, Takeshi; Takeuchi, Fumiya; Nakata, Kazuhiro; Adachi, Shinichiro; Nakashima, Koji; Igarashi, Takanori

    2010-01-01

    A composite coating containing WC/12Co cermet and Fe 43 Cr 16 Mo 16 C 15 B 10 metallic glass was successfully deposited onto type 304 stainless steel by high-velocity oxygen fuel (HVOF) spraying, and the microstructure and tribological properties were investigated. The microstructure of the coating was characterized by scanning electron microscopy/electron probe micro-analysis (SEM/EPMA) and X-ray diffractometry (XRD). The hardness, adhesion strength and tribological properties of the coating were tested with a Vickers hardness tester, tensile tester and reciprocating wear tester, respectively. The composite coating, in which flattened WC/12Co was embedded in amorphous Fe 43 Cr 16 Mo 16 C 15 B 10 layers, exhibited high hardness, good wear resistance and a low friction coefficient compared to the monolithic coating. The addition of 8% WC/12Co to the Fe 43 Cr 16 Mo 16 C 15 B 10 matrix increased the cross-sectional hardness from 660 to 870 HV and reduced the friction coefficient from 0.65 to 0.5. WC/12Co reinforcement plays an important role in improving the tribological properties of the Fe 43 Cr 16 Mo 16 C 15 B 10 coating.

  17. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  18. Cloud-Based RFID Mutual Authentication Protocol without Leaking Location Privacy to the Cloud

    OpenAIRE

    Dong, Qingkuan; Tong, Jiaqing; Chen, Yuan

    2015-01-01

    With the rapid developments of the IoT (Internet of Things) and the cloud computing, cloud-based RFID systems attract more attention. Users can reduce their cost of deploying and maintaining the RFID system by purchasing cloud services. However, the security threats of cloud-based RFID systems are more serious than those of traditional RFID systems. In cloud-based RFID systems, the connection between the reader and the cloud database is not secure and cloud service provider is not trusted. Th...

  19. Development of methods for inferring cloud thickness and cloud-base height from satellite radiance data

    Science.gov (United States)

    Smith, William L., Jr.; Minnis, Patrick; Alvarez, Joseph M.; Uttal, Taneil; Intrieri, Janet M.; Ackerman, Thomas P.; Clothiaux, Eugene

    1993-01-01

    Cloud-top height is a major factor determining the outgoing longwave flux at the top of the atmosphere. The downwelling radiation from the cloud strongly affects the cooling rate within the atmosphere and the longwave radiation incident at the surface. Thus, determination of cloud-base temperature is important for proper calculation of fluxes below the cloud. Cloud-base altitude is also an important factor in aircraft operations. Cloud-top height or temperature can be derived in a straightforward manner using satellite-based infrared data. Cloud-base temperature, however, is not observable from the satellite, but is related to the height, phase, and optical depth of the cloud in addition to other variables. This study uses surface and satellite data taken during the First ISCCP Regional Experiment (FIRE) Phase-2 Intensive Field Observation (IFO) period (13 Nov. - 7 Dec. 1991, to improve techniques for deriving cloud-base height from conventional satellite data.

  20. Cloud Collaboration: Cloud-Based Instruction for Business Writing Class

    Science.gov (United States)

    Lin, Charlie; Yu, Wei-Chieh Wayne; Wang, Jenny

    2014-01-01

    Cloud computing technologies, such as Google Docs, Adobe Creative Cloud, Dropbox, and Microsoft Windows Live, have become increasingly appreciated to the next generation digital learning tools. Cloud computing technologies encourage students' active engagement, collaboration, and participation in their learning, facilitate group work, and support…

  1. Cloud-based Virtual Organization Engineering

    Directory of Open Access Journals (Sweden)

    Liviu Gabriel CRETU

    2012-01-01

    Full Text Available Nowadays we may notice that SOA arrived to its maturity stage and Cloud Computing brings the next paradigm-shift regarding the software delivery business model. In such a context, we consider that there is a need for frameworks to guide the creation, execution and management of virtual organizations (VO based on services from different Clouds. This paper will introduce the main components of such a framework that will innovatively combine the principles of event-driven SOA, REST and ISO/IEC 42010:2007 multiple views and viewpoints in order to provide the required methodology for Cloud-based virtual organization (Cloud-VO engi-neering. The framework will consider the resource concept found in software architectures like REST or RDF as the basic building block of Cloud-VO. and will make use of resources’ URIs to create the Cloud-VO’s resource allocation matrix. While the matrix is used to declare activity-resources relationships, the resource catalogue concept will be introduced as a way to describe the resource in one place, using as many viewpoints as needed, and then to reuse that description for the creation or simulation of different VOs.

  2. Metastable structure formation during high velocity grinding

    International Nuclear Information System (INIS)

    Samarin, A.N.; Klyuev, M.M.

    1984-01-01

    Metastable structures in surface layers of samples are; investigated during force high-velocity abrasive grinding. Samples of martensitic (40Kh13), austenitic (12Kh18N10T), ferritic (05Kh23Yu5) steels and some alloys, in particular KhN77TYuR (EhI437B), were grinded for one pass at treatment depth from 0.17 up to 2.6 mm. It is established that processes of homogenizing, recrystallization and coagulation are; developed during force high-velocity grinding along with polymorphic transformations in the zone of thermomechanical effect, that leads to changes of physical and mechanical properties of the surface

  3. Comparison of cloud optical depth and cloud mask applying BRDF model-based background surface reflectance

    Science.gov (United States)

    Kim, H. W.; Yeom, J. M.; Woo, S. H.

    2017-12-01

    Over the thin cloud region, satellite can simultaneously detect the reflectance from thin clouds and land surface. Since the mixed reflectance is not the exact cloud information, the background surface reflectance should be eliminated to accurately distinguish thin cloud such as cirrus. In the previous research, Kim et al (2017) was developed the cloud masking algorithm using the Geostationary Ocean Color Imager (GOCI), which is one of significant instruments for Communication, Ocean, and Meteorology Satellite (COMS). Although GOCI has 8 spectral channels including visible and near infra-red spectral ranges, the cloud masking has quantitatively reasonable result when comparing with MODIS cloud mask (Collection 6 MYD35). Especially, we noticed that this cloud masking algorithm is more specialized in thin cloud detections through the validation with Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) data. Because this cloud masking method was concentrated on eliminating background surface effects from the top-of-atmosphere (TOA) reflectance. Applying the difference between TOA reflectance and the bi-directional reflectance distribution function (BRDF) model-based background surface reflectance, cloud areas both thick cloud and thin cloud can be discriminated without infra-red channels which were mostly used for detecting clouds. Moreover, when the cloud mask result was utilized as the input data when simulating BRDF model and the optimized BRDF model-based surface reflectance was used for the optimized cloud masking, the probability of detection (POD) has higher value than POD of the original cloud mask. In this study, we examine the correlation between cloud optical depth (COD) and its cloud mask result. Cloud optical depths mostly depend on the cloud thickness, the characteristic of contents, and the size of cloud contents. COD ranges from less than 0.1 for thin clouds to over 1000 for the huge cumulus due to scattering by droplets. With

  4. Cloud-based Architecture Capabilities Summary Report

    Energy Technology Data Exchange (ETDEWEB)

    Vang, Leng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    In collaborating scientific research arena it is important to have an environment where analysts have access to a shared of information documents, software tools and be able to accurately maintain and track historical changes in models. A new cloud-based environment would be accessible remotely from anywhere regardless of computing platforms given that the platform has available of Internet access and proper browser capabilities. Information stored at this environment would be restricted based on user assigned credentials. This report reviews development of a Cloud-based Architecture Capabilities (CAC) as a web portal for PRA tools.

  5. Exploring the Effects of Cloud Vertical Structure on Cloud Microphysical Retrievals based on Polarized Reflectances

    Science.gov (United States)

    Miller, D. J.; Zhang, Z.; Platnick, S. E.; Ackerman, A. S.; Cornet, C.; Baum, B. A.

    2013-12-01

    A polarized cloud reflectance simulator was developed by coupling an LES cloud model with a polarized radiative transfer model to assess the capabilities of polarimetric cloud retrievals. With future remote sensing campaigns like NASA's Aerosols/Clouds/Ecosystems (ACE) planning to feature advanced polarimetric instruments it is important for the cloud remote sensing community to understand the retrievable information available and the related systematic/methodical limitations. The cloud retrieval simulator we have developed allows us to probe these important questions in a realistically relevant test bed. Our simulator utilizes a polarized adding-doubling radiative transfer model and an LES cloud field from a DHARMA simulation (Ackerman et al. 2004) with cloud properties based on the stratocumulus clouds observed during the DYCOMS-II field campaign. In this study we will focus on how the vertical structure of cloud microphysics can influence polarized cloud effective radius retrievals. Numerous previous studies have explored how retrievals based on total reflectance are affected by cloud vertical structure (Platnick 2000, Chang and Li 2002) but no such studies about the effects of vertical structure on polarized retrievals exist. Unlike the total cloud reflectance, which is predominantly multiply scattered light, the polarized reflectance is primarily the result of singly scattered photons. Thus the polarized reflectance is sensitive to only the uppermost region of the cloud (tau~influencer on the microphysical development of cloud droplets, can be potentially studied with polarimetric retrievals.

  6. Cloud-based Networked Visual Servo Control

    OpenAIRE

    Wu, Haiyan; Lu, Lei; Chen, Chih-Chung; Hirche, Sandra; Kühnlenz, Kolja

    2013-01-01

    The performance of vision-based control systems, in particular of highly dynamic vision-based motion control systems, is often limited by the low sampling rate of the visual feedback caused by the long image processing time. In order to overcome this problem, the networked visual servo control, which integrates networked computational resources for cloud image processing, is considered in this article. The main contributions of this article are i) a real-time transport protocol for transmitti...

  7. Crowdsourcing cloud-based software development

    CERN Document Server

    Li, Wei; Tsai, Wei-Tek; Wu, Wenjun

    2015-01-01

    This book presents the latest research on the software crowdsourcing approach to develop large and complex software in a cloud-based platform. It develops the fundamental principles, management organization and processes, and a cloud-based infrastructure to support this new software development approach. The book examines a variety of issues in software crowdsourcing processes, including software quality, costs, diversity of solutions, and the competitive nature of crowdsourcing processes. Furthermore, the book outlines a research roadmap of this emerging field, including all the key technology and management issues for the foreseeable future. Crowdsourcing, as demonstrated by Wikipedia and Facebook for online web applications, has shown promising results for a variety of applications, including healthcare, business, gold mining exploration, education, and software development. Software crowdsourcing is emerging as a promising solution to designing, developing and maintaining software. Preliminary software cr...

  8. High-velocity frictional properties of gabbro

    Science.gov (United States)

    Tsutsumi, Akito; Shimamoto, Toshihiko

    High-velocity friction experiments have been performed on a pair of hollow-cylindrical specimens of gabbro initially at room temperature, at slip rates from 7.5 mm/s to 1.8 m/s, with total circumferential displacements of 125 to 174 m, and at normal stresses to 5 MPa, using a rotary-shear high-speed friction testing machine. Steady-state friction increases slightly with increasing slip rate at slip rates to about 100 mm/s (velocity strengthening) and it decreases markedly with increasing slip rate at higher velocities (velocity weakening). Steady-state friction in the velocity weakening regime is lower for the non-melting case than the frictional melting case, due perhaps to severe thermal fracturing. A very large peak friction is always recognized upon the initiation of visible frictional melting, presumably owing to the welding of fault surfaces upon the solidification of melt patches. Frictional properties thus change dramatically with increasing displacement at high velocities, and such a non-linear effect must be incorporated into the analysis of earthquake initiation processes.

  9. NASA Cloud-Based Climate Data Services

    Science.gov (United States)

    McInerney, M. A.; Schnase, J. L.; Duffy, D. Q.; Tamkin, G. S.; Strong, S.; Ripley, W. D., III; Thompson, J. H.; Gill, R.; Jasen, J. E.; Samowich, B.; Pobre, Z.; Salmon, E. M.; Rumney, G.; Schardt, T. D.

    2012-12-01

    Cloud-based scientific data services are becoming an important part of NASA's mission. Our technological response is built around the concept of specialized virtual climate data servers, repetitive cloud provisioning, image-based deployment and distribution, and virtualization-as-a-service (VaaS). A virtual climate data server (vCDS) is an Open Archive Information System (OAIS) compliant, iRODS-based data server designed to support a particular type of scientific data collection. iRODS is data grid middleware that provides policy-based control over collection-building, managing, querying, accessing, and preserving large scientific data sets. We have deployed vCDS Version 1.0 in the Amazon EC2 cloud using S3 object storage and are using the system to deliver a subset of NASA's Intergovernmental Panel on Climate Change (IPCC) data products to the latest CentOS federated version of Earth System Grid Federation (ESGF), which is also running in the Amazon cloud. vCDS-managed objects are exposed to ESGF through FUSE (Filesystem in User Space), which presents a POSIX-compliant filesystem abstraction to applications such as the ESGF server that require such an interface. A vCDS manages data as a distinguished collection for a person, project, lab, or other logical unit. A vCDS can manage a collection across multiple storage resources using rules and microservices to enforce collection policies. And a vCDS can federate with other vCDSs to manage multiple collections over multiple resources, thereby creating what can be thought of as an ecosystem of managed collections. With the vCDS approach, we are trying to enable the full information lifecycle management of scientific data collections and make tractable the task of providing diverse climate data services. In this presentation, we describe our approach, experiences, lessons learned, and plans for the future.; (A) vCDS/ESG system stack. (B) Conceptual architecture for NASA cloud-based data services.

  10. Cloud field classification based on textural features

    Science.gov (United States)

    Sengupta, Sailes Kumar

    1989-01-01

    An essential component in global climate research is accurate cloud cover and type determination. Of the two approaches to texture-based classification (statistical and textural), only the former is effective in the classification of natural scenes such as land, ocean, and atmosphere. In the statistical approach that was adopted, parameters characterizing the stochastic properties of the spatial distribution of grey levels in an image are estimated and then used as features for cloud classification. Two types of textural measures were used. One is based on the distribution of the grey level difference vector (GLDV), and the other on a set of textural features derived from the MaxMin cooccurrence matrix (MMCM). The GLDV method looks at the difference D of grey levels at pixels separated by a horizontal distance d and computes several statistics based on this distribution. These are then used as features in subsequent classification. The MaxMin tectural features on the other hand are based on the MMCM, a matrix whose (I,J)th entry give the relative frequency of occurrences of the grey level pair (I,J) that are consecutive and thresholded local extremes separated by a given pixel distance d. Textural measures are then computed based on this matrix in much the same manner as is done in texture computation using the grey level cooccurrence matrix. The database consists of 37 cloud field scenes from LANDSAT imagery using a near IR visible channel. The classification algorithm used is the well known Stepwise Discriminant Analysis. The overall accuracy was estimated by the percentage or correct classifications in each case. It turns out that both types of classifiers, at their best combination of features, and at any given spatial resolution give approximately the same classification accuracy. A neural network based classifier with a feed forward architecture and a back propagation training algorithm is used to increase the classification accuracy, using these two classes

  11. Cloud fraction and cloud base measurements from scanning Doppler lidar during WFIP-2

    Science.gov (United States)

    Bonin, T.; Long, C.; Lantz, K. O.; Choukulkar, A.; Pichugina, Y. L.; McCarty, B.; Banta, R. M.; Brewer, A.; Marquis, M.

    2017-12-01

    The second Wind Forecast Improvement Project (WFIP-2) consisted of an 18-month field deployment of a variety of instrumentation with the principle objective of validating and improving NWP forecasts for wind energy applications in complex terrain. As a part of the set of instrumentation, several scanning Doppler lidars were installed across the study domain to primarily measure profiles of the mean wind and turbulence at high-resolution within the planetary boundary layer. In addition to these measurements, Doppler lidar observations can be used to directly quantify the cloud fraction and cloud base, since clouds appear as a high backscatter return. These supplementary measurements of clouds can then be used to validate cloud cover and other properties in NWP output. Herein, statistics of the cloud fraction and cloud base height from the duration of WFIP-2 are presented. Additionally, these cloud fraction estimates from Doppler lidar are compared with similar measurements from a Total Sky Imager and Radiative Flux Analysis (RadFlux) retrievals at the Wasco site. During mostly cloudy to overcast conditions, estimates of the cloud radiating temperature from the RadFlux methodology are also compared with Doppler lidar measured cloud base height.

  12. Research on cloud background infrared radiation simulation based on fractal and statistical data

    Science.gov (United States)

    Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing

    2018-02-01

    Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.

  13. Definition of "banner clouds" based on time lapse movies

    OpenAIRE

    Schween , J. H.; Kuettner , J.; Reinert , D.; Reuder , J.; Wirth , V.

    2007-01-01

    International audience; Banner clouds appear on the leeward side of a mountain and resemble a banner or a flag. This article provides a comprehensive definition of "banner clouds". It is based primarily on an extensive collection of time lapse movies, but previous attempts at an explanation of this phenomenon are also taken into account. The following ingredients are considered essential: the cloud must be attached to the mountain but not appear on the windward side; the cloud must originate ...

  14. Point Cloud Based Change Detection - an Automated Approach for Cloud-based Services

    Science.gov (United States)

    Collins, Patrick; Bahr, Thomas

    2016-04-01

    The fusion of stereo photogrammetric point clouds with LiDAR data or terrain information derived from SAR interferometry has a significant potential for 3D topographic change detection. In the present case study latest point cloud generation and analysis capabilities are used to examine a landslide that occurred in the village of Malin in Maharashtra, India, on 30 July 2014, and affected an area of ca. 44.000 m2. It focuses on Pléiades high resolution satellite imagery and the Airbus DS WorldDEMTM as a product of the TanDEM-X mission. This case study was performed using the COTS software package ENVI 5.3. Integration of custom processes and automation is supported by IDL (Interactive Data Language). Thus, ENVI analytics is running via the object-oriented and IDL-based ENVITask API. The pre-event topography is represented by the WorldDEMTM product, delivered with a raster of 12 m x 12 m and based on the EGM2008 geoid (called pre-DEM). For the post-event situation a Pléiades 1B stereo image pair of the AOI affected was obtained. The ENVITask "GeneratePointCloudsByDenseImageMatching" was implemented to extract passive point clouds in LAS format from the panchromatic stereo datasets: • A dense image-matching algorithm is used to identify corresponding points in the two images. • A block adjustment is applied to refine the 3D coordinates that describe the scene geometry. • Additionally, the WorldDEMTM was input to constrain the range of heights in the matching area, and subsequently the length of the epipolar line. The "PointCloudFeatureExtraction" task was executed to generate the post-event digital surface model from the photogrammetric point clouds (called post-DEM). Post-processing consisted of the following steps: • Adding the geoid component (EGM 2008) to the post-DEM. • Pre-DEM reprojection to the UTM Zone 43N (WGS-84) coordinate system and resizing. • Subtraction of the pre-DEM from the post-DEM. • Filtering and threshold based classification of

  15. Teaching Thousands with Cloud-based GIS

    Science.gov (United States)

    Gould, Michael; DiBiase, David; Beale, Linda

    2016-04-01

    Teaching Thousands with Cloud-based GIS Educators often draw a distinction between "teaching about GIS" and "teaching with GIS." Teaching about GIS involves helping students learn what GIS is, what it does, and how it works. On the other hand, teaching with GIS involves using the technology as a means to achieve education objectives in the sciences, social sciences, professional disciplines like engineering and planning, and even the humanities. The same distinction applies to CyberGIS. Understandably, early efforts to develop CyberGIS curricula and educational resources tend to be concerned primarily with CyberGIS itself. However, if CyberGIS becomes as functional, usable and scalable as it aspires to be, teaching with CyberGIS has the potential to enable large and diverse global audiences to perform spatial analysis using hosted data, mapping and analysis services all running in the cloud. Early examples of teaching tens of thousands of students across the globe with cloud-based GIS include the massive open online courses (MOOCs) offered by Penn State University and others, as well as the series of MOOCs more recently developed and offered by Esri. In each case, ArcGIS Online was used to help students achieve educational objectives in subjects like business, geodesign, geospatial intelligence, and spatial analysis, as well as mapping. Feedback from the more than 100,000 total student participants to date, as well as from the educators and staff who supported these offerings, suggest that online education with cloud-based GIS is scalable to very large audiences. Lessons learned from the course design, development, and delivery of these early examples may be useful in informing the continuing development of CyberGIS education. While MOOCs may have passed the peak of their "hype cycle" in higher education, the phenomenon they revealed persists: namely, a global mass market of educated young adults who turn to free online education to expand their horizons. The

  16. A Reputation-Based Identity Management Model for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Lifa Wu

    2015-01-01

    Full Text Available In the field of cloud computing, most research on identity management has concentrated on protecting user data. However, users typically leave a trail when they access cloud services, and the resulting user traceability can potentially lead to the leakage of sensitive user information. Meanwhile, malicious users can do harm to cloud providers through the use of pseudonyms. To solve these problems, we introduce a reputation mechanism and design a reputation-based identity management model for cloud computing. In the model, pseudonyms are generated based on a reputation signature so as to guarantee the untraceability of pseudonyms, and a mechanism that calculates user reputation is proposed, which helps cloud service providers to identify malicious users. Analysis verifies that the model can ensure that users access cloud services anonymously and that cloud providers assess the credibility of users effectively without violating user privacy.

  17. A survey of high-velocity H I in the Cetus region

    International Nuclear Information System (INIS)

    Cohen, R.J.

    1982-01-01

    The region 02sup(h) 16sup(m) 0 0 surrounding the Cohen and Davies complex of high-velocity clouds has been surveyed in the 21-cm line of H I using the Jodrell Bank MK II radio telescope (beamwidth 31 x 34 arcmin). The high-velocity cloud complex was sampled every 2sup(m) in right ascension and every 0 0 .5 in declination. The observations cover a velocity range of 2100 km s -1 with a resolution of 7.3 km s -1 and an rms noise level of 0.025 K. No HVCs were found outside the velocity range -400 to +100 km s -1 . The data are presented on microfiche as a set of contour maps showing 21-cm line temperature as a function of declination and radial velocity at constant values of right ascension. Discussion is centred on the very-high-velocity clouds at velocities of -360 to -190 km s -1 . It is concluded that they are probably debris from the tidal interaction between our Galaxy and the Magellanic Clouds. (author)

  18. ID based cryptography for secure cloud data storage

    OpenAIRE

    Kaaniche , Nesrine; Boudguiga , Aymen; Laurent , Maryline

    2013-01-01

    International audience; This paper addresses the security issues of storing sensitive data in a cloud storage service and the need for users to trust the commercial cloud providers. It proposes a cryptographic scheme for cloud storage, based on an original usage of ID-Based Cryptography. Our solution has several advantages. First, it provides secrecy for encrypted data which are stored in public servers. Second, it offers controlled data access and sharing among users, so that unauthorized us...

  19. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  20. Route Assessment for Unmanned Aerial Vehicle Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Xixia Sun

    2014-01-01

    Full Text Available An integrated route assessment approach based on cloud model is proposed in this paper, where various sources of uncertainties are well kept and modeled by cloud theory. Firstly, a systemic criteria framework incorporating models for scoring subcriteria is developed. Then, the cloud model is introduced to represent linguistic variables, and survivability probability histogram of each route is converted into normal clouds by cloud transformation, enabling both randomness and fuzziness in the assessment environment to be managed simultaneously. Finally, a new way to measure the similarity between two normal clouds satisfying reflexivity, symmetry, transitivity, and overlapping is proposed. Experimental results demonstrate that the proposed route assessment approach outperforms fuzzy logic based assessment approach with regard to feasibility, reliability, and consistency with human thinking.

  1. A physically based algorithm for non-blackbody correction of the cloud top temperature for the convective clouds

    Science.gov (United States)

    Wang, C.; Luo, Z. J.; Chen, X.; Zeng, X.; Tao, W.; Huang, X.

    2012-12-01

    Cloud top temperature is a key parameter to retrieval in the remote sensing of convective clouds. Passive remote sensing cannot directly measure the temperature at the cloud tops. Here we explore a synergistic way of estimating cloud top temperature by making use of the simultaneous passive and active remote sensing of clouds (in this case, CloudSat and MODIS). Weighting function of the MODIS 11μm band is explicitly calculated by feeding cloud hydrometer profiles from CloudSat retrievals and temperature and humidity profiles based on ECMWF ERA-interim reanalysis into a radiation transfer model. Among 19,699 tropical deep convective clouds observed by the CloudSat in 2008, the averaged effective emission level (EEL, where the weighting function attains its maximum) is at optical depth 0.91 with a standard deviation of 0.33. Furthermore, the vertical gradient of CloudSat radar reflectivity, an indicator of the fuzziness of convective cloud top, is linearly proportional to, d_{CTH-EEL}, the distance between the EEL of 11μm channel and cloud top height (CTH) determined by the CloudSat when d_{CTH-EEL}<0.6km. Beyond 0.6km, the distance has little sensitivity to the vertical gradient of CloudSat radar reflectivity. Based on these findings, we derive a formula between the fuzziness in the cloud top region, which is measurable by CloudSat, and the MODIS 11μm brightness temperature assuming that the difference between effective emission temperature and the 11μm brightness temperature is proportional to the cloud top fuzziness. This formula is verified using the simulated deep convective cloud profiles by the Goddard Cumulus Ensemble model. We further discuss the application of this formula in estimating cloud top buoyancy as well as the error characteristics of the radiative calculation within such deep-convective clouds.

  2. An investigation of cloud base height in Chiang Mai

    Science.gov (United States)

    Peengam, S.; Tohsing, K.

    2017-09-01

    Clouds play very important role in the variation of surface solar radiation and rain formation. To understand this role, it is necessary to know the physical and geometrical of properties of cloud. However, clouds vary with location and time, which lead to a difficulty to obtain their properties. In this work, a ceilometer was installed at a station of the Royal Rainmaking and Agricultural Aviation Department in Chiang Mai (17.80° N, 98.43° E) in order to measure cloud base height. The cloud base height data from this instrument were compared with those obtained from LiDAR, a more sophisticated instrument installed at the same site. It was found that the cloud base height from both instruments was in reasonable agreement, with root mean square difference (RMSD) and mean bias difference (MBD) of 19.21% and 1.58%, respectively. Afterward, a six-month period (August, 2016-January, 2017) of data from the ceilometer was analyzed. The results show that mean cloud base height during this period is 1.5 km, meaning that most clouds are in the category of low-level cloud.

  3. Service quality of cloud-based applications

    CERN Document Server

    Bauer, Eric

    2014-01-01

    This book explains why applications running on cloud might not deliver the same service reliability, availability, latency and overall quality to end users as they do when the applications are running on traditional (non-virtualized, non-cloud) configurations, and explains what can be done to mitigate that risk.

  4. Cloud-based Networked Visual Servo Control

    DEFF Research Database (Denmark)

    Wu, Haiyan; Lu, Lei; Chen, Chih-Chung

    2013-01-01

    , which integrates networked computational resources for cloud image processing, is considered in this article. The main contributions of this article are i) a real-time transport protocol for transmitting large volume image data on a cloud computing platform, which enables high sampling rate visual...

  5. Definition of "banner clouds" based on time lapse movies

    Directory of Open Access Journals (Sweden)

    J. H. Schween

    2007-01-01

    Full Text Available Banner clouds appear on the leeward side of a mountain and resemble a banner or a flag. This article provides a comprehensive definition of "banner clouds". It is based primarily on an extensive collection of time lapse movies, but previous attempts at an explanation of this phenomenon are also taken into account. The following ingredients are considered essential: the cloud must be attached to the mountain but not appear on the windward side; the cloud must originate from condensation of water vapour contained in the air (rather than consist of blowing snow; the cloud must be persistent; and the cloud must not be of convective nature. The definition is illustrated and discussed with the help of still images and time lapse movies taken at Mount Zugspitze in the Bavarian Alps.

  6. Professional SharePoint 2010 Cloud-Based Solutions

    CERN Document Server

    Fox, Steve; Stubbs, Paul; Follette, Donovan

    2011-01-01

    An authoritative guide to extending SharePoint's power with cloud-based services If you want to be part of the next major shift in the IT industry, you'll want this book. Melding two of the hottest trends in the industry—the widespread popularity of the SharePoint collaboration platform and the rapid rise of cloud computing—this practical guide shows developers how to extend their SharePoint solutions with the cloud's almost limitless capabilities. See how to get started, discover smart ways to leverage cloud data and services through Azure, start incorporating Twitter or LinkedIn

  7. Developing cloud-based Business Process Management (BPM): a survey

    Science.gov (United States)

    Mercia; Gunawan, W.; Fajar, A. N.; Alianto, H.; Inayatulloh

    2018-03-01

    In today’s highly competitive business environment, modern enterprises are dealing difficulties to cut unnecessary costs, eliminate wastes and delivery huge benefits for the organization. Companies are increasingly turning to a more flexible IT environment to help them realize this goal. For this reason, the article applies cloud based Business Process Management (BPM) that enables to focus on modeling, monitoring and process management. Cloud based BPM consists of business processes, business information and IT resources, which help build real-time intelligence systems, based on business management and cloud technology. Cloud computing is a paradigm that involves procuring dynamically measurable resources over the internet as an IT resource service. Cloud based BPM service enables to address common problems faced by traditional BPM, especially in promoting flexibility, event-driven business process to exploit opportunities in the marketplace.

  8. Performance Isolation in Cloud-Based Big Data Architectures

    NARCIS (Netherlands)

    Tekinerdogan, B.; Oral, Alp

    2017-01-01

    Cloud-based big data systems usually have many different tenants that require access to the server's functionality. In a nonisolated cloud system, the different tenants can freely use the resources of the server. Hereby, disruptive tenants who exceed their limits can easily cause degradation of

  9. The governance of cloud based Supply Chain Collaborations

    NARCIS (Netherlands)

    Chandra, Dissa Riandaso; van Hillegersberg, Jos

    2015-01-01

    Despite of the promising benefits of cloud computing in enabling efficient, sustainable and agile Supply Chain Collaborations (SCCs), this service does not eliminate governance challenges in SCCs. Cloud based SCCs may flounder without a proper understanding of how to govern inter-organizational

  10. Electrical signature in polar night cloud base variations

    International Nuclear Information System (INIS)

    Harrison, R Giles; Ambaum, Maarten H P

    2013-01-01

    Layer clouds are globally extensive. Their lower edges are charged negatively by the fair weather atmospheric electricity current flowing vertically through them. Using polar winter surface meteorological data from Sodankylä (Finland) and Halley (Antarctica), we find that when meteorological diurnal variations are weak, an appreciable diurnal cycle, on average, persists in the cloud base heights, detected using a laser ceilometer. The diurnal cloud base heights from both sites correlate more closely with the Carnegie curve of global atmospheric electricity than with local meteorological measurements. The cloud base sensitivities are indistinguishable between the northern and southern hemispheres, averaging a (4.0 ± 0.5) m rise for a 1% change in the fair weather electric current density. This suggests that the global fair weather current, which is affected by space weather, cosmic rays and the El Niño Southern Oscillation, is linked with layer cloud properties. (letter)

  11. Big data mining analysis method based on cloud computing

    Science.gov (United States)

    Cai, Qing Qiu; Cui, Hong Gang; Tang, Hao

    2017-08-01

    Information explosion era, large data super-large, discrete and non-(semi) structured features have gone far beyond the traditional data management can carry the scope of the way. With the arrival of the cloud computing era, cloud computing provides a new technical way to analyze the massive data mining, which can effectively solve the problem that the traditional data mining method cannot adapt to massive data mining. This paper introduces the meaning and characteristics of cloud computing, analyzes the advantages of using cloud computing technology to realize data mining, designs the mining algorithm of association rules based on MapReduce parallel processing architecture, and carries out the experimental verification. The algorithm of parallel association rule mining based on cloud computing platform can greatly improve the execution speed of data mining.

  12. Cloud Based Earth Observation Data Exploitation Platforms

    Science.gov (United States)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland

  13. pCloud: A Cloud-based Power Market Simulation Environment

    Energy Technology Data Exchange (ETDEWEB)

    Rudkevich, Aleksandr; Goldis, Evgeniy

    2012-12-02

    This research conducted by the Newton Energy Group, LLC (NEG) is dedicated to the development of pCloud: a Cloud-based Power Market Simulation Environment. pCloud is offering power industry stakeholders the capability to model electricity markets and is organized around the Software as a Service (SaaS) concept -- a software application delivery model in which software is centrally hosted and provided to many users via the internet. During the Phase I of this project NEG developed a prototype design for pCloud as a SaaS-based commercial service offering, system architecture supporting that design, ensured feasibility of key architecture's elements, formed technological partnerships and negotiated commercial agreements with partners, conducted market research and other related activities and secured funding for continue development of pCloud between the end of Phase I and beginning of Phase II, if awarded. Based on the results of Phase I activities, NEG has established that the development of a cloud-based power market simulation environment within the Windows Azure platform is technologically feasible, can be accomplished within the budget and timeframe available through the Phase II SBIR award with additional external funding. NEG believes that pCloud has the potential to become a game-changing technology for the modeling and analysis of electricity markets. This potential is due to the following critical advantages of pCloud over its competition: - Standardized access to advanced and proven power market simulators offered by third parties. - Automated parallelization of simulations and dynamic provisioning of computing resources on the cloud. This combination of automation and scalability dramatically reduces turn-around time while offering the capability to increase the number of analyzed scenarios by a factor of 10, 100 or even 1000. - Access to ready-to-use data and to cloud-based resources leading to a reduction in software, hardware, and IT costs

  14. AN QUALITY BASED ENHANCEMENT OF USER DATA PROTECTION VIA FUZZY RULE BASED SYSTEMS IN CLOUD ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R Poorva Devi

    2016-04-01

    Full Text Available So far, in cloud computing distinct customer is accessed and consumed enormous amount of services through web, offered by cloud service provider (CSP. However cloud is providing one of the services is, security-as-a-service to its clients, still people are terrified to use the service from cloud vendor. Number of solutions, security components and measurements are coming with the new scope for the cloud security issue, but 79.2% security outcome only obtained from the different scientists, researchers and other cloud based academy community. To overcome the problem of cloud security the proposed model that is, “Quality based Enhancing the user data protection via fuzzy rule based systems in cloud environment”, will helps to the cloud clients by the way of accessing the cloud resources through remote monitoring management (RMMM and what are all the services are currently requesting and consuming by the cloud users that can be well analyzed with Managed service provider (MSP rather than a traditional CSP. Normally, people are trying to secure their own private data by applying some key management and cryptographic based computations again it will direct to the security problem. In order to provide good quality of security target result by making use of fuzzy rule based systems (Constraint & Conclusion segments in cloud environment. By using this technique, users may obtain an efficient security outcome through the cloud simulation tool of Apache cloud stack simulator.

  15. Using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  16. A Machine Learning Based Intrusion Impact Analysis Scheme for Clouds

    Directory of Open Access Journals (Sweden)

    Junaid Arshad

    2012-01-01

    Full Text Available Clouds represent a major paradigm shift, inspiring the contemporary approach to computing. They present fascinating opportunities to address dynamic user requirements with the provision of on demand expandable computing infrastructures. However, Clouds introduce novel security challenges which need to be addressed to facilitate widespread adoption. This paper is focused on one such challenge - intrusion impact analysis. In particular, we highlight the significance of intrusion impact analysis for the overall security of Clouds. Additionally, we present a machine learning based scheme to address this challenge in accordance with the specific requirements of Clouds for intrusion impact analysis. We also present rigorous evaluation performed to assess the effectiveness and feasibility of the proposed method to address this challenge for Clouds. The evaluation results demonstrate high degree of effectiveness to correctly determine the impact of an intrusion along with significant reduction with respect to the intrusion response time.

  17. A comparison of food crispness based on the cloud model.

    Science.gov (United States)

    Wang, Minghui; Sun, Yonghai; Hou, Jumin; Wang, Xia; Bai, Xue; Wu, Chunhui; Yu, Libo; Yang, Jie

    2018-02-01

    The cloud model is a typical model which transforms the qualitative concept into the quantitative description. The cloud model has been used less extensively in texture studies before. The purpose of this study was to apply the cloud model in food crispness comparison. The acoustic signals of carrots, white radishes, potatoes, Fuji apples, and crystal pears were recorded during compression. And three time-domain signal characteristics were extracted, including sound intensity, maximum short-time frame energy, and waveform index. The three signal characteristics and the cloud model were used to compare the crispness of the samples mentioned above. The crispness based on the Ex value of the cloud model, in a descending order, was carrot > potato > white radish > Fuji apple > crystal pear. To verify the results of the acoustic signals, mechanical measurement and sensory evaluation were conducted. The results of the two verification experiments confirmed the feasibility of the cloud model. The microstructures of the five samples were also analyzed. The microstructure parameters were negatively related with crispness (p cloud model method can be used for crispness comparison of different kinds of foods. The method is more accurate than the traditional methods such as mechanical measurement and sensory evaluation. The cloud model method can also be applied to other texture studies extensively. © 2017 Wiley Periodicals, Inc.

  18. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Ashley E. Van Beusekom; Grizelle Gonzalez; Martha A. Scholl

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline...

  19. Cloud model construct for transaction-based cooperative systems ...

    African Journals Online (AJOL)

    Cloud model construct for transaction-based cooperative systems. ... procure cutting edge Information Technology infrastructure are some of the problems faced ... Results also reveal that credit cooperatives will benefit from the model by taking ...

  20. Foundations of Blueprint for Cloud-based Service Engineering

    OpenAIRE

    Nguyen, D.K.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solution and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform and infrastructure service providers and configure them dynamically and in an optimal fashion to address their application requirements. Furthermore, combining different independent cloud-based services necessitates a uniform description format that f...

  1. Optimising TCP for cloud-based mobile networks

    DEFF Research Database (Denmark)

    Artuso, Matteo; Christiansen, Henrik Lehrmann

    2016-01-01

    Cloud-based mobile networks are foreseen to be a technological enabler for the next generation of mobile networks. Their design requires substantial research as they pose unique challenges, especially from the point of view of additional delays in the fronthaul network. Commonly used network...... implementations of 3 popular operating systems are investigated in our network model. The results on the most influential parameters are used to design an optimized TCP for cloud-based mobile networks....

  2. Cloud Computing Based E-Learning System

    Science.gov (United States)

    Al-Zoube, Mohammed; El-Seoud, Samir Abou; Wyne, Mudasser F.

    2010-01-01

    Cloud computing technologies although in their early stages, have managed to change the way applications are going to be developed and accessed. These technologies are aimed at running applications as services over the internet on a flexible infrastructure. Microsoft office applications, such as word processing, excel spreadsheet, access database…

  3. THE GALFA-H I COMPACT CLOUD CATALOG

    Energy Technology Data Exchange (ETDEWEB)

    Saul, Destry R.; Peek, J. E. G.; Grcevich, J.; Putman, M. E.; Brown, A. R. H.; Hamden, E. T. [Department of Astronomy, Columbia University, New York, NY 10027 (United States); Douglas, K. A. [Physics and Astronomy, University of Calgary/Dominion Radio Astrophysical Observatory, P.O. Box 248, Penticton, BC V2A 6J9 (Canada); Korpela, E. J. [Space Sciences Laboratory, University of California, Berkeley, CA 94720 (United States); Stanimirovic, S.; Lee, M.; Burkhart, B.; Pingel, N. M. [Department of Astronomy, University of Wisconsin, Madison, 475 N Charter St, Madison, WI 53703 (United States); Heiles, C. [Radio Astronomy Lab, UC Berkeley, 601 Campbell Hall, Berkeley, CA 94720 (United States); Gibson, S. J. [Department of Physics and Astronomy, Western Kentucky University, Bowling Green, KY 42101 (United States); Begum, A. [Indian Institute of Science Education and Research, ITI Campus (Gas Rahat) Building, Govindpura, Bhopal-23 (India); Tonnesen, S. [Department of Astrophysical Sciences, Princeton University, Princeton, NJ 08544 (United States)

    2012-10-10

    We present a catalog of 1964 isolated, compact neutral hydrogen clouds from the Galactic Arecibo L-Band Feed Array Survey Data Release One. The clouds were identified by a custom machine-vision algorithm utilizing the difference of Gaussian kernels to search for clouds smaller than 20'. The clouds have velocities typically between |V{sub LSR}| =20 and 400 km s{sup -1}, line widths of 2.5-35 km s{sup -1}, and column densities ranging from 1 to 35 Multiplication-Sign 10{sup 18} cm{sup -2}. The distances to the clouds in this catalog may cover several orders of magnitude, so the masses may range from less than a solar mass for clouds within the Galactic disk, to greater than 10{sup 4} M{sub Sun} for high-velocity clouds (HVCs) at the tip of the Magellanic Stream. To search for trends, we separate the catalog into five populations based on position, velocity, and line width: HVCs; galaxy candidates; cold low-velocity clouds (LVCs); warm, low positive-velocity clouds in the third Galactic quadrant; and the remaining warm LVCs. The observed HVCs are found to be associated with previously identified HVC complexes. We do not observe a large population of isolated clouds at high velocities as some models predict. We see evidence for distinct histories at low velocities in detecting populations of clouds corotating with the Galactic disk and a set of clouds that is not corotating.

  4. Security Framework for Agent-Based Cloud Computing

    Directory of Open Access Journals (Sweden)

    K Venkateshwaran

    2015-06-01

    Full Text Available Agent can play a key role in bringing suitable cloud services to the customer based on their requirements. In agent based cloud computing, agent does negotiation, coordination, cooperation and collaboration on behalf of the customer to make the decisions in efficient manner. However the agent based cloud computing have some security issues like (a. addition of malicious agent in the cloud environment which could demolish the process by attacking other agents, (b. denial of service by creating flooding attacks on other involved agents. (c. Some of the exceptions in the agent interaction protocol such as Not-Understood and Cancel_Meta protocol can be misused and may lead to terminating the connection of all the other agents participating in the negotiating services. Also, this paper proposes algorithms to solve these issues to ensure that there will be no intervention of any malicious activities during the agent interaction.

  5. Analyzing cloud base at local and regional scales to understand tropical montane cloud forest vulnerability to climate change

    Science.gov (United States)

    Van Beusekom, Ashley E.; González, Grizelle; Scholl, Martha A.

    2017-01-01

    The degree to which cloud immersion provides water in addition to rainfall, suppresses transpiration, and sustains tropical montane cloud forests (TMCFs) during rainless periods is not well understood. Climate and land use changes represent a threat to these forests if cloud base altitude rises as a result of regional warming or deforestation. To establish a baseline for quantifying future changes in cloud base, we installed a ceilometer at 100 m altitude in the forest upwind of the TMCF that occupies an altitude range from ∼ 600 m to the peaks at 1100 m in the Luquillo Mountains of eastern Puerto Rico. Airport Automated Surface Observing System (ASOS) ceilometer data, radiosonde data, and Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite data were obtained to investigate seasonal cloud base dynamics, altitude of the trade-wind inversion (TWI), and typical cloud thickness for the surrounding Caribbean region. Cloud base is rarely quantified near mountains, so these results represent a first look at seasonal and diurnal cloud base dynamics for the TMCF. From May 2013 to August 2016, cloud base was lowest during the midsummer dry season, and cloud bases were lower than the mountaintops as often in the winter dry season as in the wet seasons. The lowest cloud bases most frequently occurred at higher elevation than 600 m, from 740 to 964 m. The Luquillo forest low cloud base altitudes were higher than six other sites in the Caribbean by ∼ 200–600 m, highlighting the importance of site selection to measure topographic influence on cloud height. Proximity to the oceanic cloud system where shallow cumulus clouds are seasonally invariant in altitude and cover, along with local trade-wind orographic lifting and cloud formation, may explain the dry season low clouds. The results indicate that climate change threats to low-elevation TMCFs are not limited to the dry season; changes in synoptic-scale weather patterns

  6. Data Mining Based on Cloud-Computing Technology

    Directory of Open Access Journals (Sweden)

    Ren Ying

    2016-01-01

    Full Text Available There are performance bottlenecks and scalability problems when traditional data-mining system is used in cloud computing. In this paper, we present a data-mining platform based on cloud computing. Compared with a traditional data mining system, this platform is highly scalable, has massive data processing capacities, is service-oriented, and has low hardware cost. This platform can support the design and applications of a wide range of distributed data-mining systems.

  7. Deadline based scheduling for data-intensive applications in clouds

    Institute of Scientific and Technical Information of China (English)

    Fu Xiong; Cang Yeliang; Zhu Lipeng; Hu Bin; Deng Song; Wang Dong

    2016-01-01

    Cloud computing emerges as a new computing pattern that can provide elastic services for any users around the world.It provides good chances to solve large scale scientific problems with fewer efforts.Application deployment remains an important issue in clouds.Appropriate scheduling mechanisms can shorten the total completion time of an application and therefore improve the quality of service (QoS) for cloud users.Unlike current scheduling algorithms which mostly focus on single task allocation,we propose a deadline based scheduling approach for data-intensive applications in clouds.It does not simply consider the total completion time of an application as the sum of all its subtasks' completion time.Not only the computation capacity of virtual machine (VM) is considered,but also the communication delay and data access latencies are taken into account.Simulations show that our proposed approach has a decided advantage over the two other algorithms.

  8. CLOUD-BASED PLATFORM FOR CREATING AND SHARING WEB MAPS

    Directory of Open Access Journals (Sweden)

    Jean Pierre Gatera

    2014-01-01

    Full Text Available The rise of cloud computing is one the most important thing happening in information technology today. While many things are moving into the cloud, this trend has also reached the Geographic Information System (GIS world. For the users of GIS technology, the cloud opens new possibilities for sharing web maps, applications and spatial data. The goal of this presentation/demo is to demonstrate ArcGIS Online which is a cloud-based collaborative platform that allows to easily and quickly create interactive web maps that you can share with anyone. With ready-to-use content, apps, and templates you can produce web maps right away. And no matter what you use - desktops, browsers, smartphones, or tablets - you always have access to your content.

  9. Open Source Cloud-Based Technologies for Bim

    Science.gov (United States)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  10. OPEN SOURCE CLOUD-BASED TECHNOLOGIES FOR BIM

    Directory of Open Access Journals (Sweden)

    S. Logothetis

    2018-05-01

    Full Text Available This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC projects. Besides, the development of Open Source Software (OSS has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  11. Security on Cloud Revocation Authority using Identity Based Encryption

    Science.gov (United States)

    Rajaprabha, M. N.

    2017-11-01

    As due to the era of cloud computing most of the people are saving there documents, files and other things on cloud spaces. Due to this security over the cloud is also important because all the confidential things are there on the cloud. So to overcome private key infrastructure (PKI) issues some revocable Identity Based Encryption (IBE) techniques are introduced which eliminates the demand of PKI. The technique introduced is key update cloud service provider which is having two issues in it and they are computation and communication cost is high and second one is scalability issue. So to overcome this problem we come along with the system in which the Cloud Revocation Authority (CRA) is there for the security which will only hold the secret key for each user. And the secret key was send with the help of advanced encryption standard security. The key is encrypted and send to the CRA for giving the authentication to the person who wants to share the data or files or for the communication purpose. Through that key only the other user will able to access that file and if the user apply some invalid key on the particular file than the information of that user and file is send to the administrator and administrator is having rights to block that person of black list that person to use the system services.

  12. Cardiovascular imaging environment: will the future be cloud-based?

    Science.gov (United States)

    Kawel-Boehm, Nadine; Bluemke, David A

    2017-07-01

    In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.

  13. Thermal SiO as a probe of high velocity motions in regions of star formation

    International Nuclear Information System (INIS)

    Downes, D.; Genzel, R.; Hjalmarson, A.; Nyman, L.A.; Roennaeng, B.

    1982-01-01

    New observations of the v = 0, J = = 2→1 line of SiO at 86.8 GHz show a close association of the thermal SiO emission and infrared and maser sources in regions of star formation. In addition to SiO emission with low velocity dispersion (Δν -1 ), we report the first detection of high velocity (''plateau'') emission toward W49 and W51. The low velocity SiO component may come from the core of the molecular cloud which contains the infrared and maser sources. The ''plateau'' may indicate mass clusters. In Orion KL, the positional centroid of the high velocity SiO emission (Vertical BarΔνVertical Bar> or =20 km s -1 ) is near that of the component we identify as the ''18 km s -1 flow''. However, the centriods of the blue- and redshifted wings are displaced from each other by a few arcseconds, to the NW and NE of the position of the 18 km s -1 component. The mass-loss rates of the high velocity flow and the 18 km s -1 flow are similar

  14. A cloud-based system for automatic glaucoma screening.

    Science.gov (United States)

    Fengshou Yin; Damon Wing Kee Wong; Ying Quan; Ai Ping Yow; Ngan Meng Tan; Gopalakrishnan, Kavitha; Beng Hai Lee; Yanwu Xu; Zhuo Zhang; Jun Cheng; Jiang Liu

    2015-08-01

    In recent years, there has been increasing interest in the use of automatic computer-based systems for the detection of eye diseases including glaucoma. However, these systems are usually standalone software with basic functions only, limiting their usage in a large scale. In this paper, we introduce an online cloud-based system for automatic glaucoma screening through the use of medical image-based pattern classification technologies. It is designed in a hybrid cloud pattern to offer both accessibility and enhanced security. Raw data including patient's medical condition and fundus image, and resultant medical reports are collected and distributed through the public cloud tier. In the private cloud tier, automatic analysis and assessment of colour retinal fundus images are performed. The ubiquitous anywhere access nature of the system through the cloud platform facilitates a more efficient and cost-effective means of glaucoma screening, allowing the disease to be detected earlier and enabling early intervention for more efficient intervention and disease management.

  15. Knowledge-Based Object Detection in Laser Scanning Point Clouds

    Science.gov (United States)

    Boochs, F.; Karmacharya, A.; Marbs, A.

    2012-07-01

    Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This "understanding" enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL), used for formulating the knowledge base and the Semantic Web Rule Language (SWRL) with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists' knowledge of the scene and algorithmic processing.

  16. A novel technique for extracting clouds base height using ground based imaging

    Directory of Open Access Journals (Sweden)

    E. Hirsch

    2011-01-01

    Full Text Available The height of a cloud in the atmospheric column is a key parameter in its characterization. Several remote sensing techniques (passive and active, either ground-based or on space-borne platforms and in-situ measurements are routinely used in order to estimate top and base heights of clouds. In this article we present a novel method that combines thermal imaging from the ground and sounded wind profile in order to derive the cloud base height. This method is independent of cloud types, making it efficient for both low boundary layer and high clouds. In addition, using thermal imaging ensures extraction of clouds' features during daytime as well as at nighttime. The proposed technique was validated by comparison to active sounding by ceilometers (which is a standard ground based method, to lifted condensation level (LCL calculations, and to MODIS products obtained from space. As all passive remote sensing techniques, the proposed method extracts only the height of the lowest cloud layer, thus upper cloud layers are not detected. Nevertheless, the information derived from this method can be complementary to space-borne cloud top measurements when deep-convective clouds are present. Unlike techniques such as LCL, this method is not limited to boundary layer clouds, and can extract the cloud base height at any level, as long as sufficient thermal contrast exists between the radiative temperatures of the cloud and its surrounding air parcel. Another advantage of the proposed method is its simplicity and modest power needs, making it particularly suitable for field measurements and deployment at remote locations. Our method can be further simplified for use with visible CCD or CMOS camera (although nighttime clouds will not be observed.

  17. Cloud-Based Collaborative Writing and the Common Core Standards

    Science.gov (United States)

    Yim, Soobin; Warschauer, Mark; Zheng, Binbin; Lawrence, Joshua F.

    2014-01-01

    The Common Core State Standards emphasize the integration of technology skills into English Language Arts (ELA) instruction, recognizing the demand for technology-based literacy skills to be college- and career- ready. This study aims to examine how collaborative cloud-based writing is used in in a Colorado school district, where one-to-one…

  18. HIGH VELOCITY THERMAL GUN FOR SURFACE PREPARATION AND TREATMENT

    Directory of Open Access Journals (Sweden)

    I.A. Gorlach

    2012-01-01

    Full Text Available Many surface preparation and treatment processes utilise compressed air to propel particles against surfaces in order to clean and treat them. The effectiveness of the processes depends on the velocity of the particles, which in turn depends on the pressure of the compressed air. This paper describes a thermal gun built on the principles of High Velocity Air Fuel (HVAF and High Velocity Oxy Fuel (HVOF processes. The designed apparatus can be used for abrasive blasting, coating of surfaces, cutting of rocks, removing rubber from mining equipment, cleaning of contaminations etc.

  19. Superconducting spoke cavities for high-velocity applications

    Energy Technology Data Exchange (ETDEWEB)

    Hopper, Christopher S. [Old Dominion U.; Delayen, Jean R. [Old Dominion U., JLAB

    2013-10-01

    To date, superconducting spoke cavities have been designed, developed, and tested for particle velocities up to {beta}{sub 0}~0.6, but there is a growing interest in possible applications of multispoke cavities for high-velocity applications. We have explored the design parameter space for low-frequency, high-velocity, double-spoke superconducting cavities in order to determine how each design parameter affects the electromagnetic properties, in particular the surface electromagnetic fields and the shunt impedance. We present detailed design for cavities operating at 325 and 352 MHz and optimized for {beta}{sub 0}~=0.82 and 1.

  20. Development and Usage of Software as a Service for a Cloud and Non-Cloud Based Environment- An Empirical Study

    OpenAIRE

    Pratiyush Guleria Guleria; Vikas Sharma; Manish Arora

    2012-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand. Cloud computing is a natural evolution of the widespread adoption of virtualization, service-oriented architecture and utility computing. The computer applications nowadays are becoming more and more complex; there is an ever increasing demand for computing resources. As this demand has risen, the concepts of cloud computing and grid computing...

  1. Methodology for cloud-based design of robots

    Science.gov (United States)

    Ogorodnikova, O. M.; Vaganov, K. A.; Putimtsev, I. D.

    2017-09-01

    This paper presents some important results for cloud-based designing a robot arm by a group of students. Methodology for the cloud-based design was developed and used to initiate interdisciplinary project about research and development of a specific manipulator. The whole project data files were hosted by Ural Federal University data center. The 3D (three-dimensional) model of the robot arm was created using Siemens PLM software (Product Lifecycle Management) and structured as a complex mechatronics product by means of Siemens Teamcenter thin client; all processes were performed in the clouds. The robot arm was designed in purpose to load blanks up to 1 kg into the work space of the milling machine for performing student's researches.

  2. A Developed Artificial Bee Colony Algorithm Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Ye Jin

    2018-04-01

    Full Text Available The Artificial Bee Colony (ABC algorithm is a bionic intelligent optimization method. The cloud model is a kind of uncertainty conversion model between a qualitative concept T ˜ that is presented by nature language and its quantitative expression, which integrates probability theory and the fuzzy mathematics. A developed ABC algorithm based on cloud model is proposed to enhance accuracy of the basic ABC algorithm and avoid getting trapped into local optima by introducing a new select mechanism, replacing the onlooker bees’ search formula and changing the scout bees’ updating formula. Experiments on CEC15 show that the new algorithm has a faster convergence speed and higher accuracy than the basic ABC and some cloud model based ABC variants.

  3. Inverted Polarity Thunderstorms Linked with Elevated Cloud Base Height

    Science.gov (United States)

    Cummins, K. L.; Williams, E.

    2016-12-01

    The great majority of thunderstorms worldwide exhibit gross positive dipole structure, produce intracloud lightning that reduces this positive dipole (positive intracloud flashes), and produce negative cloud-to-ground lightning from the lower negative end of this dipole. During the STEPS experiment in 2000 much new evidence for thunderstorms (or cells within multi-cellular storms) with inverted polarity came to light, both from balloon soundings of electric field and from LMA analysis. Many of the storms with inverted polarity cells developed in eastern Colorado. Fleenor et al. (2009) followed up after STEPS to document a dominance of positive polarity CG lightning in many of these cases. In the present study, surface thermodynamic observations (temperature and dew point temperature) have been used to estimate the cloud base heights and temperatures at the time of the Fleenor et al. lightning observations. It was found that when more than 90% of the observed CG lightning polarity within a storm is negative, the cloud base heights were low (2000 m AGL or lower, and warmer, with T>10 C), and when more than 90% of the observed CG lightning within a storm was positive, the cloud base heights were high (3000 m AGL or higher, and colder, with Tmixed polarity were generally associated with intermediate cloud base heights. These findings on inverted polarity thunderstorms are remarkably consistent with results in other parts of the world where strong instability prevails in the presence of high cloud base height: the plateau regions of China (Liu et al., 1989; Qie et al., 2005), and in pre-monsoon India (Pawar et al., 2016), particularly when mixed polarity cases are excluded. Calculations of adiabatic cloud water content for lifting from near 0 oC cast some doubt on earlier speculation (Williams et al., 2005) that the graupel particles in these inverted polarity storms attain a wet growth condition, and so exhibit positive charging following laboratory experiments. This

  4. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  5. On the origin of high-velocity runaway stars

    NARCIS (Netherlands)

    Gvaramadze, V.V.; Gualandris, A.; Portegies Zwart, S.

    2009-01-01

    We explore the hypothesis that some high-velocity runaway stars attain their peculiar velocities in the course of exchange encounters between hard massive binaries and a very massive star (either an ordinary 50-100 M-circle dot star or a more massive one, formed through runaway mergers of ordinary

  6. A secure medical data exchange protocol based on cloud environment.

    Science.gov (United States)

    Chen, Chin-Ling; Yang, Tsai-Tung; Shih, Tzay-Farn

    2014-09-01

    In recent years, health care technologies already became matured such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concern issue. In spite of many literatures discussed about medical systems, but these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a secure medical data exchange protocol based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples use medical resources on the cloud environment to seek medical advice conveniently.

  7. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  8. Experimental and numerical studies of high-velocity impact fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Kipp, M.E.; Grady, D.E.; Swegle, J.W.

    1993-08-01

    Developments are reported in both experimental and numerical capabilities for characterizing the debris spray produced in penetration events. We have performed a series of high-velocity experiments specifically designed to examine the fragmentation of the projectile during impact. High-strength, well-characterized steel spheres (6.35 mm diameter) were launched with a two-stage light-gas gun to velocities in the range of 3 to 5 km/s. Normal impact with PMMA plates, thicknesses of 0.6 to 11 mm, applied impulsive loads of various amplitudes and durations to the steel sphere. Multiple flash radiography diagnostics and recovery techniques were used to assess size, velocity, trajectory and statistics of the impact-induced fragment debris. Damage modes to the primary target plate (plastic) and to a secondary target plate (aluminum) were also evaluated. Dynamic fragmentation theories, based on energy-balance principles, were used to evaluate local material deformation and fracture state information from CTH, a three-dimensional Eulerian solid dynamics shock wave propagation code. The local fragment characterization of the material defines a weighted fragment size distribution, and the sum of these distributions provides a composite particle size distribution for the steel sphere. The calculated axial and radial velocity changes agree well with experimental data, and the calculated fragment sizes are in qualitative agreement with the radiographic data. A secondary effort involved the experimental and computational analyses of normal and oblique copper ball impacts on steel target plates. High-resolution radiography and witness plate diagnostics provided impact motion and statistical fragment size data. CTH simulations were performed to test computational models and numerical methods.

  9. Perceptions of Peer Review Using Cloud-Based Software

    Science.gov (United States)

    Andrichuk, Gjoa

    2016-01-01

    This study looks at the change in perception regarding the effect of peer feedback on writing skills using cloud-based software. Pre- and post-surveys were given. The students peer reviewed drafts of five sections of scientific reports using Google Docs. While students reported that they did not perceive their writing ability improved by being…

  10. Cloud-Based Technologies: Faculty Development, Support, and Implementation

    Science.gov (United States)

    Diaz, Veronica

    2011-01-01

    The number of instructional offerings in higher education that are online, blended, or web-enhanced, including courses and programs, continues to grow exponentially. Alongside the growth of e-learning, higher education has witnessed the explosion of cloud-based or Web 2.0 technologies, a term that refers to the vast array of socially oriented,…

  11. Cloud-Based Virtual Laboratory for Network Security Education

    Science.gov (United States)

    Xu, Le; Huang, Dijiang; Tsai, Wei-Tek

    2014-01-01

    Hands-on experiments are essential for computer network security education. Existing laboratory solutions usually require significant effort to build, configure, and maintain and often do not support reconfigurability, flexibility, and scalability. This paper presents a cloud-based virtual laboratory education platform called V-Lab that provides a…

  12. Blueprint template support for engineering cloud-based services

    NARCIS (Netherlands)

    Nguyen, D.K.; Lelli, F.; Taher, Y.; Parkin, M.S.; Papazoglou, M.; van den Heuvel, W.J.A.M.; Abramowicz, W.; Martín Llorente, I.; Surridge, M.; Zisman, A.; Vayssière, J.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solutions and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform, infrastructure service providers and configure them

  13. Foundations of Blueprint for Cloud-based Service Engineering

    NARCIS (Netherlands)

    Nguyen, D.K.

    2011-01-01

    Current cloud-based service offerings are often provided as one-size-fits-all solution and give little or no room for customization. This limits the ability for application developers to pick and choose offerings from multiple software, platform and infrastructure service providers and configure

  14. ORGANIZATION OF CLOUD COMPUTING INFRASTRUCTURE BASED ON SDN NETWORK

    Directory of Open Access Journals (Sweden)

    Alexey A. Efimenko

    2013-01-01

    Full Text Available The article presents the main approaches to cloud computing infrastructure based on the SDN network in present data processing centers (DPC. The main indexes of management effectiveness of network infrastructure of DPC are determined. The examples of solutions for the creation of virtual network devices are provided.

  15. Cloud based emergency health care information service in India.

    Science.gov (United States)

    Karthikeyan, N; Sukanesh, R

    2012-12-01

    A hospital is a health care organization providing patient treatment by expert physicians, surgeons and equipments. A report from a health care accreditation group says that miscommunication between patients and health care providers is the reason for the gap in providing emergency medical care to people in need. In developing countries, illiteracy is the major key root for deaths resulting from uncertain diseases constituting a serious public health problem. Mentally affected, differently abled and unconscious patients can't communicate about their medical history to the medical practitioners. Also, Medical practitioners can't edit or view DICOM images instantly. Our aim is to provide palm vein pattern recognition based medical record retrieval system, using cloud computing for the above mentioned people. Distributed computing technology is coming in the new forms as Grid computing and Cloud computing. These new forms are assured to bring Information Technology (IT) as a service. In this paper, we have described how these new forms of distributed computing will be helpful for modern health care industries. Cloud Computing is germinating its benefit to industrial sectors especially in medical scenarios. In Cloud Computing, IT-related capabilities and resources are provided as services, via the distributed computing on-demand. This paper is concerned with sprouting software as a service (SaaS) by means of Cloud computing with an aim to bring emergency health care sector in an umbrella with physical secured patient records. In framing the emergency healthcare treatment, the crucial thing considered necessary to decide about patients is their previous health conduct records. Thus a ubiquitous access to appropriate records is essential. Palm vein pattern recognition promises a secured patient record access. Likewise our paper reveals an efficient means to view, edit or transfer the DICOM images instantly which was a challenging task for medical practitioners in the

  16. CloudLM: a Cloud-based Language Model for Machine Translation

    Directory of Open Access Journals (Sweden)

    Ferrández-Tordera Jorge

    2016-04-01

    Full Text Available Language models (LMs are an essential element in statistical approaches to natural language processing for tasks such as speech recognition and machine translation (MT. The advent of big data leads to the availability of massive amounts of data to build LMs, and in fact, for the most prominent languages, using current techniques and hardware, it is not feasible to train LMs with all the data available nowadays. At the same time, it has been shown that the more data is used for a LM the better the performance, e.g. for MT, without any indication yet of reaching a plateau. This paper presents CloudLM, an open-source cloud-based LM intended for MT, which allows to query distributed LMs. CloudLM relies on Apache Solr and provides the functionality of state-of-the-art language modelling (it builds upon KenLM, while allowing to query massive LMs (as the use of local memory is drastically reduced, at the expense of slower decoding speed.

  17. Learners’ views about cloud computing-based group activities

    Directory of Open Access Journals (Sweden)

    Yildirim Serkan

    2017-01-01

    Full Text Available Thanks to its use independently of time and place during the process of software development and by making it easier to access to information with mobile technologies, cloud based environments attracted the attention of education world and this technology started to be used in various activities. In this study, for programming education, the effects of extracurricular group assignments in cloud based environments on learners were evaluated in terms of group work satisfaction, ease of use and user satisfaction. Within the scope of computer programming education lasting eight weeks, a total of 100 students participated in the study including 34 men and 66 women. Participants were divided into groups of at least three people considering the advantages of cooperative learning in programming education. In this study carried out in both conventional and cloud based environments, between groups factorial design was used as research design. The data collected by questionnaires of opinions of group work were examined with quantitative analysis method. According to the study results extracurricular learning activities as group activity created satisfaction. However, perceptions of easy use of the environment and user satisfaction were partly positive. Despite the similar understandings; male participants were easier to perceive use of cloud computing based environments. Some variables such as class level, satisfaction, computer and internet usage time do not have any effect on satisfaction and perceptions of ease of use. Evening class students stated that they found it easy to use cloud based learning environments and became more satisfied with using these environments besides being happier with group work than daytime students.

  18. Remote sensing image segmentation based on Hadoop cloud platform

    Science.gov (United States)

    Li, Jie; Zhu, Lingling; Cao, Fubin

    2018-01-01

    To solve the problem that the remote sensing image segmentation speed is slow and the real-time performance is poor, this paper studies the method of remote sensing image segmentation based on Hadoop platform. On the basis of analyzing the structural characteristics of Hadoop cloud platform and its component MapReduce programming, this paper proposes a method of image segmentation based on the combination of OpenCV and Hadoop cloud platform. Firstly, the MapReduce image processing model of Hadoop cloud platform is designed, the input and output of image are customized and the segmentation method of the data file is rewritten. Then the Mean Shift image segmentation algorithm is implemented. Finally, this paper makes a segmentation experiment on remote sensing image, and uses MATLAB to realize the Mean Shift image segmentation algorithm to compare the same image segmentation experiment. The experimental results show that under the premise of ensuring good effect, the segmentation rate of remote sensing image segmentation based on Hadoop cloud Platform has been greatly improved compared with the single MATLAB image segmentation, and there is a great improvement in the effectiveness of image segmentation.

  19. Comparison of Cloud Base Height Derived from a Ground-Based Infrared Cloud Measurement and Two Ceilometers

    Directory of Open Access Journals (Sweden)

    Lei Liu

    2015-01-01

    Full Text Available The cloud base height (CBH derived from the whole-sky infrared cloud-measuring system (WSIRCMS and two ceilometers (Vaisala CL31 and CL51 from November 1, 2011, to June 12, 2012, at the Chinese Meteorological Administration (CMA Beijing Observatory Station are analysed. Significant differences can be found by comparing the measurements of different instruments. More exactly, the cloud occurrence retrieved from CL31 is 3.8% higher than that from CL51, while WSIRCMS data shows 3.6% higher than ceilometers. More than 75.5% of the two ceilometers’ differences are within ±200 m and about 89.5% within ±500 m, while only 30.7% of the differences between WSIRCMS and ceilometers are within ±500 m and about 55.2% within ±1000 m. These differences may be caused by the measurement principles and CBH retrieval algorithm. A combination of a laser ceilometer and an infrared cloud instrument is recommended to improve the capability for determining cloud occurrence and retrieving CBHs.

  20. Cloud-based adaptive exon prediction for DNA analysis.

    Science.gov (United States)

    Putluri, Srinivasareddy; Zia Ur Rahman, Md; Fathima, Shaik Yasmeen

    2018-02-01

    Cloud computing offers significant research and economic benefits to healthcare organisations. Cloud services provide a safe place for storing and managing large amounts of such sensitive data. Under conventional flow of gene information, gene sequence laboratories send out raw and inferred information via Internet to several sequence libraries. DNA sequencing storage costs will be minimised by use of cloud service. In this study, the authors put forward a novel genomic informatics system using Amazon Cloud Services, where genomic sequence information is stored and accessed for processing. True identification of exon regions in a DNA sequence is a key task in bioinformatics, which helps in disease identification and design drugs. Three base periodicity property of exons forms the basis of all exon identification techniques. Adaptive signal processing techniques found to be promising in comparison with several other methods. Several adaptive exon predictors (AEPs) are developed using variable normalised least mean square and its maximum normalised variants to reduce computational complexity. Finally, performance evaluation of various AEPs is done based on measures such as sensitivity, specificity and precision using various standard genomic datasets taken from National Center for Biotechnology Information genomic sequence database.

  1. Streaming support for data intensive cloud-based sequence analysis.

    Science.gov (United States)

    Issa, Shadi A; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of "resources-on-demand" and "pay-as-you-go", scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  2. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Shadi A. Issa

    2013-01-01

    Full Text Available Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  3. Challenges of Future VANET and Cloud-Based Approaches

    Directory of Open Access Journals (Sweden)

    Rakesh Shrestha

    2018-01-01

    Full Text Available Vehicular ad hoc networks (VANETs have been studied intensively due to their wide variety of applications and services, such as passenger safety, enhanced traffic efficiency, and infotainment. With the evolution of technology and sudden growth in the number of smart vehicles, traditional VANETs face several technical challenges in deployment and management due to less flexibility, scalability, poor connectivity, and inadequate intelligence. Cloud computing is considered a way to satisfy these requirements in VANETs. However, next-generation VANETs will have special requirements of autonomous vehicles with high mobility, low latency, real-time applications, and connectivity, which may not be resolved by conventional cloud computing. Hence, merging of fog computing with the conventional cloud for VANETs is discussed as a potential solution for several issues in current and future VANETs. In addition, fog computing can be enhanced by integrating Software-Defined Network (SDN, which provides flexibility, programmability, and global knowledge of the network. We present two example scenarios for timely dissemination of safety messages in future VANETs based on fog and a combination of fog and SDN. We also explained the issues that need to be resolved for the deployment of three different cloud-based approaches.

  4. TUNNEL POINT CLOUD FILTERING METHOD BASED ON ELLIPTIC CYLINDRICAL MODEL

    Directory of Open Access Journals (Sweden)

    N. Zhu

    2016-06-01

    Full Text Available The large number of bolts and screws that attached to the subway shield ring plates, along with the great amount of accessories of metal stents and electrical equipments mounted on the tunnel walls, make the laser point cloud data include lots of non-tunnel section points (hereinafter referred to as non-points, therefore affecting the accuracy for modeling and deformation monitoring. This paper proposed a filtering method for the point cloud based on the elliptic cylindrical model. The original laser point cloud data was firstly projected onto a horizontal plane, and a searching algorithm was given to extract the edging points of both sides, which were used further to fit the tunnel central axis. Along the axis the point cloud was segmented regionally, and then fitted as smooth elliptic cylindrical surface by means of iteration. This processing enabled the automatic filtering of those inner wall non-points. Experiments of two groups showed coincident results, that the elliptic cylindrical model based method could effectively filter out the non-points, and meet the accuracy requirements for subway deformation monitoring. The method provides a new mode for the periodic monitoring of tunnel sections all-around deformation in subways routine operation and maintenance.

  5. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Science.gov (United States)

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  6. KNOWLEDGE-BASED OBJECT DETECTION IN LASER SCANNING POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    F. Boochs

    2012-07-01

    Full Text Available Object identification and object processing in 3D point clouds have always posed challenges in terms of effectiveness and efficiency. In practice, this process is highly dependent on human interpretation of the scene represented by the point cloud data, as well as the set of modeling tools available for use. Such modeling algorithms are data-driven and concentrate on specific features of the objects, being accessible to numerical models. We present an approach that brings the human expert knowledge about the scene, the objects inside, and their representation by the data and the behavior of algorithms to the machine. This “understanding” enables the machine to assist human interpretation of the scene inside the point cloud. Furthermore, it allows the machine to understand possibilities and limitations of algorithms and to take this into account within the processing chain. This not only assists the researchers in defining optimal processing steps, but also provides suggestions when certain changes or new details emerge from the point cloud. Our approach benefits from the advancement in knowledge technologies within the Semantic Web framework. This advancement has provided a strong base for applications based on knowledge management. In the article we will present and describe the knowledge technologies used for our approach such as Web Ontology Language (OWL, used for formulating the knowledge base and the Semantic Web Rule Language (SWRL with 3D processing and topologic built-ins, aiming to combine geometrical analysis of 3D point clouds, and specialists’ knowledge of the scene and algorithmic processing.

  7. A cloud-based multimodality case file for mobile devices.

    Science.gov (United States)

    Balkman, Jason D; Loehfelm, Thomas W

    2014-01-01

    Recent improvements in Web and mobile technology, along with the widespread use of handheld devices in radiology education, provide unique opportunities for creating scalable, universally accessible, portable image-rich radiology case files. A cloud database and a Web-based application for radiologic images were developed to create a mobile case file with reasonable usability, download performance, and image quality for teaching purposes. A total of 75 radiology cases related to breast, thoracic, gastrointestinal, musculoskeletal, and neuroimaging subspecialties were included in the database. Breast imaging cases are the focus of this article, as they best demonstrate handheld display capabilities across a wide variety of modalities. This case subset also illustrates methods for adapting radiologic content to cloud platforms and mobile devices. Readers will gain practical knowledge about storage and retrieval of cloud-based imaging data, an awareness of techniques used to adapt scrollable and high-resolution imaging content for the Web, and an appreciation for optimizing images for handheld devices. The evaluation of this software demonstrates the feasibility of adapting images from most imaging modalities to mobile devices, even in cases of full-field digital mammograms, where high resolution is required to represent subtle pathologic features. The cloud platform allows cases to be added and modified in real time by using only a standard Web browser with no application-specific software. Challenges remain in developing efficient ways to generate, modify, and upload radiologic and supplementary teaching content to this cloud-based platform. Online supplemental material is available for this article. ©RSNA, 2014.

  8. Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning

    OpenAIRE

    Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de

    2016-01-01

    This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...

  9. Simple Motor Control Concept Results High Efficiency at High Velocities

    Science.gov (United States)

    Starin, Scott; Engel, Chris

    2013-09-01

    The need for high velocity motors in space applications for reaction wheels and detectors has stressed the limits of Brushless Permanent Magnet Motors (BPMM). Due to inherent hysteresis core losses, conventional BPMMs try to balance the need for torque verses hysteresis losses. Cong-less motors have significantly less hysteresis losses but suffer from lower efficiencies. Additionally, the inherent low inductance in cog-less motors result in high ripple currents or high switching frequencies, which lowers overall efficiency and increases performance demands on the control electronics.However, using a somewhat forgotten but fully qualified technology of Isotropic Magnet Motors (IMM), extremely high velocities may be achieved at low power input using conventional drive electronics. This paper will discuss the trade study efforts and empirical test data on a 34,000 RPM IMM.

  10. High-velocity runaway stars from three-body encounters

    Science.gov (United States)

    Gvaramadze, V. V.; Gualandris, A.; Portegies Zwart, S.

    2010-01-01

    We performed numerical simulations of dynamical encounters between hard, massive binaries and a very massive star (VMS; formed through runaway mergers of ordinary stars in the dense core of a young massive star cluster) to explore the hypothesis that this dynamical process could be responsible for the origin of high-velocity (≥ 200 - 400 km s-1) early or late B-type stars. We estimated the typical velocities produced in encounters between very tight massive binaries and VMSs (of mass of ≥ 200 M⊙) and found that about 3 - 4% of all encounters produce velocities ≥ 400 km s-1, while in about 2% of encounters the escapers attain velocities exceeding the Milky Ways's escape velocity. We therefore argue that the origin of high-velocity (≥ 200 - 400 km s-1) runaway stars and at least some so-called hypervelocity stars could be associated with dynamical encounters between the tightest massive binaries and VMSs formed in the cores of star clusters. We also simulated dynamical encounters between tight massive binaries and single ordinary 50 - 100 M⊙ stars. We found that from 1 to ≃ 4% of these encounters can produce runaway stars with velocities of ≥ 300 - 400 km s-1 (typical of the bound population of high-velocity halo B-type stars) and occasionally (in less than 1% of encounters) produce hypervelocity (≥ 700 km s-1) late B-type escapers.

  11. Building a cloud based distributed active archive data center

    Science.gov (United States)

    Ramachandran, Rahul; Baynes, Katie; Murphy, Kevin

    2017-04-01

    NASA's Earth Science Data System (ESDS) Program serves as a central cog in facilitating the implementation of NASA's Earth Science strategic plan. Since 1994, the ESDS Program has committed to the full and open sharing of Earth science data obtained from NASA instruments to all users. One of the key responsibilities of the ESDS Program is to continuously evolve the entire data and information system to maximize returns on the collected NASA data. An independent review was conducted in 2015 to holistically review the EOSDIS in order to identify gaps. The review recommendations were to investigate two areas: one, whether commercial cloud providers offer potential for storage, processing, and operational efficiencies, and two, the potential development of new data access and analysis paradigms. In response, ESDS has initiated several prototypes investigating the advantages and risks of leveraging cloud computing. This poster will provide an overview of one such prototyping activity, "Cumulus". Cumulus is being designed and developed as a "native" cloud-based data ingest, archive and management system that can be used for all future NASA Earth science data streams. The long term vision for Cumulus, its requirements, overall architecture, and implementation details, as well as lessons learned from the completion of the first phase of this prototype will be covered. We envision Cumulus will foster design of new analysis/visualization tools to leverage collocated data from all of the distributed DAACs as well as elastic cloud computing resources to open new research opportunities.

  12. Smart learning services based on smart cloud computing.

    Science.gov (United States)

    Kim, Svetlana; Song, Su-Mi; Yoon, Yong-Ik

    2011-01-01

    Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user's behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S)--smart pull, smart prospect, smart content, and smart push--concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users' needs by collecting and analyzing users' behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users' behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  13. Multiview point clouds denoising based on interference elimination

    Science.gov (United States)

    Hu, Yang; Wu, Qian; Wang, Le; Jiang, Huanyu

    2018-03-01

    Newly emerging low-cost depth sensors offer huge potentials for three-dimensional (3-D) modeling, but existing high noise restricts these sensors from obtaining accurate results. Thus, we proposed a method for denoising registered multiview point clouds with high noise to solve that problem. The proposed method is aimed at fully using redundant information to eliminate the interferences among point clouds of different views based on an iterative procedure. In each iteration, noisy points are either deleted or moved to their weighted average targets in accordance with two cases. Simulated data and practical data captured by a Kinect v2 sensor were tested in experiments qualitatively and quantitatively. Results showed that the proposed method can effectively reduce noise and recover local features from highly noisy multiview point clouds with good robustness, compared to truncated signed distance function and moving least squares (MLS). Moreover, the resulting low-noise point clouds can be further smoothed by the MLS to achieve improved results. This study provides the feasibility of obtaining fine 3-D models with high-noise devices, especially for depth sensors, such as Kinect.

  14. Smart Learning Services Based on Smart Cloud Computing

    Directory of Open Access Journals (Sweden)

    Yong-Ik Yoon

    2011-08-01

    Full Text Available Context-aware technologies can make e-learning services smarter and more efficient since context-aware services are based on the user’s behavior. To add those technologies into existing e-learning services, a service architecture model is needed to transform the existing e-learning environment, which is situation-aware, into the environment that understands context as well. The context-awareness in e-learning may include the awareness of user profile and terminal context. In this paper, we propose a new notion of service that provides context-awareness to smart learning content in a cloud computing environment. We suggest the elastic four smarts (E4S—smart pull, smart prospect, smart content, and smart push—concept to the cloud services so smart learning services are possible. The E4S focuses on meeting the users’ needs by collecting and analyzing users’ behavior, prospecting future services, building corresponding contents, and delivering the contents through cloud computing environment. Users’ behavior can be collected through mobile devices such as smart phones that have built-in sensors. As results, the proposed smart e-learning model in cloud computing environment provides personalized and customized learning services to its users.

  15. FINDING CUBOID-BASED BUILDING MODELS IN POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    W. Nguatem

    2012-07-01

    Full Text Available In this paper, we present an automatic approach for the derivation of 3D building models of level-of-detail 1 (LOD 1 from point clouds obtained from (dense image matching or, for comparison only, from LIDAR. Our approach makes use of the predominance of vertical structures and orthogonal intersections in architectural scenes. After robustly determining the scene's vertical direction based on the 3D points we use it as constraint for a RANSAC-based search for vertical planes in the point cloud. The planes are further analyzed to segment reliable outlines for rectangular surface within these planes, which are connected to construct cuboid-based building models. We demonstrate that our approach is robust and effective over a range of real-world input data sets with varying point density, amount of noise, and outliers.

  16. Security of Heterogeneous Content in Cloud Based Library Information Systems Using an Ontology Based Approach

    Directory of Open Access Journals (Sweden)

    Mihai DOINEA

    2014-01-01

    Full Text Available As in any domain that involves the use of software, the library information systems take advantages of cloud computing. The paper highlights the main aspect of cloud based systems, describing some public solutions provided by the most important players on the market. Topics related to content security in cloud based services are tackled in order to emphasize the requirements that must be met by these types of systems. A cloud based implementation of an Information Library System is presented and some adjacent tools that are used together with it to provide digital content and metadata links are described. In a cloud based Information Library System security is approached by means of ontologies. Aspects such as content security in terms of digital rights are presented and a methodology for security optimization is proposed.

  17. Business Process as a Service Model Based Business and IT Cloud Alignment as a Cloud Offering

    OpenAIRE

    Robert Woitsch; Wilfrid Utz

    2015-01-01

    Cloud computing proved to offer flexible IT solutions. Although large enterprises may benefit from this technology, SMEs are falling behind in cloud usage due to missing ITcompetence and hence lose the ability to efficiently adapt their IT to their business needs. This paper introduces the project idea of the H2020 project CloudSocket, by elaborating the idea of Business Processes as a Service (BPaaS), where concept models and semantics are applied to align business processes with Cloud deplo...

  18. Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model

    Science.gov (United States)

    Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming

    Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and

  19. Cloud-based services for your library a LITA guide

    CERN Document Server

    Mitchell, Erik T

    2013-01-01

    By exploring specific examples of cloud computing and virtualization, this book allows libraries considering cloud computing to start their exploration of these systems with a more informed perspective.

  20. Clone-based Data Index in Cloud Storage Systems

    Directory of Open Access Journals (Sweden)

    He Jing

    2016-01-01

    Full Text Available The storage systems have been challenged by the development of cloud computing. The traditional data index cannot satisfy the requirements of cloud computing because of the huge index volumes and quick response time. Meanwhile, because of the increasing size of data index and its dynamic characteristics, the previous ways, which rebuilding the index or fully backup the index before the data has changed, cannot satisfy the need of today’s big data index. To solve these problems, we propose a double-layer index structure that overcomes the throughput limitation of single point server. Then, a clone based B+ tree structure is proposed to achieve high performance and adapt dynamic environment. The experimental results show that our clone-based solution has high efficiency.

  1. Efficient Resources Provisioning Based on Load Forecasting in Cloud

    Directory of Open Access Journals (Sweden)

    Rongdong Hu

    2014-01-01

    Full Text Available Cloud providers should ensure QoS while maximizing resources utilization. One optimal strategy is to timely allocate resources in a fine-grained mode according to application’s actual resources demand. The necessary precondition of this strategy is obtaining future load information in advance. We propose a multi-step-ahead load forecasting method, KSwSVR, based on statistical learning theory which is suitable for the complex and dynamic characteristics of the cloud computing environment. It integrates an improved support vector regression algorithm and Kalman smoother. Public trace data taken from multitypes of resources were used to verify its prediction accuracy, stability, and adaptability, comparing with AR, BPNN, and standard SVR. Subsequently, based on the predicted results, a simple and efficient strategy is proposed for resource provisioning. CPU allocation experiment indicated it can effectively reduce resources consumption while meeting service level agreements requirements.

  2. Managing the move to the cloud – analyzing the risks and opportunities of cloud-based accounting information systems

    OpenAIRE

    Asatiani, Aleksandre; Penttinen, Esko

    2015-01-01

    The accounting industry is being disrupted by the introduction of cloud-based accounting information systems (AIS) that allow for a more efficient allocation of work between the accountant and the client company. In cloud-based AIS, the accountant and the client company as well as third parties such as auditors can simultaneously work on the data in real time. This, in turn, enables a much more granular division of work between the parties. This teaching case considers Kluuvin Apteekki, a sma...

  3. Evaluation of Satellite-Based Upper Troposphere Cloud Top Height Retrievals in Multilayer Cloud Conditions During TC4

    Science.gov (United States)

    Chang, Fu-Lung; Minnis, Patrick; Ayers, J. Kirk; McGill, Matthew J.; Palikonda, Rabindra; Spangenberg, Douglas A.; Smith, William L., Jr.; Yost, Christopher R.

    2010-01-01

    Upper troposphere cloud top heights (CTHs), restricted to cloud top pressures (CTPs) less than 500 hPa, inferred using four satellite retrieval methods applied to Twelfth Geostationary Operational Environmental Satellite (GOES-12) data are evaluated using measurements during the July August 2007 Tropical Composition, Cloud and Climate Coupling Experiment (TC4). The four methods are the single-layer CO2-absorption technique (SCO2AT), a modified CO2-absorption technique (MCO2AT) developed for improving both single-layered and multilayered cloud retrievals, a standard version of the Visible Infrared Solar-infrared Split-window Technique (old VISST), and a new version of VISST (new VISST) recently developed to improve cloud property retrievals. They are evaluated by comparing with ER-2 aircraft-based Cloud Physics Lidar (CPL) data taken during 9 days having extensive upper troposphere cirrus, anvil, and convective clouds. Compared to the 89% coverage by upper tropospheric clouds detected by the CPL, the SCO2AT, MCO2AT, old VISST, and new VISST retrieved CTPs less than 500 hPa in 76, 76, 69, and 74% of the matched pixels, respectively. Most of the differences are due to subvisible and optically thin cirrus clouds occurring near the tropopause that were detected only by the CPL. The mean upper tropospheric CTHs for the 9 days are 14.2 (+/- 2.1) km from the CPL and 10.7 (+/- 2.1), 12.1 (+/- 1.6), 9.7 (+/- 2.9), and 11.4 (+/- 2.8) km from the SCO2AT, MCO2AT, old VISST, and new VISST, respectively. Compared to the CPL, the MCO2AT CTHs had the smallest mean biases for semitransparent high clouds in both single-layered and multilayered situations whereas the new VISST CTHs had the smallest mean biases when upper clouds were opaque and optically thick. The biases for all techniques increased with increasing numbers of cloud layers. The transparency of the upper layer clouds tends to increase with the numbers of cloud layers.

  4. A MODELING METHOD OF FLUTTERING LEAVES BASED ON POINT CLOUD

    OpenAIRE

    J. Tang; Y. Wang; Y. Zhao; Y. Zhao; W. Hao; X. Ning; K. Lv; Z. Shi; M. Zhao

    2017-01-01

    Leaves falling gently or fluttering are common phenomenon in nature scenes. The authenticity of leaves falling plays an important part in the dynamic modeling of natural scenes. The leaves falling model has a widely applications in the field of animation and virtual reality. We propose a novel modeling method of fluttering leaves based on point cloud in this paper. According to the shape, the weight of leaves and the wind speed, three basic trajectories of leaves falling are defined, which ar...

  5. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    Directory of Open Access Journals (Sweden)

    Milan Vukićević

    2014-01-01

    Full Text Available Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data.

  6. RFID-based Electronic Identity Security Cloud Platform in Cyberspace

    OpenAIRE

    Bing Chen; Chengxiang Tan; Bo Jin; Xiang Zou; Yuebo Dai

    2012-01-01

    With the moving development of networks, especially Internet of Things, electronic identity administration in cyberspace is becoming more and more important. And personal identity management in cyberspace associated with individuals in reality has been one significant and urgent task for the further development of information construction in China. So this paper presents a RFID-based electronic identity security cloud platform in cyberspace to implement an efficient security management of cyb...

  7. Towards cloud based big data analytics for smart future cities

    OpenAIRE

    Khan, Zaheer; Anjum, Ashiq; Soomro, Kamran; Tahir, Muhammad

    2015-01-01

    A large amount of land-use, environment, socio-economic, energy and transport data is generated in cities. An integrated perspective of managing and analysing such big data can answer a number of science, policy, planning, governance and business questions and support decision making in enabling a smarter environment. This paper presents a theoretical and experimental perspective on the smart cities focused big data management and analysis by proposing a cloud-based analytics service. A proto...

  8. A MAS-Based Cloud Service Brokering System to Respond Security Needs of Cloud Customers

    Directory of Open Access Journals (Sweden)

    Jamal Talbi

    2017-03-01

    Full Text Available Cloud computing is becoming a key factor in computer science and an important technology for many organizations to deliver different types of services. The companies which provide services to customers are called as cloud service providers. The cloud users (CUs increase and require secure, reliable and trustworthy cloud service providers (CSPs from the market. So, it’s a challenge for a new customer to choose the highly secure provider. This paper presents a cloud service brokering system in order to analyze and rank the secured cloud service provider among the available providers list. This model uses an autonomous and flexible agent in multi-agent system (MASs that have an intelligent behavior and suitable tools for helping the brokering system to assess the security risks for the group of cloud providers which make decision of the more secured provider and justify the business needs of users in terms of security and reliability.

  9. Evaluating the Usage of Cloud-Based Collaboration Services through Teamwork

    Science.gov (United States)

    Qin, Li; Hsu, Jeffrey; Stern, Mel

    2016-01-01

    With the proliferation of cloud computing for both organizational and educational use, cloud-based collaboration services are transforming how people work in teams. The authors investigated the determinants of the usage of cloud-based collaboration services including teamwork quality, computer self-efficacy, and prior experience, as well as its…

  10. Generic-distributed framework for cloud services marketplace based on unified ontology

    Directory of Open Access Journals (Sweden)

    Samer Hasan

    2017-11-01

    Full Text Available Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors’ knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  11. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  12. Cloud-based Jupyter Notebooks for Water Data Analysis

    Science.gov (United States)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  13. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications.

    Science.gov (United States)

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-03-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  14. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud for Mobile Cloud Computing Applications

    Directory of Open Access Journals (Sweden)

    Thanh Dinh

    2017-03-01

    Full Text Available This paper presents a location-based interactive model of Internet of Things (IoT and cloud integration (IoT-cloud for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model.

  15. A Location-Based Interactive Model of Internet of Things and Cloud (IoT-Cloud) for Mobile Cloud Computing Applications †

    Science.gov (United States)

    Dinh, Thanh; Kim, Younghan; Lee, Hyukjoon

    2017-01-01

    This paper presents a location-based interactive model of Internet of Things (IoT) and cloud integration (IoT-cloud) for mobile cloud computing applications, in comparison with the periodic sensing model. In the latter, sensing collections are performed without awareness of sensing demands. Sensors are required to report their sensing data periodically regardless of whether or not there are demands for their sensing services. This leads to unnecessary energy loss due to redundant transmission. In the proposed model, IoT-cloud provides sensing services on demand based on interest and location of mobile users. By taking advantages of the cloud as a coordinator, sensing scheduling of sensors is controlled by the cloud, which knows when and where mobile users request for sensing services. Therefore, when there is no demand, sensors are put into an inactive mode to save energy. Through extensive analysis and experimental results, we show that the location-based model achieves a significant improvement in terms of network lifetime compared to the periodic model. PMID:28257067

  16. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  17. Development of a cloud-based Bioinformatics Training Platform.

    Science.gov (United States)

    Revote, Jerico; Watson-Haigh, Nathan S; Quenette, Steve; Bethwaite, Blair; McGrath, Annette; Shang, Catherine A

    2017-05-01

    The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. © The Author 2016. Published by Oxford University Press.

  18. Web-based CERES Clouds QC Property Viewing Tool

    Science.gov (United States)

    Smith, R. A.; Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Minnis, P.

    2014-12-01

    This presentation will display the capabilities of a web-based CERES cloud property viewer. Terra data will be chosen for examples. It will demonstrate viewing of cloud properties in gridded global maps, histograms, time series displays, latitudinal zonal images, binned data charts, data frequency graphs, and ISCCP plots. Images can be manipulated by the user to narrow boundaries of the map as well as color bars and value ranges, compare datasets, view data values, and more. Other atmospheric studies groups will be encouraged to put their data into the underlying NetCDF data format and view their data with the tool. A laptop will hopefully be available to allow conference attendees to try navigating the tool.

  19. A privacy authentication scheme based on cloud for medical environment.

    Science.gov (United States)

    Chen, Chin-Ling; Yang, Tsai-Tung; Chiang, Mao-Lun; Shih, Tzay-Farn

    2014-11-01

    With the rapid development of the information technology, the health care technologies already became matured. Such as electronic medical records that can be easily stored. However, how to get medical resources more convenient is currently concerning issue. In spite of many literatures discussed about medical systems, these literatures should face many security challenges. The most important issue is patients' privacy. Therefore, we propose a privacy authentication scheme based on cloud environment. In our scheme, we use mobile device's characteristics, allowing peoples to use medical resources on the cloud environment to find medical advice conveniently. The digital signature is used to ensure the security of the medical information that is certified by the medical department in our proposed scheme.

  20. Batch Attribute-Based Encryption for Secure Clouds

    Directory of Open Access Journals (Sweden)

    Chen Yang

    2015-10-01

    Full Text Available Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated with an access policy and an attribute set, respectively; in addition to holding a secret key, one can decrypt a ciphertext only if the associated attributes match the predetermined access policy, which allows one to enforce fine-grained access control on outsourced files. One issue in existing ABE schemes is that they are designed for the users of a single organization. When one wants to share the data with the users of different organizations, the owner needs to encrypt the messages to the receivers of one organization and then repeats this process for another organization. This situation is deteriorated with more and more mobile devices using cloud services, as the ABE encryption process is time consuming and may exhaust the power supplies of the mobile devices quickly. In this paper, we propose a batch attribute-based encryption (BABE approach to address this problem in a provably-secure way. With our approach, the data owner can outsource data in batches to the users of different organizations simultaneously. The data owner is allowed to decide the receiving organizations and the attributes required for decryption. Theoretical and experimental analyses show that our approach is more efficient than traditional encryption implementations in computation and communication.

  1. Cloud-Based Model Calibration Using OpenStudio: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  2. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    Science.gov (United States)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing

  3. PC-Cluster based Storage System Architecture for Cloud Storage

    OpenAIRE

    Yee, Tin Tin; Naing, Thinn Thu

    2011-01-01

    Design and architecture of cloud storage system plays a vital role in cloud computing infrastructure in order to improve the storage capacity as well as cost effectiveness. Usually cloud storage system provides users to efficient storage space with elasticity feature. One of the challenges of cloud storage system is difficult to balance the providing huge elastic capacity of storage and investment of expensive cost for it. In order to solve this issue in the cloud storage infrastructure, low ...

  4. Testing a polarimetric cloud imager aboard research vessel Polarstern: comparison of color-based and polarimetric cloud detection algorithms.

    Science.gov (United States)

    Barta, András; Horváth, Gábor; Horváth, Ákos; Egri, Ádám; Blahó, Miklós; Barta, Pál; Bumke, Karl; Macke, Andreas

    2015-02-10

    Cloud cover estimation is an important part of routine meteorological observations. Cloudiness measurements are used in climate model evaluation, nowcasting solar radiation, parameterizing the fluctuations of sea surface insolation, and building energy transfer models of the atmosphere. Currently, the most widespread ground-based method to measure cloudiness is based on analyzing the unpolarized intensity and color distribution of the sky obtained by digital cameras. As a new approach, we propose that cloud detection can be aided by the additional use of skylight polarization measured by 180° field-of-view imaging polarimetry. In the fall of 2010, we tested such a novel polarimetric cloud detector aboard the research vessel Polarstern during expedition ANT-XXVII/1. One of our goals was to test the durability of the measurement hardware under the extreme conditions of a trans-Atlantic cruise. Here, we describe the instrument and compare the results of several different cloud detection algorithms, some conventional and some newly developed. We also discuss the weaknesses of our design and its possible improvements. The comparison with cloud detection algorithms developed for traditional nonpolarimetric full-sky imagers allowed us to evaluate the added value of polarimetric quantities. We found that (1) neural-network-based algorithms perform the best among the investigated schemes and (2) global information (the mean and variance of intensity), nonoptical information (e.g., sun-view geometry), and polarimetric information (e.g., the degree of polarization) improve the accuracy of cloud detection, albeit slightly.

  5. Model Based Business and IT Cloud Alignment as a Cloud Offering

    OpenAIRE

    Robert Woitsch; Wilfrid Utz

    2015-01-01

    Cloud computing proved to offer flexible IT solutions. Although large enterprises may benefit from this technology by educating their IT departments, SMEs are dramatically falling behind in cloud usage and hence lose the ability to efficiently adapt their IT to their business needs. This paper introduces the project idea of the H2020 project CloudSocket, by elaborating the idea of Business Processes as a Service, where concept models and semantics are applied to align business pro...

  6. Decision making in high-velocity environments: implications for healthcare.

    Science.gov (United States)

    Stepanovich, P L; Uhrig, J D

    1999-01-01

    Healthcare can be considered a high-velocity environment and, as such, can benefit from research conducted in other industries regarding strategic decision making. Strategic planning is not only relevant to firms in high-velocity environments, but is also important for high performance and survival. Specifically, decision-making speed seems to be instrumental in differentiating between high and low performers; fast decision makers outperform slow decision makers. This article outlines the differences between fast and slow decision makers, identifies five paralyses that can slow decision making in healthcare, and outlines the role of a planning department in circumventing these paralyses. Executives can use the proposed planning structure to improve both the speed and quality of strategic decisions. The structure uses planning facilitators to avoid the following five paralyses: 1. Analysis. Decision makers can no longer afford the luxury of lengthy, detailed analysis but must develop real-time systems that provide appropriate, timely information. 2. Alternatives. Many alternatives (beyond the traditional two or three) need to be considered and the alternatives must be evaluated simultaneously. 3. Group Think. Decision makers must avoid limited mind-sets and autocratic leadership styles by seeking out independent, knowledgeable counselors. 4. Process. Decision makers need to resolve conflicts through "consensus with qualification," as opposed to waiting for everyone to come on board. 5. Separation. Successful implementation requires a structured process that cuts across disciplines and levels.

  7. Strengthen Cloud Computing Security with Federal Identity Management Using Hierarchical Identity-Based Cryptography

    Science.gov (United States)

    Yan, Liang; Rong, Chunming; Zhao, Gansen

    More and more companies begin to provide different kinds of cloud computing services for Internet users at the same time these services also bring some security problems. Currently the majority of cloud computing systems provide digital identity for users to access their services, this will bring some inconvenience for a hybrid cloud that includes multiple private clouds and/or public clouds. Today most cloud computing system use asymmetric and traditional public key cryptography to provide data security and mutual authentication. Identity-based cryptography has some attraction characteristics that seem to fit well the requirements of cloud computing. In this paper, by adopting federated identity management together with hierarchical identity-based cryptography (HIBC), not only the key distribution but also the mutual authentication can be simplified in the cloud.

  8. Up in the cloud: reflections on teaching translation technology using a cloud-based platform

    DEFF Research Database (Denmark)

    Flanagan, Marian

    to teaching TT in the classroom. The approach was inspired by Pym (2006) and Doherty and Moorkens (2013), and it takes advantage of using cloud-based software. The aim was to evaluate the students’ experience in the TT workshops. Moreover, I wanted to investigate particular aspects of teaching that were...... with teaching translation technology in the classroom (e.g. Doherty et al. 2012, Kenny and Way 2001, O’Brien and Kenny 2001,2006), several unanswered questions still remain. Up until recently, the translation software often restricted the teaching approach. This paper reports on a new approach I took...... data via online questionnaires: pre-workshop (43 responses) and post-workshop (30 responses). The questionnaires consisted mainly of closed questions but both provided the students with the opportunity to discuss their expectations (pre-workshop) and reflections (post-workshop). Following...

  9. Mobile Agent based Market Basket Analysis on Cloud

    OpenAIRE

    Waghmare, Vijayata; Mukhopadhyay, Debajyoti

    2014-01-01

    This paper describes the design and development of a location-based mobile shopping application for bakery product shops. Whole application is deployed on cloud. The three-tier architecture consists of, front-end, middle-ware and back-end. The front-end level is a location-based mobile shopping application for android mobile devices, for purchasing bakery products of nearby places. Front-end level also displays association among the purchased products. The middle-ware level provides a web ser...

  10. The efficiency of ceramic-faced metal targets at high-velocity impact

    Science.gov (United States)

    Tolkachev, V. F.; Konyaev, A. A.; Pakhnutova, N. V.

    2017-11-01

    The paper represents experimental results and engineering evaluation concerning the efficiency of composite materials to be used as an additional protection during the high- velocity interaction of a tungsten rod with a target in the velocity range of 1...5 km/s. The main parameter that characterizes the high-velocity interaction of a projectile with a layered target is the penetration depth. Experimental data, numerical simulation and engineering evaluation by modified models are used to determine the penetration depth. Boron carbide, aluminum oxide, and aluminum nickelide are applied as a front surface of targets. Based on experimental data and numerical simulation, the main characteristics of ceramics are determined, which allows composite materials to be effectively used as additional elements of protection.

  11. CloudSat Preps for Launch at Vandenberg Air Force Base, CA

    Science.gov (United States)

    2005-01-01

    The CloudSat spacecraft sits encapsulated within its Boeing Delta launch vehicle dual payload attach fitting at Vandenberg Air Force Base, Calif. CloudSat will share its ride to orbit late next month with NASA's CALIPSO spacecraft. The two spacecraft are designed to reveal the secrets of clouds and aerosols.

  12. Cloud Study Investigators: Using NASA's CERES S'COOL in Problem-Based Learning

    Science.gov (United States)

    Moore, Susan; Popiolkowski, Gary

    2011-01-01

    1This article describes how, by incorporating NASA's Students' Cloud Observations On-Line (S'COOL) project into a problem-based learning (PBL) activity, middle school students are engaged in authentic scientific research where they observe and record information about clouds and contribute ground truth data to NASA's Clouds and the Earth's…

  13. Retrieval of liquid water cloud properties from ground-based remote sensing observations

    NARCIS (Netherlands)

    Knist, C.L.

    2014-01-01

    Accurate ground-based remotely sensed microphysical and optical properties of liquid water clouds are essential references to validate satellite-observed cloud properties and to improve cloud parameterizations in weather and climate models. This requires the evaluation of algorithms for retrieval of

  14. A simple dynamic rising nuclear cloud based model of ground radioactive fallout for atmospheric nuclear explosion

    International Nuclear Information System (INIS)

    Zheng Yi

    2008-01-01

    A simple dynamic rising nuclear cloud based model for atmospheric nuclear explosion radioactive prediction was presented. The deposition of particles and initial cloud radius changing with time before the cloud stabilization was considered. Large-scale relative diffusion theory was used after cloud stabilization. The model was considered reasonable and dependable in comparison with four U.S. nuclear test cases and DELFIC model results. (authors)

  15. ROBUST AND EFFICIENT PRIVACY PRESERVING PUBLIC AUDITING FOR REGENERATING-CODE-BASED CLOUD STORAGE

    OpenAIRE

    Tessy Vincent*, Mrs.Krishnaveni.V.V

    2017-01-01

    Cloud computing is gaining more popularity because of its guaranteed services like online data storage and backup solutions, Web-based e-mail services, virtualized infrastructure etc. User is allowed to access the data stored in a cloud anytime, anywhere using internet connected device with low cost. To provide security to outsourced data in cloud storage against various corruptions, adding fault tolerance to cloud storage together with data integrity checking and failure reparation becomes c...

  16. Comparison of cloud top heights derived from FY-2 meteorological satellites with heights derived from ground-based millimeter wavelength cloud radar

    Science.gov (United States)

    Wang, Zhe; Wang, Zhenhui; Cao, Xiaozhong; Tao, Fa

    2018-01-01

    Clouds are currently observed by both ground-based and satellite remote sensing techniques. Each technique has its own strengths and weaknesses depending on the observation method, instrument performance and the methods used for retrieval. It is important to study synergistic cloud measurements to improve the reliability of the observations and to verify the different techniques. The FY-2 geostationary orbiting meteorological satellites continuously observe the sky over China. Their cloud top temperature product can be processed to retrieve the cloud top height (CTH). The ground-based millimeter wavelength cloud radar can acquire information about the vertical structure of clouds-such as the cloud base height (CBH), CTH and the cloud thickness-and can continuously monitor changes in the vertical profiles of clouds. The CTHs were retrieved using both cloud top temperature data from the FY-2 satellites and the cloud radar reflectivity data for the same time period (June 2015 to May 2016) and the resulting datasets were compared in order to evaluate the accuracy of CTH retrievals using FY-2 satellites. The results show that the concordance rate of cloud detection between the two datasets was 78.1%. Higher consistencies were obtained for thicker clouds with larger echo intensity and for more continuous clouds. The average difference in the CTH between the two techniques was 1.46 km. The difference in CTH between low- and mid-level clouds was less than that for high-level clouds. An attenuation threshold of the cloud radar for rainfall was 0.2 mm/min; a rainfall intensity below this threshold had no effect on the CTH. The satellite CTH can be used to compensate for the attenuation error in the cloud radar data.

  17. Treatment Protocol for High Velocity/High Energy Gunshot Injuries to the Face

    Science.gov (United States)

    Peled, Micha; Leiser, Yoav; Emodi, Omri; Krausz, Amir

    2011-01-01

    Major causes of facial combat injuries include blasts, high-velocity/high-energy missiles, and low-velocity missiles. High-velocity bullets fired from assault rifles encompass special ballistic properties, creating a transient cavitation space with a small entrance wound and a much larger exit wound. There is no dispute regarding the fact that primary emergency treatment of ballistic injuries to the face commences in accordance with the current advanced trauma life support (ATLS) recommendations; the main areas in which disputes do exist concern the question of the timing, sequence, and modes of surgical treatment. The aim of the present study is to present the treatment outcome of high-velocity/high-energy gunshot injuries to the face, using a protocol based on the experience of a single level I trauma center. A group of 23 injured combat soldiers who sustained bullet and shrapnel injuries to the maxillofacial region during a 3-week regional military conflict were evaluated in this study. Nine patients met the inclusion criteria (high-velocity/high-energy injuries) and were included in the study. According to our protocol, upon arrival patients underwent endotracheal intubation and were hemodynamically stabilized in the shock-trauma unit and underwent total-body computed tomography with 3-D reconstruction of the head and neck and computed tomography angiography. All patients underwent maxillofacial surgery upon the day of arrival according to the protocol we present. In view of our treatment outcomes, results, and low complication rates, we conclude that strict adherence to a well-founded and structured treatment protocol based on clinical experience is mandatory in providing efficient, appropriate, and successful treatment to a relatively large group of patients who sustain various degrees of maxillofacial injuries during a short period of time. PMID:23449809

  18. A new data collaboration service based on cloud computing security

    Science.gov (United States)

    Ying, Ren; Li, Hua-Wei; Wang, Li na

    2017-09-01

    With the rapid development of cloud computing, the storage and usage of data have undergone revolutionary changes. Data owners can store data in the cloud. While bringing convenience, it also brings many new challenges to cloud data security. A key issue is how to support a secure data collaboration service that supports access and updates to cloud data. This paper proposes a secure, efficient and extensible data collaboration service, which prevents data leaks in cloud storage, supports one to many encryption mechanisms, and also enables cloud data writing and fine-grained access control.

  19. Intelligent Aggregation Based on Content Routing Scheme for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jiachen Xu

    2017-10-01

    Full Text Available Cloud computing has emerged as today’s most exciting computing paradigm for providing services using a shared framework, which opens a new door for solving the problems of the explosive growth of digital resource demands and their corresponding convenience. With the exponential growth of the number of data types and data size in so-called big data work, the backbone network is under great pressure due to its transmission capacity, which is lower than the growth of the data size and would seriously hinder the development of the network without an effective approach to solve this problem. In this paper, an Intelligent Aggregation based on a Content Routing (IACR scheme for cloud computing, which could reduce the amount of data in the network effectively and play a basic supporting role in the development of cloud computing, is first put forward. All in all, the main innovations in this paper are: (1 A framework for intelligent aggregation based on content routing is proposed, which can support aggregation based content routing; (2 The proposed IACR scheme could effectively route the high aggregation ratio data to the data center through the same routing path so as to effectively reduce the amount of data that the network transmits. The theoretical analyses experiments and results show that, compared with the previous original routing scheme, the IACR scheme can balance the load of the whole network, reduce the amount of data transmitted in the network by 41.8%, and reduce the transmission time by 31.6% in the same network with a more balanced network load.

  20. Biometric technology authentication, biocryptography, and cloud-based architecture

    CERN Document Server

    Das, Ravi

    2014-01-01

    Most biometric books are either extraordinarily technical for technophiles or extremely elementary for the lay person. Striking a balance between the two, Biometric Technology: Authentication, Biocryptography, and Cloud-Based Architecture is ideal for business, IT, or security managers that are faced with the task of making purchasing, migration, or adoption decisions. It brings biometrics down to an understandable level, so that you can immediately begin to implement the concepts discussed.Exploring the technological and social implications of widespread biometric use, the book considers the

  1. High-velocity winds from a dwarf nova during outburst

    Science.gov (United States)

    Cordova, F. A.; Mason, K. O.

    1982-01-01

    An ultraviolet spectrum of the dwarf nova TW Vir during an optical outburst shows shortward-shifted absorption features with edge velocities as high as 4800 km/s, about the escape velocity of a white dwarf. A comparison of this spectrum with the UV spectra of other cataclysmic variables suggests that mass loss is evident only for systems with relatively high luminosities (more than about 10 solar luminosities) and low inclination angles with respect to the observer's line of sight. The mass loss rate for cataclysmic variables is of order 10 to the -11th solar mass per yr; this is from 0.01 to 0.001 of the mass accretion rate onto the compact star in the binary. The mass loss may occur by a mechanism similar to that invoked for early-type stars, i.e., radiation absorbed in the lines accelerates the accreting gas to the high velocities observed.

  2. RESPONSE OF STRUCTURES TO HIGH VELOCITY IMPACTS: A GENERALIZED ALGORITHM

    Directory of Open Access Journals (Sweden)

    Aversh'ev Anatoliy Sergeevich

    2012-10-01

    Full Text Available In this paper, a high velocity impact produced by a spherical striker and a target are considered; different stages of loading and unloading, target deformations and propagation of non-stationary wave surfaces within the target are analyzed. The problem of the strike modeling and subsequent deformations is solved by using not only the equations of mechanics of deformable rigid bodies, but also fluid mechanics equations. The target material is simulated by means of an ideal "plastic gas". Modeling results and theoretical calculations are compared to the experimental results. The crater depth, its correlation with the striker diameter, values of the pressure and deformations of the target underneath the contact area are determined as the main characteristics of dynamic interaction.

  3. Influences of the Air in Metal Powder High Velocity Compaction

    Directory of Open Access Journals (Sweden)

    Liu Jun

    2017-01-01

    Full Text Available During the process of metal powder high velocity impact compaction, the air is compressed sharply and portion remains in the compacts. In order to study the Influences, a discrete density volleyball accumulation model for aluminium powder was established with the use of ABAQUS. Study found that the powder porosity air obstruct the pressing process because remaining air reduced strength and density of the compacts in the current high-speed pressing (V≤100m/s. When speed further increased (V≥100m/s, the temperature of the air increased sharply, and was even much higher than the melting point of the material. When aluminium powder was compressed at a speed of 200m/s, temperatures of air could reach 2033 K, far higher than the melting point of 877 K. Increased density of powders was a result of local softening and even melt adhesive while air between particles with high temperature and pressure flowed past.

  4. Development of a high velocity rain erosion test method

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Dong Teak; Jin, Doo Han [Korea University of Technology and Education, Cheonan (Korea, Republic of); Kang, Hyung [Agency for Defense Development, Daejeon (Korea, Republic of)

    2009-07-01

    The nose of a missile, flying through raining region with a supersonic speed, is subjected to the rain erosion because the nose is made of a brittle ceramic material. A simple yet very effective rain erosion test method is developed. The sabot assembly similar to the hypodermic syringe carries specific amount of water is launched by a low pressure air gun. After the stopper stop the sabot assembly by impact, the steel plunger continues moving toward to squeeze the silicon rubber in front. The pressurized silicon rubber then is squeezed through the orifice in front of the sabot at high velocity, thus, accelerates the water droplet to higher velocity. The droplet velocity up to 800m/s is successfully attained using a low pressure air gun. The ceramic specimen assembly is placed in front of the high speed water droplet and the rain erosion damage on the surface of the specimen is observed.

  5. Observations of high-velocity molecular gas near Herbig-Haro objects: HH 24--27 and HH 1--2

    International Nuclear Information System (INIS)

    Snell, R.L.; Edwards, S.

    1982-01-01

    High-velocity CO has been detected in the vicinity of the Herbig-Haro objects HH 24--27. These observations indicate that there are two sources of high-velocity outflow; one centered on an infrared source near HH 26, and the second centered roughly 2' south of HH 24. The redshifted and blueshifted wings in both sources are spatially separated suggesting that the high-velocity gas is due to energetic bipolar outflow from young stars embedded in the molecular cloud. The association of Herbig-Haro objects with regions of high-velocity gas suggests a common origin for both in the interaction of a stellar wind with the ambient molecular cloud. The mass loss rates implied by our observations, assuming that the rate of mass loss has been constant throughout the dynamical lifetime of the bipolar lobes, are roughly 10 -6 M/sub sun/ yr -1 for both sources. We have also searched for high-velocity gas near HH 1--2 but found no evidence for mass outflow in this region

  6. NAPR: a Cloud-Based Framework for Neuroanatomical Age Prediction.

    Science.gov (United States)

    Pardoe, Heath R; Kuzniecky, Ruben

    2018-01-01

    The availability of cloud computing services has enabled the widespread adoption of the "software as a service" (SaaS) approach for software distribution, which utilizes network-based access to applications running on centralized servers. In this paper we apply the SaaS approach to neuroimaging-based age prediction. Our system, named "NAPR" (Neuroanatomical Age Prediction using R), provides access to predictive modeling software running on a persistent cloud-based Amazon Web Services (AWS) compute instance. The NAPR framework allows external users to estimate the age of individual subjects using cortical thickness maps derived from their own locally processed T1-weighted whole brain MRI scans. As a demonstration of the NAPR approach, we have developed two age prediction models that were trained using healthy control data from the ABIDE, CoRR, DLBS and NKI Rockland neuroimaging datasets (total N = 2367, age range 6-89 years). The provided age prediction models were trained using (i) relevance vector machines and (ii) Gaussian processes machine learning methods applied to cortical thickness surfaces obtained using Freesurfer v5.3. We believe that this transparent approach to out-of-sample evaluation and comparison of neuroimaging age prediction models will facilitate the development of improved age prediction models and allow for robust evaluation of the clinical utility of these methods.

  7. Task Classification Based Energy-Aware Consolidation in Clouds

    Directory of Open Access Journals (Sweden)

    HeeSeok Choi

    2016-01-01

    Full Text Available We consider a cloud data center, in which the service provider supplies virtual machines (VMs on hosts or physical machines (PMs to its subscribers for computation in an on-demand fashion. For the cloud data center, we propose a task consolidation algorithm based on task classification (i.e., computation-intensive and data-intensive and resource utilization (e.g., CPU and RAM. Furthermore, we design a VM consolidation algorithm to balance task execution time and energy consumption without violating a predefined service level agreement (SLA. Unlike the existing research on VM consolidation or scheduling that applies none or single threshold schemes, we focus on a double threshold (upper and lower scheme, which is used for VM consolidation. More specifically, when a host operates with resource utilization below the lower threshold, all the VMs on the host will be scheduled to be migrated to other hosts and then the host will be powered down, while when a host operates with resource utilization above the upper threshold, a VM will be migrated to avoid using 100% of resource utilization. Based on experimental performance evaluations with real-world traces, we prove that our task classification based energy-aware consolidation algorithm (TCEA achieves a significant energy reduction without incurring predefined SLA violations.

  8. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    Science.gov (United States)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  9. Sensitive Data Protection Based on Intrusion Tolerance in Cloud Computing

    OpenAIRE

    Jingyu Wang; xuefeng Zheng; Dengliang Luo

    2011-01-01

    Service integration and supply on-demand coming from cloud computing can significantly improve the utilization of computing resources and reduce power consumption of per service, and effectively avoid the error of computing resources. However, cloud computing is still facing the problem of intrusion tolerance of the cloud computing platform and sensitive data of new enterprise data center. In order to address the problem of intrusion tolerance of cloud computing platform and sensitive data in...

  10. Telemedicine Based on Mobile Devices and Mobile Cloud Computing

    OpenAIRE

    Lidong Wang; Cheryl Ann Alexander

    2014-01-01

    Mobile devices such as smartphones and tablets support kinds of mobile computing and services. They can access to the cloud or offload the computation-intensive part to the cloud computing resources. Mobile cloud computing (MCC) integrates the cloud computing into the mobile environment, which extends mobile devices’ battery lifetime, improves their data storage capacity and processing power, and improves their reliability and information security. In this paper, the applications of smartphon...

  11. Mesoscale meteorological model based on radioactive explosion cloud simulation

    International Nuclear Information System (INIS)

    Zheng Yi; Zhang Yan; Ying Chuntong

    2008-01-01

    In order to simulate nuclear explosion and dirty bomb radioactive cloud movement and concentration distribution, mesoscale meteorological model RAMS was used. Particles-size, size-active distribution and gravitational fallout in the cloud were considered. The results show that the model can simulate the 'mushroom' clouds of explosion. Three-dimension fluid field and radioactive concentration field were received. (authors)

  12. Depolarization Lidar Determination of Cloud-Base Microphysical Properties

    NARCIS (Netherlands)

    Donovan, D.P.; Klein Baltink, H; Henzing, J. S.; de Roode, S.R.; Siebesma, A.P.

    2016-01-01

    The links between multiple-scattering induced depolarization and cloud microphysical properties (e.g. cloud particle number density, effective radius, water content) have long been recognised. Previous efforts to use depolarization information in a quantitative manner to retrieve cloud

  13. MIN-CUT BASED SEGMENTATION OF AIRBORNE LIDAR POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    S. Ural

    2012-07-01

    Full Text Available Introducing an organization to the unstructured point cloud before extracting information from airborne lidar data is common in many applications. Aggregating the points with similar features into segments in 3-D which comply with the nature of actual objects is affected by the neighborhood, scale, features and noise among other aspects. In this study, we present a min-cut based method for segmenting the point cloud. We first assess the neighborhood of each point in 3-D by investigating the local geometric and statistical properties of the candidates. Neighborhood selection is essential since point features are calculated within their local neighborhood. Following neighborhood determination, we calculate point features and determine the clusters in the feature space. We adapt a graph representation from image processing which is especially used in pixel labeling problems and establish it for the unstructured 3-D point clouds. The edges of the graph that are connecting the points with each other and nodes representing feature clusters hold the smoothness costs in the spatial domain and data costs in the feature domain. Smoothness costs ensure spatial coherence, while data costs control the consistency with the representative feature clusters. This graph representation formalizes the segmentation task as an energy minimization problem. It allows the implementation of an approximate solution by min-cuts for a global minimum of this NP hard minimization problem in low order polynomial time. We test our method with airborne lidar point cloud acquired with maximum planned post spacing of 1.4 m and a vertical accuracy 10.5 cm as RMSE. We present the effects of neighborhood and feature determination in the segmentation results and assess the accuracy and efficiency of the implemented min-cut algorithm as well as its sensitivity to the parameters of the smoothness and data cost functions. We find that smoothness cost that only considers simple distance

  14. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  15. A New Method of Cloud Detection Based on Cascaded AdaBoost

    International Nuclear Information System (INIS)

    Ma, C; Chen, F; Liu, J; Duan, J

    2014-01-01

    Cloud detection of remote sensing image is a critical step in the processing of the remote sensing images. How to quickly, accurately and effectively detect cloud on remote sensing images, is still a challenging issue in this area. In order to avoid disadvantages of the current algorithms, the cascaded AdaBoost classifier algorithm is successfully applied to the cloud detection. A new algorithm combined cascaded AdaBoost classifier and multi-features, is proposed in this paper. First, multi-features based on the color, texture and spectral features are extracted from the remote sensing image. Second, the automatic cloud detection model is obtained based on the cascaded AdaBoost algorithm. In this paper, the results show that the new algorithm can determine cloud detection model and threshold values adaptively for different resolution remote sensing training data. The accuracy of cloud detection is improved. So it is a new effective algorithm for the cloud detection of remote sensing images

  16. A cloud computing based 12-lead ECG telemedicine service

    Science.gov (United States)

    2012-01-01

    Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan. PMID:22838382

  17. A cloud computing based 12-lead ECG telemedicine service.

    Science.gov (United States)

    Hsieh, Jui-Chien; Hsu, Meng-Wei

    2012-07-28

    Due to the great variability of 12-lead ECG instruments and medical specialists' interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists' decision making support in emergency telecardiology. We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG) service to realize ubiquitous 12-lead ECG tele-diagnosis. This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  18. A cloud computing based 12-lead ECG telemedicine service

    Directory of Open Access Journals (Sweden)

    Hsieh Jui-chien

    2012-07-01

    Full Text Available Abstract Background Due to the great variability of 12-lead ECG instruments and medical specialists’ interpretation skills, it remains a challenge to deliver rapid and accurate 12-lead ECG reports with senior cardiologists’ decision making support in emergency telecardiology. Methods We create a new cloud and pervasive computing based 12-lead Electrocardiography (ECG service to realize ubiquitous 12-lead ECG tele-diagnosis. Results This developed service enables ECG to be transmitted and interpreted via mobile phones. That is, tele-consultation can take place while the patient is on the ambulance, between the onsite clinicians and the off-site senior cardiologists, or among hospitals. Most importantly, this developed service is convenient, efficient, and inexpensive. Conclusions This cloud computing based ECG tele-consultation service expands the traditional 12-lead ECG applications onto the collaboration of clinicians at different locations or among hospitals. In short, this service can greatly improve medical service quality and efficiency, especially for patients in rural areas. This service has been evaluated and proved to be useful by cardiologists in Taiwan.

  19. An expert fitness diagnosis system based on elastic cloud computing.

    Science.gov (United States)

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  20. An Expert Fitness Diagnosis System Based on Elastic Cloud Computing

    Directory of Open Access Journals (Sweden)

    Kevin C. Tseng

    2014-01-01

    Full Text Available This paper presents an expert diagnosis system based on cloud computing. It classifies a user’s fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user’s physiological data, such as age, gender, and body mass index (BMI. In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8% and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  1. Simultaneous and synergistic profiling of cloud and drizzle properties using ground-based observations

    Science.gov (United States)

    Rusli, Stephanie P.; Donovan, David P.; Russchenberg, Herman W. J.

    2017-12-01

    Despite the importance of radar reflectivity (Z) measurements in the retrieval of liquid water cloud properties, it remains nontrivial to interpret Z due to the possible presence of drizzle droplets within the clouds. So far, there has been no published work that utilizes Z to identify the presence of drizzle above the cloud base in an optimized and a physically consistent manner. In this work, we develop a retrieval technique that exploits the synergy of different remote sensing systems to carry out this task and to subsequently profile the microphysical properties of the cloud and drizzle in a unified framework. This is accomplished by using ground-based measurements of Z, lidar attenuated backscatter below as well as above the cloud base, and microwave brightness temperatures. Fast physical forward models coupled to cloud and drizzle structure parameterization are used in an optimal-estimation-type framework in order to retrieve the best estimate for the cloud and drizzle property profiles. The cloud retrieval is first evaluated using synthetic signals generated from large-eddy simulation (LES) output to verify the forward models used in the retrieval procedure and the vertical parameterization of the liquid water content (LWC). From this exercise it is found that, on average, the cloud properties can be retrieved within 5 % of the mean truth. The full cloud-drizzle retrieval method is then applied to a selected ACCEPT (Analysis of the Composition of Clouds with Extended Polarization Techniques) campaign dataset collected in Cabauw, the Netherlands. An assessment of the retrieval products is performed using three independent methods from the literature; each was specifically developed to retrieve only the cloud properties, the drizzle properties below the cloud base, or the drizzle fraction within the cloud. One-to-one comparisons, taking into account the uncertainties or limitations of each retrieval, show that our results are consistent with what is derived

  2. A cloud-based information repository for bridge monitoring applications

    Science.gov (United States)

    Jeong, Seongwoon; Zhang, Yilan; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2016-04-01

    This paper describes an information repository to support bridge monitoring applications on a cloud computing platform. Bridge monitoring, with instrumentation of sensors in particular, collects significant amount of data. In addition to sensor data, a wide variety of information such as bridge geometry, analysis model and sensor description need to be stored. Data management plays an important role to facilitate data utilization and data sharing. While bridge information modeling (BrIM) technologies and standards have been proposed and they provide a means to enable integration and facilitate interoperability, current BrIM standards support mostly the information about bridge geometry. In this study, we extend the BrIM schema to include analysis models and sensor information. Specifically, using the OpenBrIM standards as the base, we draw on CSI Bridge, a commercial software widely used for bridge analysis and design, and SensorML, a standard schema for sensor definition, to define the data entities necessary for bridge monitoring applications. NoSQL database systems are employed for data repository. Cloud service infrastructure is deployed to enhance scalability, flexibility and accessibility of the data management system. The data model and systems are tested using the bridge model and the sensor data collected at the Telegraph Road Bridge, Monroe, Michigan.

  3. Cloud-based processing of multi-spectral imaging data

    Science.gov (United States)

    Bernat, Amir S.; Bolton, Frank J.; Weiser, Reuven; Levitz, David

    2017-03-01

    Multispectral imaging holds great promise as a non-contact tool for the assessment of tissue composition. Performing multi - spectral imaging on a hand held mobile device would allow to bring this technology and with it knowledge to low resource settings to provide a state of the art classification of tissue health. This modality however produces considerably larger data sets than white light imaging and requires preliminary image analysis for it to be used. The data then needs to be analyzed and logged, while not requiring too much of the system resource or a long computation time and battery use by the end point device. Cloud environments were designed to allow offloading of those problems by allowing end point devices (smartphones) to offload computationally hard tasks. For this end we present a method where the a hand held device based around a smartphone captures a multi - spectral dataset in a movie file format (mp4) and compare it to other image format in size, noise and correctness. We present the cloud configuration used for segmenting images to frames where they can later be used for further analysis.

  4. a Modeling Method of Fluttering Leaves Based on Point Cloud

    Science.gov (United States)

    Tang, J.; Wang, Y.; Zhao, Y.; Hao, W.; Ning, X.; Lv, K.; Shi, Z.; Zhao, M.

    2017-09-01

    Leaves falling gently or fluttering are common phenomenon in nature scenes. The authenticity of leaves falling plays an important part in the dynamic modeling of natural scenes. The leaves falling model has a widely applications in the field of animation and virtual reality. We propose a novel modeling method of fluttering leaves based on point cloud in this paper. According to the shape, the weight of leaves and the wind speed, three basic trajectories of leaves falling are defined, which are the rotation falling, the roll falling and the screw roll falling. At the same time, a parallel algorithm based on OpenMP is implemented to satisfy the needs of real-time in practical applications. Experimental results demonstrate that the proposed method is amenable to the incorporation of a variety of desirable effects.

  5. A MODELING METHOD OF FLUTTERING LEAVES BASED ON POINT CLOUD

    Directory of Open Access Journals (Sweden)

    J. Tang

    2017-09-01

    Full Text Available Leaves falling gently or fluttering are common phenomenon in nature scenes. The authenticity of leaves falling plays an important part in the dynamic modeling of natural scenes. The leaves falling model has a widely applications in the field of animation and virtual reality. We propose a novel modeling method of fluttering leaves based on point cloud in this paper. According to the shape, the weight of leaves and the wind speed, three basic trajectories of leaves falling are defined, which are the rotation falling, the roll falling and the screw roll falling. At the same time, a parallel algorithm based on OpenMP is implemented to satisfy the needs of real-time in practical applications. Experimental results demonstrate that the proposed method is amenable to the incorporation of a variety of desirable effects.

  6. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Science.gov (United States)

    Balasubramaniam, S.; Kavitha, V.

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud. PMID:25767826

  7. Geometric Data Perturbation-Based Personal Health Record Transactions in Cloud Computing

    Directory of Open Access Journals (Sweden)

    S. Balasubramaniam

    2015-01-01

    Full Text Available Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  8. Geometric data perturbation-based personal health record transactions in cloud computing.

    Science.gov (United States)

    Balasubramaniam, S; Kavitha, V

    2015-01-01

    Cloud computing is a new delivery model for information technology services and it typically involves the provision of dynamically scalable and often virtualized resources over the Internet. However, cloud computing raises concerns on how cloud service providers, user organizations, and governments should handle such information and interactions. Personal health records represent an emerging patient-centric model for health information exchange, and they are outsourced for storage by third parties, such as cloud providers. With these records, it is necessary for each patient to encrypt their own personal health data before uploading them to cloud servers. Current techniques for encryption primarily rely on conventional cryptographic approaches. However, key management issues remain largely unsolved with these cryptographic-based encryption techniques. We propose that personal health record transactions be managed using geometric data perturbation in cloud computing. In our proposed scheme, the personal health record database is perturbed using geometric data perturbation and outsourced to the Amazon EC2 cloud.

  9. High velocity electromagnetic particle launcher for aerosol production studies

    International Nuclear Information System (INIS)

    Benson, D.A.; Rader, D.J.

    1986-05-01

    This report describes the development of a new device for study of metal combustion, breakup and production of aerosols in a high velocity environment. Metal wires are heated and electromagnetically launched with this device to produce molten metal droplets moving at velocities ranging up to about Mach 1. Such tests are presently intended to simulate the behavior of metal streamers ejected from a high-explosive detonation. A numerical model of the launcher performance in terms of sample properties, sample geometry and pulser electrical parameters is presented which can be used as a tool for design of specific test conditions. Results from several tests showing the range of sample velocities accessible with this device are described and compared with the model. Photographic measurements showing the behavior of tungsten and zirconium metal droplets are presented. Estimates of the Weber breakup and drag on the droplets, as well as calculations of the droplet trajectories, are described. Such studies may ultimately be useful in assessing environmental hazards in the handling and storage of devices containing metallic plutonium

  10. On the origin of high-velocity runaway stars

    Science.gov (United States)

    Gvaramadze, Vasilii V.; Gualandris, Alessia; Portegies Zwart, Simon

    2009-06-01

    We explore the hypothesis that some high-velocity runaway stars attain their peculiar velocities in the course of exchange encounters between hard massive binaries and a very massive star (either an ordinary 50-100Msolar star or a more massive one, formed through runaway mergers of ordinary stars in the core of a young massive star cluster). In this process, one of the binary components becomes gravitationally bound to the very massive star, while the second one is ejected, sometimes with a high speed. We performed three-body scattering experiments and found that early B-type stars (the progenitors of the majority of neutron stars) can be ejected with velocities of >~200-400kms-1 (typical of pulsars), while 3-4Msolar stars can attain velocities of >~300-400kms-1 (typical of the bound population of halo late B-type stars). We also found that the ejected stars can occasionally attain velocities exceeding the Milky Ways's escape velocity.

  11. Study of the relations between cloud properties and atmospheric conditions using ground-based digital images

    Science.gov (United States)

    Bakalova, Kalinka

    The aerosol constituents of the earth atmosphere are of great significance for the radiation budget and global climate of the planet. They are the precursors of clouds that in turn play an essential role in these processes and in the hydrological cycle of the Earth. Understanding the complex aerosol-cloud interactions requires a detailed knowledge of the dynamical processes moving the water vapor through the atmosphere, and of the physical mechanisms involved in the formation and growth of cloud particles. Ground-based observations on regional and short time scale provide valuable detailed information about atmospheric dynamics and cloud properties, and are used as a complementary tool to the global satellite observations. The objective of the present paper is to study the physical properties of clouds as displayed in ground-based visible images, and juxtapose them to the specific surface and atmospheric meteorological conditions. The observations are being carried out over the urban area of the city of Sofia, Bulgaria. The data obtained from visible images of clouds enable a quantitative description of texture and morphological features of clouds such as shape, thickness, motion, etc. These characteristics are related to cloud microphysical properties. The changes of relative humidity and the horizontal visibility are considered to be representative of the variations of the type (natural/manmade) and amount of the atmospheric aerosols near the earth surface, and potentially, the cloud drop number concentration. The atmospheric dynamics is accounted for by means of the values of the atmospheric pressure, temperature, wind velocity, etc., observed at the earth's surface. The advantage of ground-based observations of clouds compared to satellite ones is in the high spatial and temporal resolution of the obtained data about the lowermost cloud layer, which in turn is sensitive to the meteorological regimes that determine cloud formation and evolution. It turns out

  12. Analysis of the security and privacy requirements of cloud-based electronic health records systems.

    Science.gov (United States)

    Rodrigues, Joel J P C; de la Torre, Isabel; Fernández, Gonzalo; López-Coronado, Miguel

    2013-08-21

    The Cloud Computing paradigm offers eHealth systems the opportunity to enhance the features and functionality that they offer. However, moving patients' medical information to the Cloud implies several risks in terms of the security and privacy of sensitive health records. In this paper, the risks of hosting Electronic Health Records (EHRs) on the servers of third-party Cloud service providers are reviewed. To protect the confidentiality of patient information and facilitate the process, some suggestions for health care providers are made. Moreover, security issues that Cloud service providers should address in their platforms are considered. To show that, before moving patient health records to the Cloud, security and privacy concerns must be considered by both health care providers and Cloud service providers. Security requirements of a generic Cloud service provider are analyzed. To study the latest in Cloud-based computing solutions, bibliographic material was obtained mainly from Medline sources. Furthermore, direct contact was made with several Cloud service providers. Some of the security issues that should be considered by both Cloud service providers and their health care customers are role-based access, network security mechanisms, data encryption, digital signatures, and access monitoring. Furthermore, to guarantee the safety of the information and comply with privacy policies, the Cloud service provider must be compliant with various certifications and third-party requirements, such as SAS70 Type II, PCI DSS Level 1, ISO 27001, and the US Federal Information Security Management Act (FISMA). Storing sensitive information such as EHRs in the Cloud means that precautions must be taken to ensure the safety and confidentiality of the data. A relationship built on trust with the Cloud service provider is essential to ensure a transparent process. Cloud service providers must make certain that all security mechanisms are in place to avoid unauthorized access

  13. Radiative budget and cloud radiative effect over the Atlantic from ship-based observations

    Directory of Open Access Journals (Sweden)

    J. Kalisch

    2012-10-01

    Full Text Available The aim of this study is to determine cloud-type resolved cloud radiative budgets and cloud radiative effects from surface measurements of broadband radiative fluxes over the Atlantic Ocean. Furthermore, based on simultaneous observations of the state of the cloudy atmosphere, a radiative closure study has been performed by means of the ECHAM5 single column model in order to identify the model's ability to realistically reproduce the effects of clouds on the climate system.

    An extensive database of radiative and atmospheric measurements has been established along five meridional cruises of the German research icebreaker Polarstern. Besides pyranometer and pyrgeometer for downward broadband solar and thermal radiative fluxes, a sky imager and a microwave radiometer have been utilized to determine cloud fraction and cloud type on the one hand and temperature and humidity profiles as well as liquid water path for warm non-precipitating clouds on the other hand.

    Averaged over all cruise tracks, we obtain a total net (solar + thermal radiative flux of 144 W m−2 that is dominated by the solar component. In general, the solar contribution is large for cirrus clouds and small for stratus clouds. No significant meridional dependencies were found for the surface radiation budgets and cloud effects. The strongest surface longwave cloud effects were shown in the presence of low level clouds. Clouds with a high optical density induce strong negative solar radiative effects under high solar altitudes. The mean surface net cloud radiative effect is −33 W m−2.

    For the purpose of quickly estimating the mean surface longwave, shortwave and net cloud effects in moderate, subtropical and tropical climate regimes, a new parameterisation was created, considering the total cloud amount and the solar zenith angle.

    The ECHAM5 single column model provides a surface net cloud effect that is more

  14. A Review on Broker Based Cloud Service Model

    Directory of Open Access Journals (Sweden)

    Nagarajan Rajganesh

    2016-09-01

    Full Text Available Cloud computing emerged as a utility oriented computing that facilitates resource sharing under pay-as-you-go model. Nowadays, cloud offerings are not limited to range of services and anything can be shared as a service through the Internet. In this work, a detailed literature survey with respect to cloud service discovery and composition has been accounted. A proposed architecture with the inclusion of cloud broker is presented in our work. It focuses the importance of suitable service selection and its ranking towards fulfilling the customer’s service requirements. The proposed cloud broker advocates techniques such as reasoning and decision making capabilities for the improved cloud service selection and composition.

  15. Overview of Boundary Layer Clouds Using Satellite and Ground-Based Measurements

    Science.gov (United States)

    Xi, B.; Dong, X.; Wu, P.; Qiu, S.

    2017-12-01

    A comprehensive summary of boundary layer clouds properties based on our few recently studies will be presented. The analyses include the global cloud fractions and cloud macro/micro- physical properties based on satellite measurements using both CERES-MODIS and CloudSat/Caliposo data products,; the annual/seasonal/diurnal variations of stratocumulus clouds over different climate regions (mid-latitude land, mid-latitude ocean, and Arctic region) using DOE ARM ground-based measurements over Southern great plain (SGP), Azores (GRW), and North slope of Alaska (NSA) sites; the impact of environmental conditions to the formation and dissipation process of marine boundary layer clouds over Azores site; characterizing Arctice mixed-phase cloud structure and favorable environmental conditions for the formation/maintainess of mixed-phase clouds over NSA site. Though the presentation has widely spread topics, we will focus on the representation of the ground-based measurements over different climate regions; evaluation of satellite retrieved cloud properties using these ground-based measurements, and understanding the uncertainties of both satellite and ground-based retrievals and measurements.

  16. On Cloud-Based Engineering of Dependable Systems

    OpenAIRE

    Alajrami, Sami

    2014-01-01

    The cloud computing paradigm is being adopted by many organizations in different application domains as it is cost effective and offers a virtually unlimited pool of resources. Engineering critical systems can benefit from clouds in attaining all dependability means: fault tolerance, fault prevention, fault removal and fault forecasting. Our research aims to investigate the potential of supporting engineering of dependable software systems with cloud computing and proposes an open, extensible...

  17. A Cloud Computing Based Patient Centric Medical Information System

    Science.gov (United States)

    Agarwal, Ankur; Henehan, Nathan; Somashekarappa, Vivek; Pandya, A. S.; Kalva, Hari; Furht, Borko

    This chapter discusses an emerging concept of a cloud computing based Patient Centric Medical Information System framework that will allow various authorized users to securely access patient records from various Care Delivery Organizations (CDOs) such as hospitals, urgent care centers, doctors, laboratories, imaging centers among others, from any location. Such a system must seamlessly integrate all patient records including images such as CT-SCANS and MRI'S which can easily be accessed from any location and reviewed by any authorized user. In such a scenario the storage and transmission of medical records will have be conducted in a totally secure and safe environment with a very high standard of data integrity, protecting patient privacy and complying with all Health Insurance Portability and Accountability Act (HIPAA) regulations.

  18. Method for validating cloud mask obtained from satellite measurements using ground-based sky camera.

    Science.gov (United States)

    Letu, Husi; Nagao, Takashi M; Nakajima, Takashi Y; Matsumae, Yoshiaki

    2014-11-01

    Error propagation in Earth's atmospheric, oceanic, and land surface parameters of the satellite products caused by misclassification of the cloud mask is a critical issue for improving the accuracy of satellite products. Thus, characterizing the accuracy of the cloud mask is important for investigating the influence of the cloud mask on satellite products. In this study, we proposed a method for validating multiwavelength satellite data derived cloud masks using ground-based sky camera (GSC) data. First, a cloud cover algorithm for GSC data has been developed using sky index and bright index. Then, Moderate Resolution Imaging Spectroradiometer (MODIS) satellite data derived cloud masks by two cloud-screening algorithms (i.e., MOD35 and CLAUDIA) were validated using the GSC cloud mask. The results indicate that MOD35 is likely to classify ambiguous pixels as "cloudy," whereas CLAUDIA is likely to classify them as "clear." Furthermore, the influence of error propagations caused by misclassification of the MOD35 and CLAUDIA cloud masks on MODIS derived reflectance, brightness temperature, and normalized difference vegetation index (NDVI) in clear and cloudy pixels was investigated using sky camera data. It shows that the influence of the error propagation by the MOD35 cloud mask on the MODIS derived monthly mean reflectance, brightness temperature, and NDVI for clear pixels is significantly smaller than for the CLAUDIA cloud mask; the influence of the error propagation by the CLAUDIA cloud mask on MODIS derived monthly mean cloud products for cloudy pixels is significantly smaller than that by the MOD35 cloud mask.

  19. Spatiotemporal High-Resolution Cloud Mapping with a Ground-Based IR Scanner

    Directory of Open Access Journals (Sweden)

    Benjamin Brede

    2017-01-01

    Full Text Available The high spatiotemporal variability of clouds requires automated monitoring systems. This study presents a retrieval algorithm that evaluates observations of a hemispherically scanning thermal infrared radiometer, the NubiScope, to produce georeferenced, spatially explicit cloud maps. The algorithm uses atmospheric temperature and moisture profiles and an atmospheric radiative transfer code to differentiate between cloudy and cloudless measurements. In case of a cloud, it estimates its position by using the temperature profile and viewing geometry. The proposed algorithm was tested with 25 cloud maps generated by the Fmask algorithm from Landsat 7 images. The overall cloud detection rate was ranging from 0.607 for zenith angles of 0 to 10° to 0.298 for 50–60° on a pixel basis. The overall detection of cloudless pixels was 0.987 for zenith angles of 30–40° and much more stable over the whole range of zenith angles compared to cloud detection. This proves the algorithm’s capability in detecting clouds, but even better cloudless areas. Cloud-base height was best estimated up to a height of 4000 m compared to ceilometer base heights but showed large deviation above that level. This study shows the potential of the NubiScope system to produce high spatial and temporal resolution cloud maps. Future development is needed for a more accurate determination of cloud height with thermal infrared measurements.

  20. An improved approach for flow-based cloud point extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Rocha, Fábio R P

    2014-04-11

    Novel strategies are proposed to circumvent the main drawbacks of flow-based cloud point extraction (CPE). The surfactant-rich phase (SRP) was directly retained into the optical path of the spectrophotometric cell, thus avoiding its dilution previously to the measurement and yielding higher sensitivity. Solenoid micro-pumps were exploited to improve mixing by the pulsed flow and also to modulate the flow-rate for retention and removal of the SRP, thus avoiding the elution step, often carried out with organic solvents. The heat released and the increase of the salt concentration provided by an on-line neutralization reaction were exploited to induce the cloud point without an external heating device. These innovations were demonstrated by the spectrophotometric determination of iron, yielding a linear response from 10 to 200 μg L(-1) with a coefficient of variation of 2.3% (n=7). Detection limit and sampling rate were estimated at 5 μg L(-1) (95% confidence level) and 26 samples per hour, respectively. The enrichment factor was 8.9 and the procedure consumed only 6 μg of TAN and 390 μg of Triton X-114 per determination. At the 95% confidence level, the results obtained for freshwater samples agreed with the reference procedure and those obtained for digests of bovine muscle, rice flour, brown bread and tort lobster agreed with the certified reference values. The proposed procedure thus shows advantages in relation to previously proposed approaches for flow-based CPE, being a fast and environmental friendly alternative for on-line separation and pre-concentration. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Effects of Cloud-Based m-Learning on Student Creative Performance in Engineering Design

    Science.gov (United States)

    Chang, Yu-Shan; Chen, Si-Yi; Yu, Kuang-Chao; Chu, Yih-Hsien; Chien, Yu-Hung

    2017-01-01

    This study explored the effects of cloud-based m-learning on students' creative processes and products in engineering design. A nonequivalent pretest-posttest design was adopted, and 62 university students from Taipei City, Taiwan, were recruited as research participants in the study. The results showed that cloud-based m-learning had a positive…

  2. Observed Correlation between Aerosol and Cloud Base Height for Low Clouds at Baltimore and New York, United States

    Directory of Open Access Journals (Sweden)

    Sium Gebremariam

    2018-04-01

    Full Text Available The correlation between aerosol particulate matter with aerodynamic diameter ≤2.5 μ m (PM2.5 and cloud base height (CBH of low clouds (CBH lower than 1.5 km a.g.l. at Baltimore and New York, United States, for an 8 year period (2007–2014 was investigated using information from the Automated Surface Observing System (ASOS observations and collocated U.S. Environmental Protection Agency (EPA observations. The lifting condensation level (LCL heights were calculated and compared with the CBH. The monthly average observations show that PM2.5 decreases from 2007 to 2014 while there is no significant trend found for CBH and LCL. The variability of the LCL height agrees well with CBH but LCL height is systematically lower than CBH (~180 m lower. There was a significant negative correlation found between CBH–LCL and PM2.5. All of the cloud cases were separated into polluted and clean conditions based on the distribution of PM2.5 values. The distributions of CBH–LCL in the two groups show more cloud cases with smaller CBH–LCL in polluted conditions than in clean conditions.

  3. Cloud-Based Speech Technology for Assistive Technology Applications (CloudCAST).

    Science.gov (United States)

    Cunningham, Stuart; Green, Phil; Christensen, Heidi; Atria, José Joaquín; Coy, André; Malavasi, Massimiliano; Desideri, Lorenzo; Rudzicz, Frank

    2017-01-01

    The CloudCAST platform provides a series of speech recognition services that can be integrated into assistive technology applications. The platform and the services provided by the public API are described. Several exemplar applications have been developed to demonstrate the platform to potential developers and users.

  4. Development of Web-Based Remote Desktop to Provide Adaptive User Interfaces in Cloud Platform

    OpenAIRE

    Shuen-Tai Wang; Hsi-Ya Chang

    2014-01-01

    Cloud virtualization technologies are becoming more and more prevalent, cloud users usually encounter the problem of how to access to the virtualized remote desktops easily over the web without requiring the installation of special clients. To resolve this issue, we took advantage of the HTML5 technology and developed web-based remote desktop. It permits users to access the terminal which running in our cloud platform from anywhere. We implemented a sketch of web interfac...

  5. A Multi-agent Supply Chain Information Coordination Mode Based on Cloud Computing

    OpenAIRE

    Wuxue Jiang; Jing Zhang; Junhuai Li

    2013-01-01

     In order to improve the high efficiency and security of supply chain information coordination under cloud computing environment, this paper proposes a supply chain information coordination mode based on cloud computing. This mode has two basic statuses which are online status and offline status. At the online status, cloud computing center is responsible for coordinating the whole supply chain information. At the offline status, information exchange can be realized among different nodes by u...

  6. MOMCC: Market-Oriented Architecture for Mobile Cloud Computing Based on Service Oriented Architecture

    OpenAIRE

    Abolfazli, Saeid; Sanaei, Zohreh; Gani, Abdullah; Shiraz, Muhammad

    2012-01-01

    The vision of augmenting computing capabilities of mobile devices, especially smartphones with least cost is likely transforming to reality leveraging cloud computing. Cloud exploitation by mobile devices breeds a new research domain called Mobile Cloud Computing (MCC). However, issues like portability and interoperability should be addressed for mobile augmentation which is a non-trivial task using component-based approaches. Service Oriented Architecture (SOA) is a promising design philosop...

  7. A Semantic Based Policy Management Framework for Cloud Computing Environments

    Science.gov (United States)

    Takabi, Hassan

    2013-01-01

    Cloud computing paradigm has gained tremendous momentum and generated intensive interest. Although security issues are delaying its fast adoption, cloud computing is an unstoppable force and we need to provide security mechanisms to ensure its secure adoption. In this dissertation, we mainly focus on issues related to policy management and access…

  8. Cloud Privacy Audit Framework: A Value-Based Design

    Science.gov (United States)

    Coss, David Lewis

    2013-01-01

    The rapid expansion of cloud technology provides enormous capacity, which allows for the collection, dissemination and re-identification of personal information. It is the cloud's resource capabilities such as these that fuel the concern for privacy. The impetus of these concerns are not to far removed from those expressed by Mason in 1986…

  9. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  10. Analysis of cloud-based solutions on EHRs systems in different scenarios.

    Science.gov (United States)

    Fernández-Cardeñosa, Gonzalo; de la Torre-Díez, Isabel; López-Coronado, Miguel; Rodrigues, Joel J P C

    2012-12-01

    Nowadays with the growing of the wireless connections people can access all the resources hosted in the Cloud almost everywhere. In this context, organisms can take advantage of this fact, in terms of e-Health, deploying Cloud-based solutions on e-Health services. In this paper two Cloud-based solutions for different scenarios of Electronic Health Records (EHRs) management system are proposed. We have researched articles published between the years 2005 and 2011 about the implementation of e-Health services based on the Cloud in Medline. In order to analyze the best scenario for the deployment of Cloud Computing two solutions for a large Hospital and a network of Primary Care Health centers have been studied. Economic estimation of the cost of the implementation for both scenarios has been done via the Amazon calculator tool. As a result of this analysis two solutions are suggested depending on the scenario: To deploy a Cloud solution for a large Hospital a typical Cloud solution in which are hired just the needed services has been assumed. On the other hand to work with several Primary Care Centers it's suggested the implementation of a network, which interconnects these centers with just one Cloud environment. Finally it's considered the fact of deploying a hybrid solution: in which EHRs with images will be hosted in the Hospital or Primary Care Centers and the rest of them will be migrated to the Cloud.

  11. Fault gouge rheology under confined, high-velocity conditions

    Science.gov (United States)

    Reches, Z.; Madden, A. S.; Chen, X.

    2012-12-01

    We recently developed the experimental capability to investigate the shear properties of fine-grain gouge under confined conditions and high-velocity. The experimental system includes a rotary apparatus that can apply large displacements of tens of meters, slip velocity of 0.001- 2.0 m/s, and normal stress of 35 MPa (Reches and Lockner, 2010). The key new component is a Confined ROtary Cell (CROC) that can shear a gouge layer either dry or under pore-pressure. The pore pressure is controlled by two syringe pumps. CROC includes a ring-shape gouge chamber of 62.5 mm inner diameter, 81.25 mm outer diameter, and up to 3 mm thick gouge sample. The lower, rotating part of CROC contains the sample chamber, and the upper, stationary part includes the loading, hollow cylinder and setting for temperature, and dilation measurements, and pore-pressure control. Each side of the gouge chamber has two pairs of industrial, spring-energized, self-lubricating, teflon-graphite seals, built for particle media and can work at temperature up to 250 ded C. The space between each of the two sets of seals is pressurized by nitrogen. This design generates 'zero-differential pressure' on the inner seal (which is in contact with the gouge powder), and prevents gouge leaks. For the preliminary dry experiments, we used ~2.0 mm thick layers of room-dry kaolinite powder. Total displacements were on the order of meters and normal stress up to 4 MPa. The initial shear was accommodated by multiple internal slip surfaces within the kaolinite layer accommodated as oriented Riedel shear structures. Later, the shear was localized within a thin, plate-parallel Y-surface. The kaolinite layer was compacted at a quasi-asymptotic rate, and displayed a steady-state friction coefficient of ~ 0.5 with no clear dependence on slip velocity up to 0.15 m/s. Further experiments with loose quartz sand (grain size ~ 125 micron) included both dry runs and pore-pressure (distilled water) controlled runs. The sand was

  12. PRINCIPLES OF MODERN UNIVERSITY "ACADEMIC CLOUD" FORMATION BASED ON OPEN SOFTWARE PLATFORM

    Directory of Open Access Journals (Sweden)

    Olena H. Hlazunova

    2014-09-01

    Full Text Available In the article approaches to the use of cloud technology in teaching of higher education students are analyzed. The essence of the concept of "academic cloud" and its structural elements are justified. The model of academic clouds of the modern university, which operates on the basis of open software platforms, are proposed. Examples of functional software and platforms, that provide the needs of students in e-learning resources, are given. The models of deployment Cloud-oriented environment in higher education: private cloud, infrastructure as a service and platform as a service, are analyzed. The comparison of the cost of deployment "academic cloud" based on its own infrastructure of the institution and lease infrastructure vendor are substantiated.

  13. Adaptive Cost-Based Task Scheduling in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Mohammed A. S. Mosleh

    2016-01-01

    Full Text Available Task execution in cloud computing requires obtaining stored data from remote data centers. Though this storage process reduces the memory constraints of the user’s computer, the time deadline is a serious concern. In this paper, Adaptive Cost-based Task Scheduling (ACTS is proposed to provide data access to the virtual machines (VMs within the deadline without increasing the cost. ACTS considers the data access completion time for selecting the cost effective path to access the data. To allocate data access paths, the data access completion time is computed by considering the mean and variance of the network service time and the arrival rate of network input/output requests. Then the task priority is assigned to the removed tasks based data access time. Finally, the cost of data paths are analyzed and allocated based on the task priority. Minimum cost path is allocated to the low priority tasks and fast access path are allocated to high priority tasks as to meet the time deadline. Thus efficient task scheduling can be achieved by using ACTS. The experimental results conducted in terms of execution time, computation cost, communication cost, bandwidth, and CPU utilization prove that the proposed algorithm provides better performance than the state-of-the-art methods.

  14. COEL: A Cloud-based Reaction Network Simulator

    Directory of Open Access Journals (Sweden)

    Peter eBanda

    2016-04-01

    Full Text Available Chemical Reaction Networks (CRNs are a formalism to describe the macroscopic behavior of chemical systems. We introduce COEL, a web- and cloud-based CRN simulation framework that does not require a local installation, runs simulations on a large computational grid, provides reliable database storage, and offers a visually pleasing and intuitive user interface. We present an overview of the underlying software, the technologies, and the main architectural approaches employed. Some of COEL's key features include ODE-based simulations of CRNs and multicompartment reaction networks with rich interaction options, a built-in plotting engine, automatic DNA-strand displacement transformation and visualization, SBML/Octave/Matlab export, and a built-in genetic-algorithm-based optimization toolbox for rate constants.COEL is an open-source project hosted on GitHub (http://dx.doi.org/10.5281/zenodo.46544, which allows interested research groups to deploy it on their own sever. Regular users can simply use the web instance at no cost at http://coel-sim.org. The framework is ideally suited for a collaborative use in both research and education.

  15. Aerosol formation from high-velocity uranium drops: Comparison of number and mass distributions. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Rader, D.J.; Benson, D.A.

    1995-05-01

    This report presents the results of an experimental study of the aerosol produced by the combustion of high-velocity molten-uranium droplets produced by the simultaneous heating and electromagnetic launch of uranium wires. These tests are intended to simulate the reduction of high-velocity fragments into aerosol in high-explosive detonations or reactor accidents involving nuclear materials. As reported earlier, the resulting aerosol consists mainly of web-like chain agglomerates. A condensation nucleus counter was used to investigate the decay of the total particle concentration due to coagulation and losses. Number size distributions based on mobility equivalent diameter obtained soon after launch with a Differential Mobility Particle Sizer showed lognormal distributions with an initial count median diameter (CMD) of 0.3 {mu}m and a geometric standard deviation, {sigma}{sub g} of about 2; the CMD was found to increase and {sigma}{sub g} decrease with time due to coagulation. Mass size distributions based on aerodynamic diameter were obtained for the first time with a Microorifice Uniform Deposit Impactor, which showed lognormal distributions with mass median aerodynamic diameters of about 0.5 {mu}m and an aerodynamic geometric standard deviation of about 2. Approximate methods for converting between number and mass distributions and between mobility and aerodynamic equivalent diameters are presented.

  16. Aerosol formation from high-velocity uranium drops: Comparison of number and mass distributions. Final report

    International Nuclear Information System (INIS)

    Rader, D.J.; Benson, D.A.

    1995-05-01

    This report presents the results of an experimental study of the aerosol produced by the combustion of high-velocity molten-uranium droplets produced by the simultaneous heating and electromagnetic launch of uranium wires. These tests are intended to simulate the reduction of high-velocity fragments into aerosol in high-explosive detonations or reactor accidents involving nuclear materials. As reported earlier, the resulting aerosol consists mainly of web-like chain agglomerates. A condensation nucleus counter was used to investigate the decay of the total particle concentration due to coagulation and losses. Number size distributions based on mobility equivalent diameter obtained soon after launch with a Differential Mobility Particle Sizer showed lognormal distributions with an initial count median diameter (CMD) of 0.3 μm and a geometric standard deviation, σ g of about 2; the CMD was found to increase and σ g decrease with time due to coagulation. Mass size distributions based on aerodynamic diameter were obtained for the first time with a Microorifice Uniform Deposit Impactor, which showed lognormal distributions with mass median aerodynamic diameters of about 0.5 μm and an aerodynamic geometric standard deviation of about 2. Approximate methods for converting between number and mass distributions and between mobility and aerodynamic equivalent diameters are presented

  17. CLOUD COMPUTING BASED INFORMATION SYSTEMS -PRESENT AND FUTURE

    Directory of Open Access Journals (Sweden)

    Maximilian ROBU

    2012-12-01

    Full Text Available The current economic crisis and the global recession have affected the IT market as well. A solution camefrom the Cloud Computing area by optimizing IT budgets and eliminating different types of expenses (servers, licenses,and so on. Cloud Computing is an exciting and interesting phenomenon, because of its relative novelty and explodinggrowth. Because of its raise in popularity and usage Cloud Computing has established its role as a research topic.However the tendency is to focus on the technical aspects of Cloud Computing, thus leaving the potential that thistechnology offers unexplored. With the help of this technology new market player arise and they manage to break thetraditional value chain of service provision. The main focus of this paper is the business aspects of Cloud. In particularwe will talk about the economic aspects that cover using Cloud Computing (when, why and how to use, and theimpacts on the infrastructure, the legalistic issues that come from using Cloud Computing; the scalability and partiallyunclear legislation.

  18. Statistical retrieval of thin liquid cloud microphysical properties using ground-based infrared and microwave observations

    Science.gov (United States)

    Marke, Tobias; Ebell, Kerstin; Löhnert, Ulrich; Turner, David D.

    2016-12-01

    In this article, liquid water cloud microphysical properties are retrieved by a combination of microwave and infrared ground-based observations. Clouds containing liquid water are frequently occurring in most climate regimes and play a significant role in terms of interaction with radiation. Small perturbations in the amount of liquid water contained in the cloud can cause large variations in the radiative fluxes. This effect is enhanced for thin clouds (liquid water path, LWP cloud properties crucial. Due to large relative errors in retrieving low LWP values from observations in the microwave domain and a high sensitivity for infrared methods when the LWP is low, a synergistic retrieval based on a neural network approach is built to estimate both LWP and cloud effective radius (reff). These statistical retrievals can be applied without high computational demand but imply constraints like prior information on cloud phase and cloud layering. The neural network retrievals are able to retrieve LWP and reff for thin clouds with a mean relative error of 9% and 17%, respectively. This is demonstrated using synthetic observations of a microwave radiometer (MWR) and a spectrally highly resolved infrared interferometer. The accuracy and robustness of the synergistic retrievals is confirmed by a low bias in a radiative closure study for the downwelling shortwave flux, even for marginally invalid scenes. Also, broadband infrared radiance observations, in combination with the MWR, have the potential to retrieve LWP with a higher accuracy than a MWR-only retrieval.

  19. Feasibility and demonstration of a cloud-based RIID analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Michael C., E-mail: wrightmc@ornl.gov [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Hertz, Kristin L.; Johnson, William C. [Sandia National Laboratories, Livermore, CA 94551 (United States); Sword, Eric D.; Younkin, James R. [Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Sadler, Lorraine E. [Sandia National Laboratories, Livermore, CA 94551 (United States)

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments.

  20. Feasibility and demonstration of a cloud-based RIID analysis system

    International Nuclear Information System (INIS)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-01-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed. - Highlights: • A prototype cloud-based RIID analysis system was implemented and demonstrated. • A cloud-based system was shown to be feasible with currently available technology. • A system study identified the operational characteristics required by the users. • The system study showed that the user community could derive significant benefit. • An architecture was defined for field testing by users in relevant environments

  1. Development of Presentation Model with Cloud Based Infrastructure

    Directory of Open Access Journals (Sweden)

    Magdalena Widiantari Maria

    2018-01-01

    Full Text Available Computer mediated communication are the communication activities using technology which have rapidly in progress. Communication interactive activities nowadays has no longer only involving person to person but mediated by technology, and have been done in many fields including in education and teaching activity. In this study, presentation media based on cloud's infrastructure designed to replace face to face or in class lectures. In addition, the presentation will allow media data storage indefinitely, and accessible wherever and anytime. This is in line with the concept of student center learning where students were encouraged to more active in the lecture activities. The purpose of this research is making or designing a presentation model based on cloud‘s infrastructure. This research is using research and development method which is consists of four stages, where the first phase is composing the concept of media presentation design. The second phase are choosing the subject that will be designed as the subject of presentation. The third stage is designing presentation model. And the fourth phase is collecting materials of the subject that will be presented by each lecturer.

  2. Content-based histopathology image retrieval using CometCloud.

    Science.gov (United States)

    Qi, Xin; Wang, Daihou; Rodero, Ivan; Diaz-Montes, Javier; Gensure, Rebekah H; Xing, Fuyong; Zhong, Hua; Goodell, Lauri; Parashar, Manish; Foran, David J; Yang, Lin

    2014-08-26

    The development of digital imaging technology is creating extraordinary levels of accuracy that provide support for improved reliability in different aspects of the image analysis, such as content-based image retrieval, image segmentation, and classification. This has dramatically increased the volume and rate at which data are generated. Together these facts make querying and sharing non-trivial and render centralized solutions unfeasible. Moreover, in many cases this data is often distributed and must be shared across multiple institutions requiring decentralized solutions. In this context, a new generation of data/information driven applications must be developed to take advantage of the national advanced cyber-infrastructure (ACI) which enable investigators to seamlessly and securely interact with information/data which is distributed across geographically disparate resources. This paper presents the development and evaluation of a novel content-based image retrieval (CBIR) framework. The methods were tested extensively using both peripheral blood smears and renal glomeruli specimens. The datasets and performance were evaluated by two pathologists to determine the concordance. The CBIR algorithms that were developed can reliably retrieve the candidate image patches exhibiting intensity and morphological characteristics that are most similar to a given query image. The methods described in this paper are able to reliably discriminate among subtle staining differences and spatial pattern distributions. By integrating a newly developed dual-similarity relevance feedback module into the CBIR framework, the CBIR results were improved substantially. By aggregating the computational power of high performance computing (HPC) and cloud resources, we demonstrated that the method can be successfully executed in minutes on the Cloud compared to weeks using standard computers. In this paper, we present a set of newly developed CBIR algorithms and validate them using two

  3. Design Thinking and Cloud Manufacturing: A Study of Cloud Model Sharing Platform Based on Separated Data Log

    Directory of Open Access Journals (Sweden)

    Zhe Wei

    2013-01-01

    Full Text Available To solve the product data consistency problem which is caused by the portable system that cannot conduct real-time update of product data in mobile environment under the mass customization production mode, a new product data optimistic replication method based on log is presented. This paper focuses on the design thinking provider, probing into a manufacturing resource design thinking cloud platform based on manufacturing resource-locating technologies, and also discuss several application scenarios of cloud locating technologies in the manufacturing environment. The actual demand of manufacturing creates a new mode which is service-oriented and has high efficiency and low consumption. Finally, they are different from the crowd-sourcing application model of Local-Motors. The sharing platform operator is responsible for a master plan for the platform, proposing a open interface standard and establishing a service operation mode.

  4. A Novel Cloud-Based Platform for Implementation of Oblivious Power Routing for Clusters of Microgrids

    DEFF Research Database (Denmark)

    Broojeni, Kianoosh; Amini, M. Hadi; Nejadpak, Arash

    2016-01-01

    is verified by MATLAB simulation. We also present a comprehensive cloud-based platform for further implementation of the proposed algorithm on the OPAL-RT real-time digital simulation system. The communication paths between the microgrids and the cloud environment can be emulated by OMNeT++....

  5. Trust-Enhanced Cloud Service Selection Model Based on QoS Analysis.

    Science.gov (United States)

    Pan, Yuchen; Ding, Shuai; Fan, Wenjuan; Li, Jing; Yang, Shanlin

    2015-01-01

    Cloud computing technology plays a very important role in many areas, such as in the construction and development of the smart city. Meanwhile, numerous cloud services appear on the cloud-based platform. Therefore how to how to select trustworthy cloud services remains a significant problem in such platforms, and extensively investigated owing to the ever-growing needs of users. However, trust relationship in social network has not been taken into account in existing methods of cloud service selection and recommendation. In this paper, we propose a cloud service selection model based on the trust-enhanced similarity. Firstly, the direct, indirect, and hybrid trust degrees are measured based on the interaction frequencies among users. Secondly, we estimate the overall similarity by combining the experience usability measured based on Jaccard's Coefficient and the numerical distance computed by Pearson Correlation Coefficient. Then through using the trust degree to modify the basic similarity, we obtain a trust-enhanced similarity. Finally, we utilize the trust-enhanced similarity to find similar trusted neighbors and predict the missing QoS values as the basis of cloud service selection and recommendation. The experimental results show that our approach is able to obtain optimal results via adjusting parameters and exhibits high effectiveness. The cloud services ranking by our model also have better QoS properties than other methods in the comparison experiments.

  6. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  7. Keyword-based Ciphertext Search Algorithm under Cloud Storage

    Directory of Open Access Journals (Sweden)

    Ren Xunyi

    2016-01-01

    Full Text Available With the development of network storage services, cloud storage have the advantage of high scalability , inexpensive, without access limit and easy to manage. These advantages make more and more small or medium enterprises choose to outsource large quantities of data to a third party. This way can make lots of small and medium enterprises get rid of costs of construction and maintenance, so it has broad market prospects. But now lots of cloud storage service providers can not protect data security.This result leakage of user data, so many users have to use traditional storage method.This has become one of the important factors that hinder the development of cloud storage. In this article, establishing keyword index by extracting keywords from ciphertext data. After that, encrypted data and the encrypted index upload cloud server together.User get related ciphertext by searching encrypted index, so it can response data leakage problem.

  8. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  9. Introducing two Random Forest based methods for cloud detection in remote sensing images

    Science.gov (United States)

    Ghasemian, Nafiseh; Akhoondzadeh, Mehdi

    2018-07-01

    Cloud detection is a necessary phase in satellite images processing to retrieve the atmospheric and lithospheric parameters. Currently, some cloud detection methods based on Random Forest (RF) model have been proposed but they do not consider both spectral and textural characteristics of the image. Furthermore, they have not been tested in the presence of snow/ice. In this paper, we introduce two RF based algorithms, Feature Level Fusion Random Forest (FLFRF) and Decision Level Fusion Random Forest (DLFRF) to incorporate visible, infrared (IR) and thermal spectral and textural features (FLFRF) including Gray Level Co-occurrence Matrix (GLCM) and Robust Extended Local Binary Pattern (RELBP_CI) or visible, IR and thermal classifiers (DLFRF) for highly accurate cloud detection on remote sensing images. FLFRF first fuses visible, IR and thermal features. Thereafter, it uses the RF model to classify pixels to cloud, snow/ice and background or thick cloud, thin cloud and background. DLFRF considers visible, IR and thermal features (both spectral and textural) separately and inserts each set of features to RF model. Then, it holds vote matrix of each run of the model. Finally, it fuses the classifiers using the majority vote method. To demonstrate the effectiveness of the proposed algorithms, 10 Terra MODIS and 15 Landsat 8 OLI/TIRS images with different spatial resolutions are used in this paper. Quantitative analyses are based on manually selected ground truth data. Results show that after adding RELBP_CI to input feature set cloud detection accuracy improves. Also, the average cloud kappa values of FLFRF and DLFRF on MODIS images (1 and 0.99) are higher than other machine learning methods, Linear Discriminate Analysis (LDA), Classification And Regression Tree (CART), K Nearest Neighbor (KNN) and Support Vector Machine (SVM) (0.96). The average snow/ice kappa values of FLFRF and DLFRF on MODIS images (1 and 0.85) are higher than other traditional methods. The

  10. Considering polarization in MODIS-based cloud property retrievals by using a vector radiative transfer code

    International Nuclear Information System (INIS)

    Yi, Bingqi; Huang, Xin; Yang, Ping; Baum, Bryan A.; Kattawar, George W.

    2014-01-01

    In this study, a full-vector, adding–doubling radiative transfer model is used to investigate the influence of the polarization state on cloud property retrievals from Moderate Resolution Imaging Spectroradiometer (MODIS) satellite observations. Two sets of lookup tables (LUTs) are developed for the retrieval purposes, both of which provide water cloud and ice cloud reflectivity functions at two wavelengths in various sun-satellite viewing geometries. However, only one of the LUTs considers polarization. The MODIS reflectivity observations at 0.65 μm (band 1) and 2.13 μm (band 7) are used to infer the cloud optical thickness and particle effective diameter, respectively. Results indicate that the retrievals for both water cloud and ice cloud show considerable sensitivity to polarization. The retrieved water and ice cloud effective diameter and optical thickness differences can vary by as much as ±15% due to polarization state considerations. In particular, the polarization state has more influence on completely smooth ice particles than on severely roughened ice particles. - Highlights: • Impact of polarization on satellite-based retrieval of water/ice cloud properties is studied. • Inclusion of polarization can change water/ice optical thickness and effective diameter values by up to ±15%. • Influence of polarization on cloud property retrievals depends on sun-satellite viewing geometries

  11. Point-Cloud Compression for Vehicle-Based Mobile Mapping Systems Using Portable Network Graphics

    Science.gov (United States)

    Kohira, K.; Masuda, H.

    2017-09-01

    A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG) format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  12. POINT-CLOUD COMPRESSION FOR VEHICLE-BASED MOBILE MAPPING SYSTEMS USING PORTABLE NETWORK GRAPHICS

    Directory of Open Access Journals (Sweden)

    K. Kohira

    2017-09-01

    Full Text Available A mobile mapping system is effective for capturing dense point-clouds of roads and roadside objects.Point-clouds of urban areas, residential areas, and arterial roads are useful for maintenance of infrastructure, map creation, and automatic driving. However, the data size of point-clouds measured in large areas is enormously large. A large storage capacity is required to store such point-clouds, and heavy loads will be taken on network if point-clouds are transferred through the network. Therefore, it is desirable to reduce data sizes of point-clouds without deterioration of quality. In this research, we propose a novel point-cloud compression method for vehicle-based mobile mapping systems. In our compression method, point-clouds are mapped onto 2D pixels using GPS time and the parameters of the laser scanner. Then, the images are encoded in the Portable Networking Graphics (PNG format and compressed using the PNG algorithm. In our experiments, our method could efficiently compress point-clouds without deteriorating the quality.

  13. Provenance based data integrity checking and verification in cloud environments.

    Science.gov (United States)

    Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  14. Provenance based data integrity checking and verification in cloud environments.

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    Full Text Available Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.

  15. Making Cloud-based Systems Elasticity Testing Reproducible

    OpenAIRE

    Albonico , Michel; Mottu , Jean-Marie; Sunyé , Gerson; Alvares , Frederico

    2017-01-01

    International audience; Elastic cloud infrastructures vary computational resources at runtime, i. e., elasticity, which is error-prone. That makes testing throughout elasticity crucial for those systems. Those errors are detected thanks to tests that should run deterministically many times all along the development. However, elasticity testing reproduction requires several features not supported natively by the main cloud providers, such as Amazon EC2. We identify three requirements that we c...

  16. Provenance based data integrity checking and verification in cloud environments

    Science.gov (United States)

    Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais

    2017-01-01

    Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151

  17. QoE Guarantee Scheme Based on Cooperative Cognitive Cloud and Opportunistic Weight Particle Swarm

    Directory of Open Access Journals (Sweden)

    Weihang Shi

    2015-01-01

    Full Text Available It is well known that the Internet application of cloud services may be affected by the inefficiency of cloud computing and inaccurate evaluation of quality of experience (QoE seriously. In our paper, based on construction algorithms of cooperative cognitive cloud platform and optimization algorithm of opportunities weight particle swarm clustering, the QoE guarantee mechanism was proposed. The mechanism, through the sending users of requests and the cognitive neighbor users’ cooperation, combined the cooperation of subcloud platforms and constructed the optimal cloud platform with the different service. At the same time, the particle swarm optimization algorithm could be enhanced dynamically according to all kinds of opportunity request weight, which could optimize the cooperative cognitive cloud platform. Finally, the QoE guarantee scheme was proposed with the opportunity weight particle swarm optimization algorithm and collaborative cognitive cloud platform. The experimental results show that the proposed mechanism compared is superior to the QoE guarantee scheme based on cooperative cloud and QoE guarantee scheme based on particle swarm optimization, compared with optimization fitness and high cloud computing service execution efficiency and high throughput performance advantages.

  18. a Cloud-Based Architecture for Smart Video Surveillance

    Science.gov (United States)

    Valentín, L.; Serrano, S. A.; Oves García, R.; Andrade, A.; Palacios-Alonso, M. A.; Sucar, L. Enrique

    2017-09-01

    Turning a city into a smart city has attracted considerable attention. A smart city can be seen as a city that uses digital technology not only to improve the quality of people's life, but also, to have a positive impact in the environment and, at the same time, offer efficient and easy-to-use services. A fundamental aspect to be considered in a smart city is people's safety and welfare, therefore, having a good security system becomes a necessity, because it allows us to detect and identify potential risk situations, and then take appropriate decisions to help people or even prevent criminal acts. In this paper we present an architecture for automated video surveillance based on the cloud computing schema capable of acquiring a video stream from a set of cameras connected to the network, process that information, detect, label and highlight security-relevant events automatically, store the information and provide situational awareness in order to minimize response time to take the appropriate action.

  19. Development of Cloud-Based UAV Monitoring and Management System.

    Science.gov (United States)

    Itkin, Mason; Kim, Mihui; Park, Younghee

    2016-11-15

    Unmanned aerial vehicles (UAVs) are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air). An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery). The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation.

  20. Development of Cloud-Based UAV Monitoring and Management System

    Directory of Open Access Journals (Sweden)

    Mason Itkin

    2016-11-01

    Full Text Available Unmanned aerial vehicles (UAVs are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air. An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery. The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation.

  1. Development of Cloud-Based UAV Monitoring and Management System

    Science.gov (United States)

    Itkin, Mason; Kim, Mihui; Park, Younghee

    2016-01-01

    Unmanned aerial vehicles (UAVs) are an emerging technology with the potential to revolutionize commercial industries and the public domain outside of the military. UAVs would be able to speed up rescue and recovery operations from natural disasters and can be used for autonomous delivery systems (e.g., Amazon Prime Air). An increase in the number of active UAV systems in dense urban areas is attributed to an influx of UAV hobbyists and commercial multi-UAV systems. As airspace for UAV flight becomes more limited, it is important to monitor and manage many UAV systems using modern collision avoidance techniques. In this paper, we propose a cloud-based web application that provides real-time flight monitoring and management for UAVs. For each connected UAV, detailed UAV sensor readings from the accelerometer, GPS sensor, ultrasonic sensor and visual position cameras are provided along with status reports from the smaller internal components of UAVs (i.e., motor and battery). The dynamic map overlay visualizes active flight paths and current UAV locations, allowing the user to monitor all aircrafts easily. Our system detects and prevents potential collisions by automatically adjusting UAV flight paths and then alerting users to the change. We develop our proposed system and demonstrate its feasibility and performances through simulation. PMID:27854267

  2. A CLOUD-BASED ARCHITECTURE FOR SMART VIDEO SURVEILLANCE

    Directory of Open Access Journals (Sweden)

    L. Valentín

    2017-09-01

    Full Text Available Turning a city into a smart city has attracted considerable attention. A smart city can be seen as a city that uses digital technology not only to improve the quality of people’s life, but also, to have a positive impact in the environment and, at the same time, offer efficient and easy-to-use services. A fundamental aspect to be considered in a smart city is people’s safety and welfare, therefore, having a good security system becomes a necessity, because it allows us to detect and identify potential risk situations, and then take appropriate decisions to help people or even prevent criminal acts. In this paper we present an architecture for automated video surveillance based on the cloud computing schema capable of acquiring a video stream from a set of cameras connected to the network, process that information, detect, label and highlight security-relevant events automatically, store the information and provide situational awareness in order to minimize response time to take the appropriate action.

  3. Towards a Cloud Based Smart Traffic Management Framework

    Science.gov (United States)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  4. TOWARDS A CLOUD BASED SMART TRAFFIC MANAGEMENT FRAMEWORK

    Directory of Open Access Journals (Sweden)

    M. M. Rahimi

    2017-09-01

    Full Text Available Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  5. Knowledge Based Cloud FE Simulation of Sheet Metal Forming Processes.

    Science.gov (United States)

    Zhou, Du; Yuan, Xi; Gao, Haoxiang; Wang, Ailing; Liu, Jun; El Fakir, Omer; Politis, Denis J; Wang, Liliang; Lin, Jianguo

    2016-12-13

    The use of Finite Element (FE) simulation software to adequately predict the outcome of sheet metal forming processes is crucial to enhancing the efficiency and lowering the development time of such processes, whilst reducing costs involved in trial-and-error prototyping. Recent focus on the substitution of steel components with aluminum alloy alternatives in the automotive and aerospace sectors has increased the need to simulate the forming behavior of such alloys for ever more complex component geometries. However these alloys, and in particular their high strength variants, exhibit limited formability at room temperature, and high temperature manufacturing technologies have been developed to form them. Consequently, advanced constitutive models are required to reflect the associated temperature and strain rate effects. Simulating such behavior is computationally very expensive using conventional FE simulation techniques. This paper presents a novel Knowledge Based Cloud FE (KBC-FE) simulation technique that combines advanced material and friction models with conventional FE simulations in an efficient manner thus enhancing the capability of commercial simulation software packages. The application of these methods is demonstrated through two example case studies, namely: the prediction of a material's forming limit under hot stamping conditions, and the tool life prediction under multi-cycle loading conditions.

  6. Urbanization Causes Increased Cloud Base Height and Decreased Fog in Coastal Southern California

    Science.gov (United States)

    Williams, A. Park; Schwartz, Rachel E.; Iacobellis, Sam; Seager, Richard; Cook, Benjamin I.; Still, Christopher J.; Husak, Gregory; Michaelsen, Joel

    2015-01-01

    Subtropical marine stratus clouds regulate coastal and global climate, but future trends in these clouds are uncertain. In coastal Southern California (CSCA), interannual variations in summer stratus cloud occurrence are spatially coherent across 24 airfields and dictated by positive relationships with stability above the marine boundary layer (MBL) and MBL height. Trends, however, have been spatially variable since records began in the mid-1900s due to differences in nighttime warming. Among CSCA airfields, differences in nighttime warming, but not daytime warming, are strongly and positively related to fraction of nearby urban cover, consistent with an urban heat island effect. Nighttime warming raises the near-surface dew point depression, which lifts the altitude of condensation and cloud base height, thereby reducing fog frequency. Continued urban warming, rising cloud base heights, and associated effects on energy and water balance would profoundly impact ecological and human systems in highly populated and ecologically diverse CSCA.

  7. A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data

    Science.gov (United States)

    Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.

    2017-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  8. Cloud-based preoperative planning for total hip arthroplasty: a study of accuracy, efficiency, and compliance.

    Science.gov (United States)

    Maratt, Joseph D; Srinivasan, Ramesh C; Dahl, William J; Schilling, Peter L; Urquhart, Andrew G

    2012-08-01

    As digital radiography becomes more prevalent, several systems for digital preoperative planning have become available. The purpose of this study was to evaluate the accuracy and efficiency of an inexpensive, cloud-based digital templating system, which is comparable with acetate templating. However, cloud-based templating is substantially faster and more convenient than acetate templating or locally installed software. Although this is a practical solution for this particular medical application, regulatory changes are necessary before the tremendous advantages of cloud-based storage and computing can be realized in medical research and clinical practice. Copyright 2012, SLACK Incorporated.

  9. A novel cost based model for energy consumption in cloud computing.

    Science.gov (United States)

    Horri, A; Dastghaibyfard, Gh

    2015-01-01

    Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.

  10. Prediction Based Proactive Thermal Virtual Machine Scheduling in Green Clouds

    Science.gov (United States)

    Kinger, Supriya; Kumar, Rajesh; Sharma, Anju

    2014-01-01

    Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated. PMID:24737962

  11. Prediction Based Proactive Thermal Virtual Machine Scheduling in Green Clouds

    Directory of Open Access Journals (Sweden)

    Supriya Kinger

    2014-01-01

    Full Text Available Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated.

  12. Prediction based proactive thermal virtual machine scheduling in green clouds.

    Science.gov (United States)

    Kinger, Supriya; Kumar, Rajesh; Sharma, Anju

    2014-01-01

    Cloud computing has rapidly emerged as a widely accepted computing paradigm, but the research on Cloud computing is still at an early stage. Cloud computing provides many advanced features but it still has some shortcomings such as relatively high operating cost and environmental hazards like increasing carbon footprints. These hazards can be reduced up to some extent by efficient scheduling of Cloud resources. Working temperature on which a machine is currently running can be taken as a criterion for Virtual Machine (VM) scheduling. This paper proposes a new proactive technique that considers current and maximum threshold temperature of Server Machines (SMs) before making scheduling decisions with the help of a temperature predictor, so that maximum temperature is never reached. Different workload scenarios have been taken into consideration. The results obtained show that the proposed system is better than existing systems of VM scheduling, which does not consider current temperature of nodes before making scheduling decisions. Thus, a reduction in need of cooling systems for a Cloud environment has been obtained and validated.

  13. The thin border between cloud and aerosol: Sensitivity of several ground based observation techniques

    Science.gov (United States)

    Calbó, Josep; Long, Charles N.; González, Josep-Abel; Augustine, John; McComiskey, Allison

    2017-11-01

    Cloud and aerosol are two manifestations of what it is essentially the same physical phenomenon: a suspension of particles in the air. The differences between the two come from the different composition (e.g., much higher amount of condensed water in particles constituting a cloud) and/or particle size, and also from the different number of such particles (10-10,000 particles per cubic centimeter depending on conditions). However, there exist situations in which the distinction is far from obvious, and even when broken or scattered clouds are present in the sky, the borders between cloud/not cloud are not always well defined, a transition area that has been coined as the ;twilight zone;. The current paper presents a discussion on the definition of cloud and aerosol, the need for distinguishing or for considering the continuum between the two, and suggests a quantification of the importance and frequency of such ambiguous situations, founded on several ground-based observing techniques. Specifically, sensitivity analyses are applied on sky camera images and broadband and spectral radiometric measurements taken at Girona (Spain) and Boulder (Co, USA). Results indicate that, at these sites, in more than 5% of the daytime hours the sky may be considered cloudless (but containing aerosols) or cloudy (with some kind of optically thin clouds) depending on the observing system and the thresholds applied. Similarly, at least 10% of the time the extension of scattered or broken clouds into clear areas is problematic to establish, and depends on where the limit is put between cloud and aerosol. These findings are relevant to both technical approaches for cloud screening and sky cover categorization algorithms and radiative transfer studies, given the different effect of clouds and aerosols (and the different treatment in models) on the Earth's radiation balance.

  14. Automated cloud classification using a ground based infra-red camera and texture analysis techniques

    Science.gov (United States)

    Rumi, Emal; Kerr, David; Coupland, Jeremy M.; Sandford, Andrew P.; Brettle, Mike J.

    2013-10-01

    Clouds play an important role in influencing the dynamics of local and global weather and climate conditions. Continuous monitoring of clouds is vital for weather forecasting and for air-traffic control. Convective clouds such as Towering Cumulus (TCU) and Cumulonimbus clouds (CB) are associated with thunderstorms, turbulence and atmospheric instability. Human observers periodically report the presence of CB and TCU clouds during operational hours at airports and observatories; however such observations are expensive and time limited. Robust, automatic classification of cloud type using infrared ground-based instrumentation offers the advantage of continuous, real-time (24/7) data capture and the representation of cloud structure in the form of a thermal map, which can greatly help to characterise certain cloud formations. The work presented here utilised a ground based infrared (8-14 μm) imaging device mounted on a pan/tilt unit for capturing high spatial resolution sky images. These images were processed to extract 45 separate textural features using statistical and spatial frequency based analytical techniques. These features were used to train a weighted k-nearest neighbour (KNN) classifier in order to determine cloud type. Ground truth data were obtained by inspection of images captured simultaneously from a visible wavelength colour camera at the same installation, with approximately the same field of view as the infrared device. These images were classified by a trained cloud observer. Results from the KNN classifier gave an encouraging success rate. A Probability of Detection (POD) of up to 90% with a Probability of False Alarm (POFA) as low as 16% was achieved.

  15. A hybrid cloud read aligner based on MinHash and kmer voting that preserves privacy

    Science.gov (United States)

    Popic, Victoria; Batzoglou, Serafim

    2017-05-01

    Low-cost clouds can alleviate the compute and storage burden of the genome sequencing data explosion. However, moving personal genome data analysis to the cloud can raise serious privacy concerns. Here, we devise a method named Balaur, a privacy preserving read mapper for hybrid clouds based on locality sensitive hashing and kmer voting. Balaur can securely outsource a substantial fraction of the computation to the public cloud, while being highly competitive in accuracy and speed with non-private state-of-the-art read aligners on short read data. We also show that the method is significantly faster than the state of the art in long read mapping. Therefore, Balaur can enable institutions handling massive genomic data sets to shift part of their analysis to the cloud without sacrificing accuracy or exposing sensitive information to an untrusted third party.

  16. Buildings and Terrain of Urban Area Point Cloud Segmentation based on PCL

    International Nuclear Information System (INIS)

    Liu, Ying; Zhong, Ruofei

    2014-01-01

    One current problem with laser radar point data classification is building and urban terrain segmentation, this paper proposes a point cloud segmentation method base on PCL libraries. PCL is a large cross-platform open source C++ programming library, which implements a large number of point cloud related efficient data structures and generic algorithms involving point cloud retrieval, filtering, segmentation, registration, feature extraction and curved surface reconstruction, visualization, etc. Due to laser radar point cloud characteristics with large amount of data, unsymmetrical distribution, this paper proposes using the data structure of kd-tree to organize data; then using Voxel Grid filter for point cloud resampling, namely to reduce the amount of point cloud data, and at the same time keep the point cloud shape characteristic; use PCL Segmentation Module, we use a Euclidean Cluster Extraction class with Europe clustering for buildings and ground three-dimensional point cloud segmentation. The experimental results show that this method avoids the multiple copy system existing data needs, saves the program storage space through the call of PCL library method and class, shortens the program compiled time and improves the running speed of the program

  17. A Security Monitoring Method Based on Autonomic Computing for the Cloud Platform

    Directory of Open Access Journals (Sweden)

    Jingjie Zhang

    2018-01-01

    Full Text Available With the continuous development of cloud computing, cloud security has become one of the most important issues in cloud computing. For example, data stored in the cloud platform may be attacked, and its security is difficult to be guaranteed. Therefore, we must attach weight to the issue of how to protect the data stored in the cloud. To protect data, data monitoring is a necessary process. Based on autonomic computing, we develop a cloud data monitoring system on the cloud platform, monitoring whether the data is abnormal in the cycle and analyzing the security of the data according to the monitored results. In this paper, the feasibility of the scheme can be verified through simulation. The results show that the proposed method can adapt to the dynamic change of cloud platform load, and it can also accurately evaluate the degree of abnormal data. Meanwhile, by adjusting monitoring frequency automatically, it improves the accuracy and timeliness of monitoring. Furthermore, it can reduce the monitoring cost of the system in normal operation process.

  18. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    Science.gov (United States)

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  19. City Hub : a cloud based IoT platform for Smart Cities

    OpenAIRE

    Lea, Rodger; Blackstock, Michael

    2014-01-01

    Cloud based Smart City hubs are an attractive approach to addressing some of the complex issues faced when deploying PaaS infrastructure for Smart Cities. In this paper we introduce the general notion of IoT hubs and then discusses our work to generalize our IoT hub as a Smart City PaaS. Two key issues are identified, support for hybrid public/private cloud and interoperability. We briefly describe our approach to these issues and discuss our experiences deploying two cloud-based Smart City h...

  20. The State of Cloud-Based Biospecimen and Biobank Data Management Tools.

    Science.gov (United States)

    Paul, Shonali; Gade, Aditi; Mallipeddi, Sumani

    2017-04-01

    Biobanks are critical for collecting and managing high-quality biospecimens from donors with appropriate clinical annotation. The high-quality human biospecimens and associated data are required to better understand disease processes. Therefore, biobanks have become an important and essential resource for healthcare research and drug discovery. However, collecting and managing huge volumes of data (biospecimens and associated clinical data) necessitate that biobanks use appropriate data management solutions that can keep pace with the ever-changing requirements of research. To automate biobank data management, biobanks have been investing in traditional Laboratory Information Management Systems (LIMS). However, there are a myriad of challenges faced by biobanks in acquiring traditional LIMS. Traditional LIMS are cost-intensive and often lack the flexibility to accommodate changes in data sources and workflows. Cloud technology is emerging as an alternative that provides the opportunity to small and medium-sized biobanks to automate their operations in a cost-effective manner, even without IT personnel. Cloud-based solutions offer the advantage of heightened security, rapid scalability, dynamic allocation of services, and can facilitate collaboration between different research groups by using a shared environment on a "pay-as-you-go" basis. The benefits offered by cloud technology have resulted in the development of cloud-based data management solutions as an alternative to traditional on-premise software. After evaluating the advantages offered by cloud technology, several biobanks have started adopting cloud-based tools. Cloud-based tools provide biobanks with easy access to biospecimen data for real-time sharing with clinicians. Another major benefit realized by biobanks by implementing cloud-based applications is unlimited data storage on the cloud and automatic backups for protecting any data loss in the face of natural calamities.

  1. Searchable attribute-based encryption scheme with attribute revocation in cloud storage.

    Science.gov (United States)

    Wang, Shangping; Zhao, Duqiao; Zhang, Yaling

    2017-01-01

    Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.

  2. Continuously deformation monitoring of subway tunnel based on terrestrial point clouds

    NARCIS (Netherlands)

    Kang, Z.; Tuo, L.; Zlatanova, S.

    2012-01-01

    The deformation monitoring of subway tunnel is of extraordinary necessity. Therefore, a method for deformation monitoring based on terrestrial point clouds is proposed in this paper. First, the traditional adjacent stations registration is replaced by sectioncontrolled registration, so that the

  3. Cloud-Based Social Media Visual Analytics Disaster Response System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a next-generation cloud-based social media visual analytics disaster response system that will enable decision-makers and first-responders to obtain...

  4. OpenID Connect as a security service in cloud-based medical imaging systems.

    Science.gov (United States)

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  5. A privacy-preserving framework for outsourcing location-based services to the cloud

    OpenAIRE

    Zhu, Xiaojie; Ayday, Erman; Vitenberg, Roman

    2018-01-01

    Thanks to the popularity of mobile devices a large number of location-based services (LBS) have emerged. While a large number of privacy-preserving solutions for LBS have been proposed, most of these solutions do not consider the fact that LBS are typically cloud-based nowadays. Outsourcing data and computation to the cloud raises a number of significant challenges related to data confidentiality, user identity and query privacy, fine-grain access control, and query expressiveness. In this wo...

  6. An Analysis of Resilience of a Cloud Based Incident Notification Process

    OpenAIRE

    Vrieze , Paul ,; Xu , Lai

    2015-01-01

    Part 2: Agility and Resilience in Collaborative Networks; International audience; Cloud based Business Process Management (BPM) systems have provided SMEs with BPM in a pay-per-use manner. Previous work has focused on looking at cloud based BPM from the perspectives of distribution of data, activity or/and process engine and related issues, such as scalability of system, security of data, distribution of data and activities. To achieve business agility, business process collaboration needs to...

  7. Future Directions of Applying Healthcare Cloud for Home-based Chronic Disease Care

    OpenAIRE

    Hu, Yan; Eriksén, Sara; Lundberg, Jenny

    2017-01-01

    The care of chronic disease has become the main challenge for healthcare institutions around the world. To meet the growing needs of patients, moving the front desk of healthcare from hospital to home is essential. Recently, cloud computing has been applied to healthcare domain; however, adapting to and using this technology effectively for home-based care is still in its initial phase. We have proposed a conceptual hybrid cloud model for home-based chronic disease care, and have evaluated it...

  8. A Reference Architecture for a Cloud-Based Tools as a Service Workspace

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali; Sheng, Quan Z.

    2015-01-01

    Software Architecture (SA) plays a critical role in developing and evolving cloud-based applications. We present a Reference Architecture (RA) for designing Cloud-based Tools as a service work SPACE (TSPACE) - a platform for provisioning chain of tools following the Software as a Service (SaaS...... evaluate the RA in terms of completeness and feasibility. Our proposed RA can provide valuable guidance and insights for designing and implementing concrete software architectures of TSPACE....

  9. Experimental Investigation of Multi-layer Insulation Effect on Damage of Stuffed Shield by High-velocity Impact

    Directory of Open Access Journals (Sweden)

    GUAN Gong-shun

    2016-09-01

    Full Text Available The stuffed shield with multi-layer insulation(MLI was designed by improving on Al Whipple shield, and a series of high-velocity impact tests were practiced with a two-stage light gas gun facility at vacuum environment. The damage model of the stuffed shield with different MLI location by Al-sphere projectile impacting was obtained. The effect of MLI on damage of the stuffed shield by high-velocity impact was studied. The results indicate when the MLI is located at front side of the first Al-plate, the protection performance of the stuffed shield is improved with the larger perforation diameter of the first Al-plate and more impact kinetic energy dissipation of the projectile. When MLI is arranged at back side of the first Al-plate, the expansion of the secondary debris cloud from projectile impacting the first Al-plate is restrained, it is not good to improve the protection performance of the stuffed shield. When MLI is arranged at front side of the stuffed wall, the perforation size of the stuffed wall increases; when MLI is arranged at front side of the rear wall, the distribution range of crater on the rear wall decreases.

  10. Move It or Lose It: Cloud-Based Data Storage

    Science.gov (United States)

    Waters, John K.

    2010-01-01

    There was a time when school districts showed little interest in storing or backing up their data to remote servers. Nothing seemed less secure than handing off data to someone else. But in the last few years the buzz around cloud storage has grown louder, and the idea that data backup could be provided as a service has begun to gain traction in…

  11. A Cloud-User Protocol Based on Ciphertext Watermarking Technology

    Directory of Open Access Journals (Sweden)

    Keyang Liu

    2017-01-01

    Full Text Available With the growth of cloud computing technology, more and more Cloud Service Providers (CSPs begin to provide cloud computing service to users and ask for users’ permission of using their data to improve the quality of service (QoS. Since these data are stored in the form of plain text, they bring about users’ worry for the risk of privacy leakage. However, the existing watermark embedding and encryption technology is not suitable for protecting the Right to Be Forgotten. Hence, we propose a new Cloud-User protocol as a solution for plain text outsourcing problem. We only allow users and CSPs to embed the ciphertext watermark, which is generated and embedded by Trusted Third Party (TTP, into the ciphertext data for transferring. Then, the receiver decrypts it and obtains the watermarked data in plain text. In the arbitration stage, feature extraction and the identity of user will be used to identify the data. The fixed Hamming distance code can help raise the system’s capability for watermarks as much as possible. Extracted watermark can locate the unauthorized distributor and protect the right of honest CSP. The results of experiments demonstrate the security and validity of our protocol.

  12. Spectral cumulus parameterization based on cloud-resolving model

    Science.gov (United States)

    Baba, Yuya

    2018-02-01

    We have developed a spectral cumulus parameterization using a cloud-resolving model. This includes a new parameterization of the entrainment rate which was derived from analysis of the cloud properties obtained from the cloud-resolving model simulation and was valid for both shallow and deep convection. The new scheme was examined in a single-column model experiment and compared with the existing parameterization of Gregory (2001, Q J R Meteorol Soc 127:53-72) (GR scheme). The results showed that the GR scheme simulated more shallow and diluted convection than the new scheme. To further validate the physical performance of the parameterizations, Atmospheric Model Intercomparison Project (AMIP) experiments were performed, and the results were compared with reanalysis data. The new scheme performed better than the GR scheme in terms of mean state and variability of atmospheric circulation, i.e., the new scheme improved positive bias of precipitation in western Pacific region, and improved positive bias of outgoing shortwave radiation over the ocean. The new scheme also simulated better features of convectively coupled equatorial waves and Madden-Julian oscillation. These improvements were found to be derived from the modification of parameterization for the entrainment rate, i.e., the proposed parameterization suppressed excessive increase of entrainment, thus suppressing excessive increase of low-level clouds.

  13. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment

    Directory of Open Access Journals (Sweden)

    Vinothkumar Muthurajan

    2016-01-01

    Full Text Available Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function provide minimum protection level compared to asymmetric key (RSA, AES, and ECC schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  14. An Elliptic Curve Based Schnorr Cloud Security Model in Distributed Environment.

    Science.gov (United States)

    Muthurajan, Vinothkumar; Narayanasamy, Balaji

    2016-01-01

    Cloud computing requires the security upgrade in data transmission approaches. In general, key-based encryption/decryption (symmetric and asymmetric) mechanisms ensure the secure data transfer between the devices. The symmetric key mechanisms (pseudorandom function) provide minimum protection level compared to asymmetric key (RSA, AES, and ECC) schemes. The presence of expired content and the irrelevant resources cause unauthorized data access adversely. This paper investigates how the integrity and secure data transfer are improved based on the Elliptic Curve based Schnorr scheme. This paper proposes a virtual machine based cloud model with Hybrid Cloud Security Algorithm (HCSA) to remove the expired content. The HCSA-based auditing improves the malicious activity prediction during the data transfer. The duplication in the cloud server degrades the performance of EC-Schnorr based encryption schemes. This paper utilizes the blooming filter concept to avoid the cloud server duplication. The combination of EC-Schnorr and blooming filter efficiently improves the security performance. The comparative analysis between proposed HCSA and the existing Distributed Hash Table (DHT) regarding execution time, computational overhead, and auditing time with auditing requests and servers confirms the effectiveness of HCSA in the cloud security model creation.

  15. Cooperative Cloud Service Aware Mobile Internet Coverage Connectivity Guarantee Protocol Based on Sensor Opportunistic Coverage Mechanism

    Directory of Open Access Journals (Sweden)

    Qin Qin

    2015-01-01

    Full Text Available In order to improve the Internet coverage ratio and provide connectivity guarantee, based on sensor opportunistic coverage mechanism and cooperative cloud service, we proposed the coverage connectivity guarantee protocol for mobile Internet. In this scheme, based on the opportunistic covering rules, the network coverage algorithm of high reliability and real-time security was achieved by using the opportunity of sensor nodes and the Internet mobile node. Then, the cloud service business support platform is created based on the Internet application service management capabilities and wireless sensor network communication service capabilities, which is the architecture of the cloud support layer. The cooperative cloud service aware model was proposed. Finally, we proposed the mobile Internet coverage connectivity guarantee protocol. The results of experiments demonstrate that the proposed algorithm has excellent performance, in terms of the security of the Internet and the stability, as well as coverage connectivity ability.

  16. Integration of cloud-based storage in BES III computing environment

    International Nuclear Information System (INIS)

    Wang, L; Hernandez, F; Deng, Z

    2014-01-01

    We present an on-going work that aims to evaluate the suitability of cloud-based storage as a supplement to the Lustre file system for storing experimental data for the BES III physics experiment and as a backend for storing files belonging to individual members of the collaboration. In particular, we discuss our findings regarding the support of cloud-based storage in the software stack of the experiment. We report on our development work that improves the support of CERN' s ROOT data analysis framework and allows efficient remote access to data through several cloud storage protocols. We also present our efforts providing the experiment with efficient command line tools for navigating and interacting with cloud storage-based data repositories both from interactive sessions and grid jobs.

  17. Providing a New Model for Discovering Cloud Services Based on Ontology

    Directory of Open Access Journals (Sweden)

    B. Heydari

    2017-12-01

    Full Text Available Due to its efficient, flexible, and dynamic substructure in information technology and service quality parameters estimation, cloud computing has become one of the most important issues in computer world. Discovering cloud services has been posed as a fundamental issue in reaching out high efficiency. In order to do one’s own operations in cloud space, any user needs to request several various services either simultaneously or according to a working routine. These services can be presented by different cloud producers or different decision-making policies. Therefore, service management is one of the important and challenging issues in cloud computing. With the advent of semantic web and practical services accordingly in cloud computing space, access to different kinds of applications has become possible. Ontology is the core of semantic web and can be used to ease the process of discovering services. A new model based on ontology has been proposed in this paper. The results indicate that the proposed model has explored cloud services based on user search results in lesser time compared to other models.

  18. Subtropical and Polar Cirrus Clouds Characterized by Ground-Based Lidars and CALIPSO/CALIOP Observations

    Directory of Open Access Journals (Sweden)

    Córdoba-Jabonero Carmen

    2016-01-01

    Full Text Available Cirrus clouds are product of weather processes, and then their occurrence and macrophysical/optical properties can vary significantly over different regions of the world. Lidars can provide height-resolved measurements with a relatively good both vertical and temporal resolutions, making them the most suitable instrumentation for high-cloud observations. The aim of this work is to show the potential of lidar observations on Cirrus clouds detection in combination with a recently proposed methodology to retrieve the Cirrus clouds macrophysical and optical features. In this sense, a few case studies of cirrus clouds observed at both subtropical and polar latitudes are examined and compared to CALIPSO/CALIOP observations. Lidar measurements are carried out in two stations: the Metropolitan city of Sao Paulo (MSP, Brazil, 23.3°S 46.4°W, located at subtropical latitudes, and the Belgrano II base (BEL, Argentina, 78ºS 35ºW in the Antarctic continent. Optical (COD-cloud optical depth and LR-Lidar Ratio and macrophysical (top/base heights and thickness properties of both the subtropical and polar cirrus clouds are reported. In general, subtropical Cirrus clouds present lower LR values and are found at higher altitudes than those detected at polar latitudes. In general, Cirrus clouds are detected at similar altitudes by CALIOP. However, a poor agreement is achieved in the LR retrieved between ground-based lidars and space-borne CALIOP measurements, likely due to the use of a fixed (or low-variable LR value in CALIOP inversion procedures.

  19. Improved cloud parameterization for Arctic climate simulations based on satellite data

    Science.gov (United States)

    Klaus, Daniel; Dethloff, Klaus; Dorn, Wolfgang; Rinke, Annette

    2015-04-01

    The defective representation of Arctic cloud processes and properties remains a crucial problem in climate modelling and in reanalysis products. Satellite-based cloud observations (MODIS and CPR/CALIOP) and single-column model simulations (HIRHAM5-SCM) were exploited to evaluate and improve the simulated Arctic cloud cover of the atmospheric regional climate model HIRHAM5. The ECMWF reanalysis dataset 'ERA-Interim' (ERAint) was used for the model initialization, the lateral boundary forcing as well as the dynamical relaxation inside the pan-Arctic domain. HIRHAM5 has a horizontal resolution of 0.25° and uses 40 pressure-based and terrain-following vertical levels. In comparison with the satellite observations, the HIRHAM5 control run (HH5ctrl) systematically overestimates total cloud cover, but to a lesser extent than ERAint. The underestimation of high- and mid-level clouds is strongly outweighed by the overestimation of low-level clouds. Numerous sensitivity studies with HIRHAM5-SCM suggest (1) the parameter tuning, enabling a more efficient Bergeron-Findeisen process, combined with (2) an extension of the prognostic-statistical (PS) cloud scheme, enabling the use of negatively skewed beta distributions. This improved model setup was then used in a corresponding HIRHAM5 sensitivity run (HH5sens). While the simulated high- and mid-level cloud cover is improved only to a limited extent, the large overestimation of low-level clouds can be systematically and significantly reduced, especially over sea ice. Consequently, the multi-year annual mean area average of total cloud cover with respect to sea ice is almost 14% lower than in HH5ctrl. Overall, HH5sens slightly underestimates the observed total cloud cover but shows a halved multi-year annual mean bias of 2.2% relative to CPR/CALIOP at all latitudes north of 60° N. Importantly, HH5sens produces a more realistic ratio between the cloud water and ice content. The considerably improved cloud simulation manifests in

  20. Statistical Comparison of Cloud and Aerosol Vertical Properties between Two Eastern China Regions Based on CloudSat/CALIPSO Data

    Directory of Open Access Journals (Sweden)

    Yujun Qiu

    2017-01-01

    Full Text Available The relationship between cloud and aerosol properties was investigated over two 4° × 4° adjacent regions in the south (R1 and in the north (R2 in eastern China. The CloudSat/CALIPSO data were used to extract the cloud and aerosol profiles properties. The mean value of cloud occurrence probability (COP was the highest in the mixed cloud layer (−40°C~0°C and the lowest in the warm cloud layer (>0°C. The atmospheric humidity was more statistically relevant to COP in the warm cloud layer than aerosol condition. The differences in COP between the two regions in the mixed cloud layer and ice cloud layer (<−40°C had good correlations with those in the aerosol extinction coefficient. A radar reflectivity factor greater than −10 dBZ occurred mainly in warm cloud layers and mixed cloud layers. A high-COP zone appeared in the above-0°C layer with cloud thicknesses of 2-3 km in both regions and in all the four seasons, but the distribution of the zonal layer in R2 was more continuous than that in R1, which was consistent with the higher aerosol optical thickness in R2 than in R1 in the above-0°C layer, indicating a positive correlation between aerosol and cloud probability.

  1. A browser-based 3D Visualization Tool designed for comparing CERES/CALIOP/CloudSAT level-2 data sets.

    Science.gov (United States)

    Chu, C.; Sun-Mack, S.; Chen, Y.; Heckert, E.; Doelling, D. R.

    2017-12-01

    In Langley NASA, Clouds and the Earth's Radiant Energy System (CERES) and Moderate Resolution Imaging Spectroradiometer (MODIS) are merged with Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and CloudSat Cloud Profiling Radar (CPR). The CERES merged product (C3M) matches up to three CALIPSO footprints with each MODIS pixel along its ground track. It then assigns the nearest CloudSat footprint to each of those MODIS pixels. The cloud properties from MODIS, retrieved using the CERES algorithms, are included in C3M with the matched CALIPSO and CloudSat products along with radiances from 18 MODIS channels. The dataset is used to validate the CERES retrieved MODIS cloud properties and the computed TOA and surface flux difference using MODIS or CALIOP/CloudSAT retrieved clouds. This information is then used to tune the computed fluxes to match the CERES observed TOA flux. A visualization tool will be invaluable to determine the cause of these large cloud and flux differences in order to improve the methodology. This effort is part of larger effort to allow users to order the CERES C3M product sub-setted by time and parameter as well as the previously mentioned visualization capabilities. This presentation will show a new graphical 3D-interface, 3D-CERESVis, that allows users to view both passive remote sensing satellites (MODIS and CERES) and active satellites (CALIPSO and CloudSat), such that the detailed vertical structures of cloud properties from CALIPSO and CloudSat are displayed side by side with horizontally retrieved cloud properties from MODIS and CERES. Similarly, the CERES computed profile fluxes whether using MODIS or CALIPSO and CloudSat clouds can also be compared. 3D-CERESVis is a browser-based visualization tool that makes uses of techniques such as multiple synchronized cursors, COLLADA format data and Cesium.

  2. Determination of the impact of RGB points cloud attribute quality on color-based segmentation process

    Directory of Open Access Journals (Sweden)

    Bartłomiej Kraszewski

    2015-06-01

    Full Text Available The article presents the results of research on the effect that radiometric quality of point cloud RGB attributes have on color-based segmentation. In the research, a point cloud with a resolution of 5 mm, received from FAROARO Photon 120 scanner, described the fragment of an office’s room and color images were taken by various digital cameras. The images were acquired by SLR Nikon D3X, and SLR Canon D200 integrated with the laser scanner, compact camera Panasonic TZ-30 and a mobile phone digital camera. Color information from images was spatially related to point cloud in FAROARO Scene software. The color-based segmentation of testing data was performed with the use of a developed application named “RGB Segmentation”. The application was based on public Point Cloud Libraries (PCL and allowed to extract subsets of points fulfilling the criteria of segmentation from the source point cloud using region growing method.Using the developed application, the segmentation of four tested point clouds containing different RGB attributes from various images was performed. Evaluation of segmentation process was performed based on comparison of segments acquired using the developed application and extracted manually by an operator. The following items were compared: the number of obtained segments, the number of correctly identified objects and the correctness of segmentation process. The best correctness of segmentation and most identified objects were obtained using the data with RGB attribute from Nikon D3X images. Based on the results it was found that quality of RGB attributes of point cloud had impact only on the number of identified objects. In case of correctness of the segmentation, as well as its error no apparent relationship between the quality of color information and the result of the process was found.[b]Keywords[/b]: terrestrial laser scanning, color-based segmentation, RGB attribute, region growing method, digital images, points cloud

  3. Cloud-Based DDoS HTTP Attack Detection Using Covariance Matrix Approach

    Directory of Open Access Journals (Sweden)

    Abdulaziz Aborujilah

    2017-01-01

    Full Text Available In this era of technology, cloud computing technology has become essential part of the IT services used the daily life. In this regard, website hosting services are gradually moving to the cloud. This adds new valued feature to the cloud-based websites and at the same time introduces new threats for such services. DDoS attack is one such serious threat. Covariance matrix approach is used in this article to detect such attacks. The results were encouraging, according to confusion matrix and ROC descriptors.

  4. Reliability Evaluation for the Surface to Air Missile Weapon Based on Cloud Model

    Directory of Open Access Journals (Sweden)

    Deng Jianjun

    2015-01-01

    Full Text Available The fuzziness and randomness is integrated by using digital characteristics, such as Expected value, Entropy and Hyper entropy. The cloud model adapted to reliability evaluation is put forward based on the concept of the surface to air missile weapon. The cloud scale of the qualitative evaluation is constructed, and the quantitative variable and the qualitative variable in the system reliability evaluation are corresponded. The practical calculation result shows that it is more effective to analyze the reliability of the surface to air missile weapon by this way. The practical calculation result also reflects the model expressed by cloud theory is more consistent with the human thinking style of uncertainty.

  5. Study of the fragmentation of astrophysical interest molecules (CnHm) induced by high velocity collision

    International Nuclear Information System (INIS)

    Tuna, Th.

    2008-07-01

    This work shows the study of atom-molecule collision processes in the high velocity domain (v=4,5 a.u). The molecules concerned by this work are small unsaturated hydrocarbons C 1-4 H and C 3 H 2 . Molecules are accelerated with the Tandem accelerator in Orsay and their fragmentation is analyzed by the 4π, 100% efficient detector, AGAT. Thanks to a shape analysis of the current signal from the silicon detectors in association with the well known grid method, we are able to measure all the fragmentation channels of the incident molecule. These dissociation measurements have been introduced in the modelization of two objects of the interstellar medium in which a lot of hydrocarbon molecules have been observed (TMC1, horse-head nebula). We have extended our branching ratios obtained by high velocity collision to other electronic processes included in the chemical database like photodissociation and dissociative recombination. This procedure is feasible under an assumption of the statistical point of view of the molecular fragmentation. The deviations following our modification are very small in the modelization of TMC1 but significant in the photodissociation region. The first part is dedicated to the description of the experimental setting that has enabled us to study the fragmentation of C n H m molecules: the Orsay's Tandem accelerator and the Agat detector. The second part deals with negative ion sources and particularly with the Sahat source that is based on electronic impact and has shown good features for the production of anions and correct stability for its use with accelerators. The third part is dedicated to the experimental results in terms of cross-sections, number of fragments and branching ratios, associated to the various collisional processes. The last part presents an application of our measurement of fragmentation data to astro-chemistry. In this field, the simulation codes of the inter-stellar medium require databases of chemical reactions that

  6. Towards Constraint-based High Performance Cloud System in the Process of Cloud Computing Adoption in an Organization

    OpenAIRE

    Simalango, Mikael Fernandus; Kang, Mun-Young; Oh, Sangyoon

    2010-01-01

    Cloud computing is penetrating into various domains and environments, from theoretical computer science to economy, from marketing hype to educational curriculum and from R&D lab to enterprise IT infrastructure. Yet, the currently developing state of cloud computing leaves several issues to address and also affects cloud computing adoption by organizations. In this paper, we explain how the transition into the cloud can occur in an organization and describe the mechanism for transforming lega...

  7. Influence of Ice Cloud Microphysics on Imager-Based Estimates of Earth's Radiation Budget

    Science.gov (United States)

    Loeb, N. G.; Kato, S.; Minnis, P.; Yang, P.; Sun-Mack, S.; Rose, F. G.; Hong, G.; Ham, S. H.

    2016-12-01

    A central objective of the Clouds and the Earth's Radiant Energy System (CERES) is to produce a long-term global climate data record of Earth's radiation budget from the TOA down to the surface along with the associated atmospheric and surface properties that influence it. CERES relies on a number of data sources, including broadband radiometers measuring incoming and reflected solar radiation and OLR, high-resolution spectral imagers, meteorological, aerosol and ozone assimilation data, and snow/sea-ice maps based on microwave radiometer data. While the TOA radiation budget is largely determined directly from accurate broadband radiometer measurements, the surface radiation budget is derived indirectly through radiative transfer model calculations initialized using imager-based cloud and aerosol retrievals and meteorological assimilation data. Because ice cloud particles exhibit a wide range of shapes, sizes and habits that cannot be independently retrieved a priori from passive visible/infrared imager measurements, assumptions about the scattering properties of ice clouds are necessary in order to retrieve ice cloud optical properties (e.g., optical depth) from imager radiances and to compute broadband radiative fluxes. This presentation will examine how the choice of an ice cloud particle model impacts computed shortwave (SW) radiative fluxes at the top-of-atmosphere (TOA) and surface. The ice cloud particle models considered correspond to those from prior, current and future CERES data product versions. During the CERES Edition2 (and Edition3) processing, ice cloud particles were assumed to be smooth hexagonal columns. In the Edition4, roughened hexagonal columns are assumed. The CERES team is now working on implementing in a future version an ice cloud particle model comprised of a two-habit ice cloud model consisting of roughened hexagonal columns and aggregates of roughened columnar elements. In each case, we use the same ice particle model in both the

  8. Privacy authentication using key attribute-based encryption in mobile cloud computing

    Science.gov (United States)

    Mohan Kumar, M.; Vijayan, R.

    2017-11-01

    Mobile Cloud Computing is becoming more popular in nowadays were users of smartphones are getting increased. So, the security level of cloud computing as to be increased. Privacy Authentication using key-attribute based encryption helps the users for business development were the data sharing with the organization using the cloud in a secured manner. In Privacy Authentication the sender of data will have permission to add their receivers to whom the data access provided for others the access denied. In sender application, the user can choose the file which is to be sent to receivers and then that data will be encrypted using Key-attribute based encryption using AES algorithm. In which cipher created, and that stored in Amazon Cloud along with key value and the receiver list.

  9. An Anomaly Detection Algorithm of Cloud Platform Based on Self-Organizing Maps

    Directory of Open Access Journals (Sweden)

    Jun Liu

    2016-01-01

    Full Text Available Virtual machines (VM on a Cloud platform can be influenced by a variety of factors which can lead to decreased performance and downtime, affecting the reliability of the Cloud platform. Traditional anomaly detection algorithms and strategies for Cloud platforms have some flaws in their accuracy of detection, detection speed, and adaptability. In this paper, a dynamic and adaptive anomaly detection algorithm based on Self-Organizing Maps (SOM for virtual machines is proposed. A unified modeling method based on SOM to detect the machine performance within the detection region is presented, which avoids the cost of modeling a single virtual machine and enhances the detection speed and reliability of large-scale virtual machines in Cloud platform. The important parameters that affect the modeling speed are optimized in the SOM process to significantly improve the accuracy of the SOM modeling and therefore the anomaly detection accuracy of the virtual machine.

  10. An Enhanced Erasure Code-Based Security Mechanism for Cloud Storage

    Directory of Open Access Journals (Sweden)

    Wenfeng Wang

    2014-01-01

    Full Text Available Cloud computing offers a wide range of luxuries, such as high performance, rapid elasticity, on-demand self-service, and low cost. However, data security continues to be a significant impediment in the promotion and popularization of cloud computing. To address the problem of data leakage caused by unreliable service providers and external cyber attacks, an enhanced erasure code-based security mechanism is proposed and elaborated in terms of four aspects: data encoding, data transmission, data placement, and data reconstruction, which ensure data security throughout the whole traversing into cloud storage. Based on the mechanism, we implement a secure cloud storage system (SCSS. The key design issues, including data division, construction of generator matrix, data encoding, fragment naming, and data decoding, are also described in detail. Finally, we conduct an analysis of data availability and security and performance evaluation. Experimental results and analysis demonstrate that SCSS achieves high availability, strong security, and excellent performance.

  11. Feature Extraction from 3D Point Cloud Data Based on Discrete Curves

    Directory of Open Access Journals (Sweden)

    Yi An

    2013-01-01

    Full Text Available Reliable feature extraction from 3D point cloud data is an important problem in many application domains, such as reverse engineering, object recognition, industrial inspection, and autonomous navigation. In this paper, a novel method is proposed for extracting the geometric features from 3D point cloud data based on discrete curves. We extract the discrete curves from 3D point cloud data and research the behaviors of chord lengths, angle variations, and principal curvatures at the geometric features in the discrete curves. Then, the corresponding similarity indicators are defined. Based on the similarity indicators, the geometric features can be extracted from the discrete curves, which are also the geometric features of 3D point cloud data. The threshold values of the similarity indicators are taken from [0,1], which characterize the relative relationship and make the threshold setting easier and more reasonable. The experimental results demonstrate that the proposed method is efficient and reliable.

  12. Development and clinical study of mobile 12-lead electrocardiography based on cloud computing for cardiac emergency.

    Science.gov (United States)

    Fujita, Hideo; Uchimura, Yuji; Waki, Kayo; Omae, Koji; Takeuchi, Ichiro; Ohe, Kazuhiko

    2013-01-01

    To improve emergency services for accurate diagnosis of cardiac emergency, we developed a low-cost new mobile electrocardiography system "Cloud Cardiology®" based upon cloud computing for prehospital diagnosis. This comprises a compact 12-lead ECG unit equipped with Bluetooth and Android Smartphone with an application for transmission. Cloud server enables us to share ECG simultaneously inside and outside the hospital. We evaluated the clinical effectiveness by conducting a clinical trial with historical comparison to evaluate this system in a rapid response car in the real emergency service settings. We found that this system has an ability to shorten the onset to balloon time of patients with acute myocardial infarction, resulting in better clinical outcome. Here we propose that cloud-computing based simultaneous data sharing could be powerful solution for emergency service for cardiology, along with its significant clinical outcome.

  13. Precession feature extraction of ballistic missile warhead with high velocity

    Science.gov (United States)

    Sun, Huixia

    2018-04-01

    This paper establishes the precession model of ballistic missile warhead, and derives the formulas of micro-Doppler frequency induced by the target with precession. In order to obtain micro-Doppler feature of ballistic missile warhead with precession, micro-Doppler bandwidth estimation algorithm, which avoids velocity compensation, is presented based on high-resolution time-frequency transform. The results of computer simulations confirm the effectiveness of the proposed method even with low signal-to-noise ratio.

  14. Research on Methods for Discovering and Selecting Cloud Infrastructure Services Based on Feature Modeling

    Directory of Open Access Journals (Sweden)

    Huamin Zhu

    2016-01-01

    Full Text Available Nowadays more and more cloud infrastructure service providers are providing large numbers of service instances which are a combination of diversified resources, such as computing, storage, and network. However, for cloud infrastructure services, the lack of a description standard and the inadequate research of systematic discovery and selection methods have exposed difficulties in discovering and choosing services for users. First, considering the highly configurable properties of a cloud infrastructure service, the feature model method is used to describe such a service. Second, based on the description of the cloud infrastructure service, a systematic discovery and selection method for cloud infrastructure services are proposed. The automatic analysis techniques of the feature model are introduced to verify the model’s validity and to perform the matching of the service and demand models. Finally, we determine the critical decision metrics and their corresponding measurement methods for cloud infrastructure services, where the subjective and objective weighting results are combined to determine the weights of the decision metrics. The best matching instances from various providers are then ranked by their comprehensive evaluations. Experimental results show that the proposed methods can effectively improve the accuracy and efficiency of cloud infrastructure service discovery and selection.

  15. A secure EHR system based on hybrid clouds.

    Science.gov (United States)

    Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke

    2012-10-01

    Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.

  16. Education on the Cloud: Researching Student-Centered, Cloud-Based Learning Prospects in the Context of a European Network

    Science.gov (United States)

    Panoutsopoulos, Hercules; Donert, Karl; Papoutsis, Panos; Kotsanis, Ioannis

    2015-01-01

    During the last few years, ongoing developments in the technological field of Cloud computing have initiated discourse on the potential of the Cloud to be systematically exploited in educational contexts. Research interest has been stimulated by a range of advantages of Cloud technologies (e.g. adaptability, flexibility, scalability,…

  17. Characterization of AVHRR global cloud detection sensitivity based on CALIPSO-CALIOP cloud optical thickness information: demonstration of results based on the CM SAF CLARA-A2 climate data record

    Science.gov (United States)

    Karlsson, Karl-Göran; Håkansson, Nina

    2018-02-01

    The sensitivity in detecting thin clouds of the cloud screening method being used in the CM SAF cloud, albedo and surface radiation data set from AVHRR data (CLARA-A2) cloud climate data record (CDR) has been evaluated using cloud information from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) onboard the CALIPSO satellite. The sensitivity, including its global variation, has been studied based on collocations of Advanced Very High Resolution Radiometer (AVHRR) and CALIOP measurements over a 10-year period (2006-2015). The cloud detection sensitivity has been defined as the minimum cloud optical thickness for which 50 % of clouds could be detected, with the global average sensitivity estimated to be 0.225. After using this value to reduce the CALIOP cloud mask (i.e. clouds with optical thickness below this threshold were interpreted as cloud-free cases), cloudiness results were found to be basically unbiased over most of the globe except over the polar regions where a considerable underestimation of cloudiness could be seen during the polar winter. The overall probability of detecting clouds in the polar winter could be as low as 50 % over the highest and coldest parts of Greenland and Antarctica, showing that a large fraction of optically thick clouds also remains undetected here. The study included an in-depth analysis of the probability of detecting a cloud as a function of the vertically integrated cloud optical thickness as well as of the cloud's geographical position. Best results were achieved over oceanic surfaces at mid- to high latitudes where at least 50 % of all clouds with an optical thickness down to a value of 0.075 were detected. Corresponding cloud detection sensitivities over land surfaces outside of the polar regions were generally larger than 0.2 with maximum values of approximately 0.5 over the Sahara and the Arabian Peninsula. For polar land surfaces the values were close to 1 or higher with maximum values of 4.5 for the parts

  18. ESTIMATING AIRCRAFT HEADING BASED ON LASERSCANNER DERIVED POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    Z. Koppanyi

    2015-03-01

    Full Text Available Using LiDAR sensors for tracking and monitoring an operating aircraft is a new application. In this paper, we present data processing methods to estimate the heading of a taxiing aircraft using laser point clouds. During the data acquisition, a Velodyne HDL-32E laser scanner tracked a moving Cessna 172 airplane. The point clouds captured at different times were used for heading estimation. After addressing the problem and specifying the equation of motion to reconstruct the aircraft point cloud from the consecutive scans, three methods are investigated here. The first requires a reference model to estimate the relative angle from the captured data by fitting different cross-sections (horizontal profiles. In the second approach, iterative closest point (ICP method is used between the consecutive point clouds to determine the horizontal translation of the captured aircraft body. Regarding the ICP, three different versions were compared, namely, the ordinary 3D, 3-DoF 3D and 2-DoF 3D ICP. It was found that 2-DoF 3D ICP provides the best performance. Finally, the last algorithm searches for the unknown heading and velocity parameters by minimizing the volume of the reconstructed plane. The three methods were compared using three test datatypes which are distinguished by object-sensor distance, heading and velocity. We found that the ICP algorithm fails at long distances and when the aircraft motion direction perpendicular to the scan plane, but the first and the third methods give robust and accurate results at 40m object distance and at ~12 knots for a small Cessna airplane.

  19. Estimating Aircraft Heading Based on Laserscanner Derived Point Clouds

    Science.gov (United States)

    Koppanyi, Z.; Toth, C., K.

    2015-03-01

    Using LiDAR sensors for tracking and monitoring an operating aircraft is a new application. In this paper, we present data processing methods to estimate the heading of a taxiing aircraft using laser point clouds. During the data acquisition, a Velodyne HDL-32E laser scanner tracked a moving Cessna 172 airplane. The point clouds captured at different times were used for heading estimation. After addressing the problem and specifying the equation of motion to reconstruct the aircraft point cloud from the consecutive scans, three methods are investigated here. The first requires a reference model to estimate the relative angle from the captured data by fitting different cross-sections (horizontal profiles). In the second approach, iterative closest point (ICP) method is used between the consecutive point clouds to determine the horizontal translation of the captured aircraft body. Regarding the ICP, three different versions were compared, namely, the ordinary 3D, 3-DoF 3D and 2-DoF 3D ICP. It was found that 2-DoF 3D ICP provides the best performance. Finally, the last algorithm searches for the unknown heading and velocity parameters by minimizing the volume of the reconstructed plane. The three methods were compared using three test datatypes which are distinguished by object-sensor distance, heading and velocity. We found that the ICP algorithm fails at long distances and when the aircraft motion direction perpendicular to the scan plane, but the first and the third methods give robust and accurate results at 40m object distance and at ~12 knots for a small Cessna airplane.

  20. Distributed MRI reconstruction using Gadgetron-based cloud computing.

    Science.gov (United States)

    Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S

    2015-03-01

    To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.

  1. Analysis and Research on Spatial Data Storage Model Based on Cloud Computing Platform

    Science.gov (United States)

    Hu, Yong

    2017-12-01

    In this paper, the data processing and storage characteristics of cloud computing are analyzed and studied. On this basis, a cloud computing data storage model based on BP neural network is proposed. In this data storage model, it can carry out the choice of server cluster according to the different attributes of the data, so as to complete the spatial data storage model with load balancing function, and have certain feasibility and application advantages.

  2. On Designing a Generic Framework for Cloud-based Big Data Analytics

    OpenAIRE

    Khan, Samiya; Alam, Mansaf

    2017-01-01

    Big data analytics has gathered immense research attention lately because of its ability to harness useful information from heaps of data. Cloud computing has been adjudged as one of the best infrastructural solutions for implementation of big data analytics. This research paper proposes a five-layer model for cloud-based big data analytics that uses dew computing and edge computing concepts. Besides this, the paper also presents an approach for creation of custom big data stack by selecting ...

  3. A Decision Matrix and Monitoring based Framework for Infrastructure Performance Enhancement in A Cloud based Environment

    OpenAIRE

    Alam, Mansaf; Shakil, Kashish Ara

    2014-01-01

    Cloud environment is very different from traditional computing environment and therefore tracking the performance of cloud leverages additional requirements. The movement of data in cloud is very fast. Hence, it requires that resources and infrastructure available at disposal must be equally competent. Infrastructure level performance in cloud involves the performance of servers, network and storage which act as the heart and soul for driving the entire cloud business. Thus a constant improve...

  4. High velocity properties of the dynamic frictional force between ductile metals

    International Nuclear Information System (INIS)

    Hammerberg, James Edward; Hollan, Brad L.; Germann, Timothy C.; Ravelo, Ramon J.

    2010-01-01

    The high velocity properties of the tangential frictional force between ductile metal interfaces seen in large-scale NonEquilibrium Molecular Dynamics (NEMD) simulations are characterized by interesting scaling behavior. In many cases a power law decrease in the frictional force with increasing velocity is observed at high velocities. We discuss the velocity dependence of the high velocity branch of the tangential force in terms of structural transformation and ultimate transition, at the highest velocities, to confined fluid behavior characterized by a critical strain rate. The particular case of an Al/Al interface is discussed.

  5. Comparing parameterized versus measured microphysical properties of tropical convective cloud bases during the ACRIDICON–CHUVA campaign

    Directory of Open Access Journals (Sweden)

    R. C. Braga

    2017-06-01

    Full Text Available The objective of this study is to validate parameterizations that were recently developed for satellite retrievals of cloud condensation nuclei supersaturation spectra, NCCN(S, at cloud base alongside more traditional parameterizations connecting NCCN(S with cloud base updrafts and drop concentrations. This was based on the HALO aircraft measurements during the ACRIDICON–CHUVA campaign over the Amazon region, which took place in September 2014. The properties of convective clouds were measured with a cloud combination probe (CCP, a cloud and aerosol spectrometer (CAS-DPOL, and a CCN counter onboard the HALO aircraft. An intercomparison of the cloud drop size distributions (DSDs and the cloud water content (CWC derived from the different instruments generally shows good agreement within the instrumental uncertainties. To this end, the directly measured cloud drop concentrations (Nd near cloud base were compared with inferred values based on the measured cloud base updraft velocity (Wb and NCCN(S spectra. The measurements of Nd at cloud base were also compared with drop concentrations (Na derived on the basis of an adiabatic assumption and obtained from the vertical evolution of cloud drop effective radius (re above cloud base. The measurements of NCCN(S and Wb reproduced the observed Nd within the measurements uncertainties when the old (1959 Twomey's parameterization was used. The agreement between the measured and calculated Nd was only within a factor of 2 with attempts to use cloud base S, as obtained from the measured Wb, Nd, and NCCN(S. This underscores the yet unresolved challenge of aircraft measurements of S in clouds. Importantly, the vertical evolution of re with height reproduced the observation-based nearly adiabatic cloud base drop concentrations, Na. The combination of these results provides aircraft observational support for the various components of the satellite-retrieved methodology that was recently developed to

  6. A Physically Based Algorithm for Non-Blackbody Correction of Cloud-Top Temperature and Application to Convection Study

    Science.gov (United States)

    Wang, Chunpeng; Lou, Zhengzhao Johnny; Chen, Xiuhong; Zeng, Xiping; Tao, Wei-Kuo; Huang, Xianglei

    2014-01-01

    Cloud-top temperature (CTT) is an important parameter for convective clouds and is usually different from the 11-micrometers brightness temperature due to non-blackbody effects. This paper presents an algorithm for estimating convective CTT by using simultaneous passive [Moderate Resolution Imaging Spectroradiometer (MODIS)] and active [CloudSat 1 Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO)] measurements of clouds to correct for the non-blackbody effect. To do this, a weighting function of the MODIS 11-micrometers band is explicitly calculated by feeding cloud hydrometer profiles from CloudSat and CALIPSO retrievals and temperature and humidity profiles based on ECMWF analyses into a radiation transfer model.Among 16 837 tropical deep convective clouds observed by CloudSat in 2008, the averaged effective emission level (EEL) of the 11-mm channel is located at optical depth; approximately 0.72, with a standard deviation of 0.3. The distance between the EEL and cloud-top height determined by CloudSat is shown to be related to a parameter called cloud-top fuzziness (CTF), defined as the vertical separation between 230 and 10 dBZ of CloudSat radar reflectivity. On the basis of these findings a relationship is then developed between the CTF and the difference between MODIS 11-micrometers brightness temperature and physical CTT, the latter being the non-blackbody correction of CTT. Correction of the non-blackbody effect of CTT is applied to analyze convective cloud-top buoyancy. With this correction, about 70% of the convective cores observed by CloudSat in the height range of 6-10 km have positive buoyancy near cloud top, meaning clouds are still growing vertically, although their final fate cannot be determined by snapshot observations.

  7. Investigation of tropical cirrus cloud properties using ground based lidar measurements

    Science.gov (United States)

    Dhaman, Reji K.; Satyanarayana, Malladi; Krishnakumar, V.; Mahadevan Pillai, V. P.; Jayeshlal, G. S.; Raghunath, K.; Venkat Ratnam, M.

    2016-05-01

    Cirrus clouds play a significant role in the Earths radiation budget. Therefore, knowledge of geometrical and optical properties of cirrus cloud is essential for the climate modeling. In this paper, the cirrus clouds microphysical and optical properties are made by using a ground based lidar measurements over an inland tropical station Gadanki (13.5°N, 79.2°E), Andhra Pradesh, India. The variation of cirrus microphysical and optical properties with mid cloud temperature is also studied. The cirrus clouds mean height is generally observed in the range of 9-17km with a peak occurrence at 13- 14km. The cirrus mid cloud temperature ranges from -81°C to -46°C. The cirrus geometrical thickness ranges from 0.9- 4.5km. During the cirrus occurrence days sub-visual, thin and dense cirrus were at 37.5%, 50% and 12.5% respectively. The monthly cirrus optical depth ranges from 0.01-0.47, but most (<80%) of the cirrus have values less than 0.1. Optical depth shows a strong dependence with cirrus geometrical thickness and mid-cloud height. The monthly mean cirrus extinction ranges from 2.8E-06 to 8E-05 and depolarization ratio and lidar ratio varies from 0.13 to 0.77 and 2 to 52 sr respectively. A positive correlation exists for both optical depth and extinction with the mid-cloud temperature. The lidar ratio shows a scattered behavior with mid-cloud temperature.

  8. Feasibility and demonstration of a cloud-based RIID analysis system

    Science.gov (United States)

    Wright, Michael C.; Hertz, Kristin L.; Johnson, William C.; Sword, Eric D.; Younkin, James R.; Sadler, Lorraine E.

    2015-06-01

    A significant limitation in the operational utility of handheld and backpack radioisotope identifiers (RIIDs) is the inability of their onboard algorithms to accurately and reliably identify the isotopic sources of the measured gamma-ray energy spectrum. A possible solution is to move the spectral analysis computations to an external device, the cloud, where significantly greater capabilities are available. The implementation and demonstration of a prototype cloud-based RIID analysis system have shown this type of system to be feasible with currently available communication and computational technology. A system study has shown that the potential user community could derive significant benefits from an appropriately implemented cloud-based analysis system and has identified the design and operational characteristics required by the users and stakeholders for such a system. A general description of the hardware and software necessary to implement reliable cloud-based analysis, the value of the cloud expressed by the user community, and the aspects of the cloud implemented in the demonstrations are discussed.

  9. CLOUD-BASED VS DESKTOP-BASED PROPERTY MANAGEMENT SYSTEMS IN HOTEL

    Directory of Open Access Journals (Sweden)

    Mustafa\tGULMEZ

    2015-06-01

    Full Text Available Even though keeping up with the modern developments in IT sector is crucial for the success and competitiveness of a hotel, it is usually very hard for new technologies to be accepted and implemented. This is the case with the cloud technology for which the opinions between hoteliers are divided on those who think that it is just another fashion trend, unnecessary to be taken into consideration and those that believe that it helps in performing daily operations more easily, leaving space for more interaction with guests both in virtual and real world. Usage of cloud technology in hotels is still in its beginning phase and hoteliers still have to learn more about its advantages and adequate usage for the benefit of overall hotel operating. On the example of hotel property management system (PMS and comparison between features of its older desktop-version and new web-based programs, this research aims at finding out at which stage and how effective is usage of cloud technology in hotels. For this, qualitative research with semi-structured interviews with hotel mangers that use one of these programs was conducted. Reasons for usage and advantages of each version are discussed.

  10. FPFH-based graph matching for 3D point cloud registration

    Science.gov (United States)

    Zhao, Jiapeng; Li, Chen; Tian, Lihua; Zhu, Jihua

    2018-04-01

    Correspondence detection is a vital step in point cloud registration and it can help getting a reliable initial alignment. In this paper, we put forward an advanced point feature-based graph matching algorithm to solve the initial alignment problem of rigid 3D point cloud registration with partial overlap. Specifically, Fast Point Feature Histograms are used to determine the initial possible correspondences firstly. Next, a new objective function is provided to make the graph matching more suitable for partially overlapping point cloud. The objective function is optimized by the simulated annealing algorithm for final group of correct correspondences. Finally, we present a novel set partitioning method which can transform the NP-hard optimization problem into a O(n3)-solvable one. Experiments on the Stanford and UWA public data sets indicates that our method can obtain better result in terms of both accuracy and time cost compared with other point cloud registration methods.

  11. Recent Findings Based on Airborne Measurements at the Interface of Coastal California Clouds and Clear Air

    Science.gov (United States)

    Sorooshian, A.; Crosbie, E.; Wang, Z.; Chuang, P. Y.; Craven, J. S.; Coggon, M. M.; Brunke, M.; Zeng, X.; Jonsson, H.; Woods, R. K.; Flagan, R. C.; Seinfeld, J.

    2015-12-01

    Recent aircraft field experiments with the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) Twin Otter have targeted interfaces between clear and cloudy areas along the California coast. These campaigns, based out of Marina, California in the July-August time frame, include the Eastern Pacific Emitted Aerosol Cloud Experiment (E-PEACE, 2011), Nucleation in California Experiment (NiCE, 2013), and the Biological Ocean Atmospheric Study (BOAS, 2015). Results will be presented related to (i) aqueous processing of natural and anthropogenic emissions, (ii) vertical re-distribution of ocean micronutrients, and (iii) stratocumulus cloud clearings and notable thermodynamic and aerosol contrasts across the clear-cloudy interface. The results have implications for modeling and observational studies of marine boundary layer clouds, especially in relation to aerosol-cloud interactions.

  12. Intrusion detection in cloud computing based attack patterns and risk assessment

    Directory of Open Access Journals (Sweden)

    Ben Charhi Youssef

    2017-05-01

    Full Text Available This paper is an extension of work originally presented in SYSCO CONF.We extend our previous work by presenting the initial results of the implementation of intrusion detection based on risk assessment on cloud computing. The idea focuses on a novel approach for detecting cyber-attacks on the cloud environment by analyzing attacks pattern using risk assessment methodologies. The aim of our solution is to combine evidences obtained from Intrusion Detection Systems (IDS deployed in a cloud with risk assessment related to each attack pattern. Our approach presents a new qualitative solution for analyzing each symptom, indicator and vulnerability analyzing impact and likelihood of distributed and multi-steps attacks directed to cloud environments. The implementation of this approach will reduce the number of false alerts and will improve the performance of the IDS.

  13. A Dynamic Resource Scheduling Method Based on Fuzzy Control Theory in Cloud Environment

    Directory of Open Access Journals (Sweden)

    Zhijia Chen

    2015-01-01

    Full Text Available The resources in cloud environment have features such as large-scale, diversity, and heterogeneity. Moreover, the user requirements for cloud computing resources are commonly characterized by uncertainty and imprecision. Hereby, to improve the quality of cloud computing service, not merely should the traditional standards such as cost and bandwidth be satisfied, but also particular emphasis should be laid on some extended standards such as system friendliness. This paper proposes a dynamic resource scheduling method based on fuzzy control theory. Firstly, the resource requirements prediction model is established. Then the relationships between resource availability and the resource requirements are concluded. Afterwards fuzzy control theory is adopted to realize a friendly match between user needs and resources availability. Results show that this approach improves the resources scheduling efficiency and the quality of service (QoS of cloud computing.

  14. High-Precision Registration of Point Clouds Based on Sphere Feature Constraints

    Directory of Open Access Journals (Sweden)

    Junhui Huang

    2016-12-01

    Full Text Available Point cloud registration is a key process in multi-view 3D measurements. Its precision affects the measurement precision directly. However, in the case of the point clouds with non-overlapping areas or curvature invariant surface, it is difficult to achieve a high precision. A high precision registration method based on sphere feature constraint is presented to overcome the difficulty in the paper. Some known sphere features with constraints are used to construct virtual overlapping areas. The virtual overlapping areas provide more accurate corresponding point pairs and reduce the influence of noise. Then the transformation parameters between the registered point clouds are solved by an optimization method with weight function. In that case, the impact of large noise in point clouds can be reduced and a high precision registration is achieved. Simulation and experiments validate the proposed method.

  15. Evaluation of the MiKlip decadal prediction system using satellite based cloud products

    Directory of Open Access Journals (Sweden)

    Thomas Spangehl

    2016-12-01

    Full Text Available The decadal hindcast simulations performed for the Mittelfristige Klimaprognosen (MiKlip project are evaluated using satellite-retrieved cloud parameters from the CM SAF cLoud, Albedo and RAdiation dataset from AVHRR data (CLARA-A1 provided by the EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF and from the International Satellite Cloud Climatology Project (ISCCP. The forecast quality of two sets of hindcasts, Baseline-1-LR and Baseline-0, which use differing initialisations, is assessed. Basic evaluation focuses on multi-year ensemble mean fields and cloud-type histograms utilizing satellite simulator output. Additionally, ensemble evaluation employing analysis of variance (ANOVA, analysis rank histograms (ARH and a deterministic correlation score is performed. Satellite simulator output is available for a subset of the full hindcast ensembles only. Therefore, the raw model cloud cover is complementary used. The new Baseline-1-LR hindcasts are closer to satellite data with respect to the simulated tropical/subtropical mean cloud cover pattern than the reference hindcasts (Baseline-0 emphasizing improvements of the new MiKlip initialisation procedure. A slightly overestimated occurrence rate of optically thick cloud-types is analysed for different experiments including hindcasts and simulations using realistic sea surface boundaries according to the Atmospheric Model Intercomparison Project (AMIP. By contrast, the evaluation of cirrus and cirrostratus clouds is complicated by observational based uncertainties. Time series of the 3-year mean total cloud cover averaged over the tropical warm pool (TWP region show some correlation with the CLARA-A1 cloud fractional cover. Moreover, ensemble evaluation of the Baseline-1-LR hindcasts reveals potential predictability of the 2–5 lead year averaged total cloud cover for a large part of this region when regarding the full observational period. However, the hindcasts show only

  16. Seasonal Bias of Retrieved Ice Cloud Optical Properties Based on MISR and MODIS Measurements

    Science.gov (United States)

    Wang, Y.; Hioki, S.; Yang, P.; Di Girolamo, L.; Fu, D.

    2017-12-01

    The precise estimation of two important cloud optical and microphysical properties, cloud particle optical thickness and cloud particle effective radius, is fundamental in the study of radiative energy budget and hydrological cycle. In retrieving these two properties, an appropriate selection of ice particle surface roughness is important because it substantially affects the single-scattering properties. At present, using a predetermined ice particle shape without spatial and temporal variations is a common practice in satellite-based retrieval. This approach leads to substantial uncertainties in retrievals. The cloud radiances measured by each of the cameras of the Multi-angle Imaging SpectroRadiometer (MISR) instrument are used to estimate spherical albedo values at different scattering angles. By analyzing the directional distribution of estimated spherical albedo values, the degree of ice particle surface roughness is estimated. With an optimal degree of ice particle roughness, cloud optical thickness and effective radius are retrieved based on a bi-spectral shortwave technique in conjunction with two Moderate Resolution Imaging Spectroradiometer (MODIS) bands centered at 0.86 and 2.13 μm. The seasonal biases of retrieved cloud optical and microphysical properties, caused by the uncertainties in ice particle roughness, are investigated by using one year of MISR-MODIS fused data.

  17. SocialCloudShare: a Facebook Application for a Relationship-based Information Sharing in the Cloud

    Directory of Open Access Journals (Sweden)

    Davide Albertini

    2014-10-01

    Full Text Available In last few years, Online Social Networks (OSNs have become one of the most used platforms for sharing data (e.g., pictures, short texts on the Internet. Nowadays Facebook and Twitter are the most popular OSN providers, though they implement different social models. However, independently from the social model they implement, OSN platforms have become a widespread repository of personal information. All these data (e.g., profile information, shared elements, users’ likes are stored in a centralized repository that can be exploited for data mining and marketing analysis. With this data collection process, lots of sensitive information are gathered by OSN providers that, in time, have become more and more targeted by malicious attackers. To overcome this problem, in this paper we present an architectural framework that, by means of a Social Application registered in Facebook, allows users to move their data (e.g., relationships, resources outside the OSN realm and to store them in the public Cloud. Given that the public Cloud is not a secure and private environment, our proposal provides users security and privacy guarantees over their data by encrypting the resources and by anonymizing their social graphs. The presented framework enforces Relationship-Based Access Control (ReBAC rules over the anonymized social graph, providing OSN users the possibility to selectively share information and resources as they are used to do in Facebook.

  18. Dynamic resource allocation engine for cloud-based real-time video transcoding in mobile cloud computing environments

    Science.gov (United States)

    Adedayo, Bada; Wang, Qi; Alcaraz Calero, Jose M.; Grecos, Christos

    2015-02-01

    The recent explosion in video-related Internet traffic has been driven by the widespread use of smart mobile devices, particularly smartphones with advanced cameras that are able to record high-quality videos. Although many of these devices offer the facility to record videos at different spatial and temporal resolutions, primarily with local storage considerations in mind, most users only ever use the highest quality settings. The vast majority of these devices are optimised for compressing the acquired video using a single built-in codec and have neither the computational resources nor battery reserves to transcode the video to alternative formats. This paper proposes a new low-complexity dynamic resource allocation engine for cloud-based video transcoding services that are both scalable and capable of being delivered in real-time. Firstly, through extensive experimentation, we establish resource requirement benchmarks for a wide range of transcoding tasks. The set of tasks investigated covers the most widely used input formats (encoder type, resolution, amount of motion and frame rate) associated with mobile devices and the most popular output formats derived from a comprehensive set of use cases, e.g. a mobile news reporter directly transmitting videos to the TV audience of various video format requirements, with minimal usage of resources both at the reporter's end and at the cloud infrastructure end for transcoding services.

  19. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    Science.gov (United States)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  20. CLOUD BASED WEB 3D GIS TAIWAN PLATFORM

    Directory of Open Access Journals (Sweden)

    W.-F. Tsai

    2012-09-01

    Full Text Available This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.

  1. Management of High-Velocity Injuries of the Head and Neck.

    Science.gov (United States)

    Majors, Jacob S; Brennan, Joseph; Holt, G Richard

    2017-11-01

    Trauma centers must prepare to manage high-velocity injuries resulting from a mass casualty incidents as global terrorism becomes a greater concern and an increasing risk. The most recent conflicts in Iraq and Afghanistan have significantly improved understanding of battlefield trauma and how to appropriately address these injures. This article applies combat surgery experience to civilian situations, outlines the physiology and kinetics of high-velocity injuries, and reviews applicable triage and management strategies. Published by Elsevier Inc.

  2. Observations of temporal change of nighttime cloud cover from Himawari 8 and ground-based sky camera over Chiba, Japan

    Science.gov (United States)

    Lagrosas, N.; Gacal, G. F. B.; Kuze, H.

    2017-12-01

    Detection of nighttime cloud from Himawari 8 is implemented using the difference of digital numbers from bands 13 (10.4µm) and 7 (3.9µm). The digital number difference of -1.39x104 can be used as a threshold to separate clouds from clear sky conditions. To look at observations from the ground over Chiba, a digital camera (Canon Powershot A2300) is used to take images of the sky every 5 minutes at an exposure time of 5s at the Center for Environmental Remote Sensing, Chiba University. From these images, cloud cover values are obtained using threshold algorithm (Gacal, et al, 2016). Ten minute nighttime cloud cover values from these two datasets are compared and analyzed from 29 May to 05 June 2017 (20:00-03:00 JST). When compared with lidar data, the camera can detect thick high level clouds up to 10km. The results show that during clear sky conditions (02-03 June), both camera and satellite cloud cover values show 0% cloud cover. During cloudy conditions (05-06 June), the camera shows almost 100% cloud cover while satellite cloud cover values range from 60 to 100%. These low values can be attributed to the presence of low-level thin clouds ( 2km above the ground) as observed from National Institute for Environmental Studies lidar located inside Chiba University. This difference of cloud cover values shows that the camera can produce accurate cloud cover values of low level clouds that are sometimes not detected by satellites. The opposite occurs when high level clouds are present (01-02 June). Derived satellite cloud cover shows almost 100% during the whole night while ground-based camera shows cloud cover values that range from 10 to 100% during the same time interval. The fluctuating values can be attributed to the presence of thin clouds located at around 6km from the ground and the presence of low level clouds ( 1km). Since the camera relies on the reflected city lights, it is possible that the high level thin clouds are not observed by the camera but is

  3. Traffic Flow Prediction Model for Large-Scale Road Network Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhaosheng Yang

    2014-01-01

    Full Text Available To increase the efficiency and precision of large-scale road network traffic flow prediction, a genetic algorithm-support vector machine (GA-SVM model based on cloud computing is proposed in this paper, which is based on the analysis of the characteristics and defects of genetic algorithm and support vector machine. In cloud computing environment, firstly, SVM parameters are optimized by the parallel genetic algorithm, and then this optimized parallel SVM model is used to predict traffic flow. On the basis of the traffic flow data of Haizhu District in Guangzhou City, the proposed model was verified and compared with the serial GA-SVM model and parallel GA-SVM model based on MPI (message passing interface. The results demonstrate that the parallel GA-SVM model based on cloud computing has higher prediction accuracy, shorter running time, and higher speedup.

  4. Cloud-Coffee: implementation of a parallel consistency-based multiple alignment algorithm in the T-Coffee package and its benchmarking on the Amazon Elastic-Cloud.

    Science.gov (United States)

    Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric

    2010-08-01

    We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html

  5. CIMS: A Context-Based Intelligent Multimedia System for Ubiquitous Cloud Computing

    Directory of Open Access Journals (Sweden)

    Abhilash Sreeramaneni

    2015-06-01

    Full Text Available Mobile users spend a tremendous amount of time surfing multimedia contents over the Internet to pursue their interests. A resource-constrained smart device demands more intensive computing tasks and lessens the battery life. To address the resource limitations (i.e., memory, lower maintenance cost, easier access, computing tasks in mobile devices, mobile cloud computing is needed. Several approaches have been proposed to confront the challenges of mobile cloud computing, but difficulties still remain. However, in the coming years, context collecting, processing, and interchanging the results on a heavy network will cause vast computations and reduce the battery life in mobiles. In this paper, we propose a “context-based intelligent multimedia system” (CIMS for ubiquitous cloud computing. The main goal of this research is to lessen the computing percentage, storage complexities, and battery life for mobile users by using pervasive cloud computing. Moreover, to reduce the computing and storage concerns in mobiles, the cloud server collects several groups of user profiles with similarities by executing K-means clustering on users’ data (context and multimedia contents. The distribution process conveys real-time notifications to smartphone users, according to what is stated in his/her profile. We considered a mobile cloud offloading system, which decides the offloading actions to/from cloud servers. Context-aware decision-making (CAD customizes the mobile device performance with different specifications such as short response time and lesser energy consumption. The analysis says that our CIMS takes advantage of cost-effective features to produce high-quality information for mobile (or smart device users in real time. Moreover, our CIMS lessens the computation and storage complexities for mobile users, as well as cloud servers. Simulation analysis suggests that our approach is more efficient than existing domains.

  6. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  7. A keyword searchable attribute-based encryption scheme with attribute update for cloud storage.

    Science.gov (United States)

    Wang, Shangping; Ye, Jian; Zhang, Yaling

    2018-01-01

    Ciphertext-policy attribute-based encryption (CP-ABE) scheme is a new type of data encryption primitive, which is very suitable for data cloud storage for its fine-grained access control. Keyword-based searchable encryption scheme enables users to quickly find interesting data stored in the cloud server without revealing any information of the searched keywords. In this work, we provide a keyword searchable attribute-based encryption scheme with attribute update for cloud storage, which is a combination of attribute-based encryption scheme and keyword searchable encryption scheme. The new scheme supports the user's attribute update, especially in our new scheme when a user's attribute need to be updated, only the user's secret key related with the attribute need to be updated, while other user's secret key and the ciphertexts related with this attribute need not to be updated with the help of the cloud server. In addition, we outsource the operation with high computation cost to cloud server to reduce the user's computational burden. Moreover, our scheme is proven to be semantic security against chosen ciphertext-policy and chosen plaintext attack in the general bilinear group model. And our scheme is also proven to be semantic security against chosen keyword attack under bilinear Diffie-Hellman (BDH) assumption.

  8. Quality Assessment and Comparison of Smartphone and Leica C10 Laser Scanner Based Point Clouds

    Science.gov (United States)

    Sirmacek, Beril; Lindenbergh, Roderik; Wang, Jinhu

    2016-06-01

    3D urban models are valuable for urban map generation, environment monitoring, safety planning and educational purposes. For 3D measurement of urban structures, generally airborne laser scanning sensors or multi-view satellite images are used as a data source. However, close-range sensors (such as terrestrial laser scanners) and low cost cameras (which can generate point clouds based on photogrammetry) can provide denser sampling of 3D surface geometry. Unfortunately, terrestrial laser scanning sensors are expensive and trained persons are needed to use them for point cloud acquisition. A potential effective 3D modelling can be generated based on a low cost smartphone sensor. Herein, we show examples of using smartphone camera images to generate 3D models of urban structures. We compare a smartphone based 3D model of an example structure with a terrestrial laser scanning point cloud of the structure. This comparison gives us opportunity to discuss the differences in terms of geometrical correctness, as well as the advantages, disadvantages and limitations in data acquisition and processing. We also discuss how smartphone based point clouds can help to solve further problems with 3D urban model generation in a practical way. We show that terrestrial laser scanning point clouds which do not have color information can be colored using smartphones. The experiments, discussions and scientific findings might be insightful for the future studies in fast, easy and low-cost 3D urban model generation field.

  9. QUALITY ASSESSMENT AND COMPARISON OF SMARTPHONE AND LEICA C10 LASER SCANNER BASED POINT CLOUDS

    Directory of Open Access Journals (Sweden)

    B. Sirmacek

    2016-06-01

    Full Text Available 3D urban models are valuable for urban map generation, environment monitoring, safety planning and educational purposes. For 3D measurement of urban structures, generally airborne laser scanning sensors or multi-view satellite images are used as a data source. However, close-range sensors (such as terrestrial laser scanners and low cost cameras (which can generate point clouds based on photogrammetry can provide denser sampling of 3D surface geometry. Unfortunately, terrestrial laser scanning sensors are expensive and trained persons are needed to use them for point cloud acquisition. A potential effective 3D modelling can be generated based on a low cost smartphone sensor. Herein, we show examples of using smartphone camera images to generate 3D models of urban structures. We compare a smartphone based 3D model of an example structure with a terrestrial laser scanning point cloud of the structure. This comparison gives us opportunity to discuss the differences in terms of geometrical correctness, as well as the advantages, disadvantages and limitations in data acquisition and processing. We also discuss how smartphone based point clouds can help to solve further problems with 3D urban model generation in a practical way. We show that terrestrial laser scanning point clouds which do not have color information can be colored using smartphones. The experiments, discussions and scientific findings might be insightful for the future studies in fast, easy and low-cost 3D urban model generation field.

  10. Best practices for implementing, testing and using a cloud-based communication system in a disaster situation.

    Science.gov (United States)

    Makowski, Dale

    2016-01-01

    This paper sets out the basics for approaching the selection and implementation of a cloud-based communication system to support a business continuity programme, including: • consideration for how a cloud-based communication system can enhance a business continuity programme; • descriptions of some of the more popular features of a cloud-based communication system; • options to evaluate when selecting a cloud-based communication system; • considerations for how to design a system to be most effective for an organisation; • best practices for how to conduct the initial load of data to a cloud-based communication system; • best practices for how to conduct an initial validation of the data loaded to a cloud-based communication system; • considerations for how to keep contact information in the cloud-based communication system current and accurate; • best practices for conducting ongoing system testing; • considerations for how to conduct user training; • review of other potential uses of a cloud-based communication system; and • review of other tools and features many cloud-based communication systems may offer.

  11. Outdoor Illegal Construction Identification Algorithm Based on 3D Point Cloud Segmentation

    Science.gov (United States)

    An, Lu; Guo, Baolong

    2018-03-01

    Recently, various illegal constructions occur significantly in our surroundings, which seriously restrict the orderly development of urban modernization. The 3D point cloud data technology is used to identify the illegal buildings, which could address the problem above effectively. This paper proposes an outdoor illegal construction identification algorithm based on 3D point cloud segmentation. Initially, in order to save memory space and reduce processing time, a lossless point cloud compression method based on minimum spanning tree is proposed. Then, a ground point removing method based on the multi-scale filtering is introduced to increase accuracy. Finally, building clusters on the ground can be obtained using a region growing method, as a result, the illegal construction can be marked. The effectiveness of the proposed algorithm is verified using a publicly data set collected from the International Society for Photogrammetry and Remote Sensing (ISPRS).

  12. A Systematic Mapping Study of Software Architectures for Cloud Based Systems

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    Context: Cloud computing has gained significant attention of researchers and practitioners. This emerging paradigm is being used to provide solutions in multiple domains without huge upfront investment because of its on demand recourse-provisioning model. However, the information about how software...... of this study is to systematically identify and analyze the currently published research on the topics related to software architectures for cloud-based systems in order to identify architecture solutions for achieving quality requirements. Method: We decided to carry out a systematic mapping study to find...... as much peer-reviewed literature on the topics related to software architectures for cloud-based systems as possible. This study has been carried out by following the guidelines for conducting systematic literature reviews and systematic mapping studies as reported in the literature. Based on our paper...

  13. Cloud-based systems for monitoring earthquakes and other environmental quantities

    Science.gov (United States)

    Clayton, R. W.; Olson, M.; Liu, A.; Chandy, M.; Bunn, J.; Guy, R.

    2013-12-01

    There are many advantages to using a cloud-based system to record and analyze environmental quantities such as earthquakes, radiation, various gases, dust and meteorological parameters. These advantages include robustness and dynamic scalability, and also reduced costs. In this paper, we present our experiences over the last three years in developing a cloud-based earthquake monitoring system (the Community Seismic Network). This network consists of over 600 sensors (accelerometers) in the S. California region that send data directly to the Google App Engine where they are analyzed. The system is capable of handing many other types of sensor data and generating a situation-awareness analysis as a product. Other advantages to the cloud-based system are integration with other peer networks, and being able to deploy anywhere in the world without have to build addition computing infrastructure.

  14. Automatic atlas based electron density and structure contouring for MRI-based prostate radiation therapy on the cloud

    International Nuclear Information System (INIS)

    Dowling, J A; Burdett, N; Chandra, S; Rivest-Hénault, D; Ghose, S; Salvado, O; Fripp, J; Greer, P B; Sun, J; Parker, J; Pichler, P; Stanwell, P

    2014-01-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  15. Automatic Atlas Based Electron Density and Structure Contouring for MRI-based Prostate Radiation Therapy on the Cloud

    Science.gov (United States)

    Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.

    2014-03-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  16. Cloud-Based Software Platform for Smart Meter Data Management

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    of the so-called big data possible. This can improve energy management, e.g., help utility companies to forecast energy loads and improve services, and help households to manage energy usage and save money. As this regard, the proposed paper focuses on building an innovative software platform for smart...... their knowledge; scalable data analytics platform for data mining over big data sets for energy demand forecasting and consumption discovering; data as the service for other applications using smart meter data; and a portal for visualizing data analytics results. The design will incorporate hybrid clouds......, including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), which are suitable for on-demand provisioning, massive scaling, and manageability. Besides, the design will impose extensibility, eciency, and high availability on the system. The paper will evaluate the system comprehensively...

  17. An adaptive process-based cloud infrastructure for space situational awareness applications

    Science.gov (United States)

    Liu, Bingwei; Chen, Yu; Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik; Rubin, Bruce

    2014-06-01

    Space situational awareness (SSA) and defense space control capabilities are top priorities for groups that own or operate man-made spacecraft. Also, with the growing amount of space debris, there is an increase in demand for contextual understanding that necessitates the capability of collecting and processing a vast amount sensor data. Cloud computing, which features scalable and flexible storage and computing services, has been recognized as an ideal candidate that can meet the large data contextual challenges as needed by SSA. Cloud computing consists of physical service providers and middleware virtual machines together with infrastructure, platform, and software as service (IaaS, PaaS, SaaS) models. However, the typical Virtual Machine (VM) abstraction is on a per operating systems basis, which is at too low-level and limits the flexibility of a mission application architecture. In responding to this technical challenge, a novel adaptive process based cloud infrastructure for SSA applications is proposed in this paper. In addition, the details for the design rationale and a prototype is further examined. The SSA Cloud (SSAC) conceptual capability will potentially support space situation monitoring and tracking, object identification, and threat assessment. Lastly, the benefits of a more granular and flexible cloud computing resources allocation are illustrated for data processing and implementation considerations within a representative SSA system environment. We show that the container-based virtualization performs better than hypervisor-based virtualization technology in an SSA scenario.

  18. Kinematics of Local, High-Velocity K dwarfs in the SUPERBLINK Proper Motion Catalog

    Science.gov (United States)

    Kim, Bokyoung; Lepine, Sebastien

    2018-01-01

    We present a study of the kinematics of 345,480 K stars within 2 kpc of the Sun, based on data from the SUPERBLINK catalog of stars with high proper motions (> 40 mas/yr), combined with data from the 2MASS survey and from the first GAIA release, which together yields proper motions accurate to ~2 mas/yr. All K dwarfs were selected based on their G-K colors, and photometric distances were estimated from a re-calibrated color-magnitude relationship for K dwarfs. We plot transverse velocities VT in various directions on the sky, to examine the local distribution of K dwarfs in velocity space. We have also obtained radial velocity information for a subsample of 10,128 stars, from RAVE and SDSS DR12, which we use to construct spatial velocity (U, V, W) plots. About a third (123,350) of the stars are high-velocity K dwarfs, with motions consistent with the local Galactic halo population. Our kinematic analysis suggests that their velocity-space distribution is very uniform, and we find no evidence of substructure that might arise, e.g., from local streams or moving groups.

  19. GPU-Based Point Cloud Superpositioning for Structural Comparisons of Protein Binding Sites.

    Science.gov (United States)

    Leinweber, Matthias; Fober, Thomas; Freisleben, Bernd

    2018-01-01

    In this paper, we present a novel approach to solve the labeled point cloud superpositioning problem for performing structural comparisons of protein binding sites. The solution is based on a parallel evolution strategy that operates on large populations and runs on GPU hardware. The proposed evolution strategy reduces the likelihood of getting stuck in a local optimum of the multimodal real-valued optimization problem represented by labeled point cloud superpositioning. The performance of the GPU-based parallel evolution strategy is compared to a previously proposed CPU-based sequential approach for labeled point cloud superpositioning, indicating that the GPU-based parallel evolution strategy leads to qualitatively better results and significantly shorter runtimes, with speed improvements of up to a factor of 1,500 for large populations. Binary classification tests based on the ATP, NADH, and FAD protein subsets of CavBase, a database containing putative binding sites, show average classification rate improvements from about 92 percent (CPU) to 96 percent (GPU). Further experiments indicate that the proposed GPU-based labeled point cloud superpositioning approach can be superior to traditional protein comparison approaches based on sequence alignments.

  20. Cloud-based computation for accelerating vegetation mapping and change detection at regional to national scales

    Science.gov (United States)

    Matthew J. Gregory; Zhiqiang Yang; David M. Bell; Warren B. Cohen; Sean Healey; Janet L. Ohmann; Heather M. Roberts

    2015-01-01

    Mapping vegetation and landscape change at fine spatial scales is needed to inform natural resource and conservation planning, but such maps are expensive and time-consuming to produce. For Landsat-based methodologies, mapping efforts are hampered by the daunting task of manipulating multivariate data for millions to billions of pixels. The advent of cloud-based...

  1. Surface Water Quality Evaluation Based on a Game Theory-Based Cloud Model

    Directory of Open Access Journals (Sweden)

    Bing Yang

    2018-04-01

    Full Text Available Water quality evaluation is an essential measure to analyze water quality. However, excessive randomness and fuzziness affect the process of evaluation, thus reducing the accuracy of evaluation. Therefore, this study proposed a cloud model for evaluating the water quality to alleviate this problem. Analytic hierarchy process and entropy theory were used to calculate the subjective weight and objective weight, respectively, and then they were coupled as a combination weight (CW via game theory. The proposed game theory-based cloud model (GCM was then applied to the Qixinggang section of the Beijiang River. The results show that the CW ranks fecal coliform as the most important factor, followed by total nitrogen and total phosphorus, while biochemical oxygen demand and fluoride were considered least important. There were 19 months (31.67% at grade I, 39 months (65.00% at grade II, and one month at grade IV and grade V during 2010–2014. A total of 52 months (86.6% of GCM were identical to the comprehensive evaluation result (CER. The obtained water quality grades of GCM are close to the grades of the analytic hierarchy process weight (AHPW due to the weight coefficient of AHPW set to 0.7487. Generally, one or two grade gaps exist among the results of the three groups of weights, suggesting that the index weight is not particularly sensitive to the cloud model. The evaluated accuracy of water quality can be improved by modifying the quantitative boundaries. This study could provide a reference for water quality evaluation, prevention, and improvement of water quality assessment and other applications.

  2. Architecture Design of Healthcare Software-as-a-Service Platform for Cloud-Based Clinical Decision Support Service

    OpenAIRE

    Oh, Sungyoung; Cha, Jieun; Ji, Myungkyu; Kang, Hyekyung; Kim, Seok; Heo, Eunyoung; Han, Jong Soo; Kang, Hyunggoo; Chae, Hoseok; Hwang, Hee; Yoo, Sooyoung

    2015-01-01

    Objectives To design a cloud computing-based Healthcare Software-as-a-Service (SaaS) Platform (HSP) for delivering healthcare information services with low cost, high clinical value, and high usability. Methods We analyzed the architecture requirements of an HSP, including the interface, business services, cloud SaaS, quality attributes, privacy and security, and multi-lingual capacity. For cloud-based SaaS services, we focused on Clinical Decision Service (CDS) content services, basic functi...

  3. A cloud-based X73 ubiquitous mobile healthcare system: design and implementation.

    Science.gov (United States)

    Ji, Zhanlin; Ganchev, Ivan; O'Droma, Máirtín; Zhang, Xin; Zhang, Xueji

    2014-01-01

    Based on the user-centric paradigm for next generation networks, this paper describes a ubiquitous mobile healthcare (uHealth) system based on the ISO/IEEE 11073 personal health data (PHD) standards (X73) and cloud computing techniques. A number of design issues associated with the system implementation are outlined. The system includes a middleware on the user side, providing a plug-and-play environment for heterogeneous wireless sensors and mobile terminals utilizing different communication protocols and a distributed "big data" processing subsystem in the cloud. The design and implementation of this system are envisaged as an efficient solution for the next generation of uHealth systems.

  4. An HST/COS legacy survey of high-velocity ultraviolet absorption in the Milky Way's circumgalactic medium and the Local Group

    Science.gov (United States)

    Richter, P.; Nuza, S. E.; Fox, A. J.; Wakker, B. P.; Lehner, N.; Ben Bekhti, N.; Fechner, C.; Wendt, M.; Howk, J. C.; Muzahid, S.; Ganguly, R.; Charlton, J. C.

    2017-11-01

    Context. The Milky Way is surrounded by large amounts of diffuse gaseous matter that connects the stellar body of our Galaxy with its large-scale Local Group (LG) environment. Aims: To characterize the absorption properties of this circumgalactic medium (CGM) and its relation to the LG we present the so-far largest survey of metal absorption in Galactic high-velocity clouds (HVCs) using archival ultraviolet (UV) spectra of extragalactic background sources. The UV data are obtained with the Cosmic Origins Spectrograph (COS) onboard the Hubble Space Telescope (HST) and are supplemented by 21 cm radio observations of neutral hydrogen. Methods: Along 270 sightlines we measure metal absorption in the lines of Si II, Si III, C II, and C iv and associated H I 21 cm emission in HVCs in the velocity range | vLSR | = 100-500 km s-1. With this unprecedented large HVC sample we were able to improve the statistics on HVC covering fractions, ionization conditions, small-scale structure, CGM mass, and inflow rate. For the first time, we determine robustly the angular two point correlation function of the high-velocity absorbers, systematically analyze antipodal sightlines on the celestial sphere, and compare the HVC absorption characteristics with that of damped Lyman α absorbers (DLAs) and constrained cosmological simulations of the LG (CLUES project). Results: The overall sky-covering fraction of high-velocity absorption is 77 ± 6 percent for the most sensitive ion in our survey, Si III, and for column densities log N(Si III)≥ 12.1. This value is 4-5 times higher than the covering fraction of 21 cm neutral hydrogen emission at log N(H I)≥ 18.7 along the same lines of sight, demonstrating that the Milky Way's CGM is multi-phase and predominantly ionized. The measured equivalent-width ratios of Si II, Si III, C II, and C iv are inhomogeneously distributed on large and small angular scales, suggesting a complex spatial distribution of multi-phase gas that surrounds the

  5. Web-based Tsunami Early Warning System with instant Tsunami Propagation Calculations in the GPU Cloud

    Science.gov (United States)

    Hammitzsch, M.; Spazier, J.; Reißland, S.

    2014-12-01

    Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the

  6. mPano: cloud-based mobile panorama view from single picture

    Science.gov (United States)

    Li, Hongzhi; Zhu, Wenwu

    2013-09-01

    Panorama view provides people an informative and natural user experience to represent the whole scene. The advances on mobile augmented reality, mobile-cloud computing, and mobile internet can enable panorama view on mobile phone with new functionalities, such as anytime anywhere query where a landmark picture is and what the whole scene looks like. To generate and explore panorama view on mobile devices faces significant challenges due to the limitations of computing capacity, battery life, and memory size of mobile phones, as well as the bandwidth of mobile Internet connection. To address the challenges, this paper presents a novel cloud-based mobile panorama view system that can generate and view panorama-view on mobile devices from a single picture, namely "Pano". In our system, first, we propose a novel iterative multi-modal image retrieval (IMIR) approach to get spatially adjacent images using both tag and content information from the single picture. Second, we propose a cloud-based parallel server synthing approach to generate panorama view in cloud, against today's local-client synthing approach that is almost impossible for mobile phones. Third, we propose predictive-cache solution to reduce latency of image delivery from cloud server to the mobile client. We have built a real mobile panorama view system and perform experiments. The experimental results demonstrated the effectiveness of our system and the proposed key component technologies, especially for landmark images.

  7. CURB-BASED STREET FLOOR EXTRACTION FROM MOBILE TERRESTRIAL LIDAR POINT CLOUD

    Directory of Open Access Journals (Sweden)

    S. Ibrahim

    2012-07-01

    Full Text Available Mobile terrestrial laser scanners (MTLS produce huge 3D point clouds describing the terrestrial surface, from which objects like different street furniture can be generated. Extraction and modelling of the street curb and the street floor from MTLS point clouds is important for many applications such as right-of-way asset inventory, road maintenance and city planning. The proposed pipeline for the curb and street floor extraction consists of a sequence of five steps: organizing the 3D point cloud and nearest neighbour search; 3D density-based segmentation to segment the ground; morphological analysis to refine out the ground segment; derivative of Gaussian filtering to detect the curb; solving the travelling salesman problem to form a closed polygon of the curb and point-inpolygon test to extract the street floor. Two mobile laser scanning datasets of different scenes are tested with the proposed pipeline. The results of the extracted curb and street floor are evaluated based on a truth data. The obtained detection rates for the extracted street floor for the datasets are 95% and 96.53%. This study presents a novel approach to the detection and extraction of the road curb and the street floor from unorganized 3D point clouds captured by MTLS. It utilizes only the 3D coordinates of the point cloud.

  8. Efficient Multi-keyword Ranked Search over Outsourced Cloud Data based on Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Nie Mengxi

    2016-01-01

    Full Text Available With the development of cloud computing, more and more data owners are motivated to outsource their data to the cloud server for great flexibility and less saving expenditure. Because the security of outsourced data must be guaranteed, some encryption methods should be used which obsoletes traditional data utilization based on plaintext, e.g. keyword search. To solve the search of encrypted data, some schemes were proposed to solve the search of encrypted data, e.g. top-k single or multiple keywords retrieval. However, the efficiency of these proposed schemes is not high enough to be impractical in the cloud computing. In this paper, we propose a new scheme based on homomorphic encryption to solve this challenging problem of privacy-preserving efficient multi-keyword ranked search over outsourced cloud data. In our scheme, the inner product is adopted to measure the relevance scores and the technique of relevance feedback is used to reflect the search preference of the data users. Security analysis shows that the proposed scheme can meet strict privacy requirements for such a secure cloud data utilization system. Performance evaluation demonstrates that the proposed scheme can achieve low overhead on both computation and communication.

  9. Criteria for the evaluation of a cloud-based hospital information system outsourcing provider.

    Science.gov (United States)

    Low, Chinyao; Hsueh Chen, Ya

    2012-12-01

    As cloud computing technology has proliferated rapidly worldwide, there has been a trend toward adopting cloud-based hospital information systems (CHISs). This study examines the critical criteria for selecting the CHISs outsourcing provider. The fuzzy Delphi method (FDM) is used to evaluate the primary indicator collected from 188 useable responses at a working hospital in Taiwan. Moreover, the fuzzy analytic hierarchy process (FAHP) is employed to calculate the weights of these criteria and establish a fuzzy multi-criteria model of CHISs outsourcing provider selection from 42 experts. The results indicate that the five most critical criteria related to CHISs outsourcing provider selection are (1) system function, (2) service quality, (3) integration, (4) professionalism, and (5) economics. This study may contribute to understanding how cloud-based hospital systems can reinforce content design and offer a way to compete in the field by developing more appropriate systems.

  10. Study on Huizhou architecture of point cloud registration based on optimized ICP algorithm

    Science.gov (United States)

    Zhang, Runmei; Wu, Yulu; Zhang, Guangbin; Zhou, Wei; Tao, Yuqian

    2018-03-01

    In view of the current point cloud registration software has high hardware requirements, heavy workload and moltiple interactive definition, the source of software with better processing effect is not open, a two--step registration method based on normal vector distribution feature and coarse feature based iterative closest point (ICP) algorithm is proposed in this paper. This method combines fast point feature histogram (FPFH) algorithm, define the adjacency region of point cloud and the calculation model of the distribution of normal vectors, setting up the local coordinate system for each key point, and obtaining the transformation matrix to finish rough registration, the rough registration results of two stations are accurately registered by using the ICP algorithm. Experimental results show that, compared with the traditional ICP algorithm, the method used in this paper has obvious time and precision advantages for large amount of point clouds.

  11. Machine learning based cloud mask algorithm driven by radiative transfer modeling

    Science.gov (United States)

    Chen, N.; Li, W.; Tanikawa, T.; Hori, M.; Shimada, R.; Stamnes, K. H.

    2017-12-01

    Cloud detection is a critically important first step required to derive many satellite data products. Traditional threshold based cloud mask algorithms require a complicated design process and fine tuning for each sensor, and have difficulty over snow/ice covered areas. With the advance of computational power and machine learning techniques, we have developed a new algorithm based on a neural network classifier driven by extensive radiative transfer modeling. Statistical validation results obtained by using collocated CALIOP and MODIS data show that its performance is consistent over different ecosystems and significantly better than the MODIS Cloud Mask (MOD35 C6) during the winter seasons over mid-latitude snow covered areas. Simulations using a reduced number of satellite channels also show satisfactory results, indicating its flexibility to be configured for different sensors.

  12. Design of Technical Support System for Retail Company Based on Cloud

    Directory of Open Access Journals (Sweden)

    Shao Ping

    2017-01-01

    Full Text Available With the retail side of the market in China, the sale of electricity companies as a new source of power retail, they participate in the electricity market business. National and local governments subsequently introduced the corresponding policies and rules, the technical support system becomes one of the necessary conditions for the access of the retail company. Retail electricity companies have started the system construction, but has not yet formed a standardized, complete architecture. This paper analyzes the business and data interaction requirements of retail electricity companies, and then designs the functional architecture based on basic application, advanced application and value-added application, and the technical architecture based on “cloud”. On this basis, the paper discusses the selection of private cloud, public cloud and mixed cloud model, and the rationalization suggestion of system construction. Which can provide reference for the construction of the technical support system of the domestic retail enterprises.

  13. High-velocity Penetration of Concrete Targets with Three Types of Projectiles: Experiments and Analysis

    Directory of Open Access Journals (Sweden)

    Shuang Zhang

    Full Text Available Abstract This study conducted high-velocity penetration experiments using conventional ogive-nose, double-ogive-nose, and grooved-tapered projectiles of approximately 2.5 kg and initial velocities between 1000 and 1360 m/s to penetrate or perforate concrete targets with unconfined compressive strengths of nominally 40MPa. The penetration performance data of these three types of projectiles with two different types of materials (i.e., AerMet100 and DT300 were obtained. The crater depth model considering both the projectile mass and the initial velocity was proposed based on the test results and a theoretical analysis. The penetration ability and the trajectory stability of these three projectile types were compared and analyzed accordingly. The results showed that, under these experimental conditions, the effects of these two different kinds of projectile materials on the penetration depth and mass erosion rate of projectile were not obvious. The existing models could not reflect the crater depths for projectiles of greater weights or higher velocities, whereas the new model established in this study was reliable. The double-ogive-nose has a certain effect of drag reduction. Thus, the double-ogive-nose projectile has a higher penetration ability than the conventional ogive-nose projectile. Meanwhile, the grooved-tapered projectile has a better trajectory stability, because the convex parts of tapered shank generated the restoring moment to stabilize the trajectory.

  14. Fragmentation of neutral carbon clusters formed by high velocity atomic collision

    International Nuclear Information System (INIS)

    Martinet, G.

    2004-05-01

    The aim of this work is to understand the fragmentation of small neutral carbon clusters formed by high velocity atomic collision on atomic gas. In this experiment, the main way of deexcitation of neutral clusters formed by electron capture with ionic species is the fragmentation. To measure the channels of fragmentation, a new detection tool based on shape analysis of current pulse delivered by semiconductor detectors has been developed. For the first time, all branching ratios of neutral carbon clusters are measured in an unambiguous way for clusters size up to 10 atoms. The measurements have been compared to a statistical model in microcanonical ensemble (Microcanonical Metropolis Monte Carlo). In this model, various structural properties of carbon clusters are required. These data have been calculated with Density Functional Theory (DFT-B3LYP) to find the geometries of the clusters and then with Coupled Clusters (CCSD(T)) formalism to obtain dissociation energies and other quantities needed to compute fragmentation calculations. The experimental branching ratios have been compared to the fragmentation model which has allowed to find an energy distribution deposited in the collision. Finally, specific cluster effect has been found namely a large population of excited states. This behaviour is completely different of the atomic carbon case for which the electron capture in the ground states predominates. (author)

  15. Magnetic Circuit Design and Multiphysics Analysis of a Novel MR Damper for Applications under High Velocity

    Directory of Open Access Journals (Sweden)

    Jiajia Zheng

    2014-02-01

    Full Text Available A novel magnetorheological (MR damper with a multistage piston and independent input currents is designed and analyzed. The equivalent magnetic circuit model is investigated along with the relation between magnetic induction density in the working gap and input currents of the electromagnetic coils. Finite element method (FEM is used to analyze the distribution of magnetic field through the MR fluid region. Considering the real situation, coupling equations are presented to analyze the electromagnetic-thermal-flow coupling problems. Software COMSOL is used to analyze the multiphysics, that is, electromagnetic, thermal dynamic, and fluid mechanic. A measurement index involving total damping force, dynamic range, and induction time needed for magnetic coil is put forward to evaluate the performance of the novel multistage MR damper. The simulation results show that it is promising for applications under high velocity and works better when more electromagnetic coils are applied with input currents separately. Besides, in order to reduce energy consumption, it is recommended to apply more electromagnetic coils with relative low currents based on the analysis of pressure drop along the annular gap.

  16. Validation of quasi-invariant ice cloud radiative quantities with MODIS satellite-based cloud property retrievals

    International Nuclear Information System (INIS)

    Ding, Jiachen; Yang, Ping; Kattawar, George W.; King, Michael D.; Platnick, Steven; Meyer, Kerry G.

    2017-01-01

    Similarity relations applied to ice cloud radiance calculations are theoretically analyzed and numerically validated. If τ(1–ϖ) and τ(1–ϖg) are conserved where τ is optical thickness, ϖ the single-scattering albedo, and g the asymmetry factor, it is possible that substantially different phase functions may give rise to similar radiances in both conservative and non-conservative scattering cases, particularly in the case of large optical thicknesses. In addition to theoretical analysis, this study uses operational ice cloud optical thickness retrievals from the Moderate Resolution Imaging Spectroradiometer (MODIS) Level 2 Collection 5 (C5) and Collection 6 (C6) cloud property products to verify radiative similarity relations. It is found that, if the MODIS C5 and C6 ice cloud optical thickness values are multiplied by their respective (1–ϖg) factors, the resultant products referred to as the effective optical thicknesses become similar with their ratio values around unity. Furthermore, the ratios of the C5 and C6 ice cloud effective optical thicknesses display an angular variation pattern similar to that of the corresponding ice cloud phase function ratios. The MODIS C5 and C6 values of ice cloud similarity parameter, defined as [(1–ϖ)/(1–ϖg)]"1"/"2, also tend to be similar. - Highlights: • Similarity relations are theoretically analyzed and validated. • Similarity relations are verified with the MODIS Level 2 Collection 5 and 6 ice cloud property products. • The product of ice cloud optical thickness and (1–ϖg) is approximately invariant. • The similarity parameter derived from the MODIS ice cloud effective radius retrieval tends to be invariant.

  17. Aircraft-based investigation of Dynamics-Aerosol-Chemistry-Cloud Interactions in Southern West Africa

    Science.gov (United States)

    Flamant, Cyrille

    2017-04-01

    The EU-funded project DACCIWA (Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa, http://www.dacciwa.eu) is investigating the relationship between weather, climate and air pollution in southern West Africa. The air over the coastal region of West Africa is a unique mixture of natural and anthropogenic gases, liquids and particles, emitted in an environment, in which multi-layer cloud decks frequently form. These exert a large influence on the local weather and climate, mainly due to their impact on radiation, the surface energy balance and thus the diurnal cycle of the atmospheric boundary layer. The main objective for the aircraft detachment was to build robust statistics of cloud properties in southern West Africa in different chemical landscapes to investigate the physical processes involved in their life cycle in such a complex chemical environment. As part of the DACCIWA field campaigns, three European aircraft (the German DLR Falcon 20, the French SAFIRE ATR 42 and the British BAS Twin Otter) conducted a total of 50 research flights across Ivory Coast, Ghana, Togo, and Benin from 27 June to 16 July 2016 for a total of 155 flight hours, including hours sponsored through 3 EUFAR projects. The aircraft were used in different ways based on their strengths, but all three had comparable instrumentation with the the capability to do gas-phase chemistry, aerosol and clouds, thereby generating a rich dataset of atmospheric conditions across the region. Eight types of flight objectives were conducted to achieve the goals of the DACCIWA: (i) Stratus clouds, (ii) Land-sea breeze clouds, (iii) Mid-level clouds, (iv) Biogenic emission, (v) City emissions, (vi) Flaring and ship emissions, (vii) Dust and biomass burning aerosols, and (viii) air-sea interactions. An overview of the DACCIWA aircraft campaign as well as first highlights from the airborne observations will be presented.

  18. Object-Based Coregistration of Terrestrial Photogrammetric and ALS Point Clouds in Forested Areas

    Science.gov (United States)

    Polewski, P.; Erickson, A.; Yao, W.; Coops, N.; Krzystek, P.; Stilla, U.

    2016-06-01

    Airborne Laser Scanning (ALS) and terrestrial photogrammetry are methods applicable for mapping forested environments. While ground-based techniques provide valuable information about the forest understory, the measured point clouds are normally expressed in a local coordinate system, whose transformation into a georeferenced system requires additional effort. In contrast, ALS point clouds are usually georeferenced, yet the point density near the ground may be poor under dense overstory conditions. In this work, we propose to combine the strengths of the two data sources by co-registering the respective point clouds, thus enriching the georeferenced ALS point cloud with detailed understory information in a fully automatic manner. Due to markedly different sensor characteristics, coregistration methods which expect a high geometric similarity between keypoints are not suitable in this setting. Instead, our method focuses on the object (tree stem) level. We first calculate approximate stem positions in the terrestrial and ALS point clouds and construct, for each stem, a descriptor which quantifies the 2D and vertical distances to other stem centers (at ground height). Then, the similarities between all descriptor pairs from the two point clouds are calculated, and standard graph maximum matching techniques are employed to compute corresponding stem pairs (tiepoints). Finally, the tiepoint subset yielding the optimal rigid transformation between the terrestrial and ALS coordinate systems is determined. We test our method on simulated tree positions and a plot situated in the northern interior of the Coast Range in western Oregon, USA, using ALS data (76 x 121 m2) and a photogrammetric point cloud (33 x 35 m2) derived from terrestrial photographs taken with a handheld camera. Results on both simulated and real data show that the proposed stem descriptors are discriminative enough to derive good correspondences. Specifically, for the real plot data, 24

  19. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment.

    Science.gov (United States)

    Oh, Jeongsu; Choi, Chi-Hwan; Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in JAVA

  20. Using a New Event-Based Simulation Framework for Investigating Resource Provisioning in Clouds

    Directory of Open Access Journals (Sweden)

    Simon Ostermann

    2011-01-01

    Full Text Available Today, Cloud computing proposes an attractive alternative to building large-scale distributed computing environments by which resources are no longer hosted by the scientists' computational facilities, but leased from specialised data centres only when and for how long they are needed. This new class of Cloud resources raises new interesting research questions in the fields of resource management, scheduling, fault tolerance, or quality of service, requiring hundreds to thousands of experiments for finding valid solutions. To enable such research, a scalable simulation framework is typically required for early prototyping, extensive testing and validation of results before the real deployment is performed. The scope of this paper is twofold. In the first part we present GroudSim, a Grid and Cloud simulation toolkit for scientific computing based on a scalable simulation-independent discrete-event engine. GroudSim provides a comprehensive set of features for complex simulation scenarios from simple job executions on leased computing resources to file transfers, calculation of costs and background load on resources. Simulations can be parameterised and are easily extendable by probability distribution packages for failures which normally occur in complex distributed environments. Experimental results demonstrate the improved scalability of GroudSim compared to a related process-based simulation approach. In the second part, we show the use of the GroudSim simulator to analyse the problem of dynamic provisioning of Cloud resources to scientific workflows that do not benefit from sufficient Grid resources as required by their computational demands. We propose and study four strategies for provisioning and releasing Cloud resources that take into account the general leasing model encountered in today's commercial Cloud environments based on resource bulks, fuzzy descriptions and hourly payment intervals. We study the impact of our techniques to the

  1. New Stereo Vision Digital Camera System for Simultaneous Measurement of Cloud Base Height and Atmospheric Visibility

    Science.gov (United States)

    Janeiro, F. M.; Carretas, F.; Palma, N.; Ramos, P. M.; Wagner, F.

    2013-12-01

    Clouds play an important role in many aspects of everyday life. They affect both the local weather as well as the global climate and are an important parameter on climate change studies. Cloud parameters are also important for weather prediction models which make use of actual measurements. It is thus important to have low-cost instrumentation that can be deployed in the field to measure those parameters. This kind of instruments should also be automated and robust since they may be deployed in remote places and be subject to adverse weather conditions. Although clouds are very important in environmental systems, they are also an essential component of airplane safety when visual flight rules (VFR) are enforced, such as in most small aerodromes where it is not economically viable to install instruments for assisted flying. Under VFR there are strict limits on the height of the cloud base, cloud cover and atmospheric visibility that ensure the safety of the pilots and planes. Although there are instruments, available in the market, to measure those parameters, their relatively high cost makes them unavailable in many local aerodromes. In this work we present a new prototype which has been recently developed and deployed in a local aerodrome as proof of concept. It is composed by two digital cameras that capture photographs of the sky and allow the measurement of the cloud height from the parallax effect. The new developments consist on having a new geometry which allows the simultaneous measurement of cloud base height, wind speed at cloud base height and atmospheric visibility, which was not previously possible with only two cameras. The new orientation of the cameras comes at the cost of a more complex geometry to measure the cloud base height. The atmospheric visibility is calculated from the Lambert-Beer law after the measurement of the contrast between a set of dark objects and the background sky. The prototype includes the latest hardware developments that

  2. Theoretical Research Progress in High-Velocity/Hypervelocity Impact on Semi-Infinite Targets

    Directory of Open Access Journals (Sweden)

    Yunhou Sun

    2015-01-01

    Full Text Available With the hypervelocity kinetic weapon and hypersonic cruise missiles research projects being carried out, the damage mechanism for high-velocity/hypervelocity projectile impact on semi-infinite targets has become the research keystone in impact dynamics. Theoretical research progress in high-velocity/hypervelocity impact on semi-infinite targets was reviewed in this paper. The evaluation methods for critical velocity of high-velocity and hypervelocity impact were summarized. The crater shape, crater scaling laws and empirical formulae, and simplified analysis models of crater parameters for spherical projectiles impact on semi-infinite targets were reviewed, so were the long rod penetration state differentiation, penetration depth calculation models for the semifluid, and deformed long rod projectiles. Finally, some research proposals were given for further study.

  3. Supporting reputation based trust management enhancing security layer for cloud service models

    Science.gov (United States)

    Karthiga, R.; Vanitha, M.; Sumaiya Thaseen, I.; Mangaiyarkarasi, R.

    2017-11-01

    In the existing system trust between cloud providers and consumers is inadequate to establish the service level agreement though the consumer’s response is good cause to assess the overall reliability of cloud services. Investigators recognized the significance of trust can be managed and security can be provided based on feedback collected from participant. In this work a face recognition system that helps to identify the user effectively. So we use an image comparison algorithm where the user face is captured during registration time and get stored in database. With that original image we compare it with the sample image that is already stored in database. If both the image get matched then the users are identified effectively. When the confidential data are subcontracted to the cloud, data holders will become worried about the confidentiality of their data in the cloud. Encrypting the data before subcontracting has been regarded as the important resources of keeping user data privacy beside the cloud server. So in order to keep the data secure we use an AES algorithm. Symmetric-key algorithms practice a shared key concept, keeping data secret requires keeping this key secret. So only the user with private key can decrypt data.

  4. An Intelligent and Secure Health Monitoring Scheme Using IoT Sensor Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jin-Xin Hu

    2017-01-01

    Full Text Available Internet of Things (IoT is the network of physical objects where information and communication technology connect multiple embedded devices to the Internet for collecting and exchanging data. An important advancement is the ability to connect such devices to large resource pools such as cloud. The integration of embedded devices and cloud servers offers wide applicability of IoT to many areas of our life. With the aging population increasing every day, embedded devices with cloud server can provide the elderly with more flexible service without the need to visit hospitals. Despite the advantages of the sensor-cloud model, it still has various security threats. Therefore, the design and integration of security issues, like authentication and data confidentiality for ensuring the elderly’s privacy, need to be taken into consideration. In this paper, an intelligent and secure health monitoring scheme using IoT sensor based on cloud computing and cryptography is proposed. The proposed scheme achieves authentication and provides essential security requirements.

  5. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  6. Key Based Mutual Authentication (KBMA Mechanism for Secured Access in MobiCloud Environment

    Directory of Open Access Journals (Sweden)

    Donald A. Cecil

    2016-01-01

    Full Text Available Mobile Cloud Computing (MCC fuels innovation in Mobile Computing and opens new pathways between mobile devices and infrastructures. There are several issues in MCC environment as it integrates various technologies. Among all issues, security lies on the top where many users are not willing to adopt the cloud services. This paper focuses on the authentication. The objective of this paper is to provide a mechanism for authenticating all the entities involved in accessing the cloud services. A mechanism called Key Based Mutual Authentication (KBMA is proposed which is divided into two processes namely registration and authentication. Registration is a one-time process where the users are registered for accessing the cloud services by giving the desired unique information. Authentication process is carried out mutually to verify the identities of Device and Cloud Service Provider (CSP. Scyther tool is used for analysing the vulnerability in terms of attacks. The result claims show that the proposed mechanism is resilient against various attacks.

  7. HealthNode: Software Framework for Efficiently Designing and Developing Cloud-Based Healthcare Applications

    Directory of Open Access Journals (Sweden)

    Ho-Kyeong Ra

    2018-01-01

    Full Text Available With the exponential improvement of software technology during the past decade, many efforts have been made to design remote and personalized healthcare applications. Many of these applications are built on mobile devices connected to the cloud. Although appealing, however, prototyping and validating the feasibility of an application-level idea is yet challenging without a solid understanding of the cloud, mobile, and the interconnectivity infrastructure. In this paper, we provide a solution to this by proposing a framework called HealthNode, which is a general-purpose framework for developing healthcare applications on cloud platforms using Node.js. To fully exploit the potential of Node.js when developing cloud applications, we focus on the fact that the implementation process should be eased. HealthNode presents an explicit guideline while supporting necessary features to achieve quick and expandable cloud-based healthcare applications. A case study applying HealthNode to various real-world health applications suggests that HealthNode can express architectural structure effectively within an implementation and that the proposed platform can support system understanding and software evolution.

  8. Cloud-Based Applications for Organizing and Reviewing Plastic Surgery Content.

    Science.gov (United States)

    Luan, Anna; Momeni, Arash; Lee, Gordon K; Galvez, Michael G

    2015-01-01

    Cloud-based applications including Box, Dropbox, Google Drive, Evernote, Notability, and Zotero are available for smartphones, tablets, and laptops and have revolutionized the manner in which medical students and surgeons read and utilize plastic surgery literature. Here we provide an overview of the use of Cloud computing in practice and propose an algorithm for organizing the vast amount of plastic surgery literature. Given the incredible amount of data being produced in plastic surgery and other surgical subspecialties, it is prudent for plastic surgeons to lead the process of providing solutions for the efficient organization and effective integration of the ever-increasing data into clinical practice.

  9. Droop Control with an Adjustable Complex Virtual Impedance Loop based on Cloud Model Theory

    DEFF Research Database (Denmark)

    Li, Yan; Shuai, Zhikang; Xu, Qinming

    2016-01-01

    Droop control framework with an adjustable virtual impedance loop is proposed in this paper, which is based on the cloud model theory. The proposed virtual impedance loop includes two terms: a negative virtual resistor and an adjustable virtual inductance. The negative virtual resistor term...... sometimes. The cloud model theory is applied to get online the changing line impedance value, which relies on the relevance of the reactive power responding the changing line impedance. The verification of the proposed control strategy is done according to the simulation in a low voltage microgrid in Matlab....

  10. Macrophysical and optical properties of midlatitude cirrus clouds from four ground-based lidars and collocated CALIOP observations

    Energy Technology Data Exchange (ETDEWEB)

    Dupont, Jean-Charles; Haeffelin, M.; Morille, Y.; Noel, V.; Keckhut, P.; Winker, D.; Comstock, Jennifer M.; Chervet, P.; Roblin, A.

    2010-05-27

    Ground-based lidar and CALIOP datasets gathered over four mid-latitude sites, two US and two French sites, are used to evaluate the consistency of cloud macrophysical and optical property climatologies that can be derived by such datasets. The consistency in average cloud height (both base and top height) between the CALIOP and ground datasets ranges from -0.4km to +0.5km. The cloud geometrical thickness distributions vary significantly between the different datasets, due in part to the original vertical resolutions of the lidar profiles. Average cloud geometrical thicknesses vary from 1.2 to 1.9km, i.e. by more than 50%. Cloud optical thickness distributions in subvisible, semi-transparent and moderate intervals differ by more than 50% between ground and space-based datasets. The cirrus clouds with 2 optical thickness below 0.1 (not included in historical cloud climatologies) represent 30-50% of the non-opaque cirrus class. The differences in average cloud base altitude between ground and CALIOP datasets of 0.0-0.1 km, 0.0-0.2 km and 0.0-0.2 km can be attributed to irregular sampling of seasonal variations in the ground-based data, to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without low-level clouds in ground-based data, respectively. The cloud geometrical thicknesses are not affected by irregular sampling of seasonal variations in the ground-based data, while up to 0.0-0.2 km and 0.1-0.3 km differences can be attributed to day-night differences in detection capabilities by CALIOP, and to the restriction to situations without lowlevel clouds in ground-based data, respectively.

  11. Energy Efficient Multiresource Allocation of Virtual Machine Based on PSO in Cloud Data Center

    Directory of Open Access Journals (Sweden)

    An-ping Xiong

    2014-01-01

    Full Text Available Presently, massive energy consumption in cloud data center tends to be an escalating threat to the environment. To reduce energy consumption in cloud data center, an energy efficient virtual machine allocation algorithm is proposed in this paper based on a proposed energy efficient multiresource allocation model and the particle swarm optimization (PSO method. In this algorithm, the fitness function of PSO is defined as the total Euclidean distance to determine the optimal point between resource utilization and energy consumption. This algorithm can avoid falling into local optima which is common in traditional heuristic algorithms. Compared to traditional heuristic algorithms MBFD and MBFH, our algorithm shows significantly energy savings in cloud data center and also makes the utilization of system resources reasonable at the same time.

  12. a Gross Error Elimination Method for Point Cloud Data Based on Kd-Tree

    Science.gov (United States)

    Kang, Q.; Huang, G.; Yang, S.

    2018-04-01

    Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data's pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  13. An approach of point cloud denoising based on improved bilateral filtering

    Science.gov (United States)

    Zheng, Zeling; Jia, Songmin; Zhang, Guoliang; Li, Xiuzhi; Zhang, Xiangyin

    2018-04-01

    An omnidirectional mobile platform is designed for building point cloud based on an improved filtering algorithm which is employed to handle the depth image. First, the mobile platform can move flexibly and the control interface is convenient to control. Then, because the traditional bilateral filtering algorithm is time-consuming and inefficient, a novel method is proposed which called local bilateral filtering (LBF). LBF is applied to process depth image obtained by the Kinect sensor. The results show that the effect of removing noise is improved comparing with the bilateral filtering. In the condition of off-line, the color images and processed images are used to build point clouds. Finally, experimental results demonstrate that our method improves the speed of processing time of depth image and the effect of point cloud which has been built.

  14. Registration of vehicle based panoramic image and LiDAR point cloud

    Science.gov (United States)

    Chen, Changjun; Cao, Liang; Xie, Hong; Zhuo, Xiangyu

    2013-10-01

    Higher quality surface information would be got when data from optical images and LiDAR were integrated, owing to the fact that optical images and LiDAR point cloud have unique characteristics that make them preferable in many applications. While most previous works focus on registration of pinhole perspective cameras to 2D or 3D LiDAR data. In this paper, a method for the registration of vehicle based panoramic image and LiDAR point cloud is proposed. Using the translation among panoramic image, single CCD image, laser scanner and Position and Orientation System (POS) along with the GPS/IMU data, precise co-registration between the panoramic image and the LiDAR point cloud in the world system is achieved. Results are presented under a real world data set collected by a new developed Mobile Mapping System (MMS) integrated with a high resolution panoramic camera, two laser scanners and a POS.

  15. Wireless-Uplinks-Based Energy-Efficient Scheduling in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xing Liu

    2015-01-01

    Full Text Available Mobile cloud computing (MCC combines cloud computing and mobile internet to improve the computational capabilities of resource-constrained mobile devices (MDs. In MCC, mobile users could not only improve the computational capability of MDs but also save operation consumption by offloading the mobile applications to the cloud. However, MCC faces the problem of energy efficiency because of time-varying channels when the offloading is being executed. In this paper, we address the issue of energy-efficient scheduling for wireless uplink in MCC. By introducing Lyapunov optimization, we first propose a scheduling algorithm that can dynamically choose channel to transmit data based on queue backlog and channel statistics. Then, we show that the proposed scheduling algorithm can make a tradeoff between queue backlog and energy consumption in a channel-aware MCC system. Simulation results show that the proposed scheduling algorithm can reduce the time average energy consumption for offloading compared to the existing algorithm.

  16. Analysis of the new health management based on health internet of things and cloud computing

    Science.gov (United States)

    Liu, Shaogang

    2018-05-01

    With the development and application of Internet of things and cloud technology in the medical field, it provides a higher level of exploration space for human health management. By analyzing the Internet of things technology and cloud technology, this paper studies a new form of health management system which conforms to the current social and technical level, and explores its system architecture, system characteristics and application. The new health management platform for networking and cloud can achieve the real-time monitoring and prediction of human health through a variety of sensors and wireless networks based on information and can be transmitted to the monitoring system, and then through the software analysis model, and gives the targeted prevention and treatment measures, to achieve real-time, intelligent health management.

  17. A GROSS ERROR ELIMINATION METHOD FOR POINT CLOUD DATA BASED ON KD-TREE

    Directory of Open Access Journals (Sweden)

    Q. Kang

    2018-04-01

    Full Text Available Point cloud data has been one type of widely used data sources in the field of remote sensing. Key steps of point cloud data’s pro-processing focus on gross error elimination and quality control. Owing to the volume feature of point could data, existed gross error elimination methods need spend massive memory both in space and time. This paper employed a new method which based on Kd-tree algorithm to construct, k-nearest neighbor algorithm to search, settled appropriate threshold to determine with result turns out a judgement that whether target point is or not an outlier. Experimental results show that, our proposed algorithm will help to delete gross error in point cloud data and facilitate to decrease memory consumption, improve efficiency.

  18. CIMIDx: Prototype for a Cloud-Based System to Support Intelligent Medical Image Diagnosis With Efficiency.

    Science.gov (United States)

    Bhavani, Selvaraj Rani; Senthilkumar, Jagatheesan; Chilambuchelvan, Arul Gnanaprakasam; Manjula, Dhanabalachandran; Krishnamoorthy, Ramasamy; Kannan, Arputharaj

    2015-03-27

    The Internet has greatly enhanced health care, helping patients stay up-to-date on medical issues and general knowledge. Many cancer patients use the Internet for cancer diagnosis and related information. Recently, cloud computing has emerged as a new way of delivering health services but currently, there is no generic and fully automated cloud-based self-management intervention for breast cancer patients, as practical guidelines are lacking. We investigated the prevalence and predictors of cloud use for medical diagnosis among women with breast cancer to gain insight into meaningful usage parameters to evaluate the use of generic, fully automated cloud-based self-intervention, by assessing how breast cancer survivors use a generic self-management model. The goal of this study was implemented and evaluated with a new prototype called "CIMIDx", based on representative association rules that support the diagnosis of medical images (mammograms). The proposed Cloud-Based System Support Intelligent Medical Image Diagnosis (CIMIDx) prototype includes two modules. The first is the design and development of the CIMIDx training and test cloud services. Deployed in the cloud, the prototype can be used for diagnosis and screening mammography by assessing the cancers detected, tumor sizes, histology, and stage of classification accuracy. To analyze the prototype's classification accuracy, we conducted an experiment with data provided by clients. Second, by monitoring cloud server requests, the CIMIDx usage statistics were recorded for the cloud-based self-intervention groups. We conducted an evaluation of the CIMIDx cloud service usage, in which browsing functionalities were evaluated from the end-user's perspective. We performed several experiments to validate the CIMIDx prototype for breast health issues. The first set of experiments evaluated the diagnostic performance of the CIMIDx framework. We collected medical information from 150 breast cancer survivors from hospitals

  19. Unsupervised Learning Through Randomized Algorithms for High-Volume High-Velocity Data (ULTRA-HV).

    Energy Technology Data Exchange (ETDEWEB)

    Pinar, Ali [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kolda, Tamara G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Carlberg, Kevin Thomas [Wake Forest Univ., Winston-Salem, MA (United States); Ballard, Grey [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mahoney, Michael [Univ. of California, Berkeley, CA (United States)

    2018-01-01

    Through long-term investments in computing, algorithms, facilities, and instrumentation, DOE is an established leader in massive-scale, high-fidelity simulations, as well as science-leading experimentation. In both cases, DOE is generating more data than it can analyze and the problem is intensifying quickly. The need for advanced algorithms that can automatically convert the abundance of data into a wealth of useful information by discovering hidden structures is well recognized. Such efforts however, are hindered by the massive volume of the data and its high velocity. Here, the challenge is developing unsupervised learning methods to discover hidden structure in high-volume, high-velocity data.

  20. Media Culture 2020: Collaborative Teaching and Blended Learning Using Social Media and Cloud-Based Technologies

    Science.gov (United States)

    Vickers, Richard; Field, James; Melakoski, Cai

    2015-01-01

    In 2013 five universities from across Europe undertook an innovative project "Media Culture 2020", combining skills and forces to develop new practices that would face the challenge of the convergence of digital media, taking full advantage of social media and cloud-based technologies. The aim of the Media Culture 2020 project was to…

  1. Modeling and Security Threat Assessments of Data Processed in Cloud Based Information Systems

    Directory of Open Access Journals (Sweden)

    Darya Sergeevna Simonenkova

    2016-03-01

    Full Text Available The subject of the research is modeling and security threat assessments of data processed in cloud based information systems (CBIS. This method allow to determine the current security threats of CBIS, state of the system in which vulnerabilities exists, level of possible violators, security properties and to generate recommendations for neutralizing security threats of CBIS.

  2. Design and implementation of a cloud based lithography illumination pupil processing application

    Science.gov (United States)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  3. Implementation of news module for news client based on ApiCloud

    OpenAIRE

    Fu Xin; Liang Yu; Cao Sanxing; Gu Hongbo

    2017-01-01

    With the development of new media technology, news client has become the main battlefield of news browsing. Based on the ApiCloud hybrid development platform, this paper uses HTML, JavaScript and other technologies to develop the mobile client news module, and uses WAMP integrated development environment to build a news publishing system on the server side.

  4. Using a Cloud-Based Computing Environment to Support Teacher Training on Common Core Implementation

    Science.gov (United States)

    Robertson, Cory

    2013-01-01

    A cloud-based computing environment, Google Apps for Education (GAFE), has provided the Anaheim City School District (ACSD) a comprehensive and collaborative avenue for creating, sharing, and editing documents, calendars, and social networking communities. With this environment, teachers and district staff at ACSD are able to utilize the deep…

  5. A Cloud-Based Scavenger Hunt: Orienting Undergraduates to ACS National Meetings

    Science.gov (United States)

    Kubasik, Matthew A.; Van Dyke, Aaron R.; Harper-Leatherman, Amanda S.; Miecznikowski, John R.; Steffen, L. Kraig; Smith-Carpenter, Jillian

    2016-01-01

    American Chemical Society (ACS) National Meetings are valuable for the development of undergraduate researchers but can be overwhelming for first-time attendees. To orient and engage students with the range of offerings at an ACS meeting, we developed a cloud-based scavenger hunt. Using their mobile devices, teams of undergraduates…

  6. Cloud discrimination for ASCENDS Mission Based on Optical Phase Conjugation as a Novel Approach, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — PI at ArkLight proposes a novel scheme for making cloud discrimination in the wavelength ranges of near-IR (1-1.8 m) and mid-IR (3-4 m). This scheme is based on...

  7. The Suitability of Cloud-Based Speech Recognition Engines for Language Learning

    Science.gov (United States)

    Daniels, Paul; Iwago, Koji

    2017-01-01

    As online automatic speech recognition (ASR) engines become more accurate and more widely implemented with call software, it becomes important to evaluate the effectiveness and the accuracy of these recognition engines using authentic speech samples. This study investigates two of the most prominent cloud-based speech recognition engines--Apple's…

  8. Implementation of news module for news client based on ApiCloud

    Directory of Open Access Journals (Sweden)

    Fu Xin

    2017-01-01

    Full Text Available With the development of new media technology, news client has become the main battlefield of news browsing. Based on the ApiCloud hybrid development platform, this paper uses HTML, JavaScript and other technologies to develop the mobile client news module, and uses WAMP integrated development environment to build a news publishing system on the server side.

  9. A Knowledge Base for Automatic Feature Recognition from Point Clouds in an Urban Scene

    Directory of Open Access Journals (Sweden)

    Xu-Feng Xing

    2018-01-01

    Full Text Available LiDAR technology can provide very detailed and highly accurate geospatial information on an urban scene for the creation of Virtual Geographic Environments (VGEs for different applications. However, automatic 3D modeling and feature recognition from LiDAR point clouds are very complex tasks. This becomes even more complex when the data is incomplete (occlusion problem or uncertain. In this paper, we propose to build a knowledge base comprising of ontology and semantic rules aiming at automatic feature recognition from point clouds in support of 3D modeling. First, several modules for ontology are defined from different perspectives to describe an urban scene. For instance, the spatial relations module allows the formalized representation of possible topological relations extracted from point clouds. Then, a knowledge base is proposed that contains different concepts, their properties and their relations, together with constraints and semantic rules. Then, instances and their specific relations form an urban scene and are added to the knowledge base as facts. Based on the knowledge and semantic rules, a reasoning process is carried out to extract semantic features of the objects and their components in the urban scene. Finally, several experiments are presented to show the validity of our approach to recognize different semantic features of buildings from LiDAR point clouds.

  10. Use of Cloud-Based Graphic Narrative Software in Medical Ethics Teaching

    Science.gov (United States)

    Weber, Alan S.

    2015-01-01

    Although used as a common pedagogical tool in K-12 education, online graphic narrative ("comics") software has not generally been incorporated into advanced professional or technical education. This contribution reports preliminary data from a study on the use of cloud-based graphics software Pixton.com to teach basic medical ethics…

  11. An Evaluation Methodology for the Usability and Security of Cloud-based File Sharing Technologies

    Science.gov (United States)

    2012-09-01

    FISMA, ISO 27001 , FIPS 140-2, and ISO 270001) indicate a cloud-based service’s compliance with industry standard security controls, management and...Information Assurance IEEE Institute of Electrical and Electronics Engineers IT Information Technology ITS Insider Threat Study ISO International...effectively, efficiently and with satisfaction” (International Organization for Standardization [ ISO ], 1998). Alternately, information security

  12. Students' Attitude to Cloud-Based Learning in University Diverse Environment: A Case of Russia

    Science.gov (United States)

    Atabekova, Anastasia; Gorbatenko, Rimma; Chilingaryan, Kamo

    2015-01-01

    The paper explores the ways how Russian students with different social background view the cloud- based foreign language learning. The empirical data was collected through questionnaires and in-depth interviews of students from metropolitan and regional universities, taking into account the students' family incomes, ethnic and religious…

  13. Aerosol and Cloud Properties during the Cloud Cheju ABC Plume -Asian Monsoon Experiment (CAPMEX) 2008: Linking between Ground-based and UAV Measurements

    Science.gov (United States)

    Kim, S.; Yoon, S.; Venkata Ramana, M.; Ramanathan, V.; Nguyen, H.; Park, S.; Kim, M.

    2009-12-01

    Cheju Atmospheric Brown Cloud (ABC) Plume-Monsoon Experiment (CAPMEX), comprehsensive ground-based measurements and a series of data-gathering flights by specially equipped autonomous unmanned aerial vehicles (AUAVs) for aerosol and cloud, had conducted at Jeju (formerly, Cheju), South Korea during August-September 2008, to improve our understanding of how the reduction of anthropogenic emissions in China (so-called “great shutdown” ) during and after the Summer Beijing Olympic Games 2008 effcts on the air quliaty and radiation budgets and how atmospheric brown clouds (ABCs) influences solar radiation budget off Asian continent. Large numbers of in-situ and remote sensing instruments at the Gosan ABC observatory and miniaturized instruments on the aircraft measure a range of properties such as the quantity of soot, size-segregated aerosol particle numbers, total particle numbers, size-segregated cloud droplet numbers (only AUAV), aerosol scattering properties (only ground), aerosol vertical distribution, column-integrated aerosol properties, and meteorological variables. By integrating ground-level and high-elevation AUAV measurements with NASA-satellite observations (e.g., MODIS, CALIPSO), we investigate the long range transport of aerosols, the impact of ABCs on clouds, and the role of biogenic and anthropogenic aerosols on cloud condensation nuclei (CCN). In this talk, we will present the results from CAPMEX focusing on: (1) the characteristics of aerosol optical, physical and chemical properties at Gosan observatory, (2) aerosol solar heating calculated from the ground-based micro-pulse lidar and AERONET sun/sky radiometer synergy, and comparison with direct measurements from UAV, and (3) aerosol-cloud interactions in conjunction with measurements by satellites and Gosan observatory.

  14. Implementation of a 4-tier Cloud-Based Architecture for Collaborative Health Care Delivery

    Directory of Open Access Journals (Sweden)

    N. A. Azeez

    2016-06-01

    Full Text Available Cloud services permit healthcare providers to ensure information handling and allow different service resources such as Software as a Service (SaaS, Platform as a Service (PaaS and Infrastructure as a Service (IaaS on the Internet, given that security and information proprietorship concerns are attended to. Health Care Providers (HCPs in Nigeria however, have been confronted with various issues because of their method of operations. Amongst the issues are ill-advised methods of data storage and unreliable nature of patient medical records. Apart from these challenges, trouble in accessing quality healthcare services, high cost of medical services, and wrong analysis and treatment methodology are not left out. Cloud Computing has relatively possessed the capacity to give proficient and reliable method for securing medical information and the need for data mining tools in this form of distributed system will go a long way in achieving the objective set out for this project. The aim of this research therefore is to implement a cloud-based architecture that is suitable to integrate Healthcare Delivery into the cloud to provide a productive mode of operation. The proposed architecture consists of four phases (4-Tier; a User Authentication and Access Control Engine (UAACE which prevents unauthorized access to patient medical records and also utilizes standard encryption/decoding techniques to ensure privacy of such records. The architecture likewise contains a Data Analysis and Pattern Prediction Unit (DAPPU which gives valuable data that guides decision making through standard Data mining procedures as well as Cloud Service Provider (CSP and Health Care Providers (HCPs. The architecture which has been implemented on CloudSim has proved to be efficient and reliable base on the results obtained when compared with previous work.

  15. A combined spectral and object-based approach to transparent cloud removal in an operational setting for Landsat ETM+

    Science.gov (United States)

    Watmough, Gary R.; Atkinson, Peter M.; Hutton, Craig W.

    2011-04-01

    The automated cloud cover assessment (ACCA) algorithm has provided automated estimates of cloud cover for the Landsat ETM+ mission since 2001. However, due to the lack of a band around 1.375 μm, cloud edges and transparent clouds such as cirrus cannot be detected. Use of Landsat ETM+ imagery for terrestrial land analysis is further hampered by the relatively long revisit period due to a nadir only viewing sensor. In this study, the ACCA threshold parameters were altered to minimise omission errors in the cloud masks. Object-based analysis was used to reduce the commission errors from the extended cloud filters. The method resulted in the removal of optically thin cirrus cloud and cloud edges which are often missed by other methods in sub-tropical areas. Although not fully automated, the principles of the method developed here provide an opportunity for using otherwise sub-optimal or completely unusable Landsat ETM+ imagery for operational applications. Where specific images are required for particular research goals the method can be used to remove cloud and transparent cloud helping to reduce bias in subsequent land cover classifications.

  16. International inter-rater agreement in scoring acne severity utilizing cloud-based image sharing of mobile phone photographs.

    Science.gov (United States)

    Foolad, Negar; Ornelas, Jennifer N; Clark, Ashley K; Ali, Ifrah; Sharon, Victoria R; Al Mubarak, Luluah; Lopez, Andrés; Alikhan, Ali; Al Dabagh, Bishr; Firooz, Alireza; Awasthi, Smita; Liu, Yu; Li, Chin-Shang; Sivamani, Raja K

    2017-09-01

    Cloud-based image sharing technology allows facilitated sharing of images. Cloud-based image sharing technology has not been well-studied for acne assessments or treatment preferences, among international evaluators. We evaluated inter-rater variability of acne grading and treatment recommendations among an international group of dermatologists that assessed photographs. This is a prospective, single visit photographic study to assess inter-rater agreement of acne photographs shared through an integrated mobile device, cloud-based, and HIPAA-compliant platform. Inter-rater agreements for global acne assessment and acne lesion counts were evaluated by the Kendall's coefficient of concordance while correlations between treatment recommendations and acne severity were calculated by Spearman's rank correlation coefficient. There was good agreement for the evaluation of inflammatory lesions (KCC = 0.62, P cloud-based image sharing for acne assessment. Cloud-based sharing may facilitate acne care and research among international collaborators. © 2017 The International Society of Dermatology.

  17. Location-Based Services and Privacy Protection Under Mobile Cloud Computing

    OpenAIRE

    Yan, Yan; Xiaohong, Hao; Wanjun, Wang

    2015-01-01

    Location-based services can provide personalized services based on location information of moving objects and have already been widely used in public safety services, transportation, entertainment and many other areas. With the rapid development of mobile communication technology and popularization of intelligent terminals, there will be great commercial prospects to provide location-based services under mobile cloud computing environment. However, the high adhesion degree of mobile terminals...

  18. An Analysis of Cloud Model-Based Security for Computing Secure Cloud Bursting and Aggregation in Real Environment

    OpenAIRE

    Pritesh Jain; Vaishali Chourey; Dheeraj Rane

    2011-01-01

    Cloud Computing has emerged as a major information and communications technology trend and has been proved as a key technology for market development and analysis for the users of several field. The practice of computing across two or more data centers separated by the Internet is growing in popularity due to an explosion in scalable computing demands. However, one of the major challenges that faces the cloud computing is how to secure and protect the data and processes the data of the user. ...

  19. The Pose Estimation of Mobile Robot Based on Improved Point Cloud Registration

    Directory of Open Access Journals (Sweden)

    Yanzi Miao

    2016-03-01

    Full Text Available Due to GPS restrictions, an inertial sensor is usually used to estimate the location of indoor mobile robots. However, it is difficult to achieve high-accuracy localization and control by inertial sensors alone. In this paper, a new method is proposed to estimate an indoor mobile robot pose with six degrees of freedom based on an improved 3D-Normal Distributions Transform algorithm (3D-NDT. First, point cloud data are captured by a Kinect sensor and segmented according to the distance to the robot. After the segmentation, the input point cloud data are processed by the Approximate Voxel Grid Filter algorithm in different sized voxel grids. Second, the initial registration and precise registration are performed respectively according to the distance to the sensor. The most distant point cloud data use the 3D-Normal Distributions Transform algorithm (3D-NDT with large-sized voxel grids for initial registration, based on the transformation matrix from the odometry method. The closest point cloud data use the 3D-NDT algorithm with small-sized voxel grids for precise registration. After the registrations above, a final transformation matrix is obtained and coordinated. Based on this transformation matrix, the pose estimation problem of the indoor mobile robot is solved. Test results show that this method can obtain accurate robot pose estimation and has better robustness.

  20. Point Cloud Based Relative Pose Estimation of a Satellite in Close Range

    Directory of Open Access Journals (Sweden)

    Lujiang Liu

    2016-06-01

    Full Text Available Determination of the relative pose of satellites is essential in space rendezvous operations and on-orbit servicing missions. The key problems are the adoption of suitable sensor on board of a chaser and efficient techniques for pose estimation. This paper aims to estimate the pose of a target satellite in close range on the basis of its known model by using point cloud data generated by a flash LIDAR sensor. A novel model based pose estimation method is proposed; it includes a fast and reliable pose initial acquisition method based on global optimal searching by processing the dense point cloud data directly, and a pose tracking method based on Iterative Closest Point algorithm. Also, a simulation system is presented in this paper in order to evaluate the performance of the sensor and generate simulated sensor point cloud data. It also provides truth pose of the test target so that the pose estimation error can be quantified. To investigate the effectiveness of the proposed approach and achievable pose accuracy, numerical simulation experiments are performed; results demonstrate algorithm capability of operating with point cloud directly and large pose variations. Also, a field testing experiment is conducted and results show that the proposed method is effective.

  1. Enhancing data utilization through adoption of cloud-based data architectures (Invited Paper 211869)

    Science.gov (United States)

    Kearns, E. J.

    2017-12-01

    A traditional approach to data distribution and utilization of open government data involves continuously moving those data from a central government location to each potential user, who would then utilize them on their local computer systems. An alternate approach would be to bring those users to the open government data, where users would also have access to computing and analytics capabilities that would support data utilization. NOAA's Big Data Project is exploring such an alternate approach through an experimental collaboration with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium. As part of this ongoing experiment, NOAA is providing open data of interest which are freely hosted by the Big Data Project Collaborators, who provide a variety of cloud-based services and capabilities to enable utilization by data users. By the terms of the agreement, the Collaborators may charge for those value-added services and processing capacities to recover their costs to freely host the data and to generate profits if so desired. Initial results have shown sustained increases in data utilization from 2 to over 100 times previously-observed access patterns from traditional approaches. Significantly increased utilization speed as compared to the traditional approach has also been observed by NOAA data users who have volunteered their experiences on these cloud-based systems. The potential for implementing and sustaining the alternate cloud-based approach as part of a change in operational data utilization strategies will be discussed.

  2. AN INTERACTIVE WEB-BASED ANALYSIS FRAMEWORK FOR REMOTE SENSING CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Z. Wang

    2015-07-01

    Full Text Available Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users’ private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook

  3. Energy Productivity of the High Velocity Algae Raceway Integrated Design (ARID-HV)

    Energy Technology Data Exchange (ETDEWEB)

    Attalah, Said; Waller, Peter M.; Khawam, George; Ryan, Randy D.; Huesemann, Michael H.

    2015-06-03

    The original Algae Raceway Integrated Design (ARID) raceway was an effective method to increase algae culture temperature in open raceways. However, the energy input was high and flow mixing was poor. Thus, the High Velocity Algae Raceway Integrated Design (ARID-HV) raceway was developed to reduce energy input requirements and improve flow mixing in a serpentine flow path. A prototype ARID-HV system was installed in Tucson, Arizona. Based on algae growth simulation and hydraulic analysis, an optimal ARID-HV raceway was designed, and the electrical energy input requirement (kWh ha-1 d-1) was calculated. An algae growth model was used to compare the productivity of ARIDHV and conventional raceways. The model uses a pond surface energy balance to calculate water temperature as a function of environmental parameters. Algae growth and biomass loss are calculated based on rate constants during day and night, respectively. A 10 year simulation of DOE strain 1412 (Chlorella sorokiniana) showed that the ARID-HV raceway had significantly higher production than a conventional raceway for all months of the year in Tucson, Arizona. It should be noted that this difference is species and climate specific and is not observed in other climates and with other algae species. The algae growth model results and electrical energy input evaluation were used to compare the energy productivity (algae production rate/energy input) of the ARID-HV and conventional raceways for Chlorella sorokiniana in Tucson, Arizona. The energy productivity of the ARID-HV raceway was significantly greater than the energy productivity of a conventional raceway for all months of the year.

  4. Dogs with hearth diseases causing turbulent high-velocity blood flow have changes in patelet function and von Willebrand factor multimer distribution

    DEFF Research Database (Denmark)

    Tarnow, Inge; Kristensen, Annemarie Thuri; Olsen, Lisbeth Høier

    2005-01-01

    The purpose of this prospective study was to investigate platelet function using in vitro tests based on both high and low shear rates and von Willebrand factor (vWf) multimeric composition in dogs with cardiac disease and turbulent high-velocity blood flow. Client-owned asymptomatic, untreated d...

  5. Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers

    Science.gov (United States)

    2015-01-01

    Background The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. Objective The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. Methods The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. Results We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. Conclusions The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost. PMID:26215155

  6. Secure Cloud-Based Solutions for Different eHealth Services in Spanish Rural Health Centers.

    Science.gov (United States)

    de la Torre-Díez, Isabel; Lopez-Coronado, Miguel; Garcia-Zapirain Soto, Begonya; Mendez-Zorrilla, Amaia

    2015-07-27

    The combination of eHealth applications and/or services with cloud technology provides health care staff—with sufficient mobility and accessibility for them—to be able to transparently check any data they may need without having to worry about its physical location. The main aim of this paper is to put forward secure cloud-based solutions for a range of eHealth services such as electronic health records (EHRs), telecardiology, teleconsultation, and telediagnosis. The scenario chosen for introducing the services is a set of four rural health centers located within the same Spanish region. iCanCloud software was used to perform simulations in the proposed scenario. We chose online traffic and the cost per unit in terms of time as the parameters for choosing the secure solution on the most optimum cloud for each service. We suggest that load balancers always be fitted for all solutions in communication together with several Internet service providers and that smartcards be used to maintain identity to an appropriate extent. The solutions offered via private cloud for EHRs, teleconsultation, and telediagnosis services require a volume of online traffic calculated at being able to reach 2 Gbps per consultation. This may entail an average cost of €500/month. The security solutions put forward for each eHealth service constitute an attempt to centralize all information on the cloud, thus offering greater accessibility to medical information in the case of EHRs alongside more reliable diagnoses and treatment for telecardiology, telediagnosis, and teleconsultation services. Therefore, better health care for the rural patient can be obtained at a reasonable cost.

  7. High Velocity Oxidation and Hot Corrosion Resistance of Some ODS Alloys

    Science.gov (United States)

    Lowell, C. E.; Deadmore, D. L.

    1977-01-01

    Several oxide dispersion strengthened (ODS) alloys were tested for cyclic, high velocity, oxidation, and hot corrosion resistance. These results were compared to the resistance of an advanced, NiCrAl coated superalloy. An ODS FeCrAl were identified as having sufficient oxidation and hot corrosion resistance to allow potential use in an aircraft gas turbine without coating.

  8. Cross layer optimization for cloud-based radio over optical fiber networks

    Science.gov (United States)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  9. Design Intend Solving: Dynamic Composition Method for Innovative Design Based on Virtual Cloud Manufacturing Resource Generators

    Directory of Open Access Journals (Sweden)

    Yi-Cong Gao

    2013-01-01

    Full Text Available Recently, there has been growing interest in composition of cloud manufacturing resources (CMRs. Composition of CMRs is a feasible innovation to fulfill the user request while single cloud manufacturing resource cannot satisfy the functionality required by the user. In this paper, we propose a new case-based approach for the composition of CMRs. The basic idea of the present approach is to provide a computational framework for the composition of CMRs by imitating the common design method of reviewing past designs to obtain solution concepts for a new composite cloud manufacturing resource (CCMR. A notion of virtual cloud manufacturing resource generators (VCMRGs is introduced to conceptualize and represent underlying CCMRs contained in existing CCMRs. VCMRGs are derived from previous CCMRs and serve as new conceptual building blocks for the composition of CMRs. Feasible composite CMRs are generated by combining VCMRGs using some adaptation rules. The reuse of prior CCMRs is accomplished via VCMRGs within the framework of case-based reasoning. We demonstrate that the proposed approach yields lower execution time for fulfilling user request and shows good scalability.

  10. a Threshold-Free Filtering Algorithm for Airborne LIDAR Point Clouds Based on Expectation-Maximization

    Science.gov (United States)

    Hui, Z.; Cheng, P.; Ziggah, Y. Y.; Nie, Y.

    2018-04-01

    Filtering is a key step for most applications of airborne LiDAR point clouds. Although lots of filtering algorithms have been put forward in recent years, most of them suffer from parameters setting or thresholds adjusting, which will be time-consuming and reduce the degree of automation of the algorithm. To overcome this problem, this paper proposed a threshold-free filtering algorithm based on expectation-maximization. The proposed algorithm is developed based on an assumption that point clouds are seen as a mixture of Gaussian models. The separation of ground points and non-ground points from point clouds can be replaced as a separation of a mixed Gaussian model. Expectation-maximization (EM) is applied for realizing the separation. EM is used to calculate maximum likelihood estimates of the mixture parameters. Using the estimated parameters, the likelihoods of each point belonging to ground or object can be computed. After several iterations, point clouds can be labelled as the component with a larger likelihood. Furthermore, intensity information was also utilized to optimize the filtering results acquired using the EM method. The proposed algorithm was tested using two different datasets used in practice. Experimental results showed that the proposed method can filter non-ground points effectively. To quantitatively evaluate the proposed method, this paper adopted the dataset provided by the ISPRS for the test. The proposed algorithm can obtain a 4.48 % total error which is much lower than most of the eight classical filtering algorithms reported by the ISPRS.

  11. Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.

    Science.gov (United States)

    Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L

    2015-07-01

    The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.

  12. Optical and geometrical properties of cirrus clouds in Amazonia derived from 1 year of ground-based lidar measurements

    Science.gov (United States)

    Gouveia, Diego A.; Barja, Boris; Barbosa, Henrique M. J.; Seifert, Patric; Baars, Holger; Pauliquevis, Theotonio; Artaxo, Paulo

    2017-03-01

    Cirrus clouds cover a large fraction of tropical latitudes and play an important role in Earth's radiation budget. Their optical properties, altitude, vertical and horizontal coverage control their radiative forcing, and hence detailed cirrus measurements at different geographical locations are of utmost importance. Studies reporting cirrus properties over tropical rain forests like the Amazon, however, are scarce. Studies with satellite profilers do not give information on the diurnal cycle, and the satellite imagers do not report on the cloud vertical structure. At the same time, ground-based lidar studies are restricted to a few case studies. In this paper, we derive the first comprehensive statistics of optical and geometrical properties of upper-tropospheric cirrus clouds in Amazonia. We used 1 year (July 2011 to June 2012) of ground-based lidar atmospheric observations north of Manaus, Brazil. This dataset was processed by an automatic cloud detection and optical properties retrieval algorithm. Upper-tropospheric cirrus clouds were observed more frequently than reported previously for tropical regions. The frequency of occurrence was found to be as high as 88 % during the wet season and not lower than 50 % during the dry season. The diurnal cycle shows a minimum around local noon and maximum during late afternoon, associated with the diurnal cycle of precipitation. The mean values of cirrus cloud top and base heights, cloud thickness, and cloud optical depth were 14.3 ± 1.9 (SD) km, 12.9 ± 2.2 km, 1.4 ± 1.1 km, and 0.25 ± 0.46, respectively. Cirrus clouds were found at temperatures down to -90 °C. Frequently cirrus were observed within the tropical tropopause layer (TTL), which are likely associated to slow mesoscale uplifting or to the remnants of overshooting convection. The vertical distribution was not uniform, and thin and subvisible cirrus occurred more frequently closer to the tropopause. The mean lidar ratio was 23.3 ± 8.0 sr. However, for

  13. Cloud-Based versus Local-Based Web Development Education: An Experimental Study in Learning Experience

    Science.gov (United States)

    Pike, Ronald E.; Pittman, Jason M.; Hwang, Drew

    2017-01-01

    This paper investigates the use of a cloud computing environment to facilitate the teaching of web development at a university in the Southwestern United States. A between-subjects study of students in a web development course was conducted to assess the merits of a cloud computing environment instead of personal computers for developing websites.…

  14. Spatially Extended and High-Velocity Dispersion Molecular Component in Spiral Galaxies: Single-Dish Versus Interferometric Observations

    Science.gov (United States)

    Caldú-Primo, Anahi; Schruba, Andreas; Walter, Fabian; Leroy, Adam; Bolatto, Alberto D.; Vogel, Stuart

    2015-02-01

    Recent studies of the molecular medium in nearby galaxies have provided mounting evidence that the molecular gas can exist in two phases: one that is clumpy and organized as molecular clouds and another one that is more diffuse. This last component has a higher velocity dispersion than the clumpy one. In order to investigate these two molecular components further, we compare the fluxes and line widths of CO in NGC 4736 and NGC 5055, two nearby spiral galaxies for which high-quality interferometric as well as single-dish data sets are available. Our analysis leads to two main results: (1) employing three different methods, we determine the flux recovery of the interferometer as compared to the single-dish to be within a range of 35%-74% for NGC 4736 and 81%-92% for NGC 5055, and (2) when focusing on high (S/N ≥ 5) lines of sight (LOSs), the single-dish line widths are larger by ˜(40 ± 20)% than the ones derived from interferometric data, which is in agreement with stacking all LOSs. These results point to a molecular gas component that is distributed over spatial scales larger than 30″(˜1 kpc), and is therefore filtered out by the interferometer. The available observations do not allow us to distinguish between a truly diffuse gas morphology and a uniform distribution of small clouds that are separated by less than the synthesized beam size (˜3″ or ˜100 pc), as they would both be invisible for the interferometer. This high velocity dispersion component has a dispersion similar to what is found in the atomic medium, as traced through observations of the H i line.

  15. Spatially extended and high-velocity dispersion molecular component in spiral galaxies: Single-dish versus interferometric observations

    International Nuclear Information System (INIS)

    Caldú-Primo, Anahi; Walter, Fabian; Schruba, Andreas; Leroy, Adam; Bolatto, Alberto D.; Vogel, Stuart

    2015-01-01

    Recent studies of the molecular medium in nearby galaxies have provided mounting evidence that the molecular gas can exist in two phases: one that is clumpy and organized as molecular clouds and another one that is more diffuse. This last component has a higher velocity dispersion than the clumpy one. In order to investigate these two molecular components further, we compare the fluxes and line widths of CO in NGC 4736 and NGC 5055, two nearby spiral galaxies for which high-quality interferometric as well as single-dish data sets are available. Our analysis leads to two main results: (1) employing three different methods, we determine the flux recovery of the interferometer as compared to the single-dish to be within a range of 35%–74% for NGC 4736 and 81%–92% for NGC 5055, and (2) when focusing on high (S/N ≥ 5) lines of sight (LOSs), the single-dish line widths are larger by ∼(40 ± 20)% than the ones derived from interferometric data, which is in agreement with stacking all LOSs. These results point to a molecular gas component that is distributed over spatial scales larger than 30″(∼1 kpc), and is therefore filtered out by the interferometer. The available observations do not allow us to distinguish between a truly diffuse gas morphology and a uniform distribution of small clouds that are separated by less than the synthesized beam size (∼3″ or ∼100 pc), as they would both be invisible for the interferometer. This high velocity dispersion component has a dispersion similar to what is found in the atomic medium, as traced through observations of the H i line.

  16. Research on Environmental Adjustment of Cloud Ranch Based on BP Neural Network PID Control

    Science.gov (United States)

    Ren, Jinzhi; Xiang, Wei; Zhao, Lin; Wu, Jianbo; Huang, Lianzhen; Tu, Qinggang; Zhao, Heming

    2018-01-01

    In order to make the intelligent ranch management mode replace the traditional artificial one gradually, this paper proposes a pasture environment control system based on cloud server, and puts forward the PID control algorithm based on BP neural network to control temperature and humidity better in the pasture environment. First, to model the temperature and humidity (controlled object) of the pasture, we can get the transfer function. Then the traditional PID control algorithm and the PID one based on BP neural network are applied to the transfer function. The obtained step tracking curves can be seen that the PID controller based on BP neural network has obvious superiority in adjusting time and error, etc. This algorithm, calculating reasonable control parameters of the temperature and humidity to control environment, can be better used in the cloud service platform.

  17. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  18. A mobile cloud-based Parkinson's disease assessment system for home-based monitoring.

    Science.gov (United States)

    Pan, Di; Dhall, Rohit; Lieberman, Abraham; Petitti, Diana B

    2015-03-26

    Parkinson's disease (PD) is the most prevalent movement disorder of the central nervous system, and affects more than 6.3 million people in the world. The characteristic motor features include tremor, bradykinesia, rigidity, and impaired postural stability. Current therapy based on augmentation or replacement of dopamine is designed to improve patients' motor performance but often leads to levodopa-induced adverse effects, such as dyskinesia and motor fluctuation. Clinicians must regularly monitor patients in order to identify these effects and other declines in motor function as soon as possible. Current clinical assessment for Parkinson's is subjective and mostly conducted by brief observations made during patient visits. Changes in patients' motor function between visits are hard to track and clinicians are not able to make the most informed decisions about the course of therapy without frequent visits. Frequent clinic visits increase the physical and economic burden on patients and their families. In this project, we sought to design, develop, and evaluate a prototype mobile cloud-based mHealth app, "PD Dr", which collects quantitative and objective information about PD and would enable home-based assessment and monitoring of major PD symptoms. We designed and developed a mobile app on the Android platform to collect PD-related motion data using the smartphone 3D accelerometer and to send the data to a cloud service for storage, data processing, and PD symptoms severity estimation. To evaluate this system, data from the system were collected from 40 patients with PD and compared with experts' rating on standardized rating scales. The evaluation showed that PD Dr could effectively capture important motion features that differentiate PD severity and identify critical symptoms. For hand resting tremor detection, the sensitivity was .77 and accuracy was .82. For gait difficulty detection, the sensitivity was .89 and accuracy was .81. In PD severity estimation, the

  19. HIGH-VELOCITY MOLECULAR OUTFLOW IN CO J = 7-6 EMISSION FROM THE ORION HOT CORE

    International Nuclear Information System (INIS)

    Furuya, Ray S.; Shinnaga, Hiroko

    2009-01-01

    Using the Caltech Submillimeter Observatory 10.4 m telescope, we performed sensitive mapping observations of 12 CO J = 7-6 emission at 807 GHz toward Orion IRc2. The image has an angular resolution of 10'', which is the highest angular resolution data toward the Orion Hot Core published for this transition. In addition, thanks to the on-the-fly mapping technique, the fidelity of the new image is rather high, particularly in comparison with previous images. We have succeeded in mapping the northwest-southeast high-velocity molecular outflow, whose terminal velocity is shifted by ∼70-85 km s -1 with respect to the systemic velocity of the cloud. This yields an extremely short dynamical time scale of ∼900 years. The estimated outflow mass loss rate shows an extraordinarily high value, on the order of 10 -3 M sun yr -1 . Assuming that the outflow is driven by Orion IRc2, our result agrees with the picture so far obtained for a 20 M sun (proto)star in the process of formation.

  20. Cloud-Based Perception and Control of Sensor Nets and Robot Swarms

    Science.gov (United States)

    2016-04-01

    distributed stream processing framework provides the necessary API and infrastructure to develop and execute such applications in a cluster of computation...streaming DDDAS applications based on challenges they present to the backend Cloud control system. Figure 2 Parallel SLAM Application 3 1) Set of...the art deep learning- based object detectors can recognize among hundreds of object classes and this capability would be very useful for mobile

  1. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    International Nuclear Information System (INIS)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S.

    2015-01-01

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures

  2. A point cloud based pipeline for depth reconstruction from autostereoscopic sets

    Science.gov (United States)

    Niquin, Cédric; Prévost, Stéphanie; Remion, Yannick

    2010-02-01

    This is a three step pipeline to construct a 3D mesh of a scene from a set of N images, destined to be viewed on auto-stereoscopic displays. The first step matches the pixels to create a point cloud using a new algorithm based on graph-cuts. It exploits the data redundancy of the N images to ensure the geometric consistency of the scene and to reduce the graph complexity, in order to speed up the computation. It performs an accurate detection of occlusions and its results can then be used in applications like view synthesis. The second step slightly moves the points along the Z-axis to refine the point cloud. It uses a new cost including both occlusion positions and light variations deduced from the matching. The Z values are selected using a dynamic programming algorithm. This step finally generates a point cloud, which is fine enough for applications like augmented reality. From any of the two previously defined point clouds, the last step creates a colored mesh, which is a convenient data structure to be used in graphics APIs. It also generates N depth maps, allowing a comparison between the results of our method with those of other methods.

  3. Threshold-based queuing system for performance analysis of cloud computing system with dynamic scaling

    Energy Technology Data Exchange (ETDEWEB)

    Shorgin, Sergey Ya.; Pechinkin, Alexander V. [Institute of Informatics Problems, Russian Academy of Sciences (Russian Federation); Samouylov, Konstantin E.; Gaidamaka, Yuliya V.; Gudkova, Irina A.; Sopin, Eduard S. [Telecommunication Systems Department, Peoples’ Friendship University of Russia (Russian Federation)

    2015-03-10

    Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. For better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.

  4. The design of an m-Health monitoring system based on a cloud computing platform

    Science.gov (United States)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  5. Improved retrieval of cloud base heights from ceilometer using a non-standard instrument method

    Science.gov (United States)

    Wang, Yang; Zhao, Chuanfeng; Dong, Zipeng; Li, Zhanqing; Hu, Shuzhen; Chen, Tianmeng; Tao, Fa; Wang, Yuzhao

    2018-04-01

    Cloud-base height (CBH) is a basic cloud parameter but has not been measured accurately, especially under polluted conditions due to the interference of aerosol. Taking advantage of a comprehensive field experiment in northern China in which a variety of advanced cloud probing instruments were operated, different methods of detecting CBH are assessed. The Micro-Pulse Lidar (MPL) and the Vaisala ceilometer (CL51) provided two types of backscattered profiles. The latter has been employed widely as a standard means of measuring CBH using the manufacturer's operational algorithm to generate standard CBH products (CL51 MAN) whose quality is rigorously assessed here, in comparison with a research algorithm that we developed named value distribution equalization (VDE) algorithm. It was applied to both the profiles of lidar backscattering data from the two instruments. The VDE algorithm is found to produce more accurate estimates of CBH for both instruments and can cope with heavy aerosol loading conditions well. By contrast, CL51 MAN overestimates CBH by 400 m and misses many low level clouds under such conditions. These findings are important given that CL51 has been adopted operationally by many meteorological stations in China.

  6. A Hierarchical Auction-Based Mechanism for Real-Time Resource Allocation in Cloud Robotic Systems.

    Science.gov (United States)

    Wang, Lujia; Liu, Ming; Meng, Max Q-H

    2017-02-01

    Cloud computing enables users to share computing resources on-demand. The cloud computing framework cannot be directly mapped to cloud robotic systems with ad hoc networks since cloud robotic systems have additional constraints such as limited bandwidth and dynamic structure. However, most multirobotic applications with cooperative control adopt this decentralized approach to avoid a single point of failure. Robots need to continuously update intensive data to execute tasks in a coordinated manner, which implies real-time requirements. Thus, a resource allocation strategy is required, especially in such resource-constrained environments. This paper proposes a hierarchical auction-based mechanism, namely link quality matrix (LQM) auction, which is suitable for ad hoc networks by introducing a link quality indicator. The proposed algorithm produces a fast and robust method that is accurate and scalable. It reduces both global communication and unnecessary repeated computation. The proposed method is designed for firm real-time resource retrieval for physical multirobot systems. A joint surveillance scenario empirically validates the proposed mechanism by assessing several practical metrics. The results show that the proposed LQM auction outperforms state-of-the-art algorithms for resource allocation.

  7. Segmentation of Large Unstructured Point Clouds Using Octree-Based Region Growing and Conditional Random Fields

    Science.gov (United States)

    Bassier, M.; Bonduel, M.; Van Genechten, B.; Vergauwen, M.

    2017-11-01

    Point cloud segmentation is a crucial step in scene understanding and interpretation. The goal is to decompose the initial data into sets of workable clusters with similar properties. Additionally, it is a key aspect in the automated procedure from point cloud data to BIM. Current approaches typically only segment a single type of primitive such as planes or cylinders. Also, current algorithms suffer from oversegmenting the data and are often sensor or scene dependent. In this work, a method is presented to automatically segment large unstructured point clouds of buildings. More specifically, the segmentation is formulated as a graph optimisation problem. First, the data is oversegmented with a greedy octree-based region growing method. The growing is conditioned on the segmentation of planes as well as smooth surfaces. Next, the candidate clusters are represented by a Conditional Random Field after which the most likely configuration of candidate clusters is computed given a set of local and contextual features. The experiments prove that the used method is a fast and reliable framework for unstructured point cloud segmentation. Processing speeds up to 40,000 points per second are recorded for the region growing. Additionally, the recall and precision of the graph clustering is approximately 80%. Overall, nearly 22% of oversegmentation is reduced by clustering the data. These clusters will be classified and used as a basis for the reconstruction of BIM models.

  8. Satellite-based trends of solar radiation and cloud parameters in Europe

    Science.gov (United States)

    Pfeifroth, Uwe; Bojanowski, Jedrzej S.; Clerbaux, Nicolas; Manara, Veronica; Sanchez-Lorenzo, Arturo; Trentmann, Jörg; Walawender, Jakub P.; Hollmann, Rainer

    2018-04-01

    Solar radiation is the main driver of the Earth's climate. Measuring solar radiation and analysing its interaction with clouds are essential for the understanding of the climate system. The EUMETSAT Satellite Application Facility on Climate Monitoring (CM SAF) generates satellite-based, high-quality climate data records, with a focus on the energy balance and water cycle. Here, multiple of these data records are analyzed in a common framework to assess the consistency in trends and spatio-temporal variability of surface solar radiation, top-of-atmosphere reflected solar radiation and cloud fraction. This multi-parameter analysis focuses on Europe and covers the time period from 1992 to 2015. A high correlation between these three variables has been found over Europe. An overall consistency of the climate data records reveals an increase of surface solar radiation and a decrease in top-of-atmosphere reflected radiation. In addition, those trends are confirmed by negative trends in cloud cover. This consistency documents the high quality and stability of the CM SAF climate data records, which are mostly derived independently from each other. The results of this study indicate that one of the main reasons for the positive trend in surface solar radiation since the 1990's is a decrease in cloud coverage even if an aerosol contribution cannot be completely ruled out.

  9. 2.5D Multi-View Gait Recognition Based on Point Cloud Registration

    Science.gov (United States)

    Tang, Jin; Luo, Jian; Tjahjadi, Tardi; Gao, Yan

    2014-01-01

    This paper presents a method for modeling a 2.5-dimensional (2.5D) human body and extracting the gait features for identifying the human subject. To achieve view-invariant gait recognition, a multi-view synthesizing method based on point cloud registration (MVSM) to generate multi-view training galleries is proposed. The concept of a density and curvature-based Color Gait Curvature Image is introduced to map 2.5D data onto a 2D space to enable data dimension reduction by discrete cosine transform and 2D principle component analysis. Gait recognition is achieved via a 2.5D view-invariant gait recognition method based on point cloud registration. Experimental results on the in-house database captured by a Microsoft Kinect camera show a significant performance gain when using MVSM. PMID:24686727

  10. Impact of Arctic sea-ice retreat on the recent change in cloud-base height during autumn

    Science.gov (United States)

    Sato, K.; Inoue, J.; Kodama, Y.; Overland, J. E.

    2012-12-01

    Cloud-base observations over the ice-free Chukchi and Beaufort Seas in autumn were conducted using a shipboard ceilometer and radiosondes during the 1999-2010 cruises of the Japanese R/V Mirai. To understand the recent change in cloud base height over the Arctic Ocean, these cloud-base height data were compared with the observation data under ice-covered situation during SHEBA (the Surface Heat Budget of the Arctic Ocean project in 1998). Our ice-free results showed a 30 % decrease (increase) in the frequency of low clouds with a ceiling below (above) 500 m. Temperature profiles revealed that the boundary layer was well developed over the ice-free ocean in the 2000s, whereas a stable layer dominated during the ice-covered period in 1998. The change in surface boundary conditions likely resulted in the difference in cloud-base height, although it had little impact on air temperatures in the mid- and upper troposphere. Data from the 2010 R/V Mirai cruise were investigated in detail in terms of air-sea temperature difference. This suggests that stratus cloud over the sea ice has been replaced as stratocumulus clouds with low cloud fraction due to the decrease in static stability induced by the sea-ice retreat. The relationship between cloud-base height and air-sea temperature difference (SST-Ts) was analyzed in detail using special section data during 2010 cruise data. Stratus clouds near the sea surface were predominant under a warm advection situation, whereas stratocumulus clouds with a cloud-free layer were significant under a cold advection situation. The threshold temperature difference between sea surface and air temperatures for distinguishing the dominant cloud types was 3 K. Anomalous upward turbulent heat fluxes associated with the sea-ice retreat have likely contributed to warming of the lower troposphere. Frequency distribution of the cloud-base height (km) detected by a ceilometer/lidar (black bars) and radiosondes (gray bars), and profiles of potential

  11. BUSINESS INTELLIGENCE IN CLOUD

    OpenAIRE

    Celina M. Olszak

    2014-01-01

    . The paper reviews and critiques current research on Business Intelligence (BI) in cloud. This review highlights that organizations face various challenges using BI cloud. The research objectives for this study are a conceptualization of the BI cloud issue, as well as an investigation of some benefits and risks from BI cloud. The study was based mainly on a critical analysis of literature and some reports on BI cloud using. The results of this research can be used by IT and business leaders ...

  12. A Wing Pod-based Millimeter Wave Cloud Radar on HIAPER

    Science.gov (United States)

    Vivekanandan, Jothiram; Tsai, Peisang; Ellis, Scott; Loew, Eric; Lee, Wen-Chau; Emmett, Joanthan

    2014-05-01

    One of the attractive features of a millimeter wave radar system is its ability to detect micron-sized particles that constitute clouds with lower than 0.1 g m-3 liquid or ice water content. Scanning or vertically-pointing ground-based millimeter wavelength radars are used to study stratocumulus (Vali et al. 1998; Kollias and Albrecht 2000) and fair-weather cumulus (Kollias et al. 2001). Airborne millimeter wavelength radars have been used for atmospheric remote sensing since the early 1990s (Pazmany et al. 1995). Airborne millimeter wavelength radar systems, such as the University of Wyoming King Air Cloud Radar (WCR) and the NASA ER-2 Cloud Radar System (CRS), have added mobility to observe clouds in remote regions and over oceans. Scientific requirements of millimeter wavelength radar are mainly driven by climate and cloud initiation studies. Survey results from the cloud radar user community indicated a common preference for a narrow beam W-band radar with polarimetric and Doppler capabilities for airborne remote sensing of clouds. For detecting small amounts of liquid and ice, it is desired to have -30 dBZ sensitivity at a 10 km range. Additional desired capabilities included a second wavelength and/or dual-Doppler winds. Modern radar technology offers various options (e.g., dual-polarization and dual-wavelength). Even though a basic fixed beam Doppler radar system with a sensitivity of -30 dBZ at 10 km is capable of satisfying cloud detection requirements, the above-mentioned additional options, namely dual-wavelength, and dual-polarization, significantly extend the measurement capabilities to further reduce any uncertainty in radar-based retrievals of cloud properties. This paper describes a novel, airborne pod-based millimeter wave radar, preliminary radar measurements and corresponding derived scientific products. Since some of the primary engineering requirements of this millimeter wave radar are that it should be deployable on an airborne platform

  13. Replicas Strategy and Cache Optimization of Video Surveillance Systems Based on Cloud Storage

    Directory of Open Access Journals (Sweden)

    Rongheng Li

    2018-04-01

    Full Text Available With the rapid development of video surveillance technology, especially the popularity of cloud-based video surveillance applications, video data begins to grow explosively. However, in the cloud-based video surveillance system, replicas occupy an amount of storage space. Also, the slow response to video playback constrains the performance of the system. In this paper, considering the characteristics of video data comprehensively, we propose a dynamic redundant replicas mechanism based on security levels that can dynamically adjust the number of replicas. Based on the location correlation between cameras, this paper also proposes a data cache strategy to improve the response speed of data reading. Experiments illustrate that: (1 our dynamic redundant replicas mechanism can save storage space while ensuring data security; (2 the cache mechanism can predict the playback behaviors of the users in advance and improve the response speed of data reading according to the location and time correlation of the front-end cameras; and (3 in terms of cloud-based video surveillance, our proposed approaches significantly outperform existing methods.

  14. Optical fibre multi-parameter sensing with secure cloud based signal capture and processing

    Science.gov (United States)

    Newe, Thomas; O'Connell, Eoin; Meere, Damien; Yuan, Hongwei; Leen, Gabriel; O'Keeffe, Sinead; Lewis, Elfed

    2016-05-01

    Recent advancements in cloud computing technologies in the context of optical and optical fibre based systems are reported. The proliferation of real time and multi-channel based sensor systems represents significant growth in data volume. This coupled with a growing need for security presents many challenges and presents a huge opportunity for an evolutionary step in the widespread application of these sensing technologies. A tiered infrastructural system approach is adopted that is designed to facilitate the delivery of Optical Fibre-based "SENsing as a Service- SENaaS". Within this infrastructure, novel optical sensing platforms, deployed within different environments, are interfaced with a Cloud-based backbone infrastructure which facilitates the secure collection, storage and analysis of real-time data. Feedback systems, which harness this data to affect a change within the monitored location/environment/condition, are also discussed. The cloud based system presented here can also be used with chemical and physical sensors that require real-time data analysis, processing and feedback.

  15. A cloud based tool for knowledge exchange on local scale flood risk.

    Science.gov (United States)

    Wilkinson, M E; Mackay, E; Quinn, P F; Stutter, M; Beven, K J; MacLeod, C J A; Macklin, M G; Elkhatib, Y; Percy, B; Vitolo, C; Haygarth, P M

    2015-09-15

    There is an emerging and urgent need for new approaches for the management of environmental challenges such as flood hazard in the broad context of sustainability. This requires a new way of working which bridges disciplines and organisations, and that breaks down science-culture boundaries. With this, there is growing recognition that the appropriate involvement of local communities in catchment management decisions can result in multiple benefits. However, new tools are required to connect organisations and communities. The growth of cloud based technologies offers a novel way to facilitate this process of exchange of information in environmental science and management; however, stakeholders need to be engaged with as part of the development process from the beginning rather than being presented with a final product at the end. Here we present the development of a pilot Local Environmental Virtual Observatory Flooding Tool. The aim was to develop a cloud based learning platform for stakeholders, bringing together fragmented data, models and visualisation tools that will enable these stakeholders to make scientifically informed environmental management decisions at the local scale. It has been developed by engaging with different stakeholder groups in three catchment case studies in the UK and a panel of national experts in relevant topic areas. However, these case study catchments are typical of many northern latitude catchments. The tool was designed to communicate flood risk in locally impacted communities whilst engaging with landowners/farmers about the risk of runoff from the farmed landscape. It has been developed iteratively to reflect the needs, interests and capabilities of a wide range of stakeholders. The pilot tool combines cloud based services, local catchment datasets, a hydrological model and bespoke visualisation tools to explore real time hydrometric data and the impact of flood risk caused by future land use changes. The novel aspects of the

  16. Integral Field Spectroscopy of Markarian 273: Mapping High-Velocity Gas Flows and an Off-Nucleus Seyfert 2 Nebula.

    Science.gov (United States)

    Colina; Arribas; Borne

    1999-12-10

    Integral field optical spectroscopy with the INTEGRAL fiber-based system is used to map the extended ionized regions and gas flows in Mrk 273, one of the closest ultraluminous infrared galaxies. The Hbeta and [O iii] lambda5007 maps show the presence of two distinct regions separated by 4&arcsec; (3.1 kpc) along position angle (P.A.) 240 degrees. The northeastern region coincides with the optical nucleus of the galaxy and shows the spectral characteristics of LINERs. The southwestern region is dominated by [O iii] emission and is classified as a Seyfert 2. Therefore, in the optical, Mrk 273 is an ultraluminous infrared galaxy with a LINER nucleus and an extended off-nucleus Seyfert 2 nebula. The kinematics of the [O iii] ionized gas shows (1) the presence of highly disturbed gas in the regions around the LINER nucleus, (2) a high-velocity gas flow with a peak-to-peak amplitude of 2.4x103 km s-1, and (3) quiescent gas in the outer regions (at 3 kpc). We hypothesize that the high-velocity flow is the starburst-driven superwind generated in an optically obscured nuclear starburst and that the quiescent gas is directly ionized by a nuclear source, similar to the ionization cones typically seen in Seyfert galaxies.

  17. Study on Cloud Security Based on Trust Spanning Tree Protocol

    Science.gov (United States)

    Lai, Yingxu; Liu, Zenghui; Pan, Qiuyue; Liu, Jing

    2015-09-01

    Attacks executed on Spanning Tree Protocol (STP) expose the weakness of link layer protocols and put the higher layers in jeopardy. Although the problems have been studied for many years and various solutions have been proposed, many security issues remain. To enhance the security and credibility of layer-2 network, we propose a trust-based spanning tree protocol aiming at achieving a higher credibility of LAN switch with a simple and lightweight authentication mechanism. If correctly implemented in each trusted switch, the authentication of trust-based STP can guarantee the credibility of topology information that is announced to other switch in the LAN. To verify the enforcement of the trusted protocol, we present a new trust evaluation method of the STP using a specification-based state model. We implement a prototype of trust-based STP to investigate its practicality. Experiment shows that the trusted protocol can achieve security goals and effectively avoid STP attacks with a lower computation overhead and good convergence performance.

  18. gemcWeb: A Cloud Based Nuclear Physics Simulation Software

    Science.gov (United States)

    Markelon, Sam

    2017-09-01

    gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.

  19. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    Science.gov (United States)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  20. Auto-Scaling of Geo-Based Image Processing in an OpenStack Cloud Computing Environment

    OpenAIRE

    Sanggoo Kang; Kiwon Lee

    2016-01-01

    Cloud computing is a base platform for the distribution of large volumes of data and high-performance image processing on the Web. Despite wide applications in Web-based services and their many benefits, geo-spatial applications based on cloud computing technology are still developing. Auto-scaling realizes automatic scalability, i.e., the scale-out and scale-in processing of virtual servers in a cloud computing environment. This study investigates the applicability of auto-scaling to geo-bas...

  1. Study into Point Cloud Geometric Rigidity and Accuracy of TLS-Based Identification of Geometric Bodies

    Science.gov (United States)

    Klapa, Przemyslaw; Mitka, Bartosz; Zygmunt, Mariusz

    2017-12-01

    Capability of obtaining a multimillion point cloud in a very short time has made the Terrestrial Laser Scanning (TLS) a widely used tool in many fields of science and technology. The TLS accuracy matches traditional devices used in land surveying (tacheometry, GNSS - RTK), but like any measurement it is burdened with error which affects the precise identification of objects based on their image in the form of a point cloud. The point’s coordinates are determined indirectly by means of measuring the angles and calculating the time of travel of the electromagnetic wave. Each such component has a measurement error which is translated into the final result. The XYZ coordinates of a measuring point are determined with some uncertainty and the very accuracy of determining these coordinates is reduced as the distance to the instrument increases. The paper presents the results of examination of geometrical stability of a point cloud obtained by means terrestrial laser scanner and accuracy evaluation of solids determined using the cloud. Leica P40 scanner and two different settings of measuring points were used in the tests. The first concept involved placing a few balls in the field and then scanning them from various sides at similar distances. The second part of measurement involved placing balls and scanning them a few times from one side but at varying distances from the instrument to the object. Each measurement encompassed a scan of the object with automatic determination of its position and geometry. The desk studies involved a semiautomatic fitting of solids and measurement of their geometrical elements, and comparison of parameters that determine their geometry and location in space. The differences of measures of geometrical elements of balls and translations vectors of the solids centres indicate the geometrical changes of the point cloud depending on the scanning distance and parameters. The results indicate the changes in the geometry of scanned objects

  2. Intelligent Agent Based Semantic Web in Cloud Computing Environment

    OpenAIRE

    Mukhopadhyay, Debajyoti; Sharma, Manoj; Joshi, Gajanan; Pagare, Trupti; Palwe, Adarsha

    2013-01-01

    Considering today's web scenario, there is a need of effective and meaningful search over the web which is provided by Semantic Web. Existing search engines are keyword based. They are vulnerable in answering intelligent queries from the user due to the dependence of their results on information available in web pages. While semantic search engines provides efficient and relevant results as the semantic web is an extension of the current web in which information is given well defined meaning....

  3. Automated Coarse Registration of Point Clouds in 3d Urban Scenes Using Voxel Based Plane Constraint

    Science.gov (United States)

    Xu, Y.; Boerner, R.; Yao, W.; Hoegner, L.; Stilla, U.

    2017-09-01

    For obtaining a full coverage of 3D scans in a large-scale urban area, the registration between point clouds acquired via terrestrial laser scanning (TLS) is normally mandatory. However, due to the complex urban environment, the automatic registration of different scans is still a challenging problem. In this work, we propose an automatic marker free method for fast and coarse registration between point clouds using the geometric constrains of planar patches under a voxel structure. Our proposed method consists of four major steps: the voxelization of the point cloud, the approximation of planar patches, the matching of corresponding patches, and the estimation of transformation parameters. In the voxelization step, the point cloud of each scan is organized with a 3D voxel structure, by which the entire point cloud is partitioned into small individual patches. In the following step, we represent points of each voxel with the approximated plane function, and select those patches resembling planar surfaces. Afterwards, for matching the corresponding patches, a RANSAC-based strategy is applied. Among all the planar patches of a scan, we randomly select a planar patches set of three planar surfaces, in order to build a coordinate frame via their normal vectors and their intersection points. The transformation parameters between scans are calculated from these two coordinate frames. The planar patches set with its transformation parameters owning the largest number of coplanar patches are identified as the optimal candidate set for estimating the correct transformation parameters. The experimental results using TLS datasets of different scenes reveal that our proposed method can be both effective and efficient for the coarse registration task. Especially, for the fast orientation between scans, our proposed method can achieve a registration error of less than around 2 degrees using the testing datasets, and much more efficient than the classical baseline methods.

  4. Enhancing student motivation using LectureTools: A cloud-based teaching and learning platform

    Directory of Open Access Journals (Sweden)

    P. H. Patrio Chiu

    2015-06-01

    Full Text Available A cloud-based teaching and learning platform, LectureTools, was piloted at City University of Hong Kong in the 2012-13 academic year. LectureTools is an online platform that provides a suite of cloud-based teaching and learning applications. It combines the functions of interactive presentation, real-time student response system, student inquiry and online note-taking synchronised with the presentation slides, into one cloud-based platform. A comprehensive study investigated the effectiveness of the platform for enhancing student motivation among graduate (n=158 and undergraduate (n=96 students. Both groups of students reported enhanced motivation when using LectureTools. The scores on all six learning motivation scales of the Motivated Strategies for Learning Questionnaire, a psychometric instrument based on the cognitive view of motivation, increased when students engaged with the tool in class. Those who used the tool scored significantly higher on intrinsic goal orientation than those who did not use the tool. The students’ quantitative feedback showed that they found the tool useful and that it improved their motivation. Qualitative feedback from the instructors indicated that the tool was useful for engaging passive students. They reported that the most useful function was the interactive online questions with real-time results, while the in-class student inquiry function was difficult to use in practice.

  5. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    Science.gov (United States)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  6. Fuzzy Comprehensive Evaluation of Ecological Risk Based on Cloud Model: Taking Chengchao Iron Mine as Example

    Science.gov (United States)

    Ruan, Jinghua; Chen, Yong; Xiao, Xiao; Yong, Gan; Huang, Ranran; Miao, Zuohua

    2018-01-01

    Aimed at the fuzziness and randomness during the evaluation process, this paper constructed a fuzzy comprehensive evaluation method based on cloud model. The evaluation index system was established based on the inherent risk, present level and control situation, which had been proved to be able to convey the main contradictions of ecological risk in mine on the macro level, and be advantageous for comparison among mines. The comment sets and membership functions improved by cloud model could reflect the uniformity of ambiguity and randomness effectively. In addition, the concept of fuzzy entropy was introduced to further characterize the fuzziness of assessments results and the complexities of ecological problems in target mine. A practical example in Chengchao Iron Mine evidenced that, the assessments results can reflect actual situations appropriately and provide a new theoretic guidance for comprehensive ecological risk evaluation of underground iron mine.

  7. Evaluation of secure capability-based access control in the M2M local cloud platform

    DEFF Research Database (Denmark)

    Anggorojati, Bayu; Prasad, Neeli R.; Prasad, Ramjee

    2016-01-01

    delegation. Recently, the capability based access control has been considered as method to manage access in the Internet of Things (IoT) or M2M domain. In this paper, the implementation and evaluation of a proposed secure capability based access control in the M2M local cloud platform is presented......Managing access to and protecting resources is one of the important aspect in managing security, especially in a distributed computing system such as Machine-to-Machine (M2M). One such platform known as the M2M local cloud platform, referring to BETaaS architecture [1], which conceptually consists...... of multiple distributed M2M gateways, creating new challenges in the access control. Some existing access control systems lack in scalability and flexibility to manage access from users or entity that belong to different authorization domains, or fails to provide fine grained and flexible access right...

  8. Literature Review of Cloud Based E-learning Adoption by Students: State of the Art and Direction for Future Work

    Science.gov (United States)

    Hassan Kayali, Mohammad; Safie, Nurhizam; Mukhtar, Muriati

    2016-11-01

    Cloud computing is a new paradigm shift in information technology. Most of the studies in the cloud are business related while the studies in cloud based e-learning are few. The field is still in its infancy and researchers have used several adoption theories to discover the dimensions of this field. The purpose of this paper is to review and integrate the literature to understand the current situation of the cloud based e-learning adoption. A total of 312 articles were extracted from Science direct, emerald, and IEEE. Screening processes were applied to select only the articles that are related to the cloud based e-learning. A total of 231 removed because they are related to business organization. Next, a total of 63 articles were removed because they are technical articles. A total of 18 articles were included in this paper. A frequency analysis was conducted on the paper to identify the most frequent factors, theories, statistical software, respondents, and countries of the studies. The findings showed that usefulness and ease of use are the most frequent factors. TAM is the most prevalent adoption theories in the literature. The mean of the respondents in the reviewed studies is 377 and Malaysia is the most researched countries in terms of cloud based e-learning. Studies of cloud based e-learning are few and more empirical studies are needed.

  9. SenseSeer, mobile-cloud-based lifelogging framework

    OpenAIRE

    Albatal, Rami; Gurrin, Cathal; Zhou, Jiang; Yang, Yang; Carthy, Denise; LI, Na

    2013-01-01

    Smart-phones are becoming our constant companions, they are with us all of the time, being used for calling, web surfing, apps, music listening, TV viewing, social networking, buying, gaming, and a myriad of other uses. Smart-phones are a technology that knows us much better than most of us could imagine. Based on our usage and the fact that we are never far away from our smart phones, they know where we go, who we interact with, what information we consume, and with a little clever software,...

  10. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment.

    Directory of Open Access Journals (Sweden)

    Jeongsu Oh

    Full Text Available High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs. The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG technology-a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD

  11. CLUSTOM-CLOUD: In-Memory Data Grid-Based Software for Clustering 16S rRNA Sequence Data in the Cloud Environment

    Science.gov (United States)

    Park, Min-Kyu; Kim, Byung Kwon; Hwang, Kyuin; Lee, Sang-Heon; Hong, Soon Gyu; Nasir, Arshan; Cho, Wan-Sup; Kim, Kyung Mo

    2016-01-01

    High-throughput sequencing can produce hundreds of thousands of 16S rRNA sequence reads corresponding to different organisms present in the environmental samples. Typically, analysis of microbial diversity in bioinformatics starts from pre-processing followed by clustering 16S rRNA reads into relatively fewer operational taxonomic units (OTUs). The OTUs are reliable indicators of microbial diversity and greatly accelerate the downstream analysis time. However, existing hierarchical clustering algorithms that are generally more accurate than greedy heuristic algorithms struggle with large sequence datasets. To keep pace with the rapid rise in sequencing data, we present CLUSTOM-CLOUD, which is the first distributed sequence clustering program based on In-Memory Data Grid (IMDG) technology–a distributed data structure to store all data in the main memory of multiple computing nodes. The IMDG technology helps CLUSTOM-CLOUD to enhance both its capability of handling larger datasets and its computational scalability better than its ancestor, CLUSTOM, while maintaining high accuracy. Clustering speed of CLUSTOM-CLOUD was evaluated on published 16S rRNA human microbiome sequence datasets using the small laboratory cluster (10 nodes) and under the Amazon EC2 cloud-computing environments. Under the laboratory environment, it required only ~3 hours to process dataset of size 200 K reads regardless of the complexity of the human microbiome data. In turn, one million reads were processed in approximately 20, 14, and 11 hours when utilizing 20, 30, and 40 nodes on the Amazon EC2 cloud-computing environment. The running time evaluation indicates that CLUSTOM-CLOUD can handle much larger sequence datasets than CLUSTOM and is also a scalable distributed processing system. The comparative accuracy test using 16S rRNA pyrosequences of a mock community shows that CLUSTOM-CLOUD achieves higher accuracy than DOTUR, mothur, ESPRIT-Tree, UCLUST and Swarm. CLUSTOM-CLOUD is written in

  12. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    OpenAIRE

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-01-01

    With the acceleration of China’s informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical ap...

  13. Towards a More Reliable and Available Docker-based Container Cloud

    OpenAIRE

    Verma, Mudit; Dhawan, Mohan

    2017-01-01

    Operating System-level virtualization technology, or containers as they are commonly known, represents the next generation of light-weight virtualization, and is primarily represented by Docker. However, Docker's current design does not complement the SLAs from Docker-based container cloud offerings promising both reliability and high availability. The tight coupling between the containers and the Docker daemon proves fatal for the containers' uptime during daemon's unavailability due to eith...

  14. Cloud-Based Evaluation of Anatomical Structure Segmentation and Landmark Detection Algorithms : VISCERAL Anatomy Benchmarks

    OpenAIRE

    Jimenez-del-Toro, Oscar; Muller, Henning; Krenn, Markus; Gruenberg, Katharina; Taha, Abdel Aziz; Winterstein, Marianne; Eggel, Ivan; Foncubierta-Rodriguez, Antonio; Goksel, Orcun; Jakab, Andres; Kontokotsios, Georgios; Langs, Georg; Menze, Bjoern H.; Fernandez, Tomas Salas; Schaer, Roger

    2016-01-01

    Variations in the shape and appearance of anatomical structures in medical images are often relevant radiological signs of disease. Automatic tools can help automate parts of this manual process. A cloud-based evaluation framework is presented in this paper including results of benchmarking current state-of-the-art medical imaging algorithms for anatomical structure segmentation and landmark detection: the VISCERAL Anatomy benchmarks. The algorithms are implemented in virtual machines in the ...

  15. Hydrogen distribution in a containment with a high-velocity hydrogen-steam source

    International Nuclear Information System (INIS)

    Bloom, G.R.; Muhlestein, L.D.; Postma, A.K.; Claybrook, S.W.

    1982-09-01

    Hydrogen mixing and distribution tests are reported for a modeled high velocity hydrogen-steam release from a postulated small pipe break or release from a pressurizer relief tank rupture disk into the lower compartment of an Ice Condenser Plant. The tests, which in most cases used helium as a simulant for hydrogen, demonstrated that the lower compartment gas was well mixed for both hydrogen release conditions used. The gas concentration differences between any spatial locations were less than 3 volume percent during the hydrogen/steam release period and were reduced to less than 0.5 volume percent within 20 minutes after termination of the hydrogen source. The high velocity hydrogen/steam jet provided the dominant mixing mechanism; however, natural convection and forced air recirculation played important roles in providing a well mixed atmosphere following termination of the hydrogen source. 5 figures, 4 tables

  16. Towards high velocity deformation characterisation of metals and composites using Digital Image Correlation

    DEFF Research Database (Denmark)

    Eriksen, Rasmus Normann Wilken; Berggreen, Christian; Boyd, S.W

    2010-01-01

    images and then extracting deformation data using Digital Image Correlation (DIC) from tensile testing in the intermediate strain rate regime available with the test machines. Three different materials, aluminium alloy 1050, S235 steel and glass fibre reinforced plastic (GFRP) were tested at different......Characterisation of materials subject to high velocity deformation is necessary as many materials behave differently under such conditions. It is particularly important for accurate numerical simulation of high strain rate events. High velocity servo-hydraulic test machines have enabled material...... testing in the strain rate regime from 1 – 500 ε/s. The range is much lower than that experienced under ballistic, shock or impact loads, nevertheless it is a useful starting point for the application of optical techniques. The present study examines the possibility of using high speed cameras to capture...

  17. Building Contour Extraction Based on LiDAR Point Cloud

    Directory of Open Access Journals (Sweden)

    Zhang Xu-Qing

    2017-01-01

    Full Text Available This paper presents a new method for solving the problem of utilizing the LiDAR data to extract the building contour line. For detection of the edge points between the building test points by using the least squares fitting to get the edge line of buildings and give the weight determining of the building of edge line slope depend on the length of the edge line. And then get the weighted mean of the positive and negative slope of the building edge line. Based on the structure of the adjacent edge perpendicular hypothesis, regularization processing to extract the edge of the skeleton line perpendicular. The experiments show that the extracted building edges have the good accuracy and have the good applicability in complex urban areas.

  18. Research on Digital Forensic Readiness Design in a Cloud Computing-Based Smart Work Environment

    Directory of Open Access Journals (Sweden)

    Sangho Park

    2018-04-01

    Full Text Available Recently, the work environments of organizations have been in the process of transitioning into smart work environments by applying cloud computing technology in the existing work environment. The smart work environment has the characteristic of being able to access information assets inside the company from outside the company through cloud computing technology, share information without restrictions on location by using mobile terminals, and provide a work environment where work can be conducted effectively in various locations and mobile environments. Thus, in the cloud computing-based smart work environment, changes are occurring in terms of security risks, such as an increase in the leakage risk of an organization’s information assets through mobile terminals which have a high risk of loss and theft and increase the hacking risk of wireless networks in mobile environments. According to these changes in security risk, the reactive digital forensic method, which investigates digital evidence after the occurrence of security incidents, appears to have a limit which has led to a rise in the necessity of proactive digital forensic approaches wherein security incidents can be addressed preemptively. Accordingly, in this research, we design a digital forensic readiness model at the level of preemptive prevention by considering changes in the cloud computing-based smart work environment. Firstly, we investigate previous research related to the cloud computing-based smart work environment and digital forensic readiness and analyze a total of 50 components of digital forensic readiness. In addition, through the analysis of the corresponding preceding research, we design seven detailed areas, namely, outside the organization environment, within the organization guideline, system information, terminal information, user information, usage information, and additional function. Then, we design a draft of the digital forensic readiness model in the cloud

  19. Auditory velocity discrimination in the horizontal plane at very high velocities.

    Science.gov (United States)

    Frissen, Ilja; Féron, François-Xavier; Guastavino, Catherine

    2014-10-01

    We determined velocity discrimination thresholds and Weber fractions for sounds revolving around the listener at very high velocities. Sounds used were a broadband white noise and two harmonic sounds with fundamental frequencies of 330 Hz and 1760 Hz. Experiment 1 used velocities ranging between 288°/s and 720°/s in an acoustically treated room and Experiment 2 used velocities between 288°/s and 576°/s in a highly reverberant hall. A third experiment addressed potential confounds in the first two experiments. The results show that people can reliably discriminate velocity at very high velocities and that both thresholds and Weber fractions decrease as velocity increases. These results violate Weber's law but are consistent with the empirical trend observed in the literature. While thresholds for the noise and 330 Hz harmonic stimulus were similar, those for the 1760 Hz harmonic stimulus were substantially higher. There were no reliable differences in velocity discrimination between the two acoustical environments, suggesting that auditory motion perception at high velocities is robust against the effects of reverberation. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Survey of high-velocity molecular gas in the vicinity of Herbig-Haro objects. I

    International Nuclear Information System (INIS)

    Edwards, S.; Snell, R.L.

    1983-01-01

    A survey of high-velocity molecular gas toward 49 Herbig-Haro objects is presented. Observations of the 12 CO J = 1-0 transition obtained with the 14 m telescope of the Five College Radio Astronomy Observatory reveal three new spatially extended high-velocity molecular outflows. One is in the NGC 1333 region near HH 12, and two are in the NGC 7129 region, the first near LkHα 234 and the second near a far-infrared source. The relationship between optical Herbin-Haro emission knots and large-scale motions of the ambient molecular material is investigated, and the properties of high-velocity molecular outflows in the vicinity of Herbig-Haro objects are discussed. Of 11 energetic outflows in the vicinity of Herbig-Haro objects, eight are found in four pairs separated by 0.2-1.0 pc. We estimate that energetic outflows characterized by mass loss rates > or =10 -7 M/sub sun/ yr -1 occur for at least 10 4 yr once in the lifetime of all stars with masses greater than 1M/sub sun/