WorldWideScience

Sample records for cii computers

  1. Conception and production of a time sharing system for a Mitra-15 CII mini-computer dedicated to APL

    International Nuclear Information System (INIS)

    The installation of a time-sharing system on a mini-computer poses several interesting problems. These technical problems are especially interesting when the goal is to equitably divide the physical resources of the machine amongst users of a high level, conservational language like APL. Original solutions were necessary to be able to retain the rapidity and performances of the original hard and software. The system has been implemented in such way that several users may simultaneously access logical resources, such as the library zones their read/write requests are managed by semaphores which may also be directly controlled by the APL programmer. (author)

  2. Velocity-resolved [CII] emission and [CII]/FIR Mapping along Orion with Herschel

    CERN Document Server

    Goicoechea, J R; Etxaluze, M; Goldsmith, P F; Ossenkopf, V; Gerin, M; Bergin, E A; Black, J H; Cernicharo, J; Cuadrado, S; Encrenaz, P; Falgarone, E; Fuente, A; Hacar, A; Lis, D C; Marcelino, N; Melnick, G J; Muller, H S P; Persson, C; Pety, J; Rollig, M; Schilke, P; Simon, R; Snell, R L; Stutzki, J

    2015-01-01

    We present the first 7.5'x11.5' velocity-resolved map of the [CII]158um line toward the Orion molecular cloud-1 (OMC-1) taken with the Herschel/HIFI instrument. In combination with far-infrared (FIR) photometric images and velocity-resolved maps of the H41alpha hydrogen recombination and CO J=2-1 lines, this data set provides an unprecedented view of the intricate small-scale kinematics of the ionized/PDR/molecular gas interfaces and of the radiative feedback from massive stars. The main contribution to the [CII] luminosity (~85%) is from the extended, FUV-illuminated face of the cloud G_0>500, n_H>5x10^3 cm^-3) and from dense PDRs (G_0~10^4, n_H~10^5 cm^-3) at the interface between OMC-1 and the HII region surrounding the Trapezium cluster. Around 15% of the [CII] emission arises from a different gas component without CO counterpart. The [CII] excitation, PDR gas turbulence, line opacity (from [13CII]) and role of the geometry of the illuminating stars with respect to the cloud are investigated. We construct...

  3. Origin and z-distribution of Galactic diffuse [CII] emission

    CERN Document Server

    Velusamy, T

    2014-01-01

    We determine the source of the diffuse [CII] emission by studying its spatial (radial and vertical) distributions. We used the HIFI [CII] Galactic survey (GOT C+), along with HI, 12CO, and 13CO data toward 354 lines of sight, and several HIFI [CII] and [CI] position-velocity maps. We quantified the emission in each spectral line profile by evaluating the intensities in 3 km/s wide velocity bins, 'spaxels'. Using the detection of [CII] with CO or [CI], we separated the dense and diffuse gas components. We derived 2-D Galactic disk maps using the spaxel velocities for kinematic distances. We separated the warm and cold H2 gases by comparing CO emissions with and without associated [CII]. We find evidence of widespread diffuse [CII] emission with a z-scale distribution larger than that for the total [CII] or CO. and it consists of (i) diffuse molecular (CO-faint) H2 clouds and (ii) diffuse HI clouds and/or WIM. In the inner Galaxy we find a lack of [CII] detections in a majority (~62%) of HI spaxels and show tha...

  4. Comparing [CII], HI, and CO dynamics of nearby galaxies

    CERN Document Server

    de Blok, W J G; Smith, J -D T; Herrera-Camus, R; Bolatto, A D; Requena-Torres, M A; Crocker, A F; Croxall, K V; Kennicutt, R C; Koda, J; Armus, L; Boquien, M; Dale, D; Kreckel, K; Meidt, S

    2016-01-01

    The HI and CO components of the interstellar medium (ISM) are usually used to derive the dynamical mass M_dyn of nearby galaxies. Both components become too faint to be used as a tracer in observations of high-redshift galaxies. In those cases, the 158 $\\mu$m line of atomic carbon [CII] may be the only way to derive M_dyn. As the distribution and kinematics of the ISM tracer affects the determination of M_dyn, it is important to quantify the relative distributions of HI, CO and [CII]. HI and CO are well-characterised observationally, however, for [CII] only very few measurements exist. Here we compare observations of CO, HI, and [CII] emission of a sample of nearby galaxies, drawn from the HERACLES, THINGS and KINGFISH surveys. We find that within R_25, the average [CII] exponential radial profile is slightly shallower than that of the CO, but much steeper than the HI distribution. This is also reflected in the integrated spectrum ("global profile"), where the [CII] spectrum looks more like that of the CO tha...

  5. Human apolipoprotein C-II: complete nucleic acid sequence of preapolipoprotein C-II.

    OpenAIRE

    Fojo, S S; Law, S W; Brewer, H B

    1984-01-01

    Apolipoprotein (apo) C-II is a cofactor for lipoprotein lipase, the enzyme that catalyzes the hydrolysis of triglycerides on plasma triglyceride-rich lipoproteins. The complete coding sequence of apoC-II mRNA has been determined from an apoC-II clone isolated from a human liver cDNA library. A 17-base-long synthetic oligonucleotide based on amino acid residues 5-10 of apoC-II was utilized as a hybridization probe to select recombinant plasmids containing the apoC-II sequence. Two thousand fou...

  6. [CII] dynamics in the S140 region

    International Nuclear Information System (INIS)

    We report the observation of [C II] emission in a cut through the S140 region together with single pointing observations of several molecular tracers, including hydrides, in key regions of the photon-dominated region (PDR) and molecular cloud [1]. At a distance of 910 pc, a BOV star ionizes the edge of the molecular cloud L1204, creating S140. In addition, the dense molecular cloud hosts a cluster of embedded massive young stellar objects only 75' from the H II region [e.g. 2, 3]. We used HIFI on Herschel to observe [CII] in a strip following the direction of the impinging radiation across the ionisation front and through the cluster of embedded YSOs. With [C II], we can trace the ionising radiation and, together with the molecular tracers such as CO isotopologues and HCO+, study the dynamical processes in the region. Combining HIFIs high spectral resolution data with ground based molecular data allows us to study the dynamics and excitation conditions both in the ionization front and the dense molecular star forming region and model their physical conditions [4

  7. [CII] dynamics in the S140 region

    Energy Technology Data Exchange (ETDEWEB)

    Dedes, C. [ETH Zurich, Institute for Astronomy, Zurich (Switzerland); Röllig, M.; Okada, Y.; Ossenkopf, V. [1. Physikalisches Institut Universität Köln (Germany); Mookerjea, B. [Tata Institute of Fundamental Research, Mumbai (India); Collaboration: WADI Team

    2015-01-22

    We report the observation of [C II] emission in a cut through the S140 region together with single pointing observations of several molecular tracers, including hydrides, in key regions of the photon-dominated region (PDR) and molecular cloud [1]. At a distance of 910 pc, a BOV star ionizes the edge of the molecular cloud L1204, creating S140. In addition, the dense molecular cloud hosts a cluster of embedded massive young stellar objects only 75' from the H II region [e.g. 2, 3]. We used HIFI on Herschel to observe [CII] in a strip following the direction of the impinging radiation across the ionisation front and through the cluster of embedded YSOs. With [C II], we can trace the ionising radiation and, together with the molecular tracers such as CO isotopologues and HCO{sup +}, study the dynamical processes in the region. Combining HIFIs high spectral resolution data with ground based molecular data allows us to study the dynamics and excitation conditions both in the ionization front and the dense molecular star forming region and model their physical conditions [4].

  8. [CII] dynamics in the S140 region

    Science.gov (United States)

    Dedes, C.; Röllig, M.; Mookerjea, B.; Okada, Y.; Ossenkopf, V.; WADI Team

    2015-01-01

    We report the observation of [C II] emission in a cut through the S140 region together with single pointing observations of several molecular tracers, including hydrides, in key regions of the photon-dominated region (PDR) and molecular cloud [1]. At a distance of 910 pc, a BOV star ionizes the edge of the molecular cloud L1204, creating S140. In addition, the dense molecular cloud hosts a cluster of embedded massive young stellar objects only 75" from the H II region [e.g. 2, 3]. We used HIFI on Herschel to observe [CII] in a strip following the direction of the impinging radiation across the ionisation front and through the cluster of embedded YSOs. With [C II], we can trace the ionising radiation and, together with the molecular tracers such as CO isotopologues and HCO+, study the dynamical processes in the region. Combining HIFIs high spectral resolution data with ground based molecular data allows us to study the dynamics and excitation conditions both in the ionization front and the dense molecular star forming region and model their physical conditions [4].

  9. [CII] synthetic emission maps of simulated galactic disks

    Science.gov (United States)

    Franeck, A.; Walch, S.; Glover, S. C. O.; Seifried, D.; Girichidis, P.; Naab, T.; Klessen, R.; Peters, T.; Wünsch, R.; Gatto, A.; Clark, P. C.

    2016-05-01

    We carry out radiative transfer simulations for the [CII] emission of simulated galactic disks from the SILCC project.6 Here we present the integrated [CII] intensity map of a typical simulation which assumes solar neighbourhood conditions with ΣGAS = 10 M⊙/pc2 and a supernova rate of 15 SN/Myr with randomly distributed supernovae (SNe) at t = 50 Myr. We analyse the intensity profile which reveals different components. These are clearly distinguishable and trace cold, molecular as well as warm, outflowing gas. We find that [CII] is a promising tool to analyse the multi-phase structure of the ISM. SILCC: Simulating the LIfe Cycle of molecular Clouds: hera.ph1.uni-koeln.de/˜silcc/

  10. Recombination Lines of CII in the Spectra of Planetary Nebulae

    OpenAIRE

    Sochi, Taha

    2010-01-01

    The current report presents the work to investigate the recombination lines of CII in the spectra of planetary nebulae. Two CIII targets were prepared and used to generate theoretical data required in the investigation of recombination lines that arise from collisions between electrons and ions in thin plasma found in planetary nebulae and other astrophysical objects. One of these targets contains 9 atomic terms while the other contains 26 terms. For each one of these targets, theoretical dat...

  11. [CII] line emission in massive star-forming galaxies at z=4.7

    CERN Document Server

    Wagg, J; Carilli, C L; Espada, D; Peck, A; Riechers, D; Walter, F; Wootten, A; Aravena, M; Barkats, D; Cortes, J R; Hills, R; Hodge, J; Impellizzeri, C M V; Iono, D; Leroy, A; Martin, S; Rawlings, M G; Maiolino, R; McMahon, R G; Scott, K S; Villard, E; Vlahakis, C

    2012-01-01

    We present Atacama Large Millimeter/submillimeter Array (ALMA) observations of the [CII] 157.7micron fine structure line and thermal dust continuum emission from a pair of gas-rich galaxies at z=4.7, BR1202-0725. This system consists of a luminous quasar host galaxy and a bright submm galaxy (SMG), while a fainter star-forming galaxy is also spatially coincident within a 4'' (25 kpc) region. All three galaxies are detected in the submm continuum, indicating FIR luminosities in excess of 10^13 Lsun for the two most luminous objects. The SMG and the quasar host galaxy are both detected in [CII] line emission with luminosities, L([CII]) = (10.0 +/- 1.5)x10^9 Lsun and L([CII]) = (6.5+/-1.0)x10^9 Lsun, respectively. We estimate a luminosity ratio, L([CII])/L(FIR) = (8.3+/-1.2)x10^-4 for the starburst SMG to the North, and L([CII])/L(FIR) = (2.5+/-0.4)x10^-4 for the quasar host galaxy, in agreement with previous high-redshift studies that suggest lower [CII]-to-FIR luminosity ratios in quasars than in starburst gal...

  12. Recombination Lines of CII in the Spectra of Planetary Nebulae

    CERN Document Server

    Sochi, Taha

    2010-01-01

    The current report presents the work carried out by the author to investigate the recombination lines of CII in the spectra of planetary nebulae. Two CIII targets were prepared and used to generate theoretical data required in the investigation of recombination lines that arise from collisions between electrons and ions in thin plasma found in planetary nebulae and other astrophysical objects. One of these targets contains 9 atomic terms while the other contains 26 terms. For each one of these targets, theoretical data concerning bound and autoionizing states were generated in the intermediate coupling approximation by R-matrix and Autostructure codes and compared to experimental data. The comparison revealed very good agreement. These theoretical data were then used to generate emissivity data and compare it to the carbon recombination lines found in the observational line list of Zhang et al [2005] on the planetary nebula NGC 7027. The main tool used in this analysis is the `Emissivity' code which is a prog...

  13. No severe bottleneck during human evolution: evidence from two apolipoprotein C-II deficiency alleles.

    OpenAIRE

    Xiong, W J; Li, W. H.; Posner, I; Yamamura, T.; Yamamoto, A.; Gotto, A M; Chan, L

    1991-01-01

    The DNA sequences of a Japanese and a Venezuelan apolipoprotein (apo) C-II deficiency allele, of a normal Japanese apo C-II gene, and of a chimpanzee apo C-II gene were amplified by PCR, and their nucleotide sequences were determined on multiple clones of the PCR products. The normal Japanese sequence is identical to--and the chimpanzee sequence differs by only three nucleotides from--a previously published normal Caucasian sequence. In contrast, the two human mutant sequences each differ fro...

  14. Balloon observations of interstellar CII (158 microns) and OI (63 microns) forbidden lines

    Science.gov (United States)

    Shibai, H.; Okuda, H.; Nakagawa, T.; Maihara, T.; Mizutani, K.; Matsuhara, H.; Kobayashi, Y.; Hiromoto, N.; Low, F. J.; Nishimura, T.

    1993-01-01

    Interstellar CII and OI forbidden lines were observed by the Balloon-Borne Infrared Telescope (BIRT) with a Fabry-Perot spectrometer. Two balloon flights were successfully made. With a method of 'frequency switching', diffuse CII forbidden-line emission was efficiently detected and mapped in extended regions around HII/molecular cloud complexes and in a wide area of the Galactic plane. It has been shown that the CII forbidden-line emission is very strong and ubiquitously distributed in interstellar space in the Galaxy.

  15. Internal structure of spiral arms traced with [CII]: Unraveling the WIM, HI, and molecular emission lanes

    CERN Document Server

    Velusamy, T; Goldsmith, P F; Pineda, J L

    2015-01-01

    The spiral arm tangencies are ideal lines of sight in which to determine the distribution of interstellar gas components in the spiral arms and study the influence of spiral density waves on the interarm gas in the Milky Way. We present a large scale (~15deg) position-velocity map of the Galactic plane in [CII] from l = 326.6 to 341.4deg observed with Herschel HIFI. We use [CII] l-v maps along with those for Hi and 12CO to derive the average spectral line intensity profiles over the longitudinal range of each tangency. Using the VLSR of the emission features, we locate the [CII], HI, and 12CO emissions along a cross cut of the spiral arm. In the spectral line profiles at the tangencies [CII] has two emission peaks, one associated with the compressed WIM and the other the molecular gas PDRs. When represented as a cut across the inner to outer edge of the spiral arm, the [CII]-WIM peak appears closest to the inner edge while 12CO and [CII] associated with molecular gas are at the outermost edge. HI has broader ...

  16. GREAT [CII] and CO observations of the BD+40{\\deg}4124 region

    CERN Document Server

    Sandell, Göran; Requena-Torres, Miguel Angel; Heyminck, Stefan; Güsten, Rolf; Stutzki, Jürgen; Simon, Robert; Graf, Urs U; 10.1051/0004-6361/201218920

    2012-01-01

    The BD+40\\degree4124 region was observed with high angular and spectral resolution with the German heterodyne instrument GREAT in CO J = 13 \\rightarrow 12 and [CII] on SOFIA. These observations show that the [CII] emission is very strong in the reflection nebula surrounding the young Herbig Ae/Be star BD+40\\degree4124. A strip map over the nebula shows that the [CII] emission approximately coincides with the optical nebulosity. The strongest [CII] emission is centered on the B2 star and a deep spectrum shows that it has faint wings, which suggests that the ionized gas is expanding. We also see faint CO J = 13 \\rightarrow 12 at the position of BD+40\\degree4124, which suggests that the star may still be surrounded by an accretion disk.We also detected [CII] emission and strong CO J = 13 \\rightarrow 12 toward V1318 Cyg. Here the [CII] emission is fainter than in BD+40\\degree4124 and appears to come from the outflow, since it shows red and blue wings with very little emission at the systemic velocity, where the C...

  17. Rapid radioimmunoassay of human apolipoproteins C-II and C-III

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, S.; Oestlund-Lindqvist, A.M.; Vessby, B. (Uppsala Univ. (Sweden))

    1984-06-01

    Apolipoprotein (apo) C-II is an activator of lipoprotein lipase, while apo C-III has the ability to inhibit apo C-II activated lipolysis. In order to study further the relationship between lipoprotein lipase mediated hydrolysis and the serum concentrations of apo C-II and apo C-III radioimmunoassays for these apolipoproteins have been developed. Formalin-treated Staphylococcus aureus Cowan I was used for immunoprecipitation and were shown to give rapid uptake of immune complexes that could easily be harvested by centrifugation. The assays were shown to be sensitive (10 ..mu..g/1), specific, precise (inter- and intra-assay coefficients of variation below 10%), rapid (completed in less than 6 h) and simple to perform. Delipidation of serum and lipoproteins had no effect on the results, indicating that the immunologically active sites of apo C-II and apo C-III are exposed to the aqueous environment under assay conditions. Serum apo C-II and apo C-III levels of normolipidaemic subjects were approximately 25 mg/1 and 110 mg/1, respectively. Highly significant positive correlations were found between VLDL apo C-II and VLDL apo C-III, respectively, and VLDL triglycerides, VLDL cholesterol and total serum TG. There was also a highly significant correlation between the HDL cholesterol concentration and the HDL apo C-III concentration.

  18. A physical model for the [CII]-FIR deficit in luminous galaxies

    CERN Document Server

    Narayanan, Desika

    2016-01-01

    Observations of ionised carbon at 158 micron ([CII]) from luminous star-forming galaxies at z~0 show that their ratios of [CII] to far infrared (FIR) luminosity are systematically lower than those of more modestly star-forming galaxies. In this paper, we provide a theory for the origin of this so called "[CII] deficit" in galaxies. Our model treats the interstellar medium as a collection of clouds with radially-stratified chemical and thermal properties, which are dictated by the clouds' volume and surface densities, as well as the interstellar radiation and cosmic ray fields to which they are exposed. [CII] emission arises from the outer, HI dominated layers of clouds, and from regions where the hydrogen is H2 but the carbon is predominantly C+. In contrast, the most shielded regions of clouds are dominated by CO and produce little [CII] emission. This provides a natural mechanism to explain the observed [CII]-star formation relation: galaxies' star formation rates are largely driven by the surface densities...

  19. [CII] 158 $\\mu$m Emission as a Star Formation Tracer

    CERN Document Server

    Herrera-Camus, R; Wolfire, M G; Smith, J D; Croxall, K V; Kennicutt, R C; Calzetti, D; Helou, G; Walter, F; Leroy, A K; Draine, B; Brandl, B R; Armus, L; Sandstrom, K M; Dale, D A; Aniano, G; Meidt, S E; Boquien, M; Hunt, L K; Galametz, M; Tabatabaei, F S; Murphy, E J; Appleton, P; Roussel, H; Engelbracht, C; Beirao, P

    2014-01-01

    The [CII] 157.74 $\\mu$m transition is the dominant coolant of the neutral interstellar gas, and has great potential as a star formation rate (SFR) tracer. Using the Herschel KINGFISH sample of 46 nearby galaxies, we investigate the relation of [CII] surface brightness and luminosity with SFR. We conclude that [CII] can be used for measurements of SFR on both global and kiloparsec scales in normal star-forming galaxies in the absence of strong active galactic nuclei (AGN). The uncertainty of the $\\Sigma_{\\rm [CII]}-\\Sigma_{\\rm SFR}$ calibration is $\\pm$0.21 dex. The main source of scatter in the correlation is associated with regions that exhibit warm IR colors, and we provide an adjustment based on IR color that reduces the scatter. We show that the color-adjusted $\\Sigma_{\\rm[CII]}-\\Sigma_{\\rm SFR}$ correlation is valid over almost 5 orders of magnitude in $\\Sigma_{\\rm SFR}$, holding for both normal star-forming galaxies and non-AGN luminous infrared galaxies. Using [CII] luminosity instead of surface bright...

  20. C[II] 158 micrometre brightness as a function of galaxy activity

    CERN Document Server

    Curran, S J

    2008-01-01

    We investigate the possibility that the known decrease in the relative luminosity of the 158 micrometre C[II] line with the far-infrared luminosity in extragalactic sources is due to evolutionary effects: Because of the flux limited nature of the surveys, large luminosities are indicative of large distances, and we do find significant increases in both L_[CII] and L_FIR with redshift. However, the fact that the C[II] luminosity does not climb so steeply with look-back time as that of the far-infrared, gives the decline in the L_[CII]/L_FIR ratio with cosmic age significant at >3 sigma, which may in turn be responsible for the decline in L_[CII]/L_FIR with L_FIR. Investigating this further, we find that the [CII] luminosity exhibits similar drops as measured against the carbon monoxide and radio continuum luminosities. The former indicates that at higher redshifts a larger fraction of the carbon is locked up in the form of molecules, rather than ionised gas. The latter hints a increased activity in these galax...

  1. A Sample of [CII] Clouds Tracing Dense Clouds in Weak FUV Fields observed by Herschel

    CERN Document Server

    Pineda, Jorge L; Langer, William D; Goldsmith, Paul F; Li., Di; Yorke, Harold W; Laboratory, Jet Propulsion

    2010-01-01

    The [CII] fine--structure line at 158um is an excellent tracer of the warm diffuse gas in the ISM and the interfaces between molecular clouds and their surrounding atomic and ionized envelopes. Here we present the initial results from Galactic Observations of Terahertz C+ (GOTC+), a Herschel Key Project devoted to study the [CII] fine structure emission in the galactic plane using the HIFI instrument. We use the [CII] emission together with observations of CO as a probe to understand the effects of newly--formed stars on their interstellar environment and characterize the physical and chemical state of the star-forming gas. We collected data along 16 lines--of--sight passing near star forming regions in the inner Galaxy near longitudes 330 degrees and 20 degrees. We identify fifty-eight [CII] components that are associated with high--column density molecular clouds as traced by 13CO emission. We combine [CII], 12CO, and 13CO observations to derive the physical conditions of the [CII]--emitting regions in our ...

  2. Radiative Transfer meets Bayesian statistics: where does your Galaxy's [CII] come from?

    CERN Document Server

    Accurso, Gioacchino; Bisbas, Thomas G; Viti, Serena

    2016-01-01

    The [CII] 158$\\mu$m emission line can arise in all phases of the ISM, therefore being able to disentangle the different contributions is an important yet unresolved problem when undertaking galaxy-wide, integrated [CII] observations. We present a new multi-phase 3D radiative transfer interface that couples Starburst99, a stellar spectrophotometric code, with the photoionisation and astrochemistry codes Mocassin and 3D-PDR. We model entire star forming regions, including the ionised, atomic and molecular phases of the ISM, and apply a Bayesian inference methodology to parametrise how the fraction of the [CII] emission originating from molecular regions, $f_{[CII],mol}$, varies as a function of typical integrated properties of galaxies in the local Universe. The main parameters responsible for the variations of $f_{[CII],mol}$ are specific star formation rate (sSFR), gas phase metallicity, HII region electron number density ($n_e$), and dust mass fraction. For example, $f_{[CII],mol}$ can increase from 60% to 8...

  3. SPAIDE: A Real-time Research Platform for the Clarion CII/90K Cochlear Implant

    Directory of Open Access Journals (Sweden)

    Dykmans P

    2005-01-01

    Full Text Available SPAIDE (sound-processing algorithm integrated development environment is a real-time platform of Advanced Bionics Corporation (Sylmar, Calif, USA to facilitate advanced research on sound-processing and electrical-stimulation strategies with the Clarion CII and 90K implants. The platform is meant for testing in the laboratory. SPAIDE is conceptually based on a clear separation of the sound-processing and stimulation strategies, and, in specific, on the distinction between sound-processing and stimulation channels and electrode contacts. The development environment has a user-friendly interface to specify sound-processing and stimulation strategies, and includes the possibility to simulate the electrical stimulation. SPAIDE allows for real-time sound capturing from file or audio input on PC, sound processing and application of the stimulation strategy, and streaming the results to the implant. The platform is able to cover a broad range of research applications; from noise reduction and mimicking of normal hearing, over complex (simultaneous stimulation strategies, to psychophysics. The hardware setup consists of a personal computer, an interface board, and a speech processor. The software is both expandable and to a great extent reusable in other applications.

  4. PROTECTING CRITICAL DATABASES – TOWARDS A RISK-BASED ASSESSMENT OF CRITICAL INFORMATION INFRASTRUCTURES (CIIS IN SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    Mzukisi N Njotini

    2013-04-01

    Full Text Available South Africa has made great strides towards protecting critical information infrastructures (CIIs. For example, South Africa recognises the significance of safeguarding places or areas that are essential to the national security of South Africa or the economic and social well-being of South African citizens. For this reason South Africa has established mechanisms to assist in preserving the integrity and security of CIIs. The measures provide inter alia for the identification of CIIs; the registration of the full names, address and contact details of the CII administrators (the persons who manage CIIs; the identification of the location(s of CIIs or their component parts; and the outlining of the general descriptions of information or data stored in CIIs.It is argued that the measures to protect CIIs in South Africa are inadequate. In particular, the measures rely on a one-size-fits-all approach to identify and classify CIIs. For this reason the South African measures are likely to lead to the adoption of a paradigm that considers every infrastructure, data or database, regardless of its significance or importance, to be key or critical.

  5. A Herschel [CII] Galactic plane survey I: the global distribution of ISM gas components

    CERN Document Server

    Pineda, Jorge L; Velusamy, Thangasamy; Goldsmith, Paul F

    2013-01-01

    [Abridged] The [CII] 158um line is an important tool for understanding the life cycle of interstellar matter. Ionized carbon is present in a variety of phases of the interstellar medium, including the diffuse ionized medium, warm and cold atomic clouds, clouds in transition from atomic to molecular, and dense and warm photon dominated regions (PDRs). The Galactic Observations of Terahertz C+ (GOTC+) project surveys the [CII] line over the entire Galactic disk with velocity-resolved observations using the Herschel/HIFI instrument. We present the first longitude-velocity maps of the [CII] emission for Galactic latitudes b=0deg, +-0.5deg, and +-1.0deg. [CII] emission is mostly associated with spiral arms, mainly emerging from Galactocentric distances between 4 and 10 kpc. We estimate that most of the observed [CII] emission is produced by dense PDRs (47%), with smaller contributions from CO-dark H2 gas (28%), cold atomic gas (21%), and ionized gas (4%). Atomic gas inside the Solar radius is mostly in the form of...

  6. The ALMA Patchy Deep Survey: A blind search for [CII] emitters at z~4.5

    CERN Document Server

    Matsuda, Y; Iono, D; Hatsukade, B; Kohno, K; Tamura, Y; Yamaguchi, Y; Shimizu, I

    2015-01-01

    We present a result of a blind search for [CII] 158 $\\mu$m emitters at $z\\sim 4.5$ using ALMA Cycle~0 archival data. We collected extra-galactic data covering at 330-360 GHz (band~7) from 8 Cycle~0 projects from which initial results have been already published. The total number of fields is 243 and the total on-source exposure time is 19.2 hours. We searched for line emitters in continuum-subtracted data cubes with spectral resolutions of 50, 100, 300 and 500 km/s. We could not detect any new line emitters above a 6-$\\sigma$ significance level. This result provides upper limits to the [CII] luminosity function at $z\\sim 4.5$ over $L_{\\rm [CII]} \\sim 10^8 - 10^{10} L_{\\odot}$ or star formation rate, SFR $\\sim$ 10-1000 M$_{^\\odot}$/yr. These limits are at least 2 orders of magnitude larger than the [CII] luminosity functions expected from the $z \\sim 4$ UV luminosity function or from numerical simulation. However, this study demonstrates that we would be able to better constrain the [CII] luminosity function a...

  7. The [CII] Deficit in LIRGs and ULIRGs is Due to High-Temperature Saturation

    CERN Document Server

    Muñoz, Joseph A

    2015-01-01

    Current predictions for the line ratios from photo-dissociative regions (PDRs) in galaxies adopt theoretical models that consider only individual parcels of PDR gas each characterized by the local density and far-UV radiation field. However, these quantities are not measured directly from unresolved galaxies, making the connection between theory and observation ambiguous. We develop a model that uses galaxy-averaged, observable inputs to explain and predict measurements of the [CII] fine structure line in luminous and ultra-luminous infrared galaxies. We find that the [CII] deficit observed in the highest IR surface-brightness systems is a natural consequence of saturating the upper fine-structure transition state at gas temperatures above 91 K. To reproduce the measured amplitude of the [CII]/FIR ratio in deficit galaxies, we require that [CII] trace approximately 10-17% of all gas in these systems, roughly independent of IR surface brightness and consistent with observed [CII] to CO(1--0) line ratios. Calcu...

  8. Collisional Excitation of the [CII] Fine Structure Transition in Interstellar Clouds

    CERN Document Server

    Goldsmith, Paul F; Pineda, Jorge L; Velusamy, T

    2012-01-01

    We analyze the collisional excitation of the 158 micron (1900.5 GHz) fine structure transition of ionized carbon (C+) in terms of line intensities produced by simple cloud models. The single C+ fine structure transition is a very important coolant of the atomic interstellar medium and of photon dominated regions in which carbon is partially or completely in ionized form. The [CII] line is widely used as a tracer of star formation in the Milky Way and other galaxies. Excitation of the [CII] fine structure transition can be via collisions with hydrogen molecules, atoms, and electrons. Velocity-resolved observations of [CII] have become possible with the HIFI instrument on Herschel and the GREAT instrument on SOFIA. Analysis of these observations is complicated by the fact that it is difficult to determine the optical depth of the [CII] line due to the relative weakness and blending of the components of the analogous transition of 13C$+. We discuss the excitation and radiative transition of the [CII] line, deriv...

  9. [CII] emission in z ~ 6 strongly lensed, star-forming galaxies

    CERN Document Server

    Knudsen, Kirsten K; Kneib, Jean-Paul; Jauzac, Mathilde; Clement, Benjamin; Drouart, Guillaume; Egami, Eiichi; Lindroos, Lukas

    2016-01-01

    The far-infrared fine-structure line [CII] at 1900.5GHz is known to be one of the brightest cooling lines in local galaxies, and therefore it has been suggested to be an efficient tracer for star-formation in very high-redshift galaxies. However, recent results for galaxies at $z>6$ have yielded numerous non-detections in star-forming galaxies, except for quasars and submillimeter galaxies. We report the results of ALMA observations of two lensed, star-forming galaxies at $z = 6.029$ and $z=6.703$. The galaxy A383-5.1 (star formation rate [SFR] of 3.2 M$_\\odot$yr$^{-1}$ and magnification of $\\mu = 11.4$) shows a line detection with $L_{\\rm [CII]} = 8.3\\times10^{6}$ L$_\\odot$, making it the so far lowest $L_{\\rm [CII]}$ ever detected at $z>6$. For MS0451-H (SFR = 0.4M$_\\odot$yr$^{-1}$ and $\\mu = 100\\pm20$) we provide an upper limit of $L_{\\rm [CII]} 6$, however, other effects could also play a role in terms of decreasing $L_{\\rm [CII]}$. The detection of A383-5.1 is encouraging and suggests that detections are...

  10. Tracing the reionization epoch with ALMA: [CII] emission in z~7 galaxies

    CERN Document Server

    Pentericci, L; Castellano, M; Fontana, A; Maiolino, R; Guaita, L; Vanzella, E; Grazian, A; Santini, P; Yan, H; Cristiani, S; Conselice, C; Giavalisco, M; Hathi, N; Koekemoer, A

    2016-01-01

    We present new results on [CII]158$\\mu$ m emission from four galaxies in the reionization epoch. These galaxies were previously confirmed to be at redshifts between 6.6 and 7.15 from the presence of the Ly$\\alpha$ emission line in their spectra. The Ly$\\alpha$ emission line is redshifted by 100-200 km/s compared to the systemic redshift given by the [CII] line. These velocity offsets are smaller than what is observed in z~3 Lyman break galaxies with similar UV luminosities and emission line properties. Smaller velocity shifts reduce the visibility of Ly$\\alpha$ and hence somewhat alleviate the need for a very neutral IGM at z~7 to explain the drop in the fraction of Ly$\\alpha$ emitters observed at this epoch. The galaxies show [CII] emission with L[CII]=0.6-1.6 x10$^8 L_\\odot$: these luminosities place them consistently below the SFR-L[CII] relation observed for low redshift star forming and metal poor galaxies and also below z =5.5 Lyman break galaxies with similar star formation rates. We argue that previou...

  11. Selected issues of the universal communication environment implementation for CII standard

    Science.gov (United States)

    Zagoździńska, Agnieszka; Poźniak, Krzysztof T.; Drabik, Paweł K.

    2011-10-01

    In the contemporary FPGA market there is the wide assortment of structures, integrated development environments, and boards of different producers. The variety allows to fit resources to requirements of the individual designer. There is the need of standardization of the projects to make it useful in research laboratories equipped with different producers tools. Proposed solution is CII standardization of VHDL components. This paper contains specification of the universal communication environment for CII standard. The link can be used in different FPGA structures. Implementation of the link enables object oriented VHDL programming with the use of CII standardization. The whole environment contains FPGA environment and PC software. The paper contains description of the selected issues of FPGA environment. There is description of some specific solutions that enables environment usage in structures of different producers. The flexibility of different size data transmissions with the use of CII is presented. The specified tool gives the opportunity to use FPGA structures variety fully and design faster and more effectively.

  12. The scale height of gas traced by [CII] in the Galactic plane

    CERN Document Server

    Langer, W D; Velusamy, T

    2014-01-01

    The distribution of various interstellar gas components and the pressure in the interstellar medium (ISM) is a result of the interplay of different dynamical mechanisms and energy sources on the gas in the Milky Way. The scale heights of the different gas tracers, such as HI and CO, are a measure of these processes. The scale height of [CII] emission in the Galactic plane is important for understanding those ISM components not traced by CO or HI. We determine the average distribution of [CII] perpendicular to the plane in the inner Galactic disk and compare it to the distributions of other key gas tracers, such as CO and HI. We calculated the vertical, z, distribution of [CII] in the inner Galactic disk by adopting a model for the emission that combines the latitudinal, b, spectrally unresolved BICE survey, with the spectrally resolved $Herschel$ Galactic plane survey of [CII] at b = 0 deg. Our model assumed a Gaussian emissivity distribution vertical to the plane, and related the distribution in z to that of...

  13. The Apolipoprotein E/CI/CII Gene Cluster and Late-Onset Alzheimer Disease

    OpenAIRE

    Yu, Chang-En; Payami, Haydeh; Olson, Jane M.; Boehnke, Michael; Wijsman, Ellen M; Orr, Harry T.; Kukull, Walter A.; Goddard, Katrina A B; Nemens, Ellen; White, June A.; Alonso, M. Elisa; Taylor, Todd D.; Ball, Melvyn J.; Kaye, Jeffrey; Morris, John

    1994-01-01

    The chromosome 19 apolipoprotein E/CI/CII gene cluster was examined for evidence of linkage to a familial Alzheimer disease (FAD) locus. The family groups studied were Volga German (VG), early-onset non-VG (ENVG; mean age at onset

  14. ALMA detection of [CII] 158 micron emission from a strongly lensed z=2 star-forming galaxy

    CERN Document Server

    Schaerer, D; Jones, T; Dessauges-Zavadsky, M; Sklias, P; Zamojski, M; Cava, A; Richard, J; Ellis, R; Rawle, T D; Egami, E; Combes, F

    2015-01-01

    Our objectives are to determine the properties of the interstellar medium (ISM) and of star-formation in typical star-forming galaxies at high redshift. Following up on our previous multi-wavelength observations with HST, Spitzer, Herschel, and the Plateau de Bure Interferometer (PdBI), we have studied a strongly lensed z=2.013 galaxy, the arc behind the galaxy cluster MACS J0451+0006, with ALMA to measure the [CII] 158 micron emission line, one of the main coolants of the ISM. [CII] emission from the southern part of this galaxy is detected at 10 $\\sigma$. Taking into account strong gravitational lensing, which provides a magnification of $\\mu=49$, the intrinsic lensing-corrected [CII]158 micron luminosity is $L(CII)=1.2 \\times 10^8 L_\\odot$. The observed ratio of [CII]-to-IR emission, $L(CII)/L(FIR) \\approx (1.2-2.4) \\times 10^{-3}$, is found to be similar to that in nearby galaxies. The same also holds for the observed ratio $L(CII)/L(CO)=2.3 \\times 10^3$, which is comparable to that of star-forming galaxi...

  15. Structure of the Glycosyltransferase EryCIII in Complex with its Activating P450 Homologue EryCII

    OpenAIRE

    Moncrieffe, Martin C.; Fernandez, Maria-Jose; Spiteller, Dieter; Matsumura, Hiroyoshi; Gay, Nicholas J.; Luisi, Ben F.; Leadlay, Peter F

    2012-01-01

    In the biosynthesis of the clinically important antibiotic erythromycin D, the glycosyltransferase (GT) EryCIII, in concert with its partner EryCII, attaches a nucleotide-activated sugar to the macrolide scaffold with high specificity. To understand the role of EryCII, we have determined the crystal structure of the EryCIII·EryCII complex at 3.1 Å resolution. The structure reveals a heterotetramer with a distinctive, elongated quaternary organization. The EryCIII subunits form an extensive se...

  16. In vitro regulation of phage lambda cII gene expression by Escherichia coli integration host factor.

    OpenAIRE

    Peacock, S.; Weissbach, H; Nash, H A

    1984-01-01

    The effect of Escherichia coli integration host factor (IHF) on phage lambda gene expression has been examined in a simplified DNA-directed in vitro system that measures the formation of the first dipeptide of the gene product. Plasmid pKC30cII, which contains the phage lambda genes N, cII and O, under control of the PL promoter, was used as template to study the expression of the first dipeptide of the gene products--i.e., fMet-Asp for N protein, fMet-Val for cII, and fMet-Thr for O. Purifie...

  17. GOT C+ Survey of [CII] 158 Micrometer Emission: Atomic to Molecular Cloud Transitions in the Inner Galaxy

    Science.gov (United States)

    Velusamy, T.; Langer, W. D.; Willacy, K.; Pineda, J. L.; Goldsmith, P. F.

    2012-01-01

    We present the results of the distribution of CO-dark H2 gas in a sample of 2200 interstellar clouds in the inner Galaxy (l = 90 deg to +57 deg) detected in the velocity resolved [CII] spectra observed in the GOT C+ survey using the Herschel HIFI. We analyze the [CII] intensities along with the ancillary HI, (12)CO and (13)CO data for each cloud to determine their evolutionary state and to derive the H2 column densities in the C(+) and C(+)/CO transition layers in the cloud. We discuss the overall Galactic distribution of the [CII] clouds and their properties as a function Galactic radius. GOT C+ results on the global distribution of [CII] clouds and CO-dark H2 gas traces the FUV and star formation rates in the Galactic disk.

  18. The Faintness of the 158 um [CII] Transition in the z=6.42 Quasar SDSS J1148+5251

    OpenAIRE

    Bolatto, Alberto D.; Di Francesco, James; Willott, Chris J.

    2004-01-01

    We report the non-detection of the [CII] 157.74 um transition in the z=6.42 quasar SDSS J1148+5251 after 37.5 hours of integration with the James Clerk Maxwell Telescope. This transition is the main cooling line of the star-forming interstellar medium, and usually the brightest FIR line in galaxies. Our observed RMS of 1.3 mK in the Ta* scale translates to L([CII])

  19. GRB 980425 host: [CII], [OI] and CO lines reveal recent enhancement of star formation due to atomic gas inflow

    CERN Document Server

    Michałowski, Michał J; Wardlow, J L; Karska, A; Messias, H; van der Werf, P; Hunt, L K; Baes, M; Castro-Tirado, A J; Gentile, G; Hjorth, J; Floc'h, E Le; Martinez, R Perez; Guelbenzu, A Nicuesa; Rasmussen, J; Rizzo, J R; Rossi, A; Sanchez-Portal, M; Schady, P; Sollerman, J; Xu, D

    2016-01-01

    We have recently suggested that gas accretion can be studied using host galaxies of gamma-ray bursts (GRBs). We obtained the first ever far-infrared (FIR) line observations of a GRB host, namely Herschel/PACS resolved [CII] 158 um and [OI] 63 um spectroscopy, as well as APEX CO(2-1) and ALMA CO(1-0) observations of the GRB 980425 host. It has elevated [CII]/FIR and [OI]/FIR ratios and higher values of star formation rate (SFR) derived from line ([CII], [OI], Ha) than from continuum (UV, IR, radio) indicators. [CII] emission exhibits a normal morphology, peaking at the galaxy center, whereas [OI] is concentrated close to the GRB position and the nearby Wolf-Rayet region. The high [OI] flux indicates high radiation field and gas density. The [CII]/CO luminosity ratio of the GRB 980425 host is close to the highest values found for local star-forming galaxies. Its CO-derived molecular gas mass is low given its SFR and metallicity, but the [CII]-derived molecular gas mass is close to the expected value. The [OI] a...

  20. Velocity resolved [CII] spectroscopy of the center and the BCLMP302 region of M33 (HerM33es)

    CERN Document Server

    Mookerjea, B; Kramer, C; Nikola, T; Braine, J; Ossenkopf, V; Roellig, M; Henkel, C; van der Werf, P; van der Tak, F; Wiedner, M C

    2015-01-01

    We aim to understand the contribution of the ionized, atomic and molecular phases of the ISM to the [CII] emission from clouds near the dynamical center and the BCLMP302 HII region in the north of the nearby galaxy M33 at a spatial resolution of 50pc. We combine high resolution [CII] spectra taken with the HIFI spectrometer onboard the Herschel satellite with [CII] Herschel-PACS maps and ground-based observations of CO(2-1) and HI. All data are at a common spatial resolution of 50pc. Typically, the [CII] lines have widths intermediate between the narrower CO(2-1) and broader HI line profiles. We decomposed the [CII] spectra in terms of contribution from molecular and atomic gas detected in CO(2-1) and HI, respectively. We find that the relative contribution of molecular and atomic gas traced by CO(2-1) and HI varies depends mostly on the local physical conditions and geometry. We estimate that 11-60% and 5-34% of the [CII] intensities in the center and in BCLMP302, respectively, arise at velocities showing no...

  1. SPAIDE: A Real-time Research Platform for the Clarion CII/90K Cochlear Implant

    OpenAIRE

    Dykmans P; Vanpoucke F; Bracke P; Van Immerseel L; Peeters S

    2005-01-01

    SPAIDE (sound-processing algorithm integrated development environment) is a real-time platform of Advanced Bionics Corporation (Sylmar, Calif, USA) to facilitate advanced research on sound-processing and electrical-stimulation strategies with the Clarion CII and 90K implants. The platform is meant for testing in the laboratory. SPAIDE is conceptually based on a clear separation of the sound-processing and stimulation strategies, and, in specific, on the distinction between sound-processing a...

  2. Stability of CII is a key element in the cold stress response of bacteriophage lambda infection.

    OpenAIRE

    Obuchowski, M; Shotland, Y; Koby, S; Giladi, H; Gabig, M; Wegrzyn, G; Oppenheim, A B

    1997-01-01

    Bacteria are known to adapt to environmental changes such as temperature fluctuations. It was found that temperature affects the lysis-lysogeny decision of lambda such that at body temperature (37 degrees C) the phage can select between the lytic and lysogenic pathways, while at ambient temperature (20 degrees C) the lytic pathway is blocked. This temperature-dependent discriminatory developmental pathway is governed mainly by the phage CII activity as a transcriptional activator. Mutations i...

  3. [CII] absorption and emission in the diffuse interstellar medium across the Galactic Plane

    CERN Document Server

    Gerin, M; Goicoechea, J R; Gusdorf, A; Godard, B; de Luca, M; Falgarone, E; Goldsmith, P F; Lis, D C; Menten, K M; Neufeld, D; Phillips, T G; Liszt, H

    2014-01-01

    Ionized carbon is the main gas-phase reservoir of carbon in the neutral diffuse interstellar medium and its 158 micron fine structure transition [CII] is the most important cooling line of the diffuse interstellar medium (ISM). We combine [CII] absorption and emission spectroscopy to gain an improved understanding of physical conditions in the different phases of the ISM. We present high resolution [CII] spectra obtained with the Herschel/HIFI instrument towards bright dust continuum sources regions in the Galactic plane, probing simultaneously the diffuse gas along the line of sight and the background high-mass star forming regions. These data are complemented by observations of the 492 and 809 GHz fine structure lines of atomic carbon and by medium spectral resolution spectral maps of the fine structure lines of atomic oxygen at 63 and 145 microns with Herschel/PACS. We show that the presence of foreground absorption may completely cancel the emission from the background source in medium spectral resolution...

  4. Neon and [CII] 158 micron Emission Line Profiles in Dusty Starbursts and Active Galactic Nuclei

    CERN Document Server

    Samsonyan, Anahit; Lebouteiller, Vianney; Barry, Donald; Sargsyan, Lusine

    2016-01-01

    The sample of 379 extragalactic sources is presented that have mid-infrared, high resolution spectroscopy with the Spitzer Infrared Spectrograph (IRS) and also spectroscopy of the [CII] 158 um line with the Herschel Photodetector Array Camera and Spectrometer (PACS). The emission line profiles of [NeII] 12.81 um, [NeIII] 15.55 um, and [CII] 158 um are presented, and intrinsic line widths are determined (full width half maximum of Gaussian profiles after instrumental correction). All line profiles together with overlays comparing positions of PACS and IRS observations are made available in the Cornell Atlas of Spitzer IRS Sources (CASSIS). Sources are classified from AGN to starburst based on equivalent widths of the 6.2 um polycyclic aromatic hydrocarbon feature. It is found that intrinsic line widths do not change among classification for [CII], with median widths of 207 km per s for AGN, 248 km per s for composites, and 233 km per s for starbursts. The [NeII] line widths also do not change with classificati...

  5. Globules and pillars seen in the [CII] 158 micron line with SOFIA

    CERN Document Server

    Schneider, N; Tremblin, P; Hennemann, M; Minier, V; Hill, T; Comerón, F; Requena-Torres, M A; Kraemer, K E; Simon, R; Röllig, M; Stutzki, J; Djupvik, A A; Zinnecker, H; Marston, A; Csengeri, T; Cormier, D; Lebouteiller, V; Audit, E; Motte, F; Bontemps, S; Sandell, G; Allen, L; Megeath, T; Gutermuth, R A

    2012-01-01

    Molecular globules and pillars are spectacular features, found only in the interface region between a molecular cloud and an HII-region. Impacting Far-ultraviolet (FUV) radiation creates photon dominated regions (PDRs) on their surfaces that can be traced by typical cooling lines. With the GREAT receiver onboard SOFIA we mapped and spectrally resolved the [CII] 158 micron atomic fine-structure line and the highly excited 12CO J=11-10 molecular line from three objects in Cygnus X (a pillar, a globule, and a strong IRAS source). We focus here on the globule and compare our data with existing Spitzer data and recent Herschel Open-Time PACS data. Extended [CII] emission and more compact CO-emission was found in the globule. We ascribe this emission mainly to an internal PDR, created by a possibly embedded star-cluster with at least one early B-star. However, external PDR emission caused by the excitation by the Cyg OB2 association cannot be fully excluded. The velocity-resolved [CII] emission traces the emission ...

  6. Bright [CII] and dust emission in three z>6.6 quasar host galaxies observed by ALMA

    CERN Document Server

    Venemans, B P; Zschaechner, L; Decarli, R; De Rosa, G; Findlay, J R; McMahon, R G; Sutherland, W J

    2015-01-01

    We present ALMA detections of the [CII] 158 micron emission line and the underlying far-infrared continuum of three quasars at 6.6~6 quasar hosts correlate with the quasar's bolometric luminosity. In one quasar, the [CII] line is significantly redshifted by ~1700 km/s with respect to the MgII broad emission line. Comparing to values in the literature, we find that, on average, the MgII is blueshifted by 480 km/s (with a standard deviation of 630 km/s) with respect to the host galaxy redshift, i.e. one of our quasars is an extreme outlier. Through modeling we can rule out a flat rotation curve for our brightest [CII] emitter. Finally, we find that the ratio of black hole mass to host galaxy (dynamical) mass is higher by a factor 3-4 (with significant scatter) than local relations.

  7. Isolation of Escherichia coli rpoB mutants resistant to killing by lambda cII protein and altered in pyrE gene attenuation

    DEFF Research Database (Denmark)

    Hammer, Karin; Jensen, Kaj Frank; Poulsen, Peter;

    1987-01-01

    Escherichia coli mutants simultaneously resistant to rifampin and to the lethal effects of bacteriophage lambda cII protein were isolated. The sck mutant strains carry alterations in rpoB that allow them to survive cII killing (thus the name sck), but that do not impair either the expression of c...

  8. Sub-mm Emission Line Deep Fields: CO and [CII] Luminosity Functions out to z = 6

    CERN Document Server

    Popping, Gergö; Decarli, Roberto; Spaans, Marco; Somerville, Rachel S; Trager, Scott C

    2016-01-01

    Now that ALMA is reaching its full capabilities, observations of sub-mm emission line deep fields become feasible. Deep fields are ideal to study the luminosity function of sub-mm emission lines, ultimately tracing the atomic and molecular gas properties of galaxies. We couple a semi-analytic model of galaxy formation with a radiative transfer code to make predictions for the luminosity function of CO J=1-0 up to CO J=6-5 and [CII] at redshifts z=0-6. We find that: 1) our model correctly reproduces the CO and [CII] emission of low- and high-redshift galaxies and reproduces the available constraints on the CO luminosity function at z1.5 and the CO luminosity of individual galaxies at intermediate redshifts. We argue that this is driven by a lack of cold gas in galaxies at intermediate redshifts as predicted by cosmological simulations of galaxy formation. This may lay at the root of other problems theoretical models face at the same redshifts.

  9. Search for [CII] emission in z=6.5-11 star-forming galaxies

    CERN Document Server

    González-López, Jorge; Decarli, Roberto; Walter, Fabian; Vallini, Livia; Neri, Roberto; Bertoldi, Frank; Bolatto, Alberto D; Carilli, Christopher L; Cox, Pierre; da Cunha, Elisabete; Ferrara, Andrea; Gallerani, Simona; Infante, Leopoldo

    2014-01-01

    We present the search for the [CII] emission line in three $z>6.5$ Lyman-alpha emitters (LAEs) and one J-Dropout galaxy using the Combined Array for Research in Millimeter-wave Astronomy (CARMA) and the Plateau de Bure Interferometer (PdBI). We observed three bright $z\\sim6.5-7$ LAEs discovered in the SUBARU deep field (SDF) and the Multiple Imaged lensed $z\\sim 11$ galaxy candidate found behind the galaxy cluster MACSJ0647.7+7015. For the LAEs IOK-1 ($z=6.965$), SDF J132415.7+273058 ($z=6.541$) and SDF J132408.3+271543 ($z=6.554$) we find upper limits for the [CII] line luminosity of $<2.05$, $<4.52$ and $<10.56\\times10^{8}{\\rm L}_{\\odot}$ respectively. We find upper limits to the FIR luminosity of the galaxies using a spectral energy distribution template of the local galaxy NGC 6946 and taking into account the effects of the Cosmic Microwave Background on the mm observations. For IOK-1, SDF J132415.7+273058 and SDF J132408.3+271543 we find upper limits for the FIR luminosity of $<2.33$, $3.79$ ...

  10. Varying [CII]/[NII] line ratios in the interacting system BR1202-0725 at z=4.7

    CERN Document Server

    Decarli, R; Carilli, C; Bertoldi, F; Cox, P; Ferkinhoff, C; Groves, B; Maiolino, R; Neri, R; Riechers, D; Weiss, A

    2014-01-01

    We study the properties of the interstellar medium in the interacting system BR1202-0725 at z=4.7 via its [NII] and [CII] fine-structure line emission. This system consists of a QSO, a sub-mm galaxy (SMG), and two Ly-alpha emitters (LAEs). Such a diversity in galaxy properties makes BR1202-0725 a unique laboratory of star formation and galaxy evolution at high redshift. We present ionized nitrogen ([NII] 205 micron) observations of this system, obtained with the IRAM Plateau de Bure Interferometer. We find no [NII] emission at the quasar location, but tentative [NII] line detections associated with the SMG and one of the LAEs. Together with available ionized carbon ([CII] 158 micron) ALMA observations of this system, we find the following: The [CII]/[NII] luminosity ratio is >5.5 for the QSO and the SMG, but it is as low as ~2 in the LAE, suggesting that, in this source, most of the [CII] emission is associated with the ionized medium (HII regions) rather than the neutral one (PDRs). This study demonstrates t...

  11. The Faintness of the 158 um [CII] Transition in the z=6.42 Quasar SDSS J1148+5251

    CERN Document Server

    Bolatto, A D; Willott, C J

    2004-01-01

    We report the non-detection of the [CII] 157.74 um transition in the z=6.42 quasar SDSS J1148+5251 after 37.5 hours of integration with the James Clerk Maxwell Telescope. This transition is the main cooling line of the star-forming interstellar medium, and usually the brightest FIR line in galaxies. Our observed RMS of 1.3 mK in the Ta* scale translates to L([CII])<2.6 x 10^9 Lsun. Using a recent estimate of the far-infrared continuum of this quasar, we derive for SDSS J1148+5251 L([CII])/L(FIR)<5 x 10^-4, a ratio similar to that observed in local ultra-luminous infrared galaxies but considerably smaller than what is typical in nearby normal and starburst galaxies. This indicates that the small L([CII])/L(FIR) ratio observed locally in luminous far-infrared objects also persists at the highest redshifts.

  12. Bright [CII] 158$\\mu$m emission in a quasar host galaxy at $z=6.54$

    CERN Document Server

    Bañados, E; Walter, F; Venemans, B P; Farina, E P; Fan, X

    2015-01-01

    The [CII] 158$\\mu$m fine-structure line is known to trace regions of active star formation and is the main coolant of the cold, neutral atomic medium. In this \\textit{Letter}, we report a strong detection of the [CII] line in the host galaxy of the brightest quasar known at $z>6.5$, the Pan-STARRS1 selected quasar PSO J036.5078+03.0498 (hereafter P036+03), using the IRAM NOEMA millimeter interferometer. Its [CII] and total far-infrared luminosities are $(5.8 \\pm 0.7) \\times 10^9 \\,L_\\odot$ and $(7.6\\pm1.5) \\times 10^{12}\\,L_\\odot$, respectively. This results in a $L_{[CII]} /L_{TIR}$ ratio of $\\sim 0.8\\times 10^{-3}$, which is at the high end for those found for active galaxies, though it is lower than the average found in typical main sequence galaxies at $z\\sim 0$. We also report a tentative additional line which we identify as a blended emission from the $3_{22} - 3_{13}$ and $5_{23} - 4_{32}$ H$_2$O transitions. If confirmed, this would be the most distant detection of water emission to date. P036+03 riva...

  13. Synergy of CO/[CII]/Ly$\\alpha$ Line Intensity Mapping with the SKA

    CERN Document Server

    Chang, Tzu-Ching; Santos, Mario; Silva, Marta; Aguirre, James; Doré, Olivier; Pritchard, Jonathan

    2015-01-01

    We present the science enabled by cross-correlations of the SKA1-LOW 21-cm EoR surveys with other line mapping programs. In particular, we identify and investigate potential synergies with planned programs, such as the line intensity mapping of redshifted CO rotational lines, [CII] and Ly-$\\alpha$ emissions during reionization. We briefly describe how these tracers of the star-formation rate at $z \\sim 8$ can be modeled jointly before forecasting their auto- and cross-power spectra measurements with the nominal 21cm EoR survey. The use of multiple line tracers would be invaluable to validate and enrich our understanding of the EoR.

  14. Contrasting [CII], methylidynium ion CH^+, and methanol physical and chemical tracers around Orion KL.

    Science.gov (United States)

    Morris, P.; Pearson, J.; Neufeld, D.; Gupta, H.; Herschel HEXOS Team

    2011-05-01

    Spectral maps with high velocity resolution and S/N ratios have been taken as part of the HEXOS Key Program with Herschel/HIFI of the inner region of the Orion KL nebula (typically 2' x 2') around the deeply embedded young massive star IRc2 at the [CII] 1900 GHz and CH+ J=1-0 835 and 2-1 1669 GHz. The CH+ 1-0 map also contains E symmetry K=5-4 Q branch emission lines. These maps provide excellent contrasts of the velocity and temperature/density structure, and isotopic structure of the cooling regions which have accommodated molecular cloud contraction and subsequent star formation. We present the distribution of the intense [CII] and CH+ lines from the turbulent hot and dense core and the surrounding relatively cool material, exhibiting a wide range of velocity profiles and 12C/13C isotopologue ratios. This allows a test of the previously deduced agreement of the isotope ratio with PDR models which imply that chemical fractionation in this highly illuminated region is unimportant, but we further consider optical depth effects which may attenuate the apparent abundance ratios. The CH+ J=1-0 transition is observed in emission over the entire map while the J=2-1 is primarily in absorption. The excitation of CH+ is problematic due to its rapid reactions with H2 and electrons. The line width is comparable to other extended molecules. Possible mechanisms for excitation will be discussed in light of the observational data, in relation to the photodissociation surface illuminated by theta1 Ori C to the south of IRc2.

  15. Extreme CII emission in type 2 quasars at z~2.5: a signature of kappa-distributed electron energies?

    CERN Document Server

    Humphrey, Andrew

    2014-01-01

    We investigate the flux ratio between the 1335 A and 2326 A lines of singly ionized carbon in the extended narrow line regions of type 2 quasars at z~2.5. We find the observed CII 1335 / CII] 2326 flux ratio, which is not sensitive to the C/H abundance ratio, to be often several times higher than predicted by the canonical AGN photoionization models that use solar metallicity and a Maxwell-Boltzmann electron energy distribution. We study several potential solutions for this discrepancy: low gas metallicity, shock ionization, continuum fluorescence, and kappa-distributed electron energies. Although we cannot definitively distinguish between several of the proposed solutions, we argue that a kappa distribution gives the more natural explanation. We also provide a grid of AGN photoionization models using kappa-distributed electron energies.

  16. A 158 Micron [CII] Line Survey of Galaxies at z ~ 1 to 2: An Indicator of Star Formation in the Early Universe

    CERN Document Server

    Stacey, G J; Ferkinhoff, C; Nikola, T; Parshley, S C; Benford, D J; Staguhn, J G; Fiolet, N

    2010-01-01

    We have detected the 158 {\\mu}m [CII] line from 12 galaxies at z~1-2. This is the first survey of this important starformation tracer at redshifts covering the epoch of maximum star-formation in the Universe and quadruples the number of reported high z [CII] detections. The line is very luminous, between <0.024-0.65% of the far-infrared continuum luminosity of our sources, and arises from PDRs on molecular cloud surfaces. An exception is PKS 0215+015, where half of the [CII] emission could arise from XDRs near the central AGN. The L[CII] /LFIR ratio in our star-formation-dominated systems is ~8 times larger than that of our AGN-dominated systems. Therefore this ratio selects for star-formation-dominated systems. Furthermore, the L[CII]/LFIR and L[CII]/L(CO(1-0)) ratios in our starforming galaxies and nearby starburst galaxies are the same, so that luminous starforming galaxies at earlier epochs (z~1-2) appear to be scaled up versions of local starbursts entailing kilo-parsec-scale starbursts. Most of the F...

  17. Simulator of Galaxy Millimeter/Submillimeter Emission (SIGAME): The [CII]-SFR Relationship of Massive z=2 Main Sequence Galaxies

    OpenAIRE

    Olsen, Karen P.; Greve, Thomas R.; Narayanan, Desika; Thompson, Robert; Toft, Sune; Brinch, Christian

    2015-01-01

    We present S\\'IGAME simulations of the [CII]157.7$\\mu$ fine structure line emission from cosmological smoothed particle hydrodynamics (SPH) simulations of seven main sequence galaxies at z=2. Using sub-grid physics prescriptions the gas in our simulations is modeled as a multi-phased interstellar medium (ISM) comprised of molecular gas residing in giant molecular clouds, an atomic gas phase associated with photo-dissociation regions (PDRs) at the cloud surfaces, and a diffuse, ionized gas pha...

  18. Simulator of Galaxy Millimeter/Submillimeter Emission (SIGAME): The [CII]-SFR Relationship of Massive z=2 Main Sequence Galaxies

    CERN Document Server

    Olsen, Karen P; Narayanan, Desika; Thompson, Robert; Toft, Sune; Brinch, Christian

    2015-01-01

    We present SIGAME simulations of the [CII] 157.7 {\\mu}m fine structure line emission from cosmological smoothed particle hydrodynamics (SPH) simulations of main sequence galaxies at z = 2. Using sub-grid physics prescriptions the gas in our galaxy simulations is modelled as a multi-phased interstellar medium (ISM) comprised of molecular gas residing in the inner regions of giant molecular clouds, an atomic gas phase associated with photodissociation regions at the surface of the clouds, and a diffuse, fully ionized gas phase. Adopting a density profile of the clouds and taking into account heating by the local FUV radiation field and cosmic rays - both scaled by the local star formation rate density - we calculate the [CII] emission from each of the aforementioned ISM phases using a large velocity gradient approach for each cloud, on resolved and global scales. The [CII] emission peaks in the central (~ 60%) of the emission in this region originates in the molecular gas phase. At larger galactocentric distanc...

  19. Design and Fabrication of TES Detector Modules for the TIME-Pilot [CII] Intensity Mapping Experiment

    Science.gov (United States)

    Hunacek, J.; Bock, J.; Bradford, C. M.; Bumble, B.; Chang, T.-C.; Cheng, Y.-T.; Cooray, A.; Crites, A.; Hailey-Dunsheath, S.; Gong, Y.; Kenyon, M.; Koch, P.; Li, C.-T.; O'Brient, R.; Shirokoff, E.; Shiu, C.; Staniszewski, Z.; Uzgil, B.; Zemcov, M.

    2016-08-01

    We are developing a series of close-packed modular detector arrays for TIME-Pilot, a new mm-wavelength grating spectrometer array that will map the intensity fluctuations of the redshifted 157.7 \\upmu m emission line of singly ionized carbon ([CII]) from redshift z ˜ 5 to 9. TIME-Pilot's two banks of 16 parallel-plate waveguide spectrometers (one bank per polarization) will have a spectral range of 183-326 GHz and a resolving power of R ˜ 100. The spectrometers use a curved diffraction grating to disperse and focus the light on a series of output arcs, each sampled by 60 transition edge sensor (TES) bolometers with gold micro-mesh absorbers. These low-noise detectors will be operated from a 250 mK base temperature and are designed to have a background-limited NEP of {˜ }10^{-17} mathrm {W}/mathrm {Hz}^{1/2}. This proceeding presents an overview of the detector design in the context of the TIME-Pilot instrument. Additionally, a prototype detector module produced at the Microdevices Laboratory at JPL is shown.

  20. Herschel/PACS Survey of protoplanetary disks in Taurus/Auriga -- Observations of [OI] and [CII], and far infrared continuum

    CERN Document Server

    Howard, Christian D; Vacca, William D; Duchêne, Gaspard; Mathews, Geoffrey; Augereau, Jean-Charles; Barrado, David; Dent, William R F; Eiroa, Carlos; Grady, Carol; Kamp, Inga; Meeus, Gwendolyn; Ménard, Francois; Pinte, Christophe; Podio, Linda; Riviere-Marichalar, Pablo; Roberge, Aki; Thi, Wing-Fai; Vicente, Silvia; Williams, Jonathan P

    2013-01-01

    The Herschel Space Observatory was used to observe ~ 120 pre-main-sequence stars in Taurus as part of the GASPS Open Time Key project. PACS was used to measure the continuum as well as several gas tracers such as [OI] 63 \\mu m, [OI] 145 \\mu m, [CII] 158 \\mu m, OH, H2O and CO. The strongest line seen is [OI] at 63 \\mu m. We find a clear correlation between the strength of the [OI] 63 \\mu m line and the 63 \\mu m continuum for disk sources. In outflow sources, the line emission can be up to 20 times stronger than in disk sources, suggesting that the line emission is dominated by the outflow. The tight correlation seen for disk sources suggests that the emission arises from the inner disk ($<$ 50 AU) and lower surface layers of the disk where the gas and dust are coupled. The [OI] 63 \\mu m is fainter in transitional stars than in normal Class II disks. Simple SED models indicate that the dust responsible for the continuum emission is colder in these disks, leading to weaker line emission. [CII] 158 \\mu m emiss...

  1. rctB mutations that increase copy number of Vibrio cholerae oriCII in Escherichia coli

    DEFF Research Database (Denmark)

    Koch, Birgit; Ma, Xiaofang; Løbner-Olesen, Anders

    2012-01-01

    RctB serves as the initiator protein for replication from oriCII, the origin of replication of Vibrio cholerae chromosome II. RctB is conserved between members of Vibrionaceae but shows no homology to known replication initiator proteins and has no recognizable sequence motifs. We used an ori...

  2. A multi-wavelength exploration of the [CII]/IR ratio in H-ATLAS/GAMA galaxies out to z=0.2

    CERN Document Server

    Ibar, E; Herrera-Camus, R; Hopwood, R; Bauer, A; Ivison, R J; Michałowski, M J; Dannerbauer, H; van der Werf, P; Riechers, D; Bourne, N; Baes, M; Valtchanov, I; Dunne, L; Verma, A; Brough, S; Cooray, A; De Zotti, G; Dye, S; Eales, S; Furlanetto, C; Maddox, S; Smith, M; Steele, O; Thomas, D; Valiante, E

    2015-01-01

    We explore the behaviour of [CII]-157.74um forbidden fine-structure line observed in a sample of 28 galaxies selected from ~50deg^2 of the H-ATLAS survey. The sample is restricted to galaxies with flux densities higher than S_160um>150mJy and optical spectra from the GAMA survey at 0.022.5x10^-3 with respect to those showing lower ratios. In particular, those with high ratios tend to have: (1) L_IR) is main parameter responsible for controlling the [CII]/IR ratio. It is possible that relatively high creates a positively charged dust grain distribution, impeding an efficient photo-electric extraction of electrons from these grains to then collisionally excite carbon atoms. Within the brighter IR population, 11CII]/IR ratio is unlikely to be modified by [CII] self absorption or controlled by the presence of a moderately luminous AGN (identified via the BPT diagram).

  3. ALMA Spectroscopic Survey in the Hubble Ultra Deep Field: Search for [CII] line and dust emission in $6

    CERN Document Server

    Aravena, Manuel; Walter, Fabian; Bouwens, Rychard; Oesch, Pascal; Carilli, Christopher; Bauer, Franz E; Da Cunha, Elisabete; Daddi, Emanuele; Gónzalez-López, Jorge; Ivison, R J; Riechers, Dominik; Smail, Ian R; Swinbank, Mark; Weiss, Axel; Anguita, Timo; Bacon, Roland; Bell, Eric; Bertoldi, Frank; Cortes, Paulo; Cox, Pierre; Hodge, Jacqueline; Ibar, Eduardo; Inami, Hanae; Infante, Leopoldo; Karim, Alexander; Magnelli, Benjamin; Ota, Kauzuaki; Popping, Gergö; van der Werf, Paul; Wagg, Jeffrey

    2016-01-01

    We present a search for [CII] line and dust continuum emission from optical dropout galaxies at $z>6$ using ASPECS, our ALMA Spectroscopic Survey in the Hubble Ultra-Deep Field (UDF). Our observations, which cover the frequency range $212-272$ GHz, encompass approximately the range $6$4.5 $\\sigma$, two of which correspond to blind detections with no optical counterparts. At this significance level, our statistical analysis shows that about 60\\% of our candidates are expected to be spurious. For one of our blindly selected [CII] line candidates, we tentatively detect the CO(6-5) line in our parallel 3-mm line scan. None of the line candidates are individually detected in the 1.2 mm continuum. A stack of all [CII] candidates results in a tentative detection with $S_{1.2mm}=14\\pm5\\mu$Jy. This implies a dust-obscured star formation rate (SFR) of $(3\\pm1)$ M$_\\odot$ yr$^{-1}$. We find that the two highest--SFR objects have candidate [CII] lines with luminosities that are consistent with the low-redshift $L_{\\rm [C...

  4. The Soft, Fluctuating UVB at $z\\sim6$ as Traced by C IV, SiIV, and CII

    CERN Document Server

    Finlator, K; Davé, R; Zackrisson, E; Thompson, R; Huang, S

    2016-01-01

    The sources that drove cosmological reionization left clues regarding their identity in the slope and inhomogeneity of the ultraviolet ionizing background (UVB): Bright quasars (QSOs) generate a hard UVB with predominantly large-scale fluctuations while Population II stars generate a softer one with smaller-scale fluctuations. Metal absorbers probe the UVB's slope because different ions are sensitive to different energies. Likewise, they probe spatial fluctuations because they originate in regions where a galaxy-driven UVB is harder and more intense. We take a first step towards studying the reionization-epoch UVB's slope and inhomogeneity by comparing observations of 12 metal absorbers at $z\\sim6$ versus predictions from a cosmological hydrodynamic simulation using three different UVBs: a soft, spatially-inhomogeneous "galaxies+QSOs" UVB; a homogeneous "galaxies+QSOs" UVB (Haardt & Madau 2012); and a QSOs-only model. All UVBs reproduce the observed column density distributions of CII, SiIV, and CIV reaso...

  5. Gas and dust cooling along the major axis of M33 (HerM33es): ISO/LWS CII observations

    CERN Document Server

    Kramer, C; Garcia-Burillo, S; Relano, M; Aalto, S; Boquien, M; Braine, J; Buchbender, C; Gratier, P; Israel, F P; Nikola, T; Roellig, M; Verley, S; van der Werf, P; Xilouris, E M

    2013-01-01

    We aim to better understand the heating of the gas by observing the prominent gas cooling line [CII] at 158um in the low-metallicity environment of the Local Group spiral galaxy M33 at scales of 280pc. In particular, we aim at describing the variation of the photoelectric heating efficiency with galactic environment. In this unbiased study, we used ISO/LWS [CII] observations along the major axis of M33, in combination with Herschel PACS and SPIRE continuum maps, IRAM 30m CO 2-1 and VLA HI data to study the variation of velocity integrated intensities. The ratio of [CII] emission over the far-infrared continuum is used as a proxy for the heating efficiency, and models of photon-dominated regions are used to study the local physical densities, FUV radiation fields, and average column densities of the molecular clouds. The heating efficiency stays constant at 0.8% in the inner 4.5kpc radius of the galaxy where it starts to increase to reach values of ~3% in the outskirts at about 6kpc radial distance. The rise o...

  6. Lipoprotein lipase activity and mass, apolipoprotein C-II mass and polymorphisms of apolipoproteins E and A5 in subjects with prior acute hypertriglyceridaemic pancreatitis

    Directory of Open Access Journals (Sweden)

    García-Arias Carlota

    2009-06-01

    Full Text Available Abstract Background Severe hypertriglyceridaemia due to chylomicronemia may trigger an acute pancreatitis. However, the basic underlying mechanism is usually not well understood. We decided to analyze some proteins involved in the catabolism of triglyceride-rich lipoproteins in patients with severe hypertriglyceridaemia. Methods Twenty-four survivors of acute hypertriglyceridaemic pancreatitis (cases and 31 patients with severe hypertriglyceridaemia (controls were included. Clinical and anthropometrical data, chylomicronaemia, lipoprotein profile, postheparin lipoprotein lipase mass and activity, hepatic lipase activity, apolipoprotein C II and CIII mass, apo E and A5 polymorphisms were assessed. Results Only five cases were found to have LPL mass and activity deficiency, all of them thin and having the first episode in childhood. No cases had apolipoprotein CII deficiency. No significant differences were found between the non-deficient LPL cases and the controls in terms of obesity, diabetes, alcohol consumption, drug therapy, gender distribution, evidence of fasting chylomicronaemia, lipid levels, LPL activity and mass, hepatic lipase activity, CII and CIII mass or apo E polymorphisms. However, the SNP S19W of apo A5 tended to be more prevalent in cases than controls (40% vs. 23%, NS. Conclusion Primary defects in LPL and C-II are rare in survivors of acute hypertriglyceridaemic pancreatitis; lipase activity measurements should be restricted to those having their first episode during chilhood.

  7. [CII] 158$\\mu$m and [NII] 205$\\mu$m emission from IC 342 - Disentangling the emission from ionized and photo-dissociated regions

    CERN Document Server

    Röllig, Markus; Güsten, R; Stutzki, J; Israel, F; Jacobs, K

    2016-01-01

    Aims: We investigate how much of the [CII] emission in the nucleus of the nearby spiral galaxy IC 342 is contributed by PDRs and by the ionized gas. We examine the spatial variations of starburst/PDR activity and study the correlation of the [CII] line with the [NII] 205{\\textmu}m emission line coming exclusively from the HII regions. Methods: We present small maps of [CII] and [NII] lines recently observed with the GREAT receiver on board SOFIA. In particular we present a super-resolution method to derive how unresolved, kinematically correlated structures in the beam contribute to the observed line shapes. Results: We find that the emission coming from the ionized gas shows a kinematic component in addition to the general Doppler signature of the molecular gas. We interpret this as the signature of two bi-polar lobes of ionized gas expanding out of the galactic plane. We then show how this requires an adaptation of our understanding of the geometrical structure of the nucleus of IC~342. Examining the starbu...

  8. Metabolism of apolipoproteins C-II, C-III, and B in hypertriglyceridemic men. Changes after heparin-induced lipolysis

    International Nuclear Information System (INIS)

    The C apolipoproteins are normally transferred to high density lipoproteins (HDL) after lipolysis of very low density lipoprotein (VLDL) triglyceride. In previous studies, a loss of plasma C apolipoproteins was documented after heparin-induced lipolysis in hypertriglyceridemic subjects. The present studies were designed to determine if this decline in plasma C apolipoproteins was due to their clearance with VLDL remnants. Five Type IV hypertriglyceridemic and two normal subjects were injected with 125I-VLDL and 131I-low density lipoproteins (LDL) to document kinetically an excess of VLDL apolipoprotein (apo) B flux relative to LDL apo B flux in the Type IV subjects. A mean of 46% VLDL apo B was cleared from the circulation, without conversion to intermediate density lipoprotein (IDL) or LDL. Heparin was then infused (9000 IU over 4 hours) to generate an excess of VLDL remnants that were not converted to IDL or LDL. VLDL triglyceride, apo B, and apo C concentrations fell at a similar rate. VLDL apo B declined by 42% (p less than 0.01). However, no increases were observed in IDL or LDL apo B in the Type IV subjects. This resulted in a 14% (p less than 0.01) decline in plasma apo B concentrations, indicating a clearance of VLDL remnants. VLDL apo C-II and C-III concentrations fell by 42% (p less than 0.025) and 52% (p less than 0.01), respectively. During the first 2.5 hours of infusion, they were almost quantitatively recovered in HDL. Thereafter, the C apolipoproteins declined in HDL during which time VLDL apo C concentrations continued to decline

  9. Metabolism of apolipoproteins C-II, C-III, and B in hypertriglyceridemic men. Changes after heparin-induced lipolysis

    Energy Technology Data Exchange (ETDEWEB)

    Huff, M.W.; Breckenridge, W.C.; Strong, W.L.; Wolfe, B.M.

    1988-09-01

    The C apolipoproteins are normally transferred to high density lipoproteins (HDL) after lipolysis of very low density lipoprotein (VLDL) triglyceride. In previous studies, a loss of plasma C apolipoproteins was documented after heparin-induced lipolysis in hypertriglyceridemic subjects. The present studies were designed to determine if this decline in plasma C apolipoproteins was due to their clearance with VLDL remnants. Five Type IV hypertriglyceridemic and two normal subjects were injected with 125I-VLDL and 131I-low density lipoproteins (LDL) to document kinetically an excess of VLDL apolipoprotein (apo) B flux relative to LDL apo B flux in the Type IV subjects. A mean of 46% VLDL apo B was cleared from the circulation, without conversion to intermediate density lipoprotein (IDL) or LDL. Heparin was then infused (9000 IU over 4 hours) to generate an excess of VLDL remnants that were not converted to IDL or LDL. VLDL triglyceride, apo B, and apo C concentrations fell at a similar rate. VLDL apo B declined by 42% (p less than 0.01). However, no increases were observed in IDL or LDL apo B in the Type IV subjects. This resulted in a 14% (p less than 0.01) decline in plasma apo B concentrations, indicating a clearance of VLDL remnants. VLDL apo C-II and C-III concentrations fell by 42% (p less than 0.025) and 52% (p less than 0.01), respectively. During the first 2.5 hours of infusion, they were almost quantitatively recovered in HDL. Thereafter, the C apolipoproteins declined in HDL during which time VLDL apo C concentrations continued to decline.

  10. Witnessing the birth of the red sequence: ALMA high-resolution imaging of [CII] and dust in two interacting ultra-red starbursts at z = 4.425

    CERN Document Server

    Oteo, I; Dunne, L; Smail, I; Swinbank, M; Zhang, Z-Y; Lewis, A; Maddox, S; Riechers, D; Serjeant, S; Van der Werf, P; Bremer, M; Cigan, P; Clements, D L; Cooray, A; Dannerbauer, H; Eales, S; Ibar, E; Messias, H; Michałowski, M J; Pérez-Fournon, I; van Kampen, E

    2016-01-01

    Exploiting the sensitivity and spatial resolution of the Atacama Large Millimeter/submillimeter Array (ALMA), we have studied the morphology and the physical scale of the interstellar medium - both gas and dust - in SGP38326, an unlensed pair of interacting starbursts at $z= 4.425$. SGP38326 is the most luminous star bursting system known at $z > 4$ with an IR-derived ${\\rm SFR \\sim 4300 \\,} M_\\odot \\, {\\rm yr}^{-1}$. SGP38326 also contains a molecular gas reservoir among the most massive ever found in the early Universe, and it is the likely progenitor of a massive, red-and-dead elliptical galaxy at $z \\sim 3$. Probing scales of $\\sim 0.1"$ or $\\sim 800 \\, {\\rm pc}$ we find that the smooth distribution of the continuum emission from cool dust grains contrasts with the more irregular morphology of the gas, as traced by the [CII] fine structure emission. The gas is also extended over larger physical scales than the dust. The velocity information provided by the resolved [CII] emission reveals that the dynamics...

  11. Apolipoprotein C-II Adopts Distinct Structures in Complex with Micellar and Submicellar Forms of the Amyloid-Inhibiting Lipid-Mimetic Dodecylphosphocholine.

    Science.gov (United States)

    Ryan, Timothy M; Griffin, Michael D W; McGillivray, Duncan J; Knott, Robert B; Wood, Kathleen; Masters, Colin L; Kirby, Nigel; Curtain, Cyril C

    2016-01-01

    The formation of amyloid deposits is a common feature of a broad range of diseases, including atherosclerosis, Alzheimer's disease, and Parkinson's disease. The basis and role of amyloid deposition in the pathogenesis of these diseases is still being defined, however an interesting feature of amyloidogenic proteins is that the majority of the pathologically associated proteins are involved in lipid homeostasis, be it in lipid transport, incorporation into membranes, or the regulation of lipid pathways. Thus, amyloid-forming proteins commonly bind lipids, and lipids are generally involved in the proper folding of these proteins. However, understanding of the basis for these lipid-related aspects of amyloidogenesis is lacking. Thus, we have used the apolipoprotein C-II amyloid model system in conjunction with x-ray and neutron scattering analyses to address this problem. Apolipoprotein C-II is a well-studied model system of systemic amyloid fibril formation, with a clear and well-defined pathway for fibril formation, where the effects of lipid interaction are characterized, particularly for the lipid mimetic dodecylphosphocholine. We show that the micellar state of an inhibitory lipid can have a very significant effect on protein conformation, with micelles stabilizing a particular α-helical structure, whereas submicellar lipids stabilize a very different dimeric, α-helical structure. These results indicate that lipids may have an important role in the development and progression of amyloid-related diseases. PMID:26745412

  12. [CII] and $^{12}$CO(1-0) Emission Maps in HLSJ091828.6+514223: A Strongly Lensed Interacting System at $z=5.24$

    CERN Document Server

    Rawle, T D; Bussmann, R S; Gurwell, M; Ivison, R J; Boone, F; Combes, F; Danielson, A L R; Rex, M; Richard, J; Smail, I; Swinbank, A M; Blain, A W; Clement, B; Dessauges-Zavadsky, M; Edge, A C; Fazio, G G; Jones, T; Kneib, J -P; Omont, A; Perez-Gonzalez, P G; Schaerer, D; Valtchanov, I; van der Werf, P P; Walth, G; Zamojski, M; Zemcov, M

    2013-01-01

    We present Submillimeter Array (SMA) [CII] 158um and Jansky Very Large Array (JVLA) $^{12}$CO(1-0) line emission maps for the bright, lensed, submillimeter source at $z=5.2430$ behind Abell 773: HLSJ091828.6+514223 (HLS0918). We combine these measurements with previously reported line profiles, including multiple $^{12}$CO rotational transitions, [CI], water and [NII], providing some of the best constraints on the properties of the interstellar medium (ISM) in a galaxy at $z>5$. HLS0918 has a total far-infrared (FIR) luminosity L_FIR(8-1000um) = (1.6$\\pm$0.1)x10^14 L_sun/mu, where the total magnification mu_total = 8.9$\\pm$1.9, via a new lens model from the [CII] and continuum maps. Despite a HyLIRG luminosity, the FIR continuum shape resembles that of a local LIRG. We simultaneously fit all of the observed spectral line profiles, finding four components which correspond cleanly to discrete spatial structures identified in the maps. The two most redshifted spectral components occupy the nucleus of a massive g...

  13. ALMA Observation of 158 micron [CII] Line and Dust Continuum of a z=7 Normally Star-forming Galaxy in the Epoch of Reionization

    CERN Document Server

    Ota, Kazuaki; Ohta, Kouji; Hatsukade, Bunyo; Carilli, Chris L; da Cunha, Elisabete; González-López, Jorge; Decarli, Roberto; Hodge, Jacqueline A; Nagai, Hiroshi; Egami, Eiichi; Jiang, Linhua; Iye, Masanori; Kashikawa, Nobunari; Riechers, Dominik A; Bertoldi, Frank; Cox, Pierre; Neri, Roberto; Weiss, Axel

    2014-01-01

    We present ALMA observations of the [CII] line and far-infrared (FIR) continuum of a normally star-forming galaxy in the reionization epoch, the z=6.96 Ly-alpha emitter (LAE) IOK-1. Probing to sensitivities of sigma_line = 240 micro-Jy/beam (40 km/s channel) and sigma_cont = 21 micro-Jy/beam, we found the galaxy undetected in both [CII] and continuum. Comparison of UV - FIR spectral energy distribution (SED) of IOK-1, including our ALMA limit, with those of several types of local galaxies (including the effects of the cosmic microwave background, CMB, on the FIR continuum) suggests that IOK-1 is similar to local dwarf/irregular galaxies in SED shape rather than highly dusty/obscured galaxies. Moreover, our 3 sigma FIR continuum limit, corrected for CMB effects, implies intrinsic dust mass M_dust < 6.4 x 10^7 M_sun, FIR luminosity L_FIR < 3.7 x 10^{10} L_sun (42.5 - 122.5 micron), total IR luminosity L_IR < 5.7 x 10^{10} L_sun (8 - 1000 micron) and dust-obscured star formation rate (SFR) < 10 M_sun...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  2. OMNET - high speed data communications for PDP-11 computers

    International Nuclear Information System (INIS)

    Omnet is a high speed data communications network designed at CERN for PDP-11 computers. It has grown from a link multiplexor system built for a CII 10070 computer into a full multi-point network, to which some fifty computers are now connected. It provides communications facilities for several large experimental installations as well as many smaller systems and has connections to all parts of the CERN site. The transmission protocol is discussed and brief details are given of the hardware and software used in its implementation. Also described is the gateway interface to the CERN packet switching network, 'Cernet'. (orig.)

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  8. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  10. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  11. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  13. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  19. High performance computational integral imaging system using multi-view video plus depth representation

    Science.gov (United States)

    Shi, Shasha; Gioia, Patrick; Madec, Gérard

    2012-12-01

    Integral imaging is an attractive auto-stereoscopic three-dimensional (3D) technology for next-generation 3DTV. But its application is obstructed by poor image quality, huge data volume and high processing complexity. In this paper, a new computational integral imaging (CII) system using multi-view video plus depth (MVD) representation is proposed to solve these problems. The originality of this system lies in three aspects. Firstly, a particular depth-image-based rendering (DIBR) technique is used in encoding process to exploit the inter-view correlation between different sub-images (SIs). Thereafter, the same DIBR method is applied in the display side to interpolate virtual SIs and improve the reconstructed 3D image quality. Finally, a novel parallel group projection (PGP) technique is proposed to simplify the reconstruction process. According to experimental results, the proposed CII system improves compression efficiency and displayed image quality, while reducing calculation complexity. [Figure not available: see fulltext.

  20. Allelic variants of the genes of apolipoproteins B and CII in patients with coronary heart disease and in healthy individuals from the Moscow population

    Energy Technology Data Exchange (ETDEWEB)

    Pogoda, T.V.; Kolosova, T.V.; Lyudvikova, E.K. [Institute of Molecular Genetics, Moscow (Russian Federation)] [and others

    1995-07-01

    Allelic frequencies of a microsatellite of the apolipoprotein CII gene (APOCII) and a minisatellite of the apolipoprotein B gene (APOB) were studied by using polymerase chain reaction (PCR). The study was conducted on a random sample of male Moscow inhabitants and a sample of patients with coronary heart disease (CHD) from the same population. Fourteen variants of the APOB minisatellite (the 82% heterozygosity level) and 13 alleles of the APOCII microsatellite (the 85% heterozygosity level) were found. CHD patients significantly differed from the control group in the distributions of alleles in these loci: APOB 32, APOB 46, APOB 48, and APOB 50 as well as APOCII 17 and APOCII 29 were found more frequently. A relationship was found between the distributions of APOB and APOCII in the CHD patients. The CHD patients with alleles APOCII 21 and APOCII 30 very often had the allele APOB 32; and patients with the genotype APOB 34, 36 had the allele APOCII 29 even more often than affected individuals in general. Individuals of the control group with the allele APOCII30 exhibited hypertriglyceridemia without increased levels of total cholesterol and apolipoprotein B in plasma. 14 refs., 3 figs., 6 tabs.

  1. Solution Conditions Affect the Ability of the K30D Mutation To Prevent Amyloid Fibril Formation by Apolipoprotein C-II: Insights from Experiments and Theoretical Simulations.

    Science.gov (United States)

    Mao, Yu; Todorova, Nevena; Zlatic, Courtney O; Gooley, Paul R; Griffin, Michael D W; Howlett, Geoffrey J; Yarovsky, Irene

    2016-07-12

    Apolipoproteins form amphipathic helical structures that bind lipid surfaces. Paradoxically, lipid-free apolipoproteins display a strong propensity to form cross-β structure and self-associate into disease-related amyloid fibrils. Studies of apolipoprotein C-II (apoC-II) amyloid fibrils suggest that a K30-D69 ion pair accounts for the dual abilities to form helix and cross-β structure. Consistent with this is the observation that a K30D mutation prevents fibril formation under standard fibril forming conditions. However, we found that fibril formation by K30D apoC-II proceeded readily at low pH and a higher salt or protein concentration. Structural analysis demonstrated that K30D apoC-II fibrils at pH 7 have a structure similar to that of the wild-type fibrils but are less stable. Molecular dynamics simulations of the wild-type apoC-II fibril model at pH 7 and 3 showed that the loss of charge on D69 at pH 3 leads to greater separation between residues K30 and D69 within the fibril with a corresponding reduction in β-strand content around residue 30. In contrast, in simulations of the K30D mutant model at pH 7 and 3, residues D30 and D69 moved closer at pH 3, accompanied by an increase in β-strand content around residue 30. The simulations also demonstrated a strong dominance of inter- over intramolecular contacts between ionic residues of apoC-II and suggested a cooperative mechanism for forming favorable interactions between the individual strands under different conditions. These observations demonstrate the important role of the buried K30-D69 ion pair in the stability and solution properties of apoC-II amyloid fibrils. PMID:27311794

  2. Apolipoprotein C-II Is a Potential Serum Biomarker as a Prognostic Factor of Locally Advanced Cervical Cancer After Chemoradiation Therapy

    International Nuclear Information System (INIS)

    Purpose: To determine pretreatment serum protein levels for generally applicable measurement to predict chemoradiation treatment outcomes in patients with locally advanced squamous cell cervical carcinoma (CC). Methods and Materials: In a screening study, measurements were conducted twice. At first, 6 serum samples from CC patients (3 with no evidence of disease [NED] and 3 with cancer-caused death [CD]) and 2 from healthy controls were tested. Next, 12 serum samples from different CC patients (8 NED, 4 CD) and 4 from healthy controls were examined. Subsequently, 28 different CC patients (18 NED, 10 CD) and 9 controls were analyzed in the validation study. Protein chips were treated with the sample sera, and the serum protein pattern was detected by surface-enhanced laser desorption and ionization–time-of-flight mass spectrometry (SELDI-TOF MS). Then, single MS-based peptide mass fingerprinting (PMF) and tandem MS (MS/MS)-based peptide/protein identification methods, were used to identify protein corresponding to the detected peak. And then, turbidimetric assay was used to measure the levels of a protein that indicated the best match with this peptide peak. Results: The same peak 8918 m/z was identified in both screening studies. Neither the screening study nor the validation study had significant differences in the appearance of this peak in the controls and NED. However, the intensity of the peak in CD was significantly lower than that of controls and NED in both pilot studies (P=.02, P=.04) and validation study (P=.01, P=.001). The protein indicated the best match with this peptide peak at 8918 m/z was identified as apolipoprotein C-II (ApoC-II) using PMF and MS/MS methods. Turbidimetric assay showed that the mean serum levels of ApoC-II tended to decrease in CD group when compared with NED group (P=.078). Conclusion: ApoC-II could be used as a biomarker for detection in predicting and estimating the radiation treatment outcome of patients with CC

  3. Apolipoprotein C-II and lipoprotein lipase show a temporal and geographic correlation with surfactant lipid synthesis in preparation for birth

    Directory of Open Access Journals (Sweden)

    Gérard-Hudon Marie-Christine

    2010-11-01

    Full Text Available Abstract Background Fatty acids are precursors in the synthesis of surfactant phospholipids. Recently, we showed expression of apolipoprotein C-II (apoC-II, the essential cofactor of lipoprotein lipase (LPL, in the fetal mouse lung and found the protein on the day of the surge of surfactant synthesis (gestation day 17.5 in secretory granule-like structures in the distal epithelium. In the present study, we will answer the following questions: Does apoC-II protein localization change according to the stage of lung development, thus according to the need in surfactant? Are LPL molecules translocated to the luminal surface of capillaries? Do the sites of apoC-II and LPL gene expression change according to the stage of lung development and to protein localization? Results The present study investigated whether the sites of apoC-II and LPL mRNA and protein accumulation are regulated in the mouse lung between gestation day 15 and postnatal day 10. The major sites of apoC-II and LPL gene expression changed over time and were found mainly in the distal epithelium at the end of gestation but not after birth. Accumulation of apoC-II in secretory granule-like structures was not systematically observed, but was found in the distal epithelium only at the end of gestation and soon after birth, mainly in epithelia with no or small lumina. A noticeable increase in surfactant lipid content was measured before the end of gestation day 18, which correlates temporally with the presence of apoC-II in secretory granules in distal epithelium with no or small lumina but not with large lumina. LPL was detected in capillaries at all the developmental times studied. Conclusions This study demonstrates that apoC-II and LPL mRNAs correlate temporally and geographically with surfactant lipid synthesis in preparation for birth and suggests that fatty acid recruitment from the circulation by apoC-II-activated LPL is regionally modulated by apoC-II secretion. We propose a model

  4. Coupling of a real time computer to nuclear detectors systems

    International Nuclear Information System (INIS)

    Electronic computers are now included in nuclear physics experiment systems. This corresponds to a general trend to replace conventional multichannel analyzers by on line, real time, computers. An one line computer performing nuclear data acquisition and storage, offers the advantage of reduction and calculation routines in real time. This advantage becomes a need when the number of experimental parameters increase. At the Saclay variable energy cyclotron we have connected a C 90-10 computer of C.I.I. We describe the input/output hardware features. In order to establish a dialogue with physicists, we have built a main display unit able to control many display consoles at different points: we describe them as well as some utility routines. (author)

  5. [CII] gas in IC 342

    CERN Document Server

    Röllig, M; Güsten, R; Stutzki, J; Hübers, H W; Hartogh, P; Jacobs, K; Guan, X; Israel, F

    2012-01-01

    Methods: We used the dual-band receiver GREAT on board the SOFIA airborne telescope to perform observations of the [C II] 158 {\\mu}m fine-structure line at the postitions of two giant molecular clouds (GMC) in the center of IC 342 (GMCs C and E) and compared the spectra with corresponding ground-based data for low- and mid-J CO and [C I]. We performed model calculations assuming a clumpy photo-dissociation region (PDR) environment using the KOSMA-tau PDR model code to derive physical parameters of the local medium. Results: The [C II] 158 {\\mu}m emission resembles the spectral signature of ground-based atomic and molecular lines, which indicates a common origin. The emission from GMC E can be decomposed into a cool, molecular component with weak far-ultraviolet (FUV) fields and low, mean densities of 103 cm^-3 and a strongly excited starburst/PDR region with higher densities of 104 cm^-3 and FUV intensities of 250-300 Draine fields. The emission from GMC C is consistent with gas densities of 5000 cm^-3, FUV i...

  6. 1939 Quay County CII Aerial Photo Index

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — Aerial photographs are retrievable on a frame by frame basis. The aerial photo inventory contains imagery from various sources that are now archived at the Earth...

  7. Coupling of a real time computer to nuclear detectors systems; Couplage d'un calculateur en temps reel a un ensemble experimental de detection

    Energy Technology Data Exchange (ETDEWEB)

    Lugol, J. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-06-01

    Electronic computers are now included in nuclear physics experiment systems. This corresponds to a general trend to replace conventional multichannel analyzers by on line, real time, computers. An one line computer performing nuclear data acquisition and storage, offers the advantage of reduction and calculation routines in real time. This advantage becomes a need when the number of experimental parameters increase. At the Saclay variable energy cyclotron we have connected a C 90-10 computer of C.I.I. We describe the input/output hardware features. In order to establish a dialogue with physicists, we have built a main display unit able to control many display consoles at different points: we describe them as well as some utility routines. (author) [French] Les experiences de physique nucleaire font appel, de plus en plus, a des calculateurs electroniques. Ceux-ci se substituent alors aux analyseurs multicanaux traditionnels. Un calculateur en ligne assurant l'acquisition et le stockage des donnees experimentales, presente l'avantage de pouvoir executer, en temps reel, des programmes simples de calcul et de reduction. Nous montrons la necessite de prise en charge, par un calculateur, des experiences de physique nucleaire ou le nombre des parametres experimentaux devient important. Au cyclotron a energie variable de Saclay, la solution retenue est l'emploi d'un calculateur C 90-10 de la C.I.I. Nous decrivons les interfaces necessaires au couplage. Afin d'assurer un dialogue avec les physiciens, un systeme de visualisation est realise; nous le decrivons ainsi que quelques programmes types d'utilisation de l'ensemble du systeme. (auteur)

  8. Cloud Computing Vs. Grid Computing

    OpenAIRE

    Seyyed Mohsen Hashemi; Amid Khatibi Bardsiri

    2012-01-01

    Cloud computing emerges as one of the hottest topic in field of information technology. Cloud computing is based on several other computing research areas such as HPC, virtualization, utility computing and grid computing. In order to make clear the essential of cloud computing, we propose the characteristics of this area which make cloud computing being cloud computing and distinguish it from other research areas. The service oriented, loose coupling, strong fault tolerant, business model and...

  9. Velocity-resolved [CII] Emission and [CII]/FIR Mapping along Orion with Herschel

    Science.gov (United States)

    Goicoechea, Javier R.; Teyssier, D.; Etxaluze, M.; Goldsmith, P. F.; Ossenkopf, V.; Gerin, M.; Bergin, E. A.; Black, J. H.; Cernicharo, J.; Cuadrado, S.; Encrenaz, P.; Falgarone, E.; Fuente, A.; Hacar, A.; Lis, D. C.; Marcelino, N.; Melnick, G. J.; Müller, H. S. P.; Persson, C.; Pety, J.; Röllig, M.; Schilke, P.; Simon, R.; Snell, R. L.; Stutzki, J.

    2015-10-01

    We present the first ˜7.‧5 × 11.‧5 velocity-resolved (˜0.2 km s-1) map of the [C ii] 158 μm line toward the Orion molecular cloud 1 (OMC 1) taken with the Herschel/HIFI instrument. In combination with far-IR (FIR) photometric images and velocity-resolved maps of the H41α hydrogen recombination and CO J = 2-1 lines, this data set provides an unprecedented view of the intricate small-scale kinematics of the ionized/photodissociation region (PDR)/molecular gas interfaces and of the radiative feedback from massive stars. The main contribution to the [C ii] luminosity (˜85%) is from the extended, FUV-illuminated face of the cloud (G0 > 500, {n}{{H}} \\gt 5 × 103 cm-3) and from dense PDRs ({G}0 ≳ 104, {n}{{H}} ≳ 105 cm-3) at the interface between OMC 1 and the H ii region surrounding the Trapezium cluster. Around ˜15% of the [C ii] emission arises from a different gas component without a CO counterpart. The [C ii] excitation, PDR gas turbulence, line opacity (from [13C ii]), and role of the geometry of the illuminating stars with respect to the cloud are investigated. We construct maps of the L[C ii]/{L}{FIR} and {L}{FIR}/{M}{Gas} ratios and show that L[C ii]/{L}{FIR} decreases from the extended cloud component (˜10-2-10-3) to the more opaque star-forming cores (˜10{}-3-10-4). The lowest values are reminiscent of the “[C ii] deficit” seen in local ultraluminous IR galaxies hosting vigorous star formation. Spatial correlation analysis shows that the decreasing L[C ii]/{L}{FIR} ratio correlates better with the column density of dust through the molecular cloud than with {L}{FIR}/{M}{Gas}. We conclude that the [C ii]-emitting column relative to the total dust column along each line of sight is responsible for the observed L[C ii]/{L}{FIR} variations through the cloud. Uses observations obtained with the IRAM 30 m telescope. IRAM is supported by INSU/CNRS (France), MPG (Germany), and IGN (Spain).

  10. Rethinking Computations

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří; van Leeuwen, J.

    Exeter: AISB, 2013 - (Bishop, M.; Erden, Y.), s. 6-10 ISBN 978-1-908187-31-4. [AISB Symposium on Computing and Philosophy: The Scandal of Computation - What is Computation? /6./. Exeter (GB), 03.04.2013-05.04.2013] R&D Projects: GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : computation * epistemology * philosophy of computing Subject RIV: IN - Informatics, Computer Science

  11. Cloud Computing

    OpenAIRE

    Bhavana Gupta

    2012-01-01

    Cloud computing is such a type of computing environment, where business owners outsource their computing needs including application software services to a third party and when they need to use the computing power or employees need to use the application resources like database, emails etc., they access the resources via Internet. Cloud computing the use of computing resources (hardware and software) that are delivered as a service over a (typically the Internet).

  12. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  13. Computer Music

    Science.gov (United States)

    Cook, Perry

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.). Although most people would think that analog synthesizers and electronic music substantially predate the use of computers in music, many experiments and complete computer music systems were being constructed and used as early as the 1950s.

  14. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  15. Cloud Computing

    OpenAIRE

    Bhavana Gupta

    2012-01-01

    Cloud computing is such a type of computing environment, where business owners outsource their computing needs including application software services to a third party and when they need to use the computing power or employees need to use the application resources like database, emails etc., they access the resources via Internet

  16. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point...... of view, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  17. Computational chemistry

    OpenAIRE

    Truhlar, Donald G.; McKoy, Vincent

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  18. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  19. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  20. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  1. Stream Computing

    CERN Document Server

    Kak, Subhash

    2008-01-01

    Stream computing is the use of multiple autonomic and parallel modules together with integrative processors at a higher level of abstraction to embody "intelligent" processing. The biological basis of this computing is sketched and the matter of learning is examined.

  2. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool......, for instance, be implemented in design and architecture, and in what new directions we will take the technological developments? We need a new understanding of the computer to guide these developments as none of the previous apply to these new conditions and new oppertunities. I propose that we begin......The problematic addressed in the dissertation is generally shaped by a sensation that something is amiss within the area of Ubiquitous Computing. Ubiquitous Computing as a vision—as a program—sets out to challenge the idea of the computer as a desktop computer and to explore the potential...

  3. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  4. Cloud computing

    OpenAIRE

    Kodera, Lukáš

    2014-01-01

    This thesis deals with cloud computing in Czech Republic, specifically providers of cloud services. In theoretical part there will be explained what is cloud computing, different kinds of cloud computing, virtualization necessary for cloud computing, main concerns about cloud security and also, where cloud is physically stored. In practical part author choose the best solution for company from selected cloud providers in the Czech Republic by using mathematical methods, then author compare th...

  5. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  6. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  7. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...

  8. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  9. Computer Manual.

    Science.gov (United States)

    Illinois State Office of Education, Springfield.

    This manual designed to provide the teacher with methods of understanding the computer and its potential in the classroom includes four units with exercises and an answer sheet. Unit 1 covers computer fundamentals, the mini computer, programming languages, an introduction to BASIC, and control instructions. Variable names and constants described…

  10. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  11. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  12. CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Dr. Vinod Kumar

    2013-01-01

    Full Text Available Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a utility (like the electricity grid over a network (typically the internet. Cloud computing provides computation, software, data access, and storage services that do not require end-user knowledge of the physical location and configuration of the system that delivers the services. Parallel to this concept can be drawn with the electricity grid, wherein end-users consume power without needing to understand the component devices or infrastructure required to provide the service.

  13. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  14. Cloud Computing

    OpenAIRE

    Mirashe, Shivaji P.; Kalyankar, N. V.

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the ...

  15. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  16. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  17. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  18. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  19. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  20. Grid Computing

    Directory of Open Access Journals (Sweden)

    Amr Rekaby

    2013-02-01

    Full Text Available Grid computing is a new generation of distributed computing. The target of grid paradigm is how to construct strong processing power and storage resources by many small and weak resources. Gird computing is a mesh of interconnected resources worldwide which constructs massive powerful capabilities. The user of the grid has the ability to use any (or many of these interconnected resources in the grid to solve his problems, which cannot be solved by locally owned resources capabilities.

  1. Computational Deception

    OpenAIRE

    Nijholt, Anton; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our interactions. Fusion of such information and reasoning about such information makes it possible, using computational models of human behaviour and activities, to provide context- and person-aware in...

  2. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  3. Evolutionary Computing

    OpenAIRE

    Eiben, Aguston; Schoenauer, Marc

    2002-01-01

    Evolutionary computing (EC) is an exciting development in Computer Science. It amounts to building, applying and studying algorithms based on the Darwinian principles of natural selection. In this paper we briefly introduce the main concepts behind evolutionary computing. We present the main components all evolutionary algorithms (EA), sketch the differences between different types of EAs and survey application areas ranging from optimization, modeling and simulation to entertainment.

  4. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  5. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  6. I, Computer

    Science.gov (United States)

    Barack, Lauren

    2005-01-01

    What child hasn't chatted with friends through a computer? But chatting with a computer? Some Danish scientists have literally put a face on their latest software program, bringing to virtual life storyteller Hans Christian Andersen, who engages users in actual conversations. The digitized Andersen resides at the Hans Christian Andersen Museum in…

  7. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  8. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  9. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  10. Quantum Computation in Computational Geometry

    OpenAIRE

    Sadakane, Kunihiko; Sugawara, Noriko; Tokuyama, Takeshi

    2002-01-01

    We discuss applications of quantum computation to geometric data processing. These applications include problems on convex hulls, minimum enclosing balls, linear programming, and intersection problems. Technically, we apply well-known Grover’s algorithm (and its variants) combined with geometric algorithms, and no further knowledge of quantum computing is required. However, revealing these applications and emphasizing potential usefulness of quantum computation in geometric data processing wi...

  11. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  12. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  13. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-03-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  14. Compute Canada: Advancing Computational Research

    International Nuclear Information System (INIS)

    High Performance Computing (HPC) is redefining the way that research is done. Compute Canada's HPC infrastructure provides a national platform that enables Canadian researchers to compete on an international scale, attracts top talent to Canadian universities and broadens the scope of research.

  15. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  16. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  17. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  18. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  19. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  20. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...... here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority. In...

  1. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  2. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  3. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  4. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  5. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  6. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Goudreau, G.L.

    1993-03-01

    The Computational Mechanics thrust area sponsors research into the underlying solid, structural and fluid mechanics and heat transfer necessary for the development of state-of-the-art general purpose computational software. The scale of computational capability spans office workstations, departmental computer servers, and Cray-class supercomputers. The DYNA, NIKE, and TOPAZ codes have achieved world fame through our broad collaborators program, in addition to their strong support of on-going Lawrence Livermore National Laboratory (LLNL) programs. Several technology transfer initiatives have been based on these established codes, teaming LLNL analysts and researchers with counterparts in industry, extending code capability to specific industrial interests of casting, metalforming, and automobile crash dynamics. The next-generation solid/structural mechanics code, ParaDyn, is targeted toward massively parallel computers, which will extend performance from gigaflop to teraflop power. Our work for FY-92 is described in the following eight articles: (1) Solution Strategies: New Approaches for Strongly Nonlinear Quasistatic Problems Using DYNA3D; (2) Enhanced Enforcement of Mechanical Contact: The Method of Augmented Lagrangians; (3) ParaDyn: New Generation Solid/Structural Mechanics Codes for Massively Parallel Processors; (4) Composite Damage Modeling; (5) HYDRA: A Parallel/Vector Flow Solver for Three-Dimensional, Transient, Incompressible Viscous How; (6) Development and Testing of the TRIM3D Radiation Heat Transfer Code; (7) A Methodology for Calculating the Seismic Response of Critical Structures; and (8) Reinforced Concrete Damage Modeling.

  7. [DNA computing].

    Science.gov (United States)

    Błasiak, Janusz; Krasiński, Tadeusz; Popławski, Tomasz; Sakowski, Sebastian

    2011-01-01

    Biocomputers can be an alternative for traditional "silicon-based" computers, which continuous development may be limited due to further miniaturization (imposed by the Heisenberg Uncertainty Principle) and increasing the amount of information between the central processing unit and the main memory (von Neuman bottleneck). The idea of DNA computing came true for the first time in 1994, when Adleman solved the Hamiltonian Path Problem using short DNA oligomers and DNA ligase. In the early 2000s a series of biocomputer models was presented with a seminal work of Shapiro and his colleguas who presented molecular 2 state finite automaton, in which the restriction enzyme, FokI, constituted hardware and short DNA oligomers were software as well as input/output signals. DNA molecules provided also energy for this machine. DNA computing can be exploited in many applications, from study on the gene expression pattern to diagnosis and therapy of cancer. The idea of DNA computing is still in progress in research both in vitro and in vivo and at least promising results of these research allow to have a hope for a breakthrough in the computer science. PMID:21735816

  8. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  9. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  10. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  11. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  12. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  13. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  14. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  15. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  16. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  17. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  18. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  19. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  20. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  1. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in...... topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  2. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  3. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  4. Computational Logistics

    DEFF Research Database (Denmark)

    Jensen, Rune Møller; Pacino, Dario; Voß, Stefan

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in...... topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  5. Amorphous Computing

    Science.gov (United States)

    Sussman, Gerald

    2002-03-01

    Digital computers have always been constructed to behave as precise arrangements of reliable parts, and our techniques for organizing computations depend upon this precision and reliability. Two emerging technologies, however, are begnning to undercut these assumptions about constructing and programming computers. These technologies -- microfabrication and bioengineering -- will make it possible to assemble systems composed of myriad information- processing units at almost no cost, provided: 1) that not all the units need to work correctly; and 2) that there is no need to manufacture precise geometrical arrangements or interconnection patterns among them. Microelectronic mechanical components are becoming so inexpensive to manufacture that we can anticipate combining logic circuits, microsensors, actuators, and communications devices integrated on the same chip to produce particles that could be mixed with bulk materials, such as paints, gels, and concrete. Imagine coating bridges or buildings with smart paint that can sense and report on traffic and wind loads and monitor structural integrity of the bridge. A smart paint coating on a wall could sense vibrations, monitor the premises for intruders, or cancel noise. Even more striking, there has been such astounding progress in understanding the biochemical mechanisms in individual cells, that it appears we'll be able to harness these mechanisms to construct digital- logic circuits. Imagine a discipline of cellular engineering that could tailor-make biological cells that function as sensors and actuators, as programmable delivery vehicles for pharmaceuticals, as chemical factories for the assembly of nanoscale structures. Fabricating such systems seem to be within our reach, even if it is not yet within our grasp Fabrication, however, is only part of the story. We can envision producing vast quantities of individual computing elements, whether microfabricated particles, engineered cells, or macromolecular computing

  6. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  7. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  8. Egalitarian computing

    OpenAIRE

    Biryukov, Alex; Khovratovich, Dmitry

    2016-01-01

    In this paper we explore several contexts where an adversary has an upper hand over the defender by using special hardware in an attack. These include password processing, hard-drive protection, cryptocurrency mining, resource sharing, code obfuscation, etc. We suggest memory-hard computing as a generic paradigm, where every task is amalgamated with a certain procedure requiring intensive access to RAM both in terms of size and (very importantly) bandwidth, so that transferring the com...

  9. LHCb computing

    CERN Document Server

    Corti, G

    2001-01-01

    The LHCb computing model is strongly influenced by the copious amount of data that will be produced by the experiment and by the time to process them. At present the experiment is in the process of migrating its software applications to Object Oriented technology. The software strategy of the experiment is to have an architecture and a framework built on independent components on which to base all experimental data processing applications. (3 refs).

  10. Computational Thinking

    OpenAIRE

    Bottino, Rosa; Chioccariello, Augusto

    2015-01-01

    Digital technology has radically changed the way people work in industry, finance, services, media and commerce. Informatics has contributed to the scientific and technological development of our society in general and to the digital revolution in particular. Computational thinking is the term indicating the key ideas of this discipline that might be included in the key competencies underlying the curriculum of compulsory education. The educational potential of informatics h...

  11. Computational universes

    OpenAIRE

    Svozil, Karl

    2003-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing ``in the mind'' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view.

  12. Computability theory

    OpenAIRE

    Zimmermann, Karl-Heinz

    2011-01-01

    Why do we need a formalization of the notion of algorithm or effective computation? In order to show that a specific problem is algorithmically solvable, it is sufficient to provide an algorithm that solves it in a sufficiently precise manner. However, in order to prove that a problem is in principle not solvable by an algorithm, a rigorous formalism is necessary that allows mathematical proofs. The need for such a formalism became apparent in the works of David Hilbert (1900) on the foundati...

  13. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  14. Topics in Chemical Instrumentation: CII. Automated Anodic Stripping Voltammetry.

    Science.gov (United States)

    Stock, John T.; Ewing, Galen W., Ed.

    1980-01-01

    Presents details of anodic stripping analysis (ASV) in college chemistry laboratory experiments. Provides block diagrams of the analyzer system, circuitry and power supplies of the automated stripping analyzer, and instructions for implementing microcomputer control of the ASV. (CS)

  15. CI, CII, and CO as tracers of gas phase carbon

    Science.gov (United States)

    Keene, Jocelyn

    1990-01-01

    In the dense interstellar medium, we find that about 20 percent of the total carbon abundance is in the form of CO, about 3 percent in C(sub I), and 100 percent in C(sub II) with uncertainties of factors of order 2. The abundance of other forms of gaseous carbon is negligible. CO is widespread throughout molecular clouds as is C(sub I). C(sub II) has only been observed near bright star-formation regions so far because of its high excitation energy. Further from ultraviolet sources it may be less abundant. Altogether we have accounted for about 1/3 of the total carbon abundance associated with dense molecular clouds. Since the other gaseous forms are thought to have negligible abundances, the rest of the carbon is probably in solid form.

  16. Bacteria as computers making computers

    OpenAIRE

    Danchin, Antoine

    2008-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separa...

  17. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  18. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  19. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed. PMID:26792536

  20. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  1. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  2. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses What is CT (Computed Tomography) of ... of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  3. Computed Tomography (CT) -- Sinuses

    Science.gov (United States)

    ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  5. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  6. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  7. Irreversible computable functions

    OpenAIRE

    Hoyrup, Mathieu

    2014-01-01

    The strong relationship between topology and computations has played a central role in the development of several branches of theoretical computer science: foundations of functional programming, computational geometry, computability theory, computable analysis. Often it happens that a given function is not computable simply because it is not continuous. In many cases, the function can moreover be proved to be non-computable in the stronger sense that it does not preserve computability: it map...

  8. HIGH PERFORMANCE COMPUTING APPLIED TO CLOUD COMPUTING

    OpenAIRE

    Li, Luxingzi

    2015-01-01

    The purpose of this thesis was to introduce high performance computing and cloud computing. The purpose was also to describe how to apply high performance computing to cloud computing as well as its possibilities and challenges. There were two case studies in the thesis project to present the application of cloud computing. Both quantitative and qualitative methods were used in this research. The majority of materials were from books and Internet resources. The thesis may be us...

  9. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  10. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  11. Distributed Computing: An Overview

    Directory of Open Access Journals (Sweden)

    Md. Firoj Ali

    2015-07-01

    Full Text Available Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distributed computing and performance parameters in distributed computing system, parallel distributed algorithm models, and advantages of distributed computing and scope of distributed computing.

  12. Computability and Non-computability Issues in Amorphous Computing

    Czech Academy of Sciences Publication Activity Database

    Wiedermann, Jiří

    Berlin: Springer, 2012 - (Baeten, J.; Ball, T.; de Boer, F.), s. 1-9. (Lecture Notes in Computer Science. 7604). ISBN 978-3-642-33474-0. ISSN 0302-9743. [TCS 2012. IFIP TC 1/WG 2.2 International Conference /7./. Amsterdam (NL), 26.09.2012-28.09.2012] R&D Projects: GA ČR GAP202/10/1333 Institutional support: RVO:67985807 Keywords : amorphous computing * computability * non-computability * molecular communication Subject RIV: IN - Informatics, Computer Science

  13. A semi-empirical model for the M star GJ832 using modeling tools developed for computing semi-empirical solar models

    Science.gov (United States)

    Linsky, Jeffrey; Fontenla, Juan; France, Kevin

    2016-05-01

    We present a semi-empirical model of the photosphere, chromosphere, transition region, and corona for the M2 dwarf star GJ832, which hosts two exoplanets. The atmospheric model uses a modification of the Solar Radiation Physical Modeling tools developed by Fontenla and collaborators. These computer codes model non-LTE spectral line formation for 52 atoms and ions and include a large number of lines from 20 abundant diatomic molecules that are present in the much cooler photosphere and chromosphere of this star. We constructed the temperature distribution to fit Hubble Space Telescope observations of chromospheric lines (e.g., MgII), transition region lines (CII, CIV, SiIV, and NV), and the UV continuum. Temperatures in the coronal portion of the model are consistent with ROSAT and XMM-Newton X-ray observations and the FeXII 124.2 nm line. The excellent fit of the model to the data demonstrates that the highly developed model atmosphere code developed to explain regions of the solar atmosphere with different activity levels has wide applicability to stars, including this M star with an effective temperature 2200 K cooler than the Sun. We describe similarities and differences between the M star model and models of the quiet and active Sun.

  14. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  15. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  16. Typologies of Computation and Computational Models

    OpenAIRE

    Burgin, Mark; Dodig-Crnkovic, Gordana

    2013-01-01

    We need much better understanding of information processing and computation as its primary form. Future progress of new computational devices capable of dealing with problems of big data, internet of things, semantic web, cognitive robotics and neuroinformatics depends on the adequate models of computation. In this article we first present the current state of the art through systematization of existing models and mechanisms, and outline basic structural framework of computation. We argue tha...

  17. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  18. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  19. Computational thinking and thinking about computing

    OpenAIRE

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing.

  20. Computational Methods for Simulating Quantum Computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.

    2006-01-01

    This review gives a survey of numerical algorithms and software to simulate quantum computers. It covers the basic concepts of quantum computation and quantum algorithms and includes a few examples that illustrate the use of simulation software for ideal and physical models of quantum computers.

  1. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  2. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  3. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  4. Cloud Computing (4)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ 8 Case Study Cloud computing is still a new phenomenon. Although many IT giants are developing their own cloud computing infrastructures,platforms, software, and services, few have really succeeded in becoming cloud computing providers.

  5. Computed Tomography (CT) -- Head

    Science.gov (United States)

    ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  6. Computing technology in the 1980's. [computers

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  7. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  8. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  9. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  10. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  11. Cloud Computing (1)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series will discuss cloud computing technology in the following aspects: The first part provides a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  12. Cloud Computing (2)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series discusses cloud computing technology in the following aspects: The first part provided a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  13. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  14. Undergraduate computational physics projects on quantum computing

    Science.gov (United States)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  15. Understanding Student Computational Thinking with Computational Modeling

    OpenAIRE

    Aiken, John M.; Caballero, Marcos D.; Douglas, Scott S.; Burk, John B.; Scanlon, Erin M.; Thoms, Brian D.; Schatz, Michael F.

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". 9th Grade students taking a physics course that employed the Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced an...

  16. Serious computer games in computer science education

    OpenAIRE

    Jože Rugelj

    2016-01-01

    The role and importance of serious computer games in contemporary educational practice is presented in this paper as well as the theoretical fundamentals that justify their use in different forms of education. We present a project for designing and developing serious games that take place within the curriculum for computer science teachers’ education as an independent project work in teams. In this project work students have to use their knowledge in the field of didactics and computer scienc...

  17. Calculus of computers: Integrating the computing environment

    International Nuclear Information System (INIS)

    The licensing analysis process has changed greatly since the early days of commercial nuclear power. Today's computing requirements demand an integrated computing environment encompassing a thorough knowledge of both the analysis methods and the software and of the computer hardware to best utilize resources. The paper discusses the benefits of using standard communication networks that link all resources so that the analyst controls his environment and can become more effective. The paper discusses the licensing and analysis tool history and computer networking for improved productivity

  18. How Computers Work: Computational Thinking for Everyone

    OpenAIRE

    Rex Page; Ruben Gamboa

    2013-01-01

    What would you teach if you had only one course to help students grasp the essence of computation and perhaps inspire a few of them to make computing a subject of further study? Assume they have the standard college prep background. This would include basic algebra, but not necessarily more advanced mathematics. They would have written a few term papers, but would not have written computer programs. They could surf and twitter, but could not exclusive-or and nand. What about computers would i...

  19. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  20. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  1. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  2. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  3. Computer Literacy for Teachers.

    Science.gov (United States)

    Sarapin, Marvin I.; Post, Paul E.

    Basic concepts of computer literacy are discussed as they relate to industrial arts/technology education. Computer hardware development is briefly examined, and major software categories are defined, including database management, computer graphics, spreadsheet programs, telecommunications and networking, word processing, and computer assisted and…

  4. Students’ Choice for Computers

    Institute of Scientific and Technical Information of China (English)

    Cai; Wei

    2015-01-01

    Nowadays,computers are widely used as useful tools for our daily life.So you can see students using computers everywhere.The purpose of our survey is to find out the answers to the following questions:1.What brand of computers do students often choose?2.What is the most important factor of choosing computers in students’idea?3.What do students want to do with computers most?After that,we hope the students will know what kind of computers they really need and how many factors must be thought about when buying computers.

  5. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  6. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  7. Study on Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    Guo-Liang Chen; Guang-Zhong Sun; Yun-Quan Zhang; Ze-Yao Mo

    2006-01-01

    In this paper, we present a general survey on parallel computing. The main contents include parallel computer system which is the hardware platform of parallel computing, parallel algorithm which is the theoretical base of parallel computing, parallel programming which is the software support of parallel computing. After that, we also introduce some parallel applications and enabling technologies. We argue that parallel computing research should form an integrated methodology of "architecture - algorithm - programming - application". Only in this way, parallel computing research becomes continuous development and more realistic.

  8. Social Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Adam Mcmahon

    2011-08-01

    Full Text Available While both volunteer computing and social networks have proved successful, the merging of these two models is a new field: Social Volunteer Computing. A Social Volunteer Computing system utilizes the relationships within a social network to determine how computational resources flow towards tasks that need to be completed, and the results of these computations are added back into the social network as content. Such a system will provide scientists and artists a new facility to obtain computational resources and disseminate their work. RenderWeb 2.0, a prototype Social Volunteer Computing system, is introduced that allows animations created in Blender to be distributed and rendered within Facebook.

  9. Computation in Classical Mechanics

    CERN Document Server

    Timberlake, Todd

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss the ways we have used computation in our classical mechanics courses, focusing on how computational work can improve students' understanding of physics as well as their computational skills. We present examples of computational problems that serve these two purposes. In addition, we provide information about resources for instructors who would like to include computation in their courses.

  10. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  11. Introduction to parallel computing

    CERN Document Server

    2003-01-01

    Introduction to Parallel Computing is a complete end-to-end source of information on almost all aspects of parallel computing from introduction to architectures to programming paradigms to algorithms to programming standards. It is the only book to have complete coverage of traditional Computer Science algorithms (sorting, graph and matrix algorithms), scientific computing algorithms (FFT, sparse matrix computations, N-body methods), and data intensive algorithms (search, dynamic programming, data-mining).

  12. Automata and Quantum Computing

    OpenAIRE

    Ambainis, Andris; Yakaryilmaz, Abuzer

    2015-01-01

    Quantum computing is a new model of computation, based on quantum physics. Quantum computers can be exponentially faster than conventional computers for problems such as factoring. Besides full-scale quantum computers, more restricted models such as quantum versions of finite automata have been studied. In this paper, we survey various models of quantum finite automata and their properties. We also provide some open questions and new directions for researchers.

  13. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  14. Computing Modular Polynomials

    OpenAIRE

    Charles, Denis; Lauter, Kristin

    2004-01-01

    We present a new probabilistic algorithm to compute modular polynomials modulo a prime. Modular polynomials parameterize pairs of isogenous elliptic curves and are useful in many aspects of computational number theory and cryptography. Our algorithm has the distinguishing feature that it does not involve the computation of Fourier coefficients of modular forms. We avoid computing the exponentially large integral coefficients by working directly modulo a prime and computing isogenies between e...

  15. Crime in computer networks

    OpenAIRE

    Skrbková, Michaela

    2013-01-01

    This diploma thesis deals with computer crime, especially in computer networks. The aim is to assess the level of security in joint-stock company Žďas, identify potential threats and security weaknesses and suggest possible solutions. The work is divided into two parts. The first part focuses on theoretical knowledge of computer crimes. It defines the term known as computer crime and mentions list of computer-related offenses based on classification created by the Council of Europe. It briefl...

  16. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  17. Computation in Classical Mechanics

    OpenAIRE

    Timberlake, Todd; Hasbun, Javier E.

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss th...

  18. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  19. Cloud Computing (3)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: In the preceding two parts of this series, several aspects of cloud computing-including definition, classification, characteristics, typical applications, and service levels-were discussed. This part continues with a discussion of Cloud Computing Oopen Architecture and Market-Oriented Cloud. A comparison is made between cloud computing and other distributed computing technologies, and Google's cloud platform is analyzed to determine how distributed computing is implemented in its particular model.

  20. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  1. Computational Biology, Advanced Scientific Computing, and Emerging Computational Architectures

    Energy Technology Data Exchange (ETDEWEB)

    None

    2007-06-27

    This CRADA was established at the start of FY02 with $200 K from IBM and matching funds from DOE to support post-doctoral fellows in collaborative research between International Business Machines and Oak Ridge National Laboratory to explore effective use of emerging petascale computational architectures for the solution of computational biology problems. 'No cost' extensions of the CRADA were negotiated with IBM for FY03 and FY04.

  2. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  3. Enzyme Computation - Computing the Way Proteins Do

    Directory of Open Access Journals (Sweden)

    Jaime-Alberto Parra-Plaza

    2013-08-01

    Full Text Available It is presented enzyme computation, a computational paradigm based on the molecular activity inside the biological cells, particularly in the capacity of proteins to represent information, of enzymes to transform that information, and of genes to produce both elements according to the dynamic requirements of a given system. The paradigm explodes the rich computational possibilities offered by metabolic pathways and genetic regulatory networks and translates those possibilities into a distributed computational space made up of active agents which communicate through the mechanism of message passing. Enzyme computation has been tested in diverse problems, such as image processing, species classification, symbolic regression, and constraints satisfaction. Also, given its distributed nature, an implementation in dynamical reconfigurable hardware has been possible.

  4. Duality quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article,we make a review on the development of a newly proposed quantum computer,duality computer,or the duality quantum computer and the duality mode of quantum computers.The duality computer is based on the particle-wave duality principle of quantum mechanics.Compared to an ordinary quantum computer,the duality quantum computer is a quantum computer on the move and passing through a multi-slit.It offers more computing operations than is possible with an ordinary quantum computer.The most two distinct operations are:the quantum division operation and the quantum combiner operation.The division operation divides the wave function of a quantum computer into many attenuated,and identical parts.The combiner operation combines the wave functions in different parts into a single part.The duality mode is a way in which a quantum computer with some extra qubit resource simulates a duality computer.The main structure of duality quantum computer and duality mode,the duality mode,their mathematical description and algorithm designs are reviewed.

  5. Review of quantum computation

    International Nuclear Information System (INIS)

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  6. Physics vs. computer science

    International Nuclear Information System (INIS)

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  7. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  8. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  9. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  10. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  11. GRID COMPUTING AND CLOUD COMPUTING: DESCRIPTION AND COMPARISION

    OpenAIRE

    Sunita Rani, P.K. Suri

    2012-01-01

    In a basic grid computing system, every computer can access the resources of every other computer belonging to the network. In the ideal grid computing system, every resource is shared, turning a computer network into a powerful supercomputer. It’s a special kind of distributed computing. In distributed computing, different computers within the same network share one or more resources. Cloud computing is the use of a 3rd party service (Web Services) to perform computing needs. There are diffe...

  12. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  13. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  14. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  15. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available

    Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.

    Keywords: Cloud computing, QoS, quality of cloud computing

  16. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head What is CT Scanning of the ... Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  18. Applying Computational Intelligence

    CERN Document Server

    Kordon, Arthur

    2010-01-01

    Offers guidelines on creating value from the application of computational intelligence methods. This work introduces a methodology for effective real-world application of computational intelligence while minimizing development cost, and outlines the critical, underestimated technology marketing efforts required

  19. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  20. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  1. Pervasive and mobile computing

    OpenAIRE

    Conti, Marco

    2005-01-01

    The Pervasive and Mobile Computing Journal (PMC) is a professional, peer-reviewed journal that publishes high-quality scientific articles (both theory and practice) covering all aspects of pervasive computing and communications.

  2. Computer Crime and Insurance.

    Science.gov (United States)

    Beaudoin, Ralph H.

    1985-01-01

    The susceptibility of colleges and universities to computer crime is great. While insurance coverage is available to cover the risks, an aggressive loss-prevention program is the wisest approach to limiting the exposures presented by computer technology. (MLW)

  3. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  4. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  5. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  6. Computing for Belle

    CERN Document Server

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  7. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  8. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  9. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  10. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  11. Aspects of Computability in Physics

    OpenAIRE

    Shipman, Joseph

    1997-01-01

    This paper reviews connections between physics and computation, and explores their implications. The main topics are computational "hardness" of physical systems, computational status of fundamental theories, quantum computation, and the Universe as a computer.

  12. Security of computer networks

    OpenAIRE

    Kolář, Tomáš

    2012-01-01

    This thesis is focused on design and documentation of computer network and its security in the medium-sized company. First part of this thesis describes basics of computer networks, computer infiltrations, types of assault and preventive protection of corporate networks. The practical part of this thesis is devoted to documentation of the old corporate network and the complete design of a new computer network, its security against attacks and the loss corporate data.

  13. Computer-assisted psychotherapy

    OpenAIRE

    Wright, Jesse H.; Wright, Andrew S.

    1997-01-01

    The rationale for using computers in psychotherapy includes the possibility that therapeutic software could improve the efficiency of treatment and provide access for greater numbers of patients. Computers have not been able to reliably duplicate the type of dialogue typically used in clinician-administered therapy. However, computers have significant strengths that can be used to advantage in designing treatment programs. Software developed for computer-assisted therapy gen...

  14. Computational intelligence in optimization

    CERN Document Server

    Tenne, Yoel

    2010-01-01

    This volume presents a collection of recent studies covering the spectrum of computational intelligence applications with emphasis on their application to challenging real-world problems. Topics covered include: Intelligent agent-based algorithms, Hybrid intelligent systems, Cognitive and evolutionary robotics, Knowledge-Based Engineering, fuzzy sets and systems, Bioinformatics and Bioengineering, Computational finance and Computational economics, Data mining, Machine learning, and Expert systems. ""Computational Intelligence in Optimization"" is a comprehensive reference for researchers, prac

  15. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  16. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  17. Computer Supported Collaborative Research

    OpenAIRE

    Hinze-Hoare, Vita

    2009-01-01

    Although the areas of Human Computer Interaction (HCI), Computer Supported Collaborative Work (CSCW), and Computer Supported Collaborative Learning (CSCL) are now relatively well established, the related field of Computer Supported Collaborative Research (CSCR) is newly proposed here. An analysis of the principles and issues behind CSCR is performed leading to a full definition and specification of the CSCR domain is provided with a view to setting up an e-laboratory designed to support...

  18. Man and computer

    International Nuclear Information System (INIS)

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.)

  19. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2015-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  20. Quantum Analogue Computing

    OpenAIRE

    Kendon, Vivien M; Nemoto, Kae; Munro, William J.

    2010-01-01

    We briefly review what a quantum computer is, what it promises to do for us, and why it is so hard to build one. Among the first applications anticipated to bear fruit is quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data is encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilb...

  1. Nanoelectronics: Metrology and Computation

    OpenAIRE

    Lundstrom, Mark S.; Clark, Jason Vaughn; Klimeck, Gerhard; Raman, Arvind

    2008-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (ww...

  2. Learning through computer games

    OpenAIRE

    Stojanova, Biljana; Sivevska, Despina

    2009-01-01

    In this text we will talk about modern computer technology and its infl uence on children education. Th e computer technology is entering many spheres of human activities and is changing the life style of the modern man. It infl uences the educational process by changing the way of learning. How that works we can see and understand if we direct learners’ attention through computer games. Computer games are wildly popular with young people. Th ey show new ways of learning ...

  3. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  4. Integrable Quantum Computation

    OpenAIRE

    Zhang, Yong

    2011-01-01

    Integrable quantum computation is defined as quantum computing via the integrable condition, in which two-qubit gates are either nontrivial unitary solutions of the Yang--Baxter equation or the Swap gate (permutation). To make the definition clear, in this article, we explore the physics underlying the quantum circuit model, and then present a unified description on both quantum computing via the Bethe ansatz and quantum computing via the Yang--Baxter equation.

  5. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  6. Cloud Computing: a Prologue

    OpenAIRE

    Ullah, Sultan; Xuefeng, Zheng

    2013-01-01

    An emerging internet based super computing model is represented by cloud computing. Cloud computing is the convergence and evolution of several concepts from virtualization, distributed storage, grid, and automation management to enable a more flexible approach for deploying and scaling applications. However, cloud computing moves the application software and databases to the large data centers, where the management of the data and services may not be fully trustworthy. The concept of cloud c...

  7. Biomolecular computation for bionanotechnology

    CERN Document Server

    Liu, Jian-Qin

    2006-01-01

    Computers built with moleware? The drive toward non-silicon computing is underway, and this first-of-its-kind guide to molecular computation gives researchers a firm grasp of the technologies, biochemical details, and theoretical models at the cutting edge. It explores advances in molecular biology and nanotechnology and illuminates how the convergence of various technologies is propelling computational capacity beyond the limitations of traditional hardware technology and into the realm of moleware.

  8. Computational materials: Embedding Computation into the Everyday

    OpenAIRE

    Thomsen, Mette Ramsgard; Karmon, Ayelet

    2009-01-01

    This paper presents research into material design merging the structural logics of surface tectonics with computation. The research asks how the understanding and design of interactive systems changes as computation becomes an integrated part of our material surroundings. Rather than thinking the ubiquitous system as something that is embedded into the existing context of the built environment, this paper speculates on the design of bespoke materials specified and designed in respect to both ...

  9. Physics of quantum computation

    International Nuclear Information System (INIS)

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  10. Computers in Engineering Teaching.

    Science.gov (United States)

    Rushby, N. J.

    This bibliography cites 26 books, papers, and reports dealing with various uses of computers in engineering education; and describes several computer programs available for use in teaching aeronautical, chemical, civil, electrical and electronic, mechanical, and nuclear engineering. Each computer program entry is presented by name, author,…

  11. Computer-assisted instruction

    NARCIS (Netherlands)

    J. Voogt; P. Fisser

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss th

  12. Computational Thinking Patterns

    Science.gov (United States)

    Ioannidou, Andri; Bennett, Vicki; Repenning, Alexander; Koh, Kyu Han; Basawapatna, Ashok

    2011-01-01

    The iDREAMS project aims to reinvent Computer Science education in K-12 schools, by using game design and computational science for motivating and educating students through an approach we call Scalable Game Design, starting at the middle school level. In this paper we discuss the use of Computational Thinking Patterns as the basis for our…

  13. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  14. Computer Science Experiments

    CERN Document Server

    Walker, Pamela

    2010-01-01

    Computers are more prevalent in our daily lives than ever before, yet many people are unfamiliar with the concepts and technology of computer science. Offering 20 experiments and activities based on computer research, this book aims to expand students' learning experiences in this field by covering key science concepts.

  15. Coping with Computing Success.

    Science.gov (United States)

    Breslin, Richard D.

    Elements of computing success of Iona College, the challenges it currently faces, and the strategies conceived to cope with future computing needs are discussed. The college has mandated computer literacy for students and offers nine degrees in the computerized information system/management information system areas. Since planning is needed in…

  16. The Computer Delusion.

    Science.gov (United States)

    Oppenheimer, Todd

    1997-01-01

    Challenges research and prevailing attitudes that maintain that computers improve teaching and academic achievement. Criticizes and questions research methodology, computer literacy education, the need for computer skills to make a competitive workforce, support from the business community resulting from technology programs, and Internet use. (LRW)

  17. Optimizing Computer Technology Integration

    Science.gov (United States)

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  18. A new computing principle

    International Nuclear Information System (INIS)

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  19. Quantum walk computation

    Energy Technology Data Exchange (ETDEWEB)

    Kendon, Viv [School of Physics and Astronomy, University of Leeds, LS2 9JT (United Kingdom)

    2014-12-04

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer.

  20. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  1. Uncertainty In Quantum Computation

    OpenAIRE

    Kak, Subhash

    2002-01-01

    We examine the effect of previous history on starting a computation on a quantum computer. Specifically, we assume that the quantum register has some unknown state on it, and it is required that this state be cleared and replaced by a specific superposition state without any phase uncertainty, as needed by quantum algorithms. We show that, in general, this task is computationally impossible.

  2. The computer program HERA

    International Nuclear Information System (INIS)

    The computer programme HERA is used for comparative calculation of temperature gradients in sodium-cooled fuel element clusters. It belongs to the group of computer programmes assuming the subchannels formed by the rods to be the smallest element of the flow diameter. The short description outlines the basic characteristics of this computer programme. (HR)

  3. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  4. Quantum computation: Honesty test

    Science.gov (United States)

    Morimae, Tomoyuki

    2013-11-01

    Alice does not have a quantum computer so she delegates a computation to Bob, who does own one. But how can Alice check whether the computation that Bob performs for her is correct? An experiment with photonic qubits demonstrates such a verification protocol.

  5. How Computers Work: Computational Thinking for Everyone

    Directory of Open Access Journals (Sweden)

    Rex Page

    2013-01-01

    Full Text Available What would you teach if you had only one course to help students grasp the essence of computation and perhaps inspire a few of them to make computing a subject of further study? Assume they have the standard college prep background. This would include basic algebra, but not necessarily more advanced mathematics. They would have written a few term papers, but would not have written computer programs. They could surf and twitter, but could not exclusive-or and nand. What about computers would interest them or help them place their experience in context? This paper provides one possible answer to this question by discussing a course that has completed its second iteration. Grounded in classical logic, elucidated in digital circuits and computer software, it expands into areas such as CPU components and massive databases. The course has succeeded in garnering the enthusiastic attention of students with a broad range of interests, exercising their problem solving skills, and introducing them to computational thinking.

  6. The science of computing - Parallel computation

    Science.gov (United States)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  7. Pediatric Computational Models

    Science.gov (United States)

    Soni, Bharat K.; Kim, Jong-Eun; Ito, Yasushi; Wagner, Christina D.; Yang, King-Hay

    A computational model is a computer program that attempts to simulate a behavior of a complex system by solving mathematical equations associated with principles and laws of physics. Computational models can be used to predict the body's response to injury-producing conditions that cannot be simulated experimentally or measured in surrogate/animal experiments. Computational modeling also provides means by which valid experimental animal and cadaveric data can be extrapolated to a living person. Widely used computational models for injury biomechanics include multibody dynamics and finite element (FE) models. Both multibody and FE methods have been used extensively to study adult impact biomechanics in the past couple of decades.

  8. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface

  9. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  10. Rough-Granular Computing

    Institute of Scientific and Technical Information of China (English)

    Andrzej Skowron

    2006-01-01

    Solving complex problems by multi-agent systems in distributed environments requires new approximate reasoning methods based on new computing paradigms. One such recently emerging computing paradigm is Granular Computing(GC). We discuss the Rough-Granular Computing(RGC) approach to modeling of computations in complex adaptive systems and multiagent systems as well as for approximate reasoning about the behavior of such systems. The RGC methods have been successfully applied for solving complex problems in areas such as identification of objects or behavioral patterns by autonomous systems, web mining, and sensor fusion.

  11. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  12. Topology for computing

    CERN Document Server

    Zomorodian, Afra J

    2005-01-01

    The emerging field of computational topology utilizes theory from topology and the power of computing to solve problems in diverse fields. Recent applications include computer graphics, computer-aided design (CAD), and structural biology, all of which involve understanding the intrinsic shape of some real or abstract space. A primary goal of this book is to present basic concepts from topology and Morse theory to enable a non-specialist to grasp and participate in current research in computational topology. The author gives a self-contained presentation of the mathematical concepts from a comp

  13. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  14. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  15. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  16. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  17. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  18. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  19. Computer - Assisted Accounting

    OpenAIRE

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the co...

  20. Scalable distributed computing hierarchy: cloud, fog and dew computing

    OpenAIRE

    Skala, Karolj; Davidović, Davor; Afgan, Enis; Sović, Ivan; Šojat, Zorislav

    2015-01-01

    The paper considers the conceptual approach for organization of the vertical hierarchical links between the scalable distributed computing paradigms: Cloud Computing, Fog Computing and Dew Computing. In this paper, the Dew Computing is described and recognized as a new structural layer in the existing distributed computing hierarchy. In the existing computing hierarchy, the Dew computing is positioned as the ground level for the Cloud and Fog computing paradigms. Vertical, complementary, hier...

  1. Computer assisted radiology

    International Nuclear Information System (INIS)

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs

  2. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  3. Programming in Biomolecular Computation:

    DEFF Research Database (Denmark)

    Hartmann, Lars; Jones, Neil; Simonsen, Jakob Grue;

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  4. Hyperswitch Communication Network Computer

    Science.gov (United States)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  5. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  6. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  7. Performance Evaluation of Cluster Computing

    OpenAIRE

    K D.Kavitha; RojaRamani.Adapa

    2013-01-01

    Cluster Computing addresses the latest results in these fields that support High Performance Distributed Computing (HPDC). In HPDC environments, parallel and/or distributed computing techniques are applied to the solution of computationally intensive applications across networks of computers. A cluster computing is a type of parallel or distributed computer system, which consists of a collection of interconnected stand-alone computers working together as a single integrated computing resource...

  8. Serious computer games in computer science education

    Directory of Open Access Journals (Sweden)

    Jože Rugelj

    2015-11-01

    Full Text Available The role and importance of serious computer games in contemporary educational practice is presented in this paper as well as the theoretical fundamentals that justify their use in different forms of education. We present a project for designing and developing serious games that take place within the curriculum for computer science teachers’ education as an independent project work in teams. In this project work students have to use their knowledge in the field of didactics and computer science to develop games. The developed game is tested and evaluated in schools in the framework of their practical training. The results of the evaluation can help students improve their games and verify to which extent specified learning goals have been achieved.

  9. Photonic Quantum Computing

    Science.gov (United States)

    Barz, Stefanie

    2013-05-01

    Quantum physics has revolutionized our understanding of information processing and enables computational speed-ups that are unattainable using classical computers. In this talk I will present a series of experiments in the field of photonic quantum computing. The first experiment is in the field of photonic state engineering and realizes the generation of heralded polarization-entangled photon pairs. It overcomes the limited applicability of photon-based schemes for quantum information processing tasks, which arises from the probabilistic nature of photon generation. The second experiment uses polarization-entangled photonic qubits to implement ``blind quantum computing,'' a new concept in quantum computing. Blind quantum computing enables a nearly-classical client to access the resources of a more computationally-powerful quantum server without divulging the content of the requested computation. Finally, the concept of blind quantum computing is applied to the field of verification. A new method is developed and experimentally demonstrated, which verifies the entangling capabilities of a quantum computer based on a blind Bell test.

  10. Computers and neurosurgery.

    Science.gov (United States)

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century. PMID:22985531

  11. Reconfigurable computing for tool-path computation

    OpenAIRE

    Jimeno Morenilla, Antonio; Cuenca Asensi, Sergio

    2002-01-01

    Tool path generation is one of the most complex problems in Computer Aided Manufacturing. Although some efficient strategies have been developed to solve it, most of them are only useful for 3 and 5 axis standard machining. The algorithm called Virtual Digitising computes the tool path by means of a “virtually digitised” model of the surface and a geometry specification of the tool and its motion, so can be used even in non-standard machining (retrofitting). This algorithm is simple, robust a...

  12. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  13. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  14. Computation with narrow CTCs

    CERN Document Server

    Say, A C Cem

    2011-01-01

    We examine some variants of computation with closed timelike curves (CTCs), where various restrictions are imposed on the memory of the computer, and the information carrying capacity and range of the CTC. We give full characterizations of the classes of languages recognized by polynomial time probabilistic and quantum computers that can send a single classical bit to their own past. Such narrow CTCs are demonstrated to add the power of limited nondeterminism to deterministic computers, and lead to exponential speedup in constant-space probabilistic and quantum computation. We show that, given a time machine with constant negative delay, one can implement CTC-based computations without the need to know about the runtime beforehand.

  15. The CMS Computing Model

    International Nuclear Information System (INIS)

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  16. Programming in Biomolecular Computation

    DEFF Research Database (Denmark)

    Hartmann, Lars; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level....

  17. CLOUD COMPUTING AND SECURITY

    Directory of Open Access Journals (Sweden)

    Asharani Shinde

    2015-10-01

    Full Text Available This document gives an insight into Cloud Computing giving an overview of key features as well as the detail study of exact working of Cloud computing. Cloud Computing lets you access all your application and documents from anywhere in the world, freeing you from the confines of the desktop thus making it easier for group members in different locations to collaborate. Certainly cloud computing can bring about strategic, transformational and even revolutionary benefits fundamental to future enterprise computing but it also offers immediate and pragmatic opportunities to improve efficiencies today while cost effectively and systematically setting the stage for the strategic change. As this technology makes the computing, sharing, networking easy and interesting, we should think about the security and privacy of information too. Thus the key points we are going to be discussed are what is cloud, what are its key features, current applications, future status and the security issues and the possible solutions.

  18. Computability and dynamical systems

    OpenAIRE

    Buescu, Jorge; Graça, Daniel; Zhong, Ning

    2011-01-01

    In this paper we explore results that establish a link between dynamical systems and computability theory (not numerical analysis). In the last few decades, computers have increasingly been used as simulation tools for gaining insight into dynamical behavior. However, due to the presence of errors inherent in such numerical simulations, with few exceptions, computers have not been used for the nobler task of proving mathematical results. Nevertheless, there have been some recen...

  19. The Computer Science Network

    OpenAIRE

    Landweber, Lawrence H.

    1982-01-01

    The CSNET project, sponsored by the National Science Foundation, has as its goal the design and implementation of a computer communications network to provide services to computer science research groups in the United States. Experience with Arpanet has shown that access to a computer network can lead to significantly higher level of interaction between geographically dispersed researchers. This can result in an increase in the quantity and quality of research produced by these researchers. I...

  20. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  1. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  2. Electronics and computer acronyms

    CERN Document Server

    Brown, Phil

    1988-01-01

    Electronics and Computer Acronyms presents a list of almost 2,500 acronyms related to electronics and computers. The material for this book is drawn from a number of subject areas, including electrical, electronics, computers, telecommunications, fiber optics, microcomputers/microprocessors, audio, video, and information technology. The acronyms also encompass avionics, military, data processing, instrumentation, units, measurement, standards, services, organizations, associations, and companies. This dictionary offers a comprehensive and broad view of electronics and all that is associated wi

  3. Basics of Quantum Computation

    OpenAIRE

    Vedral, Vlatko; Martin B. Plenio

    1998-01-01

    Quantum computers require quantum logic, something fundamentally different to classical Boolean logic. This difference leads to a greater efficiency of quantum computation over its classical counter-part. In this review we explain the basic principles of quantum computation, including the construction of basic gates, and networks. We illustrate the power of quantum algorithms using the simple problem of Deutsch, and explain, again in very simple terms, the well known algorithm of Shor for fac...

  4. Multiparty Cloud Computation

    OpenAIRE

    Zheng, Qingji; Zhang, Xinwen

    2012-01-01

    With the increasing popularity of the cloud, clients oursource their data to clouds in order to take advantage of unlimited virtualized storage space and the low management cost. Such trend prompts the privately oursourcing computation, called \\emph{multiparty cloud computation} (\\MCC): Given $k$ clients storing their data in the cloud, how can they perform the joint functionality by contributing their private data as inputs, and making use of cloud's powerful computation capability. Namely, ...

  5. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  6. Fostering Computational Thinking

    OpenAIRE

    Caballero, Marcos D.; Kohlmyer, Matthew A.; Schatz, Michael F.

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluat...

  7. Incremental Computation with Names

    OpenAIRE

    Hammer, Matthew A.; Dunfield, Joshua; Headley, Kyle; Labich, Nicholas; Foster, Jeffrey S.; Hicks, Michael; Van Horn, David

    2015-01-01

    Over the past thirty years, there has been significant progress in developing general-purpose, language-based approaches to incremental computation, which aims to efficiently update the result of a computation when an input is changed. A key design challenge in such approaches is how to provide efficient incremental support for a broad range of programs. In this paper, we argue that first-class names are a critical linguistic feature for efficient incremental computation. Names identify compu...

  8. Asynchronous Multiparty Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Geisler, Martin; Krøigaard, Mikkel;

    2009-01-01

    We propose an asynchronous protocol for general multiparty computation. The protocol has perfect security and communication complexity  where n is the number of parties, |C| is the size of the arithmetic circuit being computed, and k is the size of elements in the underlying field. The protocol...... multithreading. Benchmarking of a VIFF implementation of our protocol confirms that it is applicable to practical non-trivial secure computations....

  9. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  10. Quantum Computational Complexity

    OpenAIRE

    Watrous, John

    2008-01-01

    This article surveys quantum computational complexity, with a focus on three fundamental notions: polynomial-time quantum computations, the efficient verification of quantum proofs, and quantum interactive proof systems. Properties of quantum complexity classes based on these notions, such as BQP, QMA, and QIP, are presented. Other topics in quantum complexity, including quantum advice, space-bounded quantum computation, and bounded-depth quantum circuits, are also discussed.

  11. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  12. Computer-based simulations

    OpenAIRE

    Antonoaie, C.; Antonoaie, N.

    2010-01-01

    A computer-based simulation replicates an environment through a computer program designed to consider multiple variables, interactions, and system constraints. Computer-based simulation is used in organization studies to model human social systems to better understand the dynamics between individual and group behaviours.These methods advance organization studies research in many ways. They can be used for extrapolating theory, validating hypotheses, or revealing emergent behaviour. Simulation...

  13. (Computer) Vision without Sight

    OpenAIRE

    Manduchi, Roberto; Coughlan, James

    2012-01-01

    Computer vision holds great promise for helping persons with blindness or visual impairments (VI) to interpret and explore the visual world. To this end, it is worthwhile to assess the situation critically by understanding the actual needs of the VI population and which of these needs might be addressed by computer vision. This article reviews the types of assistive technology application areas that have already been developed for VI, and the possible roles that computer vision can play in fa...

  14. Computational Social Choice (Tutorial)

    OpenAIRE

    Brandt, Felix

    2015-01-01

    Over the past few years there has been a lively exchange of ideas between computer science, in particular theoretical computer science and artificial intelligence, on the one hand and economics, in particular game theory and social choice, on the other. This exchange goes in both directions and has produced active research areas such as algorithmic game theory and computational social choice. Social choice theory concerns the formal analysis and design of methods for aggregating po...

  15. Mobile computing for radiology.

    Science.gov (United States)

    Auffermann, William F; Chetlen, Alison L; Sharma, Arjun; Colucci, Andrew T; DeQuesada, Ivan M; Grajo, Joseph R; Kung, Justin W; Loehfelm, Thomas W; Sherry, Steven J

    2013-12-01

    The rapid advances in mobile computing technology have the potential to change the way radiology and medicine as a whole are practiced. Several mobile computing advances have not yet found application to the practice of radiology, while others have already been applied to radiology but are not in widespread clinical use. This review addresses several areas where radiology and medicine in general may benefit from adoption of the latest mobile computing technologies and speculates on potential future applications. PMID:24200475

  16. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  17. Mind and Computation

    OpenAIRE

    Radovan, Mario

    1995-01-01

    The paper examines basic positions concerning the computational model of the mind, and the background assumptions on which these positions are based. It has been ascertained the question of the relation between human mind and computational machines does not concern so much the obsenter independent phenomena in the world, as it concerns our attitude toward these phenomena. Taken literally, mind is not a programmable machine, but there are pragmatical reasons why assign a computational interpre...

  18. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  19. Virtualizace a cloud computing

    OpenAIRE

    Davídek, Michal

    2013-01-01

    The aim of this diploma thesis are current computing technologies known as Cloud computing. Main goal is to compare technologies and Cloud services provided by companies nowadays. During this thesis will be proven or not, that usage of Cloud technologies can save company finance due to optimized data management, effective backups, license administration or management of computing resources for single applications/users. Final part will present recommendations for Cloud Service providers or en...

  20. Sensor sentinel computing device

    Energy Technology Data Exchange (ETDEWEB)

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  1. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  2. Distributed Computing Economics

    OpenAIRE

    Gray, Jim

    2004-01-01

    Computing economics are changing. Today there is rough price parity between (1) one database access, (2) ten bytes of network traffic, (3) 100,000 instructions, (4) 10 bytes of disk storage, and (5) a megabyte of disk bandwidth. This has implications for how one structures Internet-scale distributed computing: one puts computing as close to the data as possible in order to avoid expensive network traffic.

  3. Optimization of Computer Networks

    OpenAIRE

    Saoud Sarwar; Deepa Mahra

    2011-01-01

    Computer Networks have pervaded our life like anything. They are present in all aspects of our life.Information transmission like Internet usage uses computer networks. As more and more people use computer networks, traffic increases. This puts a heavy toll on the infrastructure delivering the data from one point to another. Consequently the design issues in a network need to use the optimized parameters to deliver quality of service. This paper attempts to find a mathematical model for optim...

  4. Computer information systems framework

    International Nuclear Information System (INIS)

    Management information systems (MIS) is a commonly used term in computer profession. The new information technology has caused management to expect more from computer. The process of supplying information follows a well defined procedure. MIS should be capable for providing usable information to the various areas and levels of organization. MIS is different from data processing. MIS and business hierarchy provides a good framework for many organization which are using computers. (A.B.)

  5. Parallel computing for electromagnetic field computation

    OpenAIRE

    Vollaire, Christian; Nicolas, Laurent; Nicolas, Alain

    1998-01-01

    This paper deals with parallel computation in electrical engineering. Shared memory and distributed memory architectures are presented, with their implication in the development of parallel numerical algorithms. The necessity of optimizing the parallel performances is highlighted. Both Gray C98 and Gray T3E are finally compared.

  6. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  7. COMPUTER BASED ENVIRONMENT CONTROLS

    OpenAIRE

    Macoveiciuc Pastorel

    2011-01-01

    The aim of these notes is to give an overview of the main activities of computer based activities controls. The basic principles of computer controls should be common to all sectors and to most types of hardware and software. The absence of a common definition of computer control may, in part, be due to the relative newness of computer controls. A key feature of many organisations today is change. Although not necessarily the driver of change, IT is invariably an intrinsec component and much ...

  8. Computing machinery and understanding.

    Science.gov (United States)

    Ramscar, Michael

    2010-08-01

    How are natural symbol systems best understood? Traditional "symbolic" approaches seek to understand cognition by analogy to highly structured, prescriptive computer programs. Here, we describe some problems the traditional computational metaphor inevitably leads to, and a very different approach to computation (Ramscar, Yarlett, Dye, Denny, & Thorpe, 2010; Turing, 1950) that allows these problems to be avoided. The way we conceive of natural symbol systems depends to a large degree on the computational metaphors we use to understand them, and machine learning suggests an understanding of symbolic thought that is very different to traditional views (Hummel, 2010). The empirical question then is: Which metaphor is best? PMID:21564241

  9. Discrete and computational geometry

    CERN Document Server

    Devadoss, Satyan L

    2011-01-01

    Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well a

  10. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  11. Modeling Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Shuyi; WEN Yingyou; ZHAO Hong

    2006-01-01

    In this paper, a formal approach based on predicate logic is proposed for representing and reasoning of trusted computing models. Predicates are defined to represent the characteristics of the objects and the relationship among these objects in a trusted system according to trusted computing specifications. Inference rules of trusted relation are given too. With the semantics proposed, some trusted computing models are formalized and verified, which shows that Predicate calculus logic provides a general and effective method for modeling and reasoning trusted computing systems.

  12. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  13. Highly parallel computation

    Science.gov (United States)

    Denning, Peter J.; Tichy, Walter F.

    1990-01-01

    Among the highly parallel computing architectures required for advanced scientific computation, those designated 'MIMD' and 'SIMD' have yielded the best results to date. The present development status evaluation of such architectures shown neither to have attained a decisive advantage in most near-homogeneous problems' treatment; in the cases of problems involving numerous dissimilar parts, however, such currently speculative architectures as 'neural networks' or 'data flow' machines may be entailed. Data flow computers are the most practical form of MIMD fine-grained parallel computers yet conceived; they automatically solve the problem of assigning virtual processors to the real processors in the machine.

  14. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls and...... capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during the...

  15. Dictionary of computing

    CERN Document Server

    Illingworth, Valerie

    2004-01-01

    The world of computing continues to expand and to cross new frontiers of public awareness. Jargon grows apace, and confusion abounds as the field moves from the domain of specialists into general knowledge. In preparing the Dictionary of Computing, the need for clear explanations of the concepts that affect more and more aspects of life and the terminology that accompanies them, has been recognized. The dictionary is aimed mainly at students and teachers of computing but should also be of value to professional and amateur computer users. The fourth edition of the dictionary contains ne

  16. Frontiers in Computer Education

    CERN Document Server

    Zhu, Egui; 2011 International Conference on Frontiers in Computer Education (ICFCE 2011)

    2012-01-01

    This book is the proceedings of the 2011 International Conference on Frontiers in Computer Education (ICFCE 2011) in Sanya, China, December 1-2, 2011. The contributions can be useful for researchers, software engineers, and programmers, all interested in promoting the computer and education development. Topics covered are computing and communication technology, network management, wireless networks, telecommunication, Signal and Image Processing, Machine Learning, educational management, educational psychology, educational system, education engineering, education technology and training.  The emphasis is on methods and calculi for computer science and education technology development, verification and verification tools support, experiences from doing developments, and the associated theoretical problems.

  17. Computability theory an introduction

    CERN Document Server

    Jones, Neil D

    1973-01-01

    Computability Theory: An Introduction provides information pertinent to the major concepts, constructions, and theorems of the elementary theory of computability of recursive functions. This book provides mathematical evidence for the validity of the Church-Turing thesis.Organized into six chapters, this book begins with an overview of the concept of effective process so that a clear understanding of the effective computability of partial and total functions is obtained. This text then introduces a formal development of the equivalence of Turing machine computability, enumerability, and decida

  18. Combinatorial scientific computing

    CERN Document Server

    Naumann, Uwe

    2012-01-01

    Combinatorial Scientific Computing explores the latest research on creating algorithms and software tools to solve key combinatorial problems on large-scale high-performance computing architectures. It includes contributions from international researchers who are pioneers in designing software and applications for high-performance computing systems. The book offers a state-of-the-art overview of the latest research, tool development, and applications. It focuses on load balancing and parallelization on high-performance computers, large-scale optimization, algorithmic differentiation of numeric

  19. Computationally efficient multibody simulations

    Science.gov (United States)

    Ramakrishnan, Jayant; Kumar, Manoj

    1994-01-01

    Computationally efficient approaches to the solution of the dynamics of multibody systems are presented in this work. The computational efficiency is derived from both the algorithmic and implementational standpoint. Order(n) approaches provide a new formulation of the equations of motion eliminating the assembly and numerical inversion of a system mass matrix as required by conventional algorithms. Computational efficiency is also gained in the implementation phase by the symbolic processing and parallel implementation of these equations. Comparison of this algorithm with existing multibody simulation programs illustrates the increased computational efficiency.

  20. Introduction to Computer Programming Languages.

    Science.gov (United States)

    Bork, Alfred M.

    1971-01-01

    A brief introduction to computer programing explains the basic grammar of computer language as well as fundamental computer techniques. What constitutes a computer program is made clear, then three simple kinds of statements basic to the computational computer are defined: assignment statements, input-output statements, and branching statements. A…

  1. Who Owns Computer Software?

    Science.gov (United States)

    Branscomb, Anne Wells

    1995-01-01

    Discusses the protection of intellectual property as it applies to computer software and its impact on private enterprise and the public good. Highlights include the role of patents, copyrights, and trade secrets; some court cases; and recommendations for alternatives to the existing legal framework for protecting computer software. (KRN)

  2. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  3. Computer algebra in gravity

    CERN Document Server

    Heinicke, C; Heinicke, Christian; Hehl, Friedrich W.

    2001-01-01

    We survey the application of computer algebra in the context of gravitational theories. After some general remarks, we show of how to check the second Bianchi-identity by means of the Reduce package Excalc. Subsequently we list some computer algebra systems and packages relevant to applications in gravitational physics. We conclude by presenting a couple of typical examples.

  4. COMPUTER MODELS/EPANET

    Science.gov (United States)

    Pipe network flow analysis was among the first civil engineering applications programmed for solution on the early commercial mainframe computers in the 1960s. Since that time, advancements in analytical techniques and computing power have enabled us to solve systems with tens o...

  5. Theory and computational science

    International Nuclear Information System (INIS)

    The theoretical and computational science carried out at the Daresbury Laboratory in 1984/5 is detailed in the Appendix to the Daresbury Annual Report. The Theory, Computational Science and Applications Groups, provide support work for the experimental projects conducted at Daresbury. Use of the FPS-164 processor is also described. (U.K.)

  6. Learning with Portable Computers.

    Science.gov (United States)

    Gardner, John; And Others

    1994-01-01

    Reviews the Pupils' Learning and Access to Information Technology project that was conducted in elementary and secondary schools in Northern Ireland to investigate the impact of using portable home computers on students' learning. Performance gains are examined, and operational issues in the use of portable computers are discussed. (Contains 17…

  7. Fault tolerant computing systems

    International Nuclear Information System (INIS)

    Fault tolerance involves the provision of strategies for error detection damage assessment, fault treatment and error recovery. A survey is given of the different sorts of strategies used in highly reliable computing systems, together with an outline of recent research on the problems of providing fault tolerance in parallel and distributed computing systems. (orig.)

  8. Computer Processed Evaluation.

    Science.gov (United States)

    Griswold, George H.; Kapp, George H.

    A student testing system was developed consisting of computer generated and scored equivalent but unique repeatable tests based on performance objectives for undergraduate chemistry classes. The evaluation part of the computer system, made up of four separate programs written in FORTRAN IV, generates tests containing varying numbers of multiple…

  9. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available Toggle navigation Test/Treatment Patient Type Screening/Wellness Disease/Condition Safety En Español More Info Images/Videos News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses What is CT (Computed Tomography) of the Sinuses? What are ...

  10. Properties of Stabilizing Computations

    Directory of Open Access Journals (Sweden)

    Mark Burgin

    2015-04-01

    Full Text Available Models play an important role in the development of computerscience and information technology applications. Turing machine isone of the most popular model of computing devices andcomputations. This model, or more exactly, a family of models,provides means for exploration of capabilities of informationtechnology. However, a Turing machine stops after giving a result.In contrast to this, computers, networks and their software, suchas an operating system, very often work without stopping but givevarious results. There are different modes of such functioning andTuring machines do not provide adequate models for theseprocesses. One of the closest to halting computation isstabilizing computation when the output has to stabilize in orderto become the result of a computational process. Such stabilizingcomputations are modeled by inductive Turing machines. Incomparison with Turing machines, inductive Turing machinesrepresent the next step in the development of computer scienceproviding better models for contemporary computers and computernetworks. At the same time, inductive Turing machines reflectpivotal traits of stabilizing computational processes. In thispaper, we study relations between different modes of inductiveTuring machines functioning. In particular, it is demonstratedthat acceptation by output stabilizing and acceptation by statestabilizing are linguistically equivalent.

  11. Computing Ontology Creation

    OpenAIRE

    Stefanov, Krassen; Yordanova, Korneliya

    2003-01-01

    In this paper an approach for the development of an Ontology for the domain of Computing Education is presented. This approach was applied in the FP5 IST Project DIOGENE - A Training Web Broker for ICT Professionals. I am outlining our work on the Computing Ontology creation, and am giving some guidelines and hints for further usage of the Ontology.

  12. Computer Series, 25.

    Science.gov (United States)

    Moore, John W., Ed.

    1982-01-01

    Nine computer programs (available from the authors) are described including graphic display of molecular structures from crystallographic data, computer assisted instruction (CAI) with MATH subroutine, CAI preparation-for-chemistry course, calculation of statistical thermodynamic properties, qualitative analysis program, automated conductimetric…

  13. Computer Crimes in Schools.

    Science.gov (United States)

    Telem, Moshe

    1984-01-01

    Analyzes the occurrence of computer crimes in schools, focusing on the main types of crimes possible, potential criminals in schools, and how the organizational characteristics of schools invite computer crimes. Means to counter this problem and minimize it as far as possible are suggested. (MBR)

  14. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar;

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides a...

  15. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ray beam follows a spiral path. A special computer program processes this large volume of data to create two-dimensional cross-sectional images of your body, which are then displayed on a ... by computer software, the result is a very detailed multidimensional ...

  16. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ray beam follows a spiral path. A special computer program processes this large volume of data to create two-dimensional cross-sectional images of your body, which are then displayed on a ... by computer software, the result is a very detailed multidimensional ...

  17. Introduction to Cloud Computing

    OpenAIRE

    Conway, Gerry

    2011-01-01

    This paper describes cloud computing, its main characteristics and the models that are currently used for both deployment and delivery. It examines the benefits and business issues with using the cloud, and how they can be addressed. It describes some of the early adapters of cloud computing, together with their experiences.

  18. Statistical Mapping by Computer.

    Science.gov (United States)

    Utano, Jack J.

    The function of a statistical map is to provide readers with a visual impression of the data so that they may be able to identify any geographic characteristics of the displayed phenomena. The increasingly important role played by the computer in the production of statistical maps is manifested by the varied examples of computer maps in recent…

  19. Learning Computational Grammars

    NARCIS (Netherlands)

    Nerbonne, J.; Belz, A.; Cancedda, N.; Dejean, H.; Hammerton, J.; Koeling, R.; Konstantopoulos, S.; Osborne, M.; Thollard, F.; Tjong Kim Sang, E.F.; Daelemans, W.; Zajac, R.

    2001-01-01

    This paper reports on the LEARNING COMPUTATIONAL GRAMMARS (LCG) project, a postdoc network devoted to studying the application of machine learning techniques to grammars suitable for computational use. We were interested in a more systematic survey to understand the relevance of many factors to the

  20. Computer Virus Protection

    Science.gov (United States)

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  1. Exercises in Computational Chemistry

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  2. Quantum information and computation

    OpenAIRE

    Bub, Jeffrey

    2005-01-01

    This article deals with theoretical developments in the subject of quantum information and quantum computation, and includes an overview of classical information and some relevant quantum mechanics. The discussion covers topics in quantum communication, quantum cryptography, and quantum computation, and concludes by considering whether a perspective in terms of quantum information sheds new light on the conceptual problems of quantum mechanics.

  3. Computation of bankruptcy rules

    OpenAIRE

    Saavedra, Verónica; Lopez, Marcelo; Necco, Claudia Mónica; Quintas, Luis Guillermo

    2003-01-01

    We implemented a system that computes bankruptcy rules. The implemented rules are: The Talmud, the Proportional, the Truncated Proportional, the Adjusted Proportional, the Constrained Equal Awards and the Random Arrival rule. The system computes, compares and graphics the different allocations to claimants. We present some applications and examples exported by the system.

  4. Economics of computer dosimetry

    International Nuclear Information System (INIS)

    The literature concerning automatic dose planning methods contains sparse information about the time required and the cost. Bentley (1964) estimates the cost at 2 British Pounds ($5.60) for summation of digital dose-data from four fields in an array with 250 points on an ICT 1301, and states that the cost per plan will be lower if several plans are computed at the same time. With the analytical method described by Hope and Walters (1964) the summation time per field varies between 1/2 and 1 1/2 min on a Ferranti Sirius computer. The type of computer and the calculation method used strongly influence the time required for the computation. The cost per minute for a high-speed computer is, however, very much higher than for a slow computer, and the great variations in computing times between different machines will not necessarily be reflected in the final cost. This will, on the other hand, always be proportional to the number of dose-data to be handled and it is thus expensive to use arrays with a great number of points for the input-fields and for the resulting distributions. Variations in technique make it difficult to compare different computer methods. These difficulties are, however, small compared with those which arise when one tries to estimate and compare the manual techniques used at various hospitals

  5. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  6. Simulation of quantum computers

    NARCIS (Netherlands)

    Raedt, H. De; Michielsen, K.; Hams, A.H.; Miyashita, S.; Saito, K.

    2000-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  7. Physicist or computer specialist?

    International Nuclear Information System (INIS)

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction

  8. Theory of computational complexity

    CERN Document Server

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  9. Abstractions for biomolecular computations

    CERN Document Server

    Okunoye, Babatunde O

    2008-01-01

    Deoxyribonucleic acid is increasingly being understood to be an informational molecule, capable of information processing.It has found application in the determination of non-deterministic algorithms and in the design of molecular computing devices. This is a theoretical analysis of the mathematical properties and relations of the molecules which constituting DNA, which explains in part why DNA is a successful computing molecule.

  10. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  11. Computer Use Exposed

    NARCIS (Netherlands)

    J.M. Richter (Janneke)

    2009-01-01

    textabstractEver since the introduction of the personal computer, our daily lives are infl uenced more and more by computers. A day in the life of a PhD-student illustrates this: “At the breakfast table, I check my e-mail to see if the meeting later that day has been confi rmed, and I check the time

  12. Computers and Classroom Culture.

    Science.gov (United States)

    Schofield, Janet Ward

    This book explores the meaning of computer technology in schools. The book is based on data gathered from a two-year observation of more than 30 different classrooms in an urban high school: geometry classes in which students used artificially intelligent tutors; business classes in which students learned word processing; and computer science…

  13. Testing On Computers

    Directory of Open Access Journals (Sweden)

    Michael Russell

    1999-06-01

    Full Text Available Russell and Haney (1997 reported that open-ended test items administered on paper may underestimate the achievement of students accustomed to writing on computers. This study builds on Russell and Haney's work by examining the effect of taking open-ended tests on computers and on paper for students with different levels of computer skill. Using items from the Massachusetts Comprehensive Assessment System (MCAS and the National Assessment of Educational Progress (NAEP, this study focuses on language arts, science and math tests administered to eighth grade students. In addition, information on students' prior computer use and keyboarding speed was collected. Unlike the previous study that found large effects for open-ended writing and science items, this study reports mixed results. For the science test, performance on computers had a positive group effect. For the two language arts tests, an overall group effect was not found. However, for students whose keyboarding speed is at least 0.5 or one-half of a standard deviation above the mean, performing the language arts test on computer had a moderate positive effect. Conversely, for students whose keyboarding speed was 0.5 standard deviations below the mean, performing the tests on computer had a substantial negative effect. For the math test, performing the test on computer had an overall negative effect, but this effect became less pronounced as keyboarding speed increased. Implications are discussed in terms of testing policies and future research.

  14. Computer Communications and Learning.

    Science.gov (United States)

    Bellman, Beryl L.

    1992-01-01

    Computer conferencing offers many opportunities for linking college students and faculty at a distance. From the Binational English and Spanish Telecommunications Network (BESTNET) has evolved a variety of bilingual video/computer/face-to-face instructional packages to serve institutions and nontraditional students on several continents. (MSE)

  15. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  16. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  17. Philosophy of Computer Science

    Directory of Open Access Journals (Sweden)

    Aatami Järvinen

    2014-06-01

    Full Text Available The diversity and interdisciplinary of Computer Sciences, and the multiplicity of its uses in other sciences make it difficult to define them and prescribe how to perform them. Furthermore, also cause friction between computer scientists from different branches. Because of how they are structured, these sciences programs are criticized for not offer an adequate methodological training, or a deep understanding of different research traditions. To collaborate on a solution, some have decided to include in their curricula courses that enable students to gain awareness about epistemology and methodological issues in Computer Science, as well as give meaning to the practice of computer scientists. In this article the needs and objectives of the courses on the philosophy of Computer Science are analyzed, and its structure and management are explained.

  18. Symmetry Effects in Computation

    Science.gov (United States)

    Yao, Andrew Chi-Chih

    2008-12-01

    The concept of symmetry has played a key role in the development of modern physics. For example, using symmetry, C.N. Yang and other physicists have greatly advanced our understanding of the fundamental laws of physics. Meanwhile, computer scientists have been pondering why some computational problems seem intractable, while others are easy. Just as in physics, the laws of computation sometimes can only be inferred indirectly by considerations of general principles such as symmetry. The symmetry properties of a function can indeed have a profound effect on how fast the function can be computed. In this talk, we present several elegant and surprising discoveries along this line, made by computer scientists using symmetry as their primary tool. Note from Publisher: This article contains the abstract only.

  19. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  20. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  1. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  2. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  3. Offline computing and networking

    International Nuclear Information System (INIS)

    This note summarizes the work of the Offline Computing and Networking Group. The report is divided into two sections; the first deals with the computing and networking requirements and the second with the proposed way to satisfy those requirements. In considering the requirements, we have considered two types of computing problems. The first is CPU-intensive activity such as production data analysis (reducing raw data to DST), production Monte Carlo, or engineering calculations. The second is physicist-intensive computing such as program development, hardware design, physics analysis, and detector studies. For both types of computing, we examine a variety of issues. These included a set of quantitative questions: how much CPU power (for turn-around and for through-put), how much memory, mass-storage, bandwidth, and so on. There are also very important qualitative issues: what features must be provided by the operating system, what tools are needed for program design, code management, database management, and for graphics

  4. Fostering Computational Thinking

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  5. Unconditionally verifiable blind computation

    CERN Document Server

    Fitzsimons, Joseph F

    2012-01-01

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. Recently the authors together with Broadbent proposed a universal unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. In this paper we extend the BQC protocol presented in [Broadbent, Fitzsimons and Kashefi, FOCS 2009 p517] with new functionality allowing blind computational basis m...

  6. Blind Quantum Computation

    CERN Document Server

    Arrighi, P; Arrighi, Pablo; Salvail, Louis

    2003-01-01

    We investigate the possibility of having someone carry out the work of executing a function for you, but without letting him learn anything about your input. Say Alice wants Bob to compute some well-known function f upon her input x, but wants to prevent Bob from learning anything about x. The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The setting is quantum, the security is unconditional, the eavesdropper is as malicious as can be. Keywords: Secure Circuit Evaluation, Secure Two-party Computation, Information Hiding, Information gain vs disturbance.

  7. COMPUTER SECUIRTY WITH COMPUTER PROTECTIONAND NETWORK MANAGEMENT

    OpenAIRE

    Smt Ambikatai V Mittapally*

    2016-01-01

    Computer security, also known as cybersecurity or IT security, is the protection of information systems from theft or damage to the hardware, the software, and to the information on them,aswellasfrom disruption or misdirection of the services they provide. It includes controlling physical access to the hardware, as well as protecting against harm that may come via network access, data and code injection,] and due to malpractice by operators, whether intentional, accidental, or due to thembein...

  8. A Computable Economist’s Perspective on Computational Complexity

    OpenAIRE

    Vela Velupillai, K.

    2007-01-01

    A computable economist's view of the world of computational complexity theory is described. This means the model of computation underpinning theories of computational complexity plays a central role. The emergence of computational complexity theories from diverse traditions is emphasised. The unifications that emerged in the modern era was codified by means of the notions of efficiency of computations, non-deterministic computations, completeness, reducibility and verifiability - all three of...

  9. Coping with distributed computing

    International Nuclear Information System (INIS)

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  10. Navier-Stokes computer

    International Nuclear Information System (INIS)

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  11. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  12. Computing with synthetic protocells.

    Science.gov (United States)

    Courbet, Alexis; Molina, Franck; Amar, Patrick

    2015-09-01

    In this article we present a new kind of computing device that uses biochemical reactions networks as building blocks to implement logic gates. The architecture of a computing machine relies on these generic and composable building blocks, computation units, that can be used in multiple instances to perform complex boolean functions. Standard logical operations are implemented by biochemical networks, encapsulated and insulated within synthetic vesicles called protocells. These protocells are capable of exchanging energy and information with each other through transmembrane electron transfer. In the paradigm of computation we propose, protoputing, a machine can solve only one problem and therefore has to be built specifically. Thus, the programming phase in the standard computing paradigm is represented in our approach by the set of assembly instructions (specific attachments) that directs the wiring of the protocells that constitute the machine itself. To demonstrate the computing power of protocellular machines, we apply it to solve a NP-complete problem, known to be very demanding in computing power, the 3-SAT problem. We show how to program the assembly of a machine that can verify the satisfiability of a given boolean formula. Then we show how to use the massive parallelism of these machines to verify in less than 20 min all the valuations of the input variables and output a fluorescent signal when the formula is satisfiable or no signal at all otherwise. PMID:25969126

  13. Global computing for bioinformatics.

    Science.gov (United States)

    Loewe, Laurence

    2002-12-01

    Global computing, the collaboration of idle PCs via the Internet in a SETI@home style, emerges as a new way of massive parallel multiprocessing with potentially enormous CPU power. Its relations to the broader, fast-moving field of Grid computing are discussed without attempting a review of the latter. This review (i) includes a short table of milestones in global computing history, (ii) lists opportunities global computing offers for bioinformatics, (iii) describes the structure of problems well suited for such an approach, (iv) analyses the anatomy of successful projects and (v) points to existing software frameworks. Finally, an evaluation of the various costs shows that global computing indeed has merit, if the problem to be solved is already coded appropriately and a suitable global computing framework can be found. Then, either significant amounts of computing power can be recruited from the general public, or--if employed in an enterprise-wide Intranet for security reasons--idle desktop PCs can substitute for an expensive dedicated cluster. PMID:12511066

  14. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  15. Recognizing Computational Science

    Science.gov (United States)

    Bland-Hawthorn, J.

    2006-08-01

    There are prestigious international awards that recognize the role of theory and experiment in science and mathematics, but there are no awards of a similar stature that explicitly recognize the role of computational science in a scientific field. In 1945, John von Neumann noted that "many branches of both pure and applied mathematics are in great need of computing instruments to break the present stalemate created by the failure of the purely analytical approach to nonlinear problems." In the past few decades, great strides in mathematics and in the applied sciences can be linked to computational science.

  16. Fast Local Computation Algorithms

    OpenAIRE

    Rubinfeld, Ronitt; Tamir, Gil; Vardi, Shai; Xie, Ning

    2011-01-01

    For input $x$, let $F(x)$ denote the set of outputs that are the "legal" answers for a computational problem $F$. Suppose $x$ and members of $F(x)$ are so large that there is not time to read them in their entirety. We propose a model of {\\em local computation algorithms} which for a given input $x$, support queries by a user to values of specified locations $y_i$ in a legal output $y \\in F(x)$. When more than one legal output $y$ exists for a given $x$, the local computation algorithm should...

  17. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  18. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  19. Cervical computed tomography

    International Nuclear Information System (INIS)

    This book describes the possibilities of cervical computed tomography with the apparatus available at present. The normal anatomy of the cervical region as it appears in computed tomography is described with special regard to its compartimental structure and functional aspects; this is supplemented by anatomically normal measures obtained from cervical computed tomograms of 60 healthy individuals of different age and both sexes. The morphology of cervical anomalies obtained via CT and of the various acquired cervical disease processes is discussed and illustrated by means of the authors' own observations; the diagnostic value of the findings obtained by CT is discussed, a diagnosis is set up. (orig./MG)

  20. Computer assisted radiology

    International Nuclear Information System (INIS)

    The organization of the book follows the plan of the meeting, with chapters representing the general meeting sessions and articles representing the meeting presentations. These are grouped by modality or kindred application, where relevant. Some sessions are not similarly divided and individual papers are positioned, presumably, in order of presentation. Each section labeled workshop addresses a specific topic. The first session is on digital image generation and contains sections on magnetic resonance imaging, nuclear medicine, computed tomography, ultrasound, digital radiography, and digital subtraction and angiography. The remaining sections are on application programming, picture archiving and communications systems, computer graphics, and computer vision

  1. Research in Computational Astrobiology

    Science.gov (United States)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  2. Computational Science and Innovation

    International Nuclear Information System (INIS)

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. Mobile computing handbook

    CERN Document Server

    Ilyas, Mohammad

    2004-01-01

    INTRODUCTION AND APPLICATIONS OF MOBILE COMPUTING Wearable Computing,A. Smailagic and D.P. Siewiorek Developing Mobile Applications: A Lime Primer,G.P. Picco, A.L. Murphy, and G.-C. Roman Pervasive Application Development: Approaches and Pitfalls,G. Banavar, N. Cohen, and D. Soroker ISAM, Joining Context-Awareness and Mobility to Building Pervasive Applications,I. Augustin, A. Corrêa Yamin, J.L. Victória Barbosa, L. Cavalheiro da Silva, R. Araújo Real, G. Frainer, G.G. Honrich Cavalheiro, and C.F. Resin Geyer Integrating Mobile Wireless Devices into the Computational Grid,T. Phan, L. Huan

  5. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  6. Instant Google Compute Engine

    CERN Document Server

    Papaspyrou, Alexander

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This book is a step-by-step guide to installing and using Google Compute Engine.""Instant Google Compute Engine"" is great for developers and operators who are new to Cloud computing, and who are looking to get a good grounding in using Infrastructure-as-a-Service as part of their daily work. It's assumed that you will have some experience with the Linux operating system as well as familiarity with the concept of virtualization technologies, suc

  7. Single neuron computation

    CERN Document Server

    McKenna, Thomas M; Zornetzer, Steven F

    1992-01-01

    This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real n

  8. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  9. Human Computer Interaction

    Science.gov (United States)

    Bhagwani, Akhilesh; Sengar, Chitransh; Talwaniper, Jyotsna; Sharma, Shaan

    2012-08-01

    The paper basically deals with the study of HCI (Human computer interaction) or BCI(Brain-Computer-Interfaces) Technology that can be used for capturing brain signals and translating them into commands that allow humans to control (just by thinking) devices such as computers, robots, rehabilitation technology and virtual reality environments. The HCI is based as a direct communication pathway between the brain and an external device. BCIs are often aimed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.The paper also deals with many advantages of BCI Technology along with some of its applications and some major drawbacks.

  10. Logic circuit and computer

    International Nuclear Information System (INIS)

    This book contains eight chapters, which are introduction of computer like history of computer, integrated circuit, micro processor and micro computer, number system and binary code such as complement and parity bit, boolean algebra and logic circuit like karnaugh map, Quine-Mclusky, and prime implicant, integrated logic circuit such as adder, subtractor, carry propagation and magnitude comparator, order logic circuit and memory like flip-flop, serial binary adder and counter, IC logic gate such as IC logic level and ECL, development of structure of micro processor and instruction and addressing mode.

  11. Emerging Hybrid Computational Models

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    Berlin : Springer, 2006 - (Huang, D.; Li, K.; Irwin, G.), s. 379-389 ISBN 3-540-37274-1. - (Lecture Notes in Artificial Intelligence . 4114). [ICIC 2006. International Conference on Intelligent Computing. Kunming (CN), 16.08.2006-19.08.2006] R&D Projects: GA MŠk 1M0567 Grant ostatní: HPC-Europa(XE) RII3-CT-2003-506079 Institutional research plan: CEZ:AV0Z10300504 Keywords : computational intelligence * intelligent agents * hybrid models Subject RIV: IN - Informatics, Computer Science

  12. Power plant process computer

    International Nuclear Information System (INIS)

    The concept of instrumentation and control in nuclear power plants incorporates the use of process computers for tasks which are on-line in respect to real-time requirements but not closed-loop in respect to closed-loop control. The general scope of tasks is: - alarm annunciation on CRT's - data logging - data recording for post trip reviews and plant behaviour analysis - nuclear data computation - graphic displays. Process computers are used additionally for dedicated tasks such as the aeroball measuring system, the turbine stress evaluator. Further applications are personal dose supervision and access monitoring. (orig.)

  13. Efficient computation of hashes

    International Nuclear Information System (INIS)

    The sequential computation of hashes at the core of many distributed storage systems and found, for example, in grid services can hinder efficiency in service quality and even pose security challenges that can only be addressed by the use of parallel hash tree modes. The main contributions of this paper are, first, the identification of several efficiency and security challenges posed by the use of sequential hash computation based on the Merkle-Damgard engine. In addition, alternatives for the parallel computation of hash trees are discussed, and a prototype for a new parallel implementation of the Keccak function, the SHA-3 winner, is introduced.

  14. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  15. Advances in computers

    CERN Document Server

    Memon, Atif

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  16. Computational Tractability - Beyond Turing?

    Science.gov (United States)

    Marcer, Peter; Rowlands, Peter

    A fundamental problem in the theory of computing concerns whether descriptions of systems at all times remain tractable, that is whether the complexity that inevitably results can be reduced to a polynomial form (P) or whether some problems lead to a non-polynomial (NP) exponential growth in complexity. Here, we propose that the universal computational rewrite system that can be shown to be responsible ultimately for the development of mathematics, physics, chemistry, biology and even human consciousness, is so structured that Nature will always be structured as P at any scale and so will be computationally tractable.

  17. Archives and the computer

    CERN Document Server

    Cook, Michael Garnet

    1986-01-01

    Archives and the Computer deals with the use of the computer and its systems and programs in archiving data and other related materials. The book covers topics such as the scope of automated systems in archives; systems for records management, archival description, and retrieval; and machine-readable archives. The selection also features examples of archives from different institutions such as the University of Liverpool, Berkshire County Record Office, and the National Maritime Museum.The text is recommended for archivists who would like to know more about the use of computers in archiving of

  18. Games, puzzles, and computation

    CERN Document Server

    Hearn, Robert A

    2009-01-01

    The authors show that there are underlying mathematical reasons for why games and puzzles are challenging (and perhaps why they are so much fun). They also show that games and puzzles can serve as powerful models of computation-quite different from the usual models of automata and circuits-offering a new way of thinking about computation. The appendices provide a substantial survey of all known results in the field of game complexity, serving as a reference guide for readers interested in the computational complexity of particular games, or interested in open problems about such complexities.

  19. Introduction to grid computing

    CERN Document Server

    Magoules, Frederic; Tan, Kiat-An; Kumar, Abhinit

    2009-01-01

    A Thorough Overview of the Next Generation in ComputingPoised to follow in the footsteps of the Internet, grid computing is on the verge of becoming more robust and accessible to the public in the near future. Focusing on this novel, yet already powerful, technology, Introduction to Grid Computing explores state-of-the-art grid projects, core grid technologies, and applications of the grid.After comparing the grid with other distributed systems, the book covers two important aspects of a grid system: scheduling of jobs and resource discovery and monitoring in grid. It then discusses existing a

  20. Convergence: Computing and communications

    Energy Technology Data Exchange (ETDEWEB)

    Catlett, C. [National Center for Supercomputing Applications, Champaign, IL (United States)

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  1. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  2. Quantum-dot computing

    International Nuclear Information System (INIS)

    A quantum computer would put the latest PC to shame. Not only would such a device be faster than a conventional computer, but by exploiting the quantum-mechanical principle of superposition it could change the way we think about information processing. However, two key goals need to be met before a quantum computer becomes reality. The first is to be able to control the state of a single quantum bit (or 'qubit') and the second is to build a two-qubit gate that can produce 'entanglement' between the qubit states. (U.K.)

  3. Quantum-dot computing

    Energy Technology Data Exchange (ETDEWEB)

    Milburn, Gerard

    2003-10-01

    A quantum computer would put the latest PC to shame. Not only would such a device be faster than a conventional computer, but by exploiting the quantum-mechanical principle of superposition it could change the way we think about information processing. However, two key goals need to be met before a quantum computer becomes reality. The first is to be able to control the state of a single quantum bit (or 'qubit') and the second is to build a two-qubit gate that can produce 'entanglement' between the qubit states. (U.K.)

  4. High Performance Computing: A Survey

    OpenAIRE

    Mr. Nilesh C. Thakkar, Mr. Nitesh M. Sureja

    2012-01-01

    This paper surveys techniques used for high performance computing. High performance computing is used to develop machines which provide computing power like super computers. It concentrates on both software as well as hardware development. As the complexity of the computing increases day by day, there is a requirement of having a cost effective computing environment which provide very high computing power. The activities related to research and simulations are the common examples where we req...

  5. GRID COMPUTING AND CHECKPOINT APPROACH

    OpenAIRE

    Pankaj gupta

    2011-01-01

    Grid computing is a means of allocating the computational power of alarge number of computers to complex difficult computation or problem. Grid computing is a distributed computing paradigm thatdiffers from traditional distributed computing in that it is aimed toward large scale systems that even span organizational boundaries. In this paper we investigate the different techniques of fault tolerance which are used in many real time distributed systems. The main focus is on types of fault occu...

  6. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  7. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  8. Soft computing: forms and limits in computational aesthetics

    OpenAIRE

    Fazi, M Beatrice

    2011-01-01

    This paper contends that soft computing can help us investigate the aesthetics of digital computation. Employing broader conceptions of aesthetics and perception, and whilst drawing upon the ontology of Alfred N. Whitehead, it uses soft computing to address the 'prehensive' dimension of the quantitative procedures of computation, and explores the interrelationship between the factuality and formality of computational structures.

  9. Computational imaging: Combining optics, computation and perception

    OpenAIRE

    Masiá Corcoy, Belén; Gutiérrez Pérez, Diego

    2013-01-01

    Esta tesis presenta contribuciones en distintas partes del pipeline de imagen, desde la captura de imágenes, hasta la presentación de las mismas en un monitor u otro dispositivo, pasando por el procesamiento que se produce en los pasos intermedios. Englobamos las distintas técnicas y algoritmos utilizados en las diferentes etapas bajo el concepto de Imagen Computacional (Computational Imaging en inglés). Los temas son diversos, pero el motor e hilo conductor ha sido la idea de que una combina...

  10. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... special computer program processes this large volume of data to create two-dimensional cross-sectional images of ... Society of Urogenital Radiology note that the available data suggest that it is safe to continue breastfeeding ...

  11. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... special computer program processes this large volume of data to create two-dimensional cross-sectional images of ... Society of Urogenital Radiology note that the available data suggest that it is safe to continue breastfeeding ...

  12. CMS computing model evolution

    International Nuclear Information System (INIS)

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  13. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  14. Feynman Lectures on Computation

    CERN Document Server

    Feynman, Richard Phillips; Allen, Robin W

    1999-01-01

    "When, in 1984-86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman,"

  15. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT ... distinguished from one another on an x-ray film or CT electronic image. In a conventional x- ...

  16. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT ... distinguished from one another on an x-ray film or CT electronic image. In a conventional x- ...

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Videos related to Computed Tomography (CT) - Head About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Images related to Computed Tomography (CT) - Sinuses About this Site RadiologyInfo.org is produced by: Please note ... you can search the ACR-accredited facilities database . This website does not provide cost information. The costs ...

  19. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  20. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... look like? The CT scanner is typically a large, box-like machine with a hole, or short ... spiral path. A special computer program processes this large volume of data to create two-dimensional cross- ...