WorldWideScience

Sample records for cii computers

  1. Simulated CII observations for SPICA/SAFARI

    CERN Document Server

    Levrier, F; Hennebelle, P; Falgarone, E; Petit, F Le; Goicoechea, J R

    2009-01-01

    We investigate the case of CII 158 micron observations for SPICA/SAFARI using a three-dimensional magnetohydrodynamical (MHD) simulation of the diffuse interstellar medium (ISM) and the Meudon PDR code. The MHD simulation consists of two converging flows of warm gas (10,000 K) within a cubic box 50 pc in length. The interplay of thermal instability, magnetic field and self-gravity leads to the formation of cold, dense clumps within a warm, turbulent interclump medium. We sample several clumps along a line of sight through the simulated cube and use them as input density profiles in the Meudon PDR code. This allows us to derive intensity predictions for the CII 158 micron line and provide time estimates for the mapping of a given sky area.

  2. Velocity-resolved [CII] emission and [CII]/FIR Mapping along Orion with Herschel

    CERN Document Server

    Goicoechea, J R; Etxaluze, M; Goldsmith, P F; Ossenkopf, V; Gerin, M; Bergin, E A; Black, J H; Cernicharo, J; Cuadrado, S; Encrenaz, P; Falgarone, E; Fuente, A; Hacar, A; Lis, D C; Marcelino, N; Melnick, G J; Muller, H S P; Persson, C; Pety, J; Rollig, M; Schilke, P; Simon, R; Snell, R L; Stutzki, J

    2015-01-01

    We present the first 7.5'x11.5' velocity-resolved map of the [CII]158um line toward the Orion molecular cloud-1 (OMC-1) taken with the Herschel/HIFI instrument. In combination with far-infrared (FIR) photometric images and velocity-resolved maps of the H41alpha hydrogen recombination and CO J=2-1 lines, this data set provides an unprecedented view of the intricate small-scale kinematics of the ionized/PDR/molecular gas interfaces and of the radiative feedback from massive stars. The main contribution to the [CII] luminosity (~85%) is from the extended, FUV-illuminated face of the cloud G_0>500, n_H>5x10^3 cm^-3) and from dense PDRs (G_0~10^4, n_H~10^5 cm^-3) at the interface between OMC-1 and the HII region surrounding the Trapezium cluster. Around 15% of the [CII] emission arises from a different gas component without CO counterpart. The [CII] excitation, PDR gas turbulence, line opacity (from [13CII]) and role of the geometry of the illuminating stars with respect to the cloud are investigated. We construct...

  3. The Local [CII] 158 um Emission Line Luminosity Function

    CERN Document Server

    Hemmati, Shoubaneh; Diaz-Santos, Tanio; Armus, Lee; Capak, Peter; Faisst, Andreas; Masters, Daniel

    2016-01-01

    We present, for the first time, the local [CII] 158 um emission line luminosity function measured using a sample of more than 500 galaxies from the Revised Bright Galaxy Sample (RBGS). [CII] luminosities are measured from the Herschel PACS observations of the Luminous Infrared Galaxies in the Great Observatories All-sky LIRG Survey (GOALS) and estimated for the rest of the sample based on the far-IR luminosity and color. The sample covers 91.3% of the sky and is complete at S_60 um > 5.24 Jy. We calculated the completeness as a function of [CII] line luminosity and distance, based on the far-IR color and flux densities. The [CII] luminosity function is constrained in the range ~10^(7-9) (Lo) from both the 1/V_max and a maximum likelihood methods. The shape of our derived [CII] emission line luminosity function agrees well with the IR luminosity function. For the CO(1-0) and [CII] luminosity functions to agree, we propose a varying ratio of [CII]/CO(1-0) as a function of CO luminosity, with larger ratios for f...

  4. Simulating the [CII] emission of high redshift galaxies

    DEFF Research Database (Denmark)

    Pardos Olsen, Karen; Greve, Thomas Rodriguez; Narayanan, Desika;

    2016-01-01

    density. For the chemistry and radiative transfer, the photoionization code CLOUDY is implemented. I will show results for z=2 star-forming galaxies yet to be observed, as well as preliminary results for galaxies at z~6-7 where observations have presented contradictory detections and non......-detections of star-forming galaxies.......The fine structure line of [CII] at 158 microns can arise throughout the interstellar medium (ISM) and has been proposed as a tracer of star formation rate (SFR). But the origin of [CII] and how it depends on e.g. metallicity and radiation field of a galaxy remain uncertain.Simulating [CII] can...

  5. Comparing [CII], HI, and CO dynamics of nearby galaxies

    CERN Document Server

    de Blok, W J G; Smith, J -D T; Herrera-Camus, R; Bolatto, A D; Requena-Torres, M A; Crocker, A F; Croxall, K V; Kennicutt, R C; Koda, J; Armus, L; Boquien, M; Dale, D; Kreckel, K; Meidt, S

    2016-01-01

    The HI and CO components of the interstellar medium (ISM) are usually used to derive the dynamical mass M_dyn of nearby galaxies. Both components become too faint to be used as a tracer in observations of high-redshift galaxies. In those cases, the 158 $\\mu$m line of atomic carbon [CII] may be the only way to derive M_dyn. As the distribution and kinematics of the ISM tracer affects the determination of M_dyn, it is important to quantify the relative distributions of HI, CO and [CII]. HI and CO are well-characterised observationally, however, for [CII] only very few measurements exist. Here we compare observations of CO, HI, and [CII] emission of a sample of nearby galaxies, drawn from the HERACLES, THINGS and KINGFISH surveys. We find that within R_25, the average [CII] exponential radial profile is slightly shallower than that of the CO, but much steeper than the HI distribution. This is also reflected in the integrated spectrum ("global profile"), where the [CII] spectrum looks more like that of the CO tha...

  6. On the [CII]-SFR relation in high redshift galaxies

    CERN Document Server

    Vallini, L; Ferrara, A; Pallottini, A; Yue, B

    2015-01-01

    After two ALMA observing cycles, only a handful of [CII] $158\\,\\mu m$ emission line searches in z>6 galaxies have reported a positive detection, questioning the applicability of the local [CII]-SFR relation to high-z systems. To investigate this issue we use the Vallini et al. 2013 (V13) model, based on high-resolution, radiative transfer cosmological simulations to predict the [CII] emission from the interstellar medium of a z~7 (halo mass $M_h=1.17\\times10^{11}M_{\\odot}$) galaxy. We improve the V13 model by including (a) a physically-motivated metallicity (Z) distribution of the gas, (b) the contribution of Photo-Dissociation Regions (PDRs), (c) the effects of Cosmic Microwave Background on the [CII] line luminosity. We study the relative contribution of diffuse neutral gas to the total [CII] emission ($F _{diff}/F_{tot}$) for different SFR and Z values. We find that the [CII] emission arises predominantly from PDRs: regardless of the galaxy properties, $F _{diff}/F_{tot}\\leq 10$% since, at these early epoc...

  7. Prospects for detecting CII emission during the Epoch of Reionization

    CERN Document Server

    Silva, Marta B; Cooray, Asantha; Gong, Yan

    2014-01-01

    We produce simulations of emission of the atomic CII line in large sky fields in order to determine the current prospects for mapping this line during the high redshift Epoch of Reionization. We estimate the CII line intensity, redshift evolution and spatial fluctuations using observational relations between CII emission and the SFR in a galaxy for the frequency range of 200 GHz to 300 GHz. We obtained a frequency averaged intensity of CII emission of ${\\rm I_{\\rm CII}=(4 \\pm 2)\\times10^{2}\\, Jy\\, \\rm sr^{-1}}$ in the redshift range $z\\, \\sim\\, 5.3\\, -\\, 8.5$. Observations of CII emission in this frequency range will suffer contamination from emission lines at lower redshifts, in particular from the CO rotation lines. For the relevant frequency range we estimated the CO contamination (originated in emission from galaxies at $z\\, <\\, 2.5$), using simulations, to be ${\\rm I_{\\rm CO} \\approx 10^{3}\\, Jy \\, sr^{-1}}$ and independently confirmed the result based in observational relations. We generated maps as ...

  8. The Spatially Resolved [CII] Cooling Line Deficit in Galaxies

    CERN Document Server

    Smith, J D T; Draine, Bruce; De Looze, Ilse; Sandstrom, Karin; Armus, Lee; Beirao, Pedro; Bolatto, Alberto; Boquien, Mederic; Brandl, Bernhard; Crocker, Alison; Dale, Daniel A; Galametz, Maud; Groves, Brent; Helou, George; Herrera-Camus, Rodrigo; Hunt, Leslie; Kennicutt, Robert; Walter, Fabian; Wolfire, Mark

    2016-01-01

    We present [CII] 158um measurements from over 15,000 resolved regions within 54 nearby galaxies of the KINGFISH program to investigate the so-called [CII] "line cooling deficit" long known to occur in galaxies with different luminosities. The [CII]/TIR ratio ranges from above 1% to below 0.1% in the sample, with a mean value of 0.48+-0.21%. We find that the surface density of 24um emission dominates this trend, with [CII]/TIR dropping as nuInu{24um} increases. Deviations from this overall decline are correlated with changes in the gas phase metal abundance, with higher metallicity associated with deeper deficits at a fixed surface brightness. We supplement the local sample with resolved [CII] measurements from nearby luminous infrared galaxies and high redshift sources from z=1.8-6.4, and find that star formation rate density drives a continuous trend of deepening [CII] deficit across six orders of magnitude in SFRD. The tightness of this correlation suggests that an approximate star formation rate density ca...

  9. [CII] synthetic emission maps of simulated galactic disks

    Science.gov (United States)

    Franeck, A.; Walch, S.; Glover, S. C. O.; Seifried, D.; Girichidis, P.; Naab, T.; Klessen, R.; Peters, T.; Wünsch, R.; Gatto, A.; Clark, P. C.

    2016-05-01

    We carry out radiative transfer simulations for the [CII] emission of simulated galactic disks from the SILCC project.6 Here we present the integrated [CII] intensity map of a typical simulation which assumes solar neighbourhood conditions with ΣGAS = 10 M⊙/pc2 and a supernova rate of 15 SN/Myr with randomly distributed supernovae (SNe) at t = 50 Myr. We analyse the intensity profile which reveals different components. These are clearly distinguishable and trace cold, molecular as well as warm, outflowing gas. We find that [CII] is a promising tool to analyse the multi-phase structure of the ISM. SILCC: Simulating the LIfe Cycle of molecular Clouds: hera.ph1.uni-koeln.de/˜silcc/

  10. KirCII- promising tool for polyketide diversification

    DEFF Research Database (Denmark)

    Musiol-Kroll, Ewa Maria; Härtner, Thomas; Kulik, Andreas;

    2014-01-01

    Kirromycin is produced by Streptomyces collinus Tü 365. This compound is synthesized by a large assembly line of type I polyketide synthases and non-ribosomal peptide synthetases (PKS I/NRPS), encoded by the genes kirAI-kirAVI and kirB. The PKSs KirAI-KirAV have no acyltransferase domains integra...... introducing the non-native substrates in an in vivo context. Thus, KirCII represents a promising tool for polyketide diversification....

  11. Enhanced [CII] emission in a z=4.76 submillimetre galaxy

    CERN Document Server

    De Breuck, Carlos; Caselli, Paola; Coppin, Kristen; Hailey-Dunsheath, Steve; Nagao, Tohru

    2011-01-01

    We present the detection of bright [CII] emission in the z=4.76 submillimetre galaxy LESS J033229.4-275619 using the Atacama Pathfinder EXperiment. This represents the highest redshift [CII] detection in a submm selected, star-formation dominated system. The AGN contributions to the [CII] and far-infrared (FIR) luminosities are small. We find an atomic mass derived from [CII] comparable to the molecular mass derived from CO. The ratio of the [CII], CO and FIR luminosities imply a radiation field strength G_0~10^3 and a density ~10^4 cm^-3 in a kpc-scale starburst, as seen in local and high redshift starbursts. The high L_[CII]/L_FIR=2.4x10^-3 and the very high L_[CII]/L_CO(1-0) ~ 10^4 are reminiscent of low metallicity dwarf galaxies, suggesting that the highest redshift star-forming galaxies may also be characterised by lower metallicities. We discuss the implications of a reduced metallicity on studies of the gas reservoirs, and conclude that especially at very high redshift, [CII] may be a more powerful an...

  12. ALMA resolves turbulent, rotating [CII] emission in a young starburst galaxy at z=4.8

    CERN Document Server

    De Breuck, Carlos; Swinbank, Mark; Caselli, Paola; Coppin, Kristen; Davis, Timothy A; Maiolino, Roberto; Nagao, Tohru; Smail, Ian; Walter, Fabian; Weiss, Axel; Zwaan, Martin

    2014-01-01

    We present spatially resolved Atacama Large Millimeter/submillimeter Array (ALMA) [CII] observations of the z=4.7555 submillimetre galaxy, ALESS 73.1. Our 0.5" FWHM map resolves the [CII] emitting gas which is centred close to the active galactic nucleus (AGN). The gas kinematics are dominated by rotation but with high turbulence, v_rot/sigma_int~3.1, and a Toomre Q parameter ~0.4. The diameter of the dust continuum emission is 80 Gyr^{-1}, especially since there are no clear indications of recent merger activity. Finally, our high signal-to-noise [CII] measurement revises the observed [NII]/[CII] ratio, which suggests a close to solar metallicity, unless the [CII] flux contains significant contributions from HII regions. Our observations suggest that ALESS73.1 is a nascent galaxy undergoing its first major burst of star formation, embedded within an unstable but metal-rich gas disk.

  13. Disentangling the ISM phases of the dwarf galaxy NGC 4214 using [CII] SOFIA/GREAT observations

    CERN Document Server

    Fahrion, Katja; Bigiel, Frank; Hony, Sacha; Abel, Nick P; Cigan, Phil; Csengeri, Timea; Graf, Urs; Lebouteiller, Vianney; Madden, Suzanne C; Wu, Ronin; Young, Lisa

    2016-01-01

    The [CII] 158 um fine structure line is one of the dominant cooling lines in the interstellar medium (ISM) and is an important tracer of star formation. Recent velocity-resolved studies with Herschel/HIFI and SOFIA/GREAT showed that the [CII] line can constrain the properties of the ISM phases in star-forming regions. The [CII] line as a tracer of star formation is particularly important in low-metallicity environments where CO emission is weak because of the presence of large amounts of CO-dark gas. The nearby irregular dwarf galaxy NGC 4214 offers an excellent opportunity to study an actively star-forming ISM at low metallicity. We analyzed the spectrally resolved [CII] line profiles in three distinct regions at different evolutionary stages of NGC 4214 with respect to ancillary HI and CO data in order to study the origin of the [CII] line. We used SOFIA/GREAT [CII] 158 um observations, HI data from THINGS, and CO(2-1) data from HERACLES to decompose the spectrally resolved [CII] line profiles into componen...

  14. Stability of CII is a key element in the cold stress response of bacteriophage lambda infection.

    Science.gov (United States)

    Obuchowski, M; Shotland, Y; Koby, S; Giladi, H; Gabig, M; Wegrzyn, G; Oppenheim, A B

    1997-10-01

    Bacteria are known to adapt to environmental changes such as temperature fluctuations. It was found that temperature affects the lysis-lysogeny decision of lambda such that at body temperature (37 degrees C) the phage can select between the lytic and lysogenic pathways, while at ambient temperature (20 degrees C) the lytic pathway is blocked. This temperature-dependent discriminatory developmental pathway is governed mainly by the phage CII activity as a transcriptional activator. Mutations in cII or point mutations at the pRE promoter lead to an over-1,000-fold increase in mature-phage production at low temperature while mutations in cI cause a smaller increase in phage production. Interference with CII activity can restore lytic growth at low temperature. We found that at low temperature the stability of CII in vivo is greatly increased. It was also found that phage DNA replication is blocked at 20 degrees C but can be restored by supplying O and P in trans. It is proposed that CII hampers transcription of the rightward pR promoter, thus reducing the levels of the lambda O and P proteins, which are necessary for phage DNA replication. Our results implicate CII itself or host proteins affecting CII stability as a "molecular thermometer".

  15. The nature of the [CII] emission in dusty star-forming galaxies from the SPT-survey

    CERN Document Server

    Gullberg, Bitten; Vieira, Joaquin; Weiss, Axel; Aguirre, James; Aravena, Manuel; Béthermin, Matthieu; Bradford, C Matt; Bothwell, Matt; Carlstrom, John; Chapman, Scott; Fassnacht, Chris; Gonzalez, Anthony; Greve, Thomas; Hezavah, Yashar; Holtzapfel, William L; Husband, Kate; Ma, Jingzhe; Malkan, Matt; Marrone, Dan; Menten, Karl; Murphy, Eric; Reichardt, Chris; Spilker, Justin; Stark, Anthony; Strandet, Maria; Welikala, Niraj

    2015-01-01

    We present [CII] observations of 20 strongly lensed dusty star forming galaxies at 2.1 20 mJy) from the South Pole Telescope survey, with far-infrared (FIR) luminosities determined from extensive photometric data. The [CII] line is robustly detected in 17 sources, all but one being spectrally resolved. Eleven out of 20 sources observed in [CII] also have low-J CO detections from ATCA. A comparison with mid- and high-J CO lines from ALMA reveals consistent [CII] and CO velocity profiles, suggesting that there is little differential lensing between these species. The [CII], low-J CO and FIR data allow us to constrain the properties of the interstellar medium. We find [CII] to CO(1-0) luminosity ratios in the SPT sample of 5200 +- 1800, with significantly less scatter than in other samples. This line ratio can be best described by a medium of [CII] and CO emitting gas with a higher [CII] than CO excitation temperature, high CO optical depth tau_CO >> 1, and low to moderate [CII] optical depth tau_CII ~< 1. T...

  16. Internal structure of spiral arms traced with [CII]: Unraveling the WIM, HI, and molecular emission lanes

    CERN Document Server

    Velusamy, T; Goldsmith, P F; Pineda, J L

    2015-01-01

    The spiral arm tangencies are ideal lines of sight in which to determine the distribution of interstellar gas components in the spiral arms and study the influence of spiral density waves on the interarm gas in the Milky Way. We present a large scale (~15deg) position-velocity map of the Galactic plane in [CII] from l = 326.6 to 341.4deg observed with Herschel HIFI. We use [CII] l-v maps along with those for Hi and 12CO to derive the average spectral line intensity profiles over the longitudinal range of each tangency. Using the VLSR of the emission features, we locate the [CII], HI, and 12CO emissions along a cross cut of the spiral arm. In the spectral line profiles at the tangencies [CII] has two emission peaks, one associated with the compressed WIM and the other the molecular gas PDRs. When represented as a cut across the inner to outer edge of the spiral arm, the [CII]-WIM peak appears closest to the inner edge while 12CO and [CII] associated with molecular gas are at the outermost edge. HI has broader ...

  17. Radiative Transfer meets Bayesian statistics: where does your Galaxy's [CII] come from?

    CERN Document Server

    Accurso, Gioacchino; Bisbas, Thomas G; Viti, Serena

    2016-01-01

    The [CII] 158$\\mu$m emission line can arise in all phases of the ISM, therefore being able to disentangle the different contributions is an important yet unresolved problem when undertaking galaxy-wide, integrated [CII] observations. We present a new multi-phase 3D radiative transfer interface that couples Starburst99, a stellar spectrophotometric code, with the photoionisation and astrochemistry codes Mocassin and 3D-PDR. We model entire star forming regions, including the ionised, atomic and molecular phases of the ISM, and apply a Bayesian inference methodology to parametrise how the fraction of the [CII] emission originating from molecular regions, $f_{[CII],mol}$, varies as a function of typical integrated properties of galaxies in the local Universe. The main parameters responsible for the variations of $f_{[CII],mol}$ are specific star formation rate (sSFR), gas phase metallicity, HII region electron number density ($n_e$), and dust mass fraction. For example, $f_{[CII],mol}$ can increase from 60% to 8...

  18. [CII] 158 $\\mu$m Emission as a Star Formation Tracer

    CERN Document Server

    Herrera-Camus, R; Wolfire, M G; Smith, J D; Croxall, K V; Kennicutt, R C; Calzetti, D; Helou, G; Walter, F; Leroy, A K; Draine, B; Brandl, B R; Armus, L; Sandstrom, K M; Dale, D A; Aniano, G; Meidt, S E; Boquien, M; Hunt, L K; Galametz, M; Tabatabaei, F S; Murphy, E J; Appleton, P; Roussel, H; Engelbracht, C; Beirao, P

    2014-01-01

    The [CII] 157.74 $\\mu$m transition is the dominant coolant of the neutral interstellar gas, and has great potential as a star formation rate (SFR) tracer. Using the Herschel KINGFISH sample of 46 nearby galaxies, we investigate the relation of [CII] surface brightness and luminosity with SFR. We conclude that [CII] can be used for measurements of SFR on both global and kiloparsec scales in normal star-forming galaxies in the absence of strong active galactic nuclei (AGN). The uncertainty of the $\\Sigma_{\\rm [CII]}-\\Sigma_{\\rm SFR}$ calibration is $\\pm$0.21 dex. The main source of scatter in the correlation is associated with regions that exhibit warm IR colors, and we provide an adjustment based on IR color that reduces the scatter. We show that the color-adjusted $\\Sigma_{\\rm[CII]}-\\Sigma_{\\rm SFR}$ correlation is valid over almost 5 orders of magnitude in $\\Sigma_{\\rm SFR}$, holding for both normal star-forming galaxies and non-AGN luminous infrared galaxies. Using [CII] luminosity instead of surface bright...

  19. A physical model for the [CII]-FIR deficit in luminous galaxies

    CERN Document Server

    Narayanan, Desika

    2016-01-01

    Observations of ionised carbon at 158 micron ([CII]) from luminous star-forming galaxies at z~0 show that their ratios of [CII] to far infrared (FIR) luminosity are systematically lower than those of more modestly star-forming galaxies. In this paper, we provide a theory for the origin of this so called "[CII] deficit" in galaxies. Our model treats the interstellar medium as a collection of clouds with radially-stratified chemical and thermal properties, which are dictated by the clouds' volume and surface densities, as well as the interstellar radiation and cosmic ray fields to which they are exposed. [CII] emission arises from the outer, HI dominated layers of clouds, and from regions where the hydrogen is H2 but the carbon is predominantly C+. In contrast, the most shielded regions of clouds are dominated by CO and produce little [CII] emission. This provides a natural mechanism to explain the observed [CII]-star formation relation: galaxies' star formation rates are largely driven by the surface densities...

  20. Anti-CII antibody as a novel indicator to assess disease activity in systemic lupus erythematosus.

    Science.gov (United States)

    He, C; Mao, T; Feng, Y; Song, T; Qin, C; Yan, R; Feng, P

    2015-11-01

    Systemic lupus erythematosus (SLE) is a chronic autoimmune disorder that affects a variety of organ systems. Anti-dsDNA Abs and complement factors have been used as indicators of lupus activity for more than 50 years. A novel indicator of activation in SLE is reported in this paper. Anti-collagen type II (CII) Ab was obviously elevated in patients with SLE compared to those patients with ankylosing spondylitis (AS) and healthy controls (HCs). Anti-CII-Ab-positive patients with SLE showed significantly higher levels of serum IgG and higher titers of ANA but lower levels of C3 and C4 than controls. A positive correlation was demonstrated between anti-CII Ab and serum IgG in SLE patients (r = 0.50, p < 0.0001). The negative correlations of anti-CII Ab with C3 and C4 were observed in SLE patients (r = -0.36, p = 0.0013; r = -0.37, p = 0.0006, respectively). The reduced anti-CII Ab level was accompanied by decreased level of serum IgG and increased levels of C3 and C4 after regular treatment. Therefore, anti-CII Ab could be a novel indicator for monitoring activity of SLE.

  1. PROTECTING CRITICAL DATABASES – TOWARDS A RISK-BASED ASSESSMENT OF CRITICAL INFORMATION INFRASTRUCTURES (CIIS IN SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    Mzukisi N Njotini

    2013-04-01

    Full Text Available South Africa has made great strides towards protecting critical information infrastructures (CIIs. For example, South Africa recognises the significance of safeguarding places or areas that are essential to the national security of South Africa or the economic and social well-being of South African citizens. For this reason South Africa has established mechanisms to assist in preserving the integrity and security of CIIs. The measures provide inter alia for the identification of CIIs; the registration of the full names, address and contact details of the CII administrators (the persons who manage CIIs; the identification of the location(s of CIIs or their component parts; and the outlining of the general descriptions of information or data stored in CIIs.It is argued that the measures to protect CIIs in South Africa are inadequate. In particular, the measures rely on a one-size-fits-all approach to identify and classify CIIs. For this reason the South African measures are likely to lead to the adoption of a paradigm that considers every infrastructure, data or database, regardless of its significance or importance, to be key or critical.

  2. Tracing the reionization epoch with ALMA: [CII] emission in z~7 galaxies

    CERN Document Server

    Pentericci, L; Castellano, M; Fontana, A; Maiolino, R; Guaita, L; Vanzella, E; Grazian, A; Santini, P; Yan, H; Cristiani, S; Conselice, C; Giavalisco, M; Hathi, N; Koekemoer, A

    2016-01-01

    We present new results on [CII]158$\\mu$ m emission from four galaxies in the reionization epoch. These galaxies were previously confirmed to be at redshifts between 6.6 and 7.15 from the presence of the Ly$\\alpha$ emission line in their spectra. The Ly$\\alpha$ emission line is redshifted by 100-200 km/s compared to the systemic redshift given by the [CII] line. These velocity offsets are smaller than what is observed in z~3 Lyman break galaxies with similar UV luminosities and emission line properties. Smaller velocity shifts reduce the visibility of Ly$\\alpha$ and hence somewhat alleviate the need for a very neutral IGM at z~7 to explain the drop in the fraction of Ly$\\alpha$ emitters observed at this epoch. The galaxies show [CII] emission with L[CII]=0.6-1.6 x10$^8 L_\\odot$: these luminosities place them consistently below the SFR-L[CII] relation observed for low redshift star forming and metal poor galaxies and also below z =5.5 Lyman break galaxies with similar star formation rates. We argue that previou...

  3. The ALMA Patchy Deep Survey: A blind search for [CII] emitters at z~4.5

    CERN Document Server

    Matsuda, Y; Iono, D; Hatsukade, B; Kohno, K; Tamura, Y; Yamaguchi, Y; Shimizu, I

    2015-01-01

    We present a result of a blind search for [CII] 158 $\\mu$m emitters at $z\\sim 4.5$ using ALMA Cycle~0 archival data. We collected extra-galactic data covering at 330-360 GHz (band~7) from 8 Cycle~0 projects from which initial results have been already published. The total number of fields is 243 and the total on-source exposure time is 19.2 hours. We searched for line emitters in continuum-subtracted data cubes with spectral resolutions of 50, 100, 300 and 500 km/s. We could not detect any new line emitters above a 6-$\\sigma$ significance level. This result provides upper limits to the [CII] luminosity function at $z\\sim 4.5$ over $L_{\\rm [CII]} \\sim 10^8 - 10^{10} L_{\\odot}$ or star formation rate, SFR $\\sim$ 10-1000 M$_{^\\odot}$/yr. These limits are at least 2 orders of magnitude larger than the [CII] luminosity functions expected from the $z \\sim 4$ UV luminosity function or from numerical simulation. However, this study demonstrates that we would be able to better constrain the [CII] luminosity function a...

  4. Apolipoprotein C-II and C-III metabolism in a kindred of familial hypobetalipoproteinemia

    Energy Technology Data Exchange (ETDEWEB)

    Malmendier, C.L.; Delcroix, C.; Lontie, J.F.; Dubois, D.Y. (Research Foundation on Atherosclerosis, Brussels (Belgium))

    1991-01-01

    Three affected members of a kindred with asymptomatic hypobetalipoproteinemia (HBL) showed low levels of triglycerides, low-density lipoprotein (LDL)-cholesterol, and apolipoproteins (apo) B, C-II, and C-III. Turnover of iodine-labeled apo C-II and apo C-III associated in vitro to plasma lipoproteins was studied after intravenous injection. Radioactivity in plasma and lipoproteins (95% recovered in high-density lipoprotein (HDL) density range) and in 24-hour urine samples was observed for 16 days. A parallelism of the slowest slopes of plasma decay curves was observed between apo C-II and apo C-III, indicating a partial common catabolic route. Urine/plasma radioactivity ratio (U/P) varied with time, suggesting heterogeneity of metabolic pathways. A new compartmental model using the SAAM program was built, not only fitting simultaneously plasma and urine data, but also taking into account the partial common metabolism of lipoprotein particles (LP) containing apo C-II and apo C-III. The low apo C-II and C-III plasma concentrations observed in HBL compared with normal resulted from both an increased catabolism and a reduced synthesis, these changes being more marked for apo C-III. The modifications in the rate constants of the different pathways calculated from the new model are in favor of an increased direct removal of particles following the fast pathway, likely in the very-low-density lipoprotein (VLDL) density range.

  5. Computational Intelligence in Information Systems Conference

    CERN Document Server

    Au, Thien-Wan; Omar, Saiful

    2017-01-01

    This book constitutes the Proceedings of the Computational Intelligence in Information Systems conference (CIIS 2016), held in Brunei, November 18–20, 2016. The CIIS conference provides a platform for researchers to exchange the latest ideas and to present new research advances in general areas related to computational intelligence and its applications. The 26 revised full papers presented in this book have been carefully selected from 62 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.

  6. A spatially resolved study of photoelectric heating and [CII] cooling in the LMC

    CERN Document Server

    Rubin, D; Madden, S C; Tielens, A G G M; Meixner, M; Indebetouw, R; Reach, W; Ginsburg, A; Kim, S; Mochizuki, K; Babler, B; Block, M; Bracker, S B; Engelbracht, C W; For, B -Q; Gordon, K; Hora, J L; Leitherer, C; Meade, M; Misselt, K; Sewilo, M; Vijh, U; Whitney, B

    2008-01-01

    (abridged) We study photoelectric heating throughout the Large Magellanic Cloud. We quantify the importance of the [CII] cooling line and the photoelectric heating process of various environments in the LMC and investigate which parameters control the extent of photoelectric heating. We use the BICE [CII] map and the Spitzer/SAGE infrared maps. We examine the spatial variations in the efficiency of photoelectric heating: photoelectric heating rate over power absorbed by grains. We correlate the photoelectric heating efficiency and the emission from various dust constituents and study the variations as a function of H\\alpha emission, dust temperatures, and the total infrared luminosity. From this we estimate radiation field, gas temperature, and electron density. We find systematic variations in photoelectric efficiency. The highest efficiencies are found in the diffuse medium, while the lowest coincide with bright star-forming regions (~1.4 times lower). The [CII] line emission constitutes 1.32% of the far in...

  7. [CII] At 1 Star Formation in the Early Universe with Zeus (1 and 2)

    Science.gov (United States)

    Ferkinhoff, Carl; Hailey-Dunsheath, S.; Nikola, T.; Oberst, T.; Parshley, S.; Stacey, G.; Benford, D.; staguhn, J.

    2010-01-01

    We report the detection of the [CII] 158 micron fine structure line from six submillimeter galaxies with redshifts between 1.12 and 1.73. This more than doubles the total number of [CII] 158 micron detections reported from high redshift sources. These observations were made with the Redshift(z) and Early Universe Spectrometer(ZEUS) at the Caltech Submillimeter Observatory on Mauna Kea, Hawaii between December 2006 and March 2009. ZEUS is a background limited submm echelle grating spectrometer (Hailey-Dunsheath 2009). Currently we are constructing ZEUS-2. This new instrument will utilize the same grating but will feature a two dimensional transition-edge sensed bolometer array with SQUID multiplexing readout system enabling simultaneous background limited observations in the 200, 340,450 and 650 micron telluric windows. ZEUS-2 will allow for long slit imaging spectroscopy in nearby galaxies and a [CII] survey from z 0.25 to 2.5.

  8. ALMA [CII] detection of a redshift 7 lensed galaxy behind RXJ1347.1-1145

    CERN Document Server

    Bradač, Maruša; Huang, Kuang-Han; Vallini, Livia; Finney, Emily; Hoag, Austin; Lemaux, Brian; Schmidt, Kasper; Treu, Tommaso; Carilli, Chris; Dijkstra, Mark; Ferrara, Andrea; Fontana, Adriano; Jones, Tucker; Ryan, Russell; Wagg, Jeff

    2016-01-01

    We present the results of ALMA spectroscopic follow-up of a $z=6.765$ Lyman-$\\alpha$ emitting galaxy behind the cluster RXJ1347-1145. We report the detection of {\\ctf} line fully consistent with the Lyman-$\\alpha$ redshift and with the peak of the optical emission. Given the magnification of $\\mu=5.0 \\pm 0.3$ the intrinsic (corrected for lensing) luminosity of the [CII] line is $L_{[CII]} =1.4^{+0.2}_{-0.3} \\times 10^7L_{\\odot}$, which is ${\\sim}5$ times fainter than other detections of $z\\sim 7$ galaxies. The result indicates that low $L_{[CII]}$ in $z\\sim 7$ galaxies compared to the local counterparts are likely caused by their low metallicities and/or feedback. The small velocity off-set ($\\Delta v = 20_{-40}^{+140}\\mbox{km/s}$) between the Lyman-$\\alpha$ and [CII] line is unusual, and may be indicative of ionizing photons escaping.

  9. Measuring the Epoch of Reionization using [CII] Intensity Mapping with TIME-Pilot

    Science.gov (United States)

    Crites, Abigail; Bock, James; Bradford, Matt; Bumble, Bruce; Chang, Tzu-Ching; Cheng, Yun-Ting; Cooray, Asantha R.; Hailey-Dunsheath, Steve; Hunacek, Jonathon; Li, Chao-Te; O'Brient, Roger; Shirokoff, Erik; Staniszewski, Zachary; Shiu, Corwin; Uzgil, Bade; Zemcov, Michael B.; Sun, Guochao

    2017-01-01

    TIME-Pilot (the Tomographic Ionized carbon Intensity Mapping Experiment) is a new instrument designed to probe the epoch of reionization (EoR) by measuring the 158 um ionized carbon emission line [CII] from redshift 5 - 9. TIME-Pilot will also probe the molecular gas content of the universe during the epoch spanning the peak of star formation (z ~ 1 -3) by making an intensity mapping measurement of the CO transitions in the TIME-Pilot band (CO(3-2), CO(4-3), CO(5-4), and CO(6-5)). I will describe the instrument we are building which is an R of ~100 spectrometer sensitive to the 200-300 GHz radiation. The camera is designed to measure the line emission from galaxies using an intensity mapping technique. This instrument will allow us to detect the [CII] clustering fluctuations from faint galaxies during EoR and compare these measurements to predicted [CII] amplitudes from current models. The CO measurements will allow us to constrain models for galaxies at lower redshift. The [CII] intensity mapping measurements that will be made with TIME-Pilot and detailed measurements made with future more sensitive mm-wavelength spectrometers are complimentary to 21-cm measurements of the EoR and complimentary to direct detections of high redshift galaxies with HST, ALMA, and, in the future, JWST.

  10. The Escherichia coli RNA polymerase alpha subunit and transcriptional activation by bacteriophage lambda CII protein.

    Science.gov (United States)

    Gabig, M; Obuchowski, M; Ciesielska, A; Latała, B; Wegrzyn, A; Thomas, M S; Wegrzyn, G

    1998-01-01

    Bacteriophage lambda is not able to lysogenise the Escherichia coli rpoA341 mutant. This mutation causes a single amino acid substitution Lys271Glu in the C-terminal domain of the RNA polymerase alpha subunit (alphaCTD). Our previous studies indicated that the impaired lysogenisation of the rpoA341 host is due to a defect in transcriptional activation by the phage CII protein and suggested a role for alphaCTD in this process. Here we used a series of truncation and point mutants in the rpoA gene placed on a plasmid to investigate the process of transcriptional activation by the cII gene product. Our results indicate that amino-acid residues 265, 268 and 271 in the a subunit may play an important role in the CII-mediated activation of the pE promoter (most probably residue 271) or may be involved in putative interactions between alphaCTD and an UP-like element near pE (most probably residues 265 and 268). Measurement of the activity of pE-lacZ, pI-lacZ and p(aQ)-lacZ fusions in the rpoA+ and rpoA341 hosts demonstrated that the mechanism of activation of these CII-dependent promoters may be in each case different.

  11. Selected issues of the universal communication environment implementation for CII standard

    Science.gov (United States)

    Zagoździńska, Agnieszka; Poźniak, Krzysztof T.; Drabik, Paweł K.

    2011-10-01

    In the contemporary FPGA market there is the wide assortment of structures, integrated development environments, and boards of different producers. The variety allows to fit resources to requirements of the individual designer. There is the need of standardization of the projects to make it useful in research laboratories equipped with different producers tools. Proposed solution is CII standardization of VHDL components. This paper contains specification of the universal communication environment for CII standard. The link can be used in different FPGA structures. Implementation of the link enables object oriented VHDL programming with the use of CII standardization. The whole environment contains FPGA environment and PC software. The paper contains description of the selected issues of FPGA environment. There is description of some specific solutions that enables environment usage in structures of different producers. The flexibility of different size data transmissions with the use of CII is presented. The specified tool gives the opportunity to use FPGA structures variety fully and design faster and more effectively.

  12. Herschel Extreme Lensing Line Observations: [CII] Variations in Galaxies at Redshifts z=1–3

    Science.gov (United States)

    Malhotra, Sangeeta; Rhoads, James E.; Finkelstein, K.; Yang, Huan; Carilli, Chris; Combes, Françoise; Dassas, Karine; Finkelstein, Steven; Frye, Brenda; Gerin, Maryvonne; Guillard, Pierre; Nesvadba, Nicole; Rigby, Jane; Shin, Min-Su; Spaans, Marco; Strauss, Michael A.; Papovich, Casey

    2017-01-01

    We observed the [C ii] line in 15 lensed galaxies at redshifts 1 HELLO sample is similar to the values seen for low-redshift galaxies, indicating that small grains and PAHs dominate the heating in the neutral ISM, although some of the high [CII]/FIR ratios may be due to turbulent heating. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.

  13. GRB 980425 host: [CII], [OI] and CO lines reveal recent enhancement of star formation due to atomic gas inflow

    CERN Document Server

    Michałowski, Michał J; Wardlow, J L; Karska, A; Messias, H; van der Werf, P; Hunt, L K; Baes, M; Castro-Tirado, A J; Gentile, G; Hjorth, J; Floc'h, E Le; Martinez, R Perez; Guelbenzu, A Nicuesa; Rasmussen, J; Rizzo, J R; Rossi, A; Sanchez-Portal, M; Schady, P; Sollerman, J; Xu, D

    2016-01-01

    We have recently suggested that gas accretion can be studied using host galaxies of gamma-ray bursts (GRBs). We obtained the first ever far-infrared (FIR) line observations of a GRB host, namely Herschel/PACS resolved [CII] 158 um and [OI] 63 um spectroscopy, as well as APEX CO(2-1) and ALMA CO(1-0) observations of the GRB 980425 host. It has elevated [CII]/FIR and [OI]/FIR ratios and higher values of star formation rate (SFR) derived from line ([CII], [OI], Ha) than from continuum (UV, IR, radio) indicators. [CII] emission exhibits a normal morphology, peaking at the galaxy center, whereas [OI] is concentrated close to the GRB position and the nearby Wolf-Rayet region. The high [OI] flux indicates high radiation field and gas density. The [CII]/CO luminosity ratio of the GRB 980425 host is close to the highest values found for local star-forming galaxies. Its CO-derived molecular gas mass is low given its SFR and metallicity, but the [CII]-derived molecular gas mass is close to the expected value. The [OI] a...

  14. Velocity resolved [CII] spectroscopy of the center and the BCLMP302 region of M33 (HerM33es)

    CERN Document Server

    Mookerjea, B; Kramer, C; Nikola, T; Braine, J; Ossenkopf, V; Roellig, M; Henkel, C; van der Werf, P; van der Tak, F; Wiedner, M C

    2015-01-01

    We aim to understand the contribution of the ionized, atomic and molecular phases of the ISM to the [CII] emission from clouds near the dynamical center and the BCLMP302 HII region in the north of the nearby galaxy M33 at a spatial resolution of 50pc. We combine high resolution [CII] spectra taken with the HIFI spectrometer onboard the Herschel satellite with [CII] Herschel-PACS maps and ground-based observations of CO(2-1) and HI. All data are at a common spatial resolution of 50pc. Typically, the [CII] lines have widths intermediate between the narrower CO(2-1) and broader HI line profiles. We decomposed the [CII] spectra in terms of contribution from molecular and atomic gas detected in CO(2-1) and HI, respectively. We find that the relative contribution of molecular and atomic gas traced by CO(2-1) and HI varies depends mostly on the local physical conditions and geometry. We estimate that 11-60% and 5-34% of the [CII] intensities in the center and in BCLMP302, respectively, arise at velocities showing no...

  15. Neon and [CII] 158 micron Emission Line Profiles in Dusty Starbursts and Active Galactic Nuclei

    CERN Document Server

    Samsonyan, Anahit; Lebouteiller, Vianney; Barry, Donald; Sargsyan, Lusine

    2016-01-01

    The sample of 379 extragalactic sources is presented that have mid-infrared, high resolution spectroscopy with the Spitzer Infrared Spectrograph (IRS) and also spectroscopy of the [CII] 158 um line with the Herschel Photodetector Array Camera and Spectrometer (PACS). The emission line profiles of [NeII] 12.81 um, [NeIII] 15.55 um, and [CII] 158 um are presented, and intrinsic line widths are determined (full width half maximum of Gaussian profiles after instrumental correction). All line profiles together with overlays comparing positions of PACS and IRS observations are made available in the Cornell Atlas of Spitzer IRS Sources (CASSIS). Sources are classified from AGN to starburst based on equivalent widths of the 6.2 um polycyclic aromatic hydrocarbon feature. It is found that intrinsic line widths do not change among classification for [CII], with median widths of 207 km per s for AGN, 248 km per s for composites, and 233 km per s for starbursts. The [NeII] line widths also do not change with classificati...

  16. [CII] absorption and emission in the diffuse interstellar medium across the Galactic Plane

    CERN Document Server

    Gerin, M; Goicoechea, J R; Gusdorf, A; Godard, B; de Luca, M; Falgarone, E; Goldsmith, P F; Lis, D C; Menten, K M; Neufeld, D; Phillips, T G; Liszt, H

    2014-01-01

    Ionized carbon is the main gas-phase reservoir of carbon in the neutral diffuse interstellar medium and its 158 micron fine structure transition [CII] is the most important cooling line of the diffuse interstellar medium (ISM). We combine [CII] absorption and emission spectroscopy to gain an improved understanding of physical conditions in the different phases of the ISM. We present high resolution [CII] spectra obtained with the Herschel/HIFI instrument towards bright dust continuum sources regions in the Galactic plane, probing simultaneously the diffuse gas along the line of sight and the background high-mass star forming regions. These data are complemented by observations of the 492 and 809 GHz fine structure lines of atomic carbon and by medium spectral resolution spectral maps of the fine structure lines of atomic oxygen at 63 and 145 microns with Herschel/PACS. We show that the presence of foreground absorption may completely cancel the emission from the background source in medium spectral resolution...

  17. Globules and pillars seen in the [CII] 158 micron line with SOFIA

    CERN Document Server

    Schneider, N; Tremblin, P; Hennemann, M; Minier, V; Hill, T; Comerón, F; Requena-Torres, M A; Kraemer, K E; Simon, R; Röllig, M; Stutzki, J; Djupvik, A A; Zinnecker, H; Marston, A; Csengeri, T; Cormier, D; Lebouteiller, V; Audit, E; Motte, F; Bontemps, S; Sandell, G; Allen, L; Megeath, T; Gutermuth, R A

    2012-01-01

    Molecular globules and pillars are spectacular features, found only in the interface region between a molecular cloud and an HII-region. Impacting Far-ultraviolet (FUV) radiation creates photon dominated regions (PDRs) on their surfaces that can be traced by typical cooling lines. With the GREAT receiver onboard SOFIA we mapped and spectrally resolved the [CII] 158 micron atomic fine-structure line and the highly excited 12CO J=11-10 molecular line from three objects in Cygnus X (a pillar, a globule, and a strong IRAS source). We focus here on the globule and compare our data with existing Spitzer data and recent Herschel Open-Time PACS data. Extended [CII] emission and more compact CO-emission was found in the globule. We ascribe this emission mainly to an internal PDR, created by a possibly embedded star-cluster with at least one early B-star. However, external PDR emission caused by the excitation by the Cyg OB2 association cannot be fully excluded. The velocity-resolved [CII] emission traces the emission ...

  18. Investigating overdensities around z>6 galaxies through ALMA observations of [CII

    CERN Document Server

    Miller, Tim B; Hayward, Christopher C; Behroozi, Peter S; Bradford, C Matt; Willott, Chris J; Wagg, Jeff

    2016-01-01

    We present a search for companion [CII] emitters to known luminous sources at $6<$ z $<6.5$ in deep ALMA observations. Our data is deep enough to detect sources down to L$_{\\rm [CII]} \\sim 10^8$ at z $\\sim6$. We identify five robust line detections from a blind search of five deep fields centered on ultra-luminous infrared galaxies and QSOs, suggesting these objects may be highly biased tracers of mass in the early Universe. We find these companion lines to have comparable properties to other known galaxies at the same epoch. All companions lie less than 650 km s$^{-1}$ and between 20-70 kpc (projected) from their central source, providing a constraint on their average halo masses of 1.4$\\times$10$^{12}$ M$_\\odot$. To place these discoveries in context, we employ a mock galaxy catalog to estimate the luminosity function for [CII] during reionization and compare to our observations. The simulations support this result by showing a similar level of elevated counts found around such luminous sources. Final...

  19. Constraining star formation through redshifted CO and CII emission in archival CMB data

    Science.gov (United States)

    Switzer, Eric

    LCDM is a strikingly successful paradigm to explain the CMB anisotropy and its evolution into observed galaxy clustering statistics. The formation and evolution of galaxies within this context is more complex and only partly characterized. Measurements of the average star formation and its precursors over cosmic time are required to connect theories of galaxy evolution to LCDM evolution. The fine structure transition in CII at 158 um traces star formation rates and the ISM radiation environment. Cold, molecular gas fuels star formation and is traced well by a ladder of CO emission lines. Catalogs of emission lines in individual galaxies have provided the most information about CII and CO to-date but are subject to selection effects. Intensity mapping is an alternative approach to measuring line emission. It surveys the sum of all line radiation as a function of redshift, and requires angular resolution to reach cosmologically interesting scales, but not to resolve individual sources. It directly measures moments of the luminosity function from all emitting objects. Intensity mapping of CII and CO can perform an unbiased census of stars and cold gas across cosmic time. We will use archival COBE-FIRAS and Planck data to bound or measure cosmologically redshifted CII and CO line emission through 1) the monopole spectrum, 2) cross-power between FIRAS/Planck and public galaxy survey catalogs from BOSS and the 2MASS redshift surveys, 3) auto-power of the FIRAS/Planck data itself. FIRAS is unique in its spectral range and all-sky coverage, provided by the space-borne FTS architecture. In addition to sensitivity to a particular emission line, intensity mapping is sensitive to all other contributions to surface brightness. We will remove CMB and foreground spatial and spectral templates using models from WMAP and Planck data. Interlopers and residual foregrounds additively bias the auto-power and monopole, but both can still be used to provide rigorous upper bounds. The

  20. Physical Conditions of the Gas in an ALMA [CII]-identified Submillimetre Galaxy at z = 4.44

    CERN Document Server

    Huynh, M T; Coppin, K E K; Emonts, B H C; Ivison, R J; Seymour, N; Smail, Ian; Smolcic, V; Swinbank, A M; Brandt, W N; Chapman, S C; Dannerbauer, H; De Breuck, C; Greve, T R; Hodge, J A; Karim, A; Knudsen, K K; Menten, K M; van der Werf, P P; Walter, F; Weiss, A

    2013-01-01

    We present CO(2-1) observations of the submillimetre galaxy ALESS65.1 performed with the Australia Telescope Compact Array at 42.3 GHz. A previous ALMA study of submillimetre galaxies in the Extended Chandra Deep Field South detected [CII] 157.74 micron emission from this galaxy at a redshift of z = 4.44. No CO(2-1) emission was detected but we derive a firm upper limit to the cold gas mass in ALESS65.1 of M_gas 4 SMGs being the likely progenitors of massive red-and-dead galaxies at z > 2. The ratio of the [CII], CO and far-infrared luminosities implies a strong far-ultraviolet field of G_0 > 10^3, as seen in Galactic star forming regions or local ULIRGs. The observed L_[CII]/L_FIR = 2.3 x 10^{-3} is high compared to local ULIRGs and, combined with L_[CII]/L_CO > 2700, it is consistent with ALESS65.1 either having an extended (several kpc) [CII] emitting region or lower than solar metallicity.

  1. Detector modules and spectrometers for the TIME-Pilot [CII] intensity mapping experiment

    Science.gov (United States)

    Hunacek, Jonathon; Bock, James; Bradford, C. Matt; Bumble, Bruce; Chang, Tzu-Ching; Cheng, Yun-Ting; Cooray, Asantha; Crites, Abigail; Hailey-Dunsheath, Steven; Gong, Yan; Li, Chao-Te; O'Brient, Roger; Shirokoff, Erik; Shiu, Corwin; Sun, Jason; Staniszewski, Zachary; Uzgil, Bade; Zemcov, Michael

    2016-07-01

    This proceeding presents the current TIME-Pilot instrument design and status with a focus on the close-packed modular detector arrays and spectrometers. Results of laboratory tests with prototype detectors and spectrometers are discussed. TIME-Pilot is a new mm-wavelength grating spectrometer array under development that will study the Epoch of Reionization (the period of time when the first stars and galaxies ionized the intergalactic medium) by mapping the fluctuations of the redshifted 157:7 μm emission line of singly ionized carbon ([CII]) from redshift z 5:2 to 8:5. As a tracer of star formation, the [CII] power spectrum can provide information on the sources driving reionization and complements 21 cm data (which traces neutral hydrogen in the intergalactic medium). Intensity mapping provides a measure of the mean [CII] intensity without the need to resolve and detect faint sources individually. We plan to target a 1 degree by 0.35 arcminute field on the sky and a spectral range of 199-305 GHz, producing a spatial-spectral slab which is 140 Mpc by 0.9 Mpc on-end and 1230 Mpc in the redshift direction. With careful removal of intermediate-redshift CO sources, we anticipate a detection of the halo-halo clustering term in the [CII] power spectrum consistent with current models for star formation history in 240 hours on the JCMT. TIME-Pilot will use two stacks of 16 parallel-plate waveguide spectrometers (one stack per polarization) with a resolving power R 100 and a spectral range of 183 to 326 GHz. The range is divided into 60 spectral channels, of which 16 at the band edges on each spectrometer serve as atmospheric monitors. The diffraction gratings are curved to produce a compact instrument, each focusing the diffracted light onto an output arc sampled by the 60 bolometers. The bolometers are built in buttable dies of 8 (low freqeuency) or 12 (high frequency) spectral channels by 8 spatial channels and are mated to the spectrometer stacks. Each detector

  2. Bright [CII] and dust emission in three z>6.6 quasar host galaxies observed by ALMA

    CERN Document Server

    Venemans, B P; Zschaechner, L; Decarli, R; De Rosa, G; Findlay, J R; McMahon, R G; Sutherland, W J

    2015-01-01

    We present ALMA detections of the [CII] 158 micron emission line and the underlying far-infrared continuum of three quasars at 6.6~6 quasar hosts correlate with the quasar's bolometric luminosity. In one quasar, the [CII] line is significantly redshifted by ~1700 km/s with respect to the MgII broad emission line. Comparing to values in the literature, we find that, on average, the MgII is blueshifted by 480 km/s (with a standard deviation of 630 km/s) with respect to the host galaxy redshift, i.e. one of our quasars is an extreme outlier. Through modeling we can rule out a flat rotation curve for our brightest [CII] emitter. Finally, we find that the ratio of black hole mass to host galaxy (dynamical) mass is higher by a factor 3-4 (with significant scatter) than local relations.

  3. Isolation of Escherichia coli rpoB mutants resistant to killing by lambda cII protein and altered in pyrE gene attenuation

    DEFF Research Database (Denmark)

    Hammer, Karin; Jensen, Kaj Frank; Poulsen, Peter;

    1987-01-01

    Escherichia coli mutants simultaneously resistant to rifampin and to the lethal effects of bacteriophage lambda cII protein were isolated. The sck mutant strains carry alterations in rpoB that allow them to survive cII killing (thus the name sck), but that do not impair either the expression of c...

  4. Sub-mm Emission Line Deep Fields: CO and [CII] Luminosity Functions out to z = 6

    CERN Document Server

    Popping, Gergö; Decarli, Roberto; Spaans, Marco; Somerville, Rachel S; Trager, Scott C

    2016-01-01

    Now that ALMA is reaching its full capabilities, observations of sub-mm emission line deep fields become feasible. Deep fields are ideal to study the luminosity function of sub-mm emission lines, ultimately tracing the atomic and molecular gas properties of galaxies. We couple a semi-analytic model of galaxy formation with a radiative transfer code to make predictions for the luminosity function of CO J=1-0 up to CO J=6-5 and [CII] at redshifts z=0-6. We find that: 1) our model correctly reproduces the CO and [CII] emission of low- and high-redshift galaxies and reproduces the available constraints on the CO luminosity function at z1.5 and the CO luminosity of individual galaxies at intermediate redshifts. We argue that this is driven by a lack of cold gas in galaxies at intermediate redshifts as predicted by cosmological simulations of galaxy formation. This may lay at the root of other problems theoretical models face at the same redshifts.

  5. Search for [CII] emission in z=6.5-11 star-forming galaxies

    CERN Document Server

    González-López, Jorge; Decarli, Roberto; Walter, Fabian; Vallini, Livia; Neri, Roberto; Bertoldi, Frank; Bolatto, Alberto D; Carilli, Christopher L; Cox, Pierre; da Cunha, Elisabete; Ferrara, Andrea; Gallerani, Simona; Infante, Leopoldo

    2014-01-01

    We present the search for the [CII] emission line in three $z>6.5$ Lyman-alpha emitters (LAEs) and one J-Dropout galaxy using the Combined Array for Research in Millimeter-wave Astronomy (CARMA) and the Plateau de Bure Interferometer (PdBI). We observed three bright $z\\sim6.5-7$ LAEs discovered in the SUBARU deep field (SDF) and the Multiple Imaged lensed $z\\sim 11$ galaxy candidate found behind the galaxy cluster MACSJ0647.7+7015. For the LAEs IOK-1 ($z=6.965$), SDF J132415.7+273058 ($z=6.541$) and SDF J132408.3+271543 ($z=6.554$) we find upper limits for the [CII] line luminosity of $<2.05$, $<4.52$ and $<10.56\\times10^{8}{\\rm L}_{\\odot}$ respectively. We find upper limits to the FIR luminosity of the galaxies using a spectral energy distribution template of the local galaxy NGC 6946 and taking into account the effects of the Cosmic Microwave Background on the mm observations. For IOK-1, SDF J132415.7+273058 and SDF J132408.3+271543 we find upper limits for the FIR luminosity of $<2.33$, $3.79$ ...

  6. Planck's Dusty GEMS. II. Extended [CII] emission and absorption in the Garnet at z=3.4 seen with ALMA

    CERN Document Server

    Nesvadba, N; Canameras, R; Boone, F; Falgarone, E; Frye, B; Gerin, M; Koenig, S; Lagache, G; Floc'h, E Le; Malhotra, S; Scott, D

    2016-01-01

    We present spatially resolved ALMA [CII] observations of the bright (flux density S=400 mJy at 350 microns), gravitationally lensed, starburst galaxy PLCK G045.1+61.1 at z=3.427, the "Garnet". This source is part of our set of "Planck's Dusty GEMS", discovered with the Planck's all-sky survey. Two emission-line clouds with a relative velocity offset of ~600 km/s extend towards north-east and south-west, respectively, of a small, intensely star-forming clump with a star-formation intensity of 220 Msun/yr/kpc^2, akin to maximal starbursts. [CII] is also seen in absorption, with a redshift of +350 km/s relative to the brightest CO component. [CII] absorption has previously only been found in the Milky Way along sightlines toward bright high-mass star-forming regions, and this is the first detection in another galaxy. Similar to Galactic environments, the [CII] absorption feature is associated with [CI] emission, implying that this is diffuse gas shielded from the UV radiation of the clump, and likely at large di...

  7. rctB mutations that increase copy number of Vibrio cholerae oriCII in Escherichia coli

    DEFF Research Database (Denmark)

    Koch, Birgit; Ma, Xiaofang; Løbner-Olesen, Anders

    2012-01-01

    RctB serves as the initiator protein for replication from oriCII, the origin of replication of Vibrio cholerae chromosome II. RctB is conserved between members of Vibrionaceae but shows no homology to known replication initiator proteins and has no recognizable sequence motifs. We used an ori...

  8. Bright [CII] 158$\\mu$m emission in a quasar host galaxy at $z=6.54$

    CERN Document Server

    Bañados, E; Walter, F; Venemans, B P; Farina, E P; Fan, X

    2015-01-01

    The [CII] 158$\\mu$m fine-structure line is known to trace regions of active star formation and is the main coolant of the cold, neutral atomic medium. In this \\textit{Letter}, we report a strong detection of the [CII] line in the host galaxy of the brightest quasar known at $z>6.5$, the Pan-STARRS1 selected quasar PSO J036.5078+03.0498 (hereafter P036+03), using the IRAM NOEMA millimeter interferometer. Its [CII] and total far-infrared luminosities are $(5.8 \\pm 0.7) \\times 10^9 \\,L_\\odot$ and $(7.6\\pm1.5) \\times 10^{12}\\,L_\\odot$, respectively. This results in a $L_{[CII]} /L_{TIR}$ ratio of $\\sim 0.8\\times 10^{-3}$, which is at the high end for those found for active galaxies, though it is lower than the average found in typical main sequence galaxies at $z\\sim 0$. We also report a tentative additional line which we identify as a blended emission from the $3_{22} - 3_{13}$ and $5_{23} - 4_{32}$ H$_2$O transitions. If confirmed, this would be the most distant detection of water emission to date. P036+03 riva...

  9. Synergy of CO/[CII]/Ly$\\alpha$ Line Intensity Mapping with the SKA

    CERN Document Server

    Chang, Tzu-Ching; Santos, Mario; Silva, Marta; Aguirre, James; Doré, Olivier; Pritchard, Jonathan

    2015-01-01

    We present the science enabled by cross-correlations of the SKA1-LOW 21-cm EoR surveys with other line mapping programs. In particular, we identify and investigate potential synergies with planned programs, such as the line intensity mapping of redshifted CO rotational lines, [CII] and Ly-$\\alpha$ emissions during reionization. We briefly describe how these tracers of the star-formation rate at $z \\sim 8$ can be modeled jointly before forecasting their auto- and cross-power spectra measurements with the nominal 21cm EoR survey. The use of multiple line tracers would be invaluable to validate and enrich our understanding of the EoR.

  10. The [CII]/[NII] far-infrared line ratio at z>5: extreme conditions for “normal” galaxies

    Science.gov (United States)

    Pavesi, Riccardo; Riechers, Dominik; Capak, Peter L.; Carilli, Chris Luke; Sharon, Chelsea E.; Stacey, Gordon J.; Karim, Alexander; Scoville, Nicholas; Smolcic, Vernesa

    2017-01-01

    Thanks to the Atacama Large (sub-)Millimeter Array (ALMA), observations of atomic far-infrared fine structure lines are a very productive way of measuring physical properties of the interstellar medium (ISM) in galaxies at high redshift, because they provide an unobscured view into the physical conditions of star formation. While the bright [CII] line has become a routine probe of the dynamical properties of the gas, its intensity needs to be compared to other lines in order to establish the physical origin of the emission. [NII] selectively traces the emission coming from the ionized fraction of the [CII]-emitting gas, offering insight into the phase structure of the ISM. Here we present ALMA measurements of [NII] 205 μm fine structure line emission from a representative sample of galaxies at z=5-6 spanning two orders of magnitude in star formation rate (SFR). Our results show at least two different regimes of ionized gas properties for galaxies in the first billion years of cosmic time, separated by their L[CII]/L[NII] ratio. First, we find extremely low [NII] emission compared to [CII] from a “typical” Lyman Break Galaxy (LBG-1), likely due to low dust content and reminiscent of local dwarfs. Second, the dusty Lyman Break Galaxy HZ10 and the extreme starburst AzTEC-3 show ionized gas fractions typical of local star-forming galaxies and show hints of spatial variations in their [CII]/[NII] line ratio. These observations of far-infrared lines in “normal” galaxies at z>5 yield some of the first constraints on ISM models for young galaxies in the first billion years of cosmic time and shed light on the observed evolution of the dust and gas properties.

  11. Alimentary lipemia: plasma high-density lipoproteins and apolipoproteins CII and CIII in healthy subjects.

    Science.gov (United States)

    Kashyap, M L; Barnhart, R L; Srivastava, L S; Perisutti, G; Allen, C; Hogg, E; Glueck, C J; Jackson, R L

    1983-02-01

    Three healthy male and three female inpatient volunteers consumed isocaloric diets for 4 wk. At weekly intervals, a fatty meal (100 g fat) was consumed by each fasting subject and blood drawn at 2 h intervals for 12 h. Of the four oral fat loads, two contained saturated fat (polyunsaturated/saturated fat ratio = 0.34) and two contained unsaturated fat (polyunsaturated/saturated fat = 2.21). The magnitude of alimentary lipemia, expressed as area under the plasma triglyceride curve, was 3- to 4-fold higher in males than females. Alimentary lipemia was inversely related to the subjects' fasting plasma high-density lipoprotein (HDL)-cholesterol, HDL apolipoprotein (apo) CIII and directly related to plasma triglycerides. The P/S ratios of the daily diet or the fat meal did not significantly influence the plasma triglyceride curve. After fat intake, mean (+/- SEM) plasma total apoCII and CIII fell to 54 +/- 20% and 73 +/- 5% of base-line, respectively, at 12 h in five of six subjects. After oral fat, an initial fall and a subsequent rise in apoCII and CIII in HDL was associated with reciprocal changes in apoC concentrations in very low-density lipoproteins. We speculate from the data that 1) plasma HDL and their apoC concentrations are important determinants of chylomicron clearance and 2) transfer of apoCs from HDL to triglyceride-rich lipoproteins in the early phase of fat absorption does not result in the total recycling of apoCs from these lipoproteins to HDL during the late phase of alimentary lipemia.

  12. Explaining the [CII]158um Deficit in Luminous Infrared Galaxies - First Results from a Herschel/PACS Study of the GOALS Sample

    CERN Document Server

    Diaz-Santos, T; Charmandaris, V; Stierwalt, S; Murphy, E J; Haan, S; Inami, H; Malhotra, S; Meijerink, R; Stacey, G; Petric, A O; Evans, A S; Veilleux, S; van der Werf, P P; Lord, S; Lu, N; Howell, J H; Appleton, P; Mazzarella, J M; Surace, J A; Xu, C K; Schulz, B; Sanders, D B; Bridge, C; Chan, B H P; Frayer, D T; Iwasawa, K; Melbourne, J; Sturm, E

    2013-01-01

    We present the first results of a survey of the [CII]158um emission line in 241 luminous infrared galaxies (LIRGs) comprising the Great Observatories All-sky Survey (GOALS) sample, obtained with the PACS instrument on board Herschel. The [CII] luminosities of the LIRGs in GOALS range from ~10^7 to 2x10^9 Lsun. We find that LIRGs show a tight correlation of [CII]/FIR with far-IR flux density ratios, with a strong negative trend spanning from ~10^-2 to 10^-4, as the average temperature of dust increases. We find correlations between the [CII]/FIR ratio and the strength of the 9.7um silicate absorption feature as well as with the luminosity surface density of the mid-IR emitting region (Sigma_MIR), suggesting that warmer, more compact starbursts have substantially smaller [CII]/FIR ratios. Pure star-forming (SF) LIRGs have a mean [CII]/FIR ~ 4x10^-3, while galaxies with low 6.2um PAH equivalent widths (EWs), indicative of the presence of active galactic nuclei (AGN), span the full range in [CII]/FIR. However, we...

  13. OT2_tvelusam_4: Probing Galactic Spiral Arm Tangencies with [CII

    Science.gov (United States)

    Velusamy, T.

    2011-09-01

    We propose to use the unique viewing geometry of the Galactic spiral arm tangents , which provide an ideal environment for studying the effects of density waves on spiral structure. We propose a well-sampled map of the[C II] 1.9 THz line emission along a 15-degree longitude region across the Norma-3kpc arm tangential, which includes the edge of the Perseus Arm. The COBE-FIRAS instrument observed the strongest [C II] and [N II] emission along these spiral arm tangencies.. The Herschel Open Time Key Project Galactic Observations of Terahertz C+ (GOT C+), also detects the strongest [CII] emission near these spiral arm tangential directions in its sparsely sampled HIFI survey of [CII] in the Galactic plane survey. The [C II] 158-micron line is the strongest infrared line emitted by the ISM and is an excellent tracer and probe of both the diffuse gases in the cold neutral medium (CNM) and the warm ionized medium (WIM). Furthermore, as demonstrated in the GOTC+ results, [C II] is an efficient tracer of the dark H2 gas in the ISM that is not traced by CO or HI observations. Thus, taking advantage of the long path lengths through the spiral arm across the tangencies, we can use the [C II] emission to trace and characterize the diffuse atomic and ionized gas as well as the diffuse H2 molecular gas in cloud transitions from HI to H2 and C+ to C and CO, throughout the ISM. The main goal of our proposal is to use the well sampled (at arcmin scale) [C II] to study these gas components of the ISM in the spiral-arm, and inter-arm regions, to constrain models of the spiral structure and to understand the influence of spiral density waves on the Galactic gas and the dynamical interaction between the different components. The proposed HIFI observations will consist of OTF 15 degree longitude scans and one 2-degree latitude scan sampled every 40arcsec across the Norma- 3kpc Perseus Spiral tangency.

  14. A 158 Micron [CII] Line Survey of Galaxies at z ~ 1 to 2: An Indicator of Star Formation in the Early Universe

    CERN Document Server

    Stacey, G J; Ferkinhoff, C; Nikola, T; Parshley, S C; Benford, D J; Staguhn, J G; Fiolet, N

    2010-01-01

    We have detected the 158 {\\mu}m [CII] line from 12 galaxies at z~1-2. This is the first survey of this important starformation tracer at redshifts covering the epoch of maximum star-formation in the Universe and quadruples the number of reported high z [CII] detections. The line is very luminous, between <0.024-0.65% of the far-infrared continuum luminosity of our sources, and arises from PDRs on molecular cloud surfaces. An exception is PKS 0215+015, where half of the [CII] emission could arise from XDRs near the central AGN. The L[CII] /LFIR ratio in our star-formation-dominated systems is ~8 times larger than that of our AGN-dominated systems. Therefore this ratio selects for star-formation-dominated systems. Furthermore, the L[CII]/LFIR and L[CII]/L(CO(1-0)) ratios in our starforming galaxies and nearby starburst galaxies are the same, so that luminous starforming galaxies at earlier epochs (z~1-2) appear to be scaled up versions of local starbursts entailing kilo-parsec-scale starbursts. Most of the F...

  15. Detection of molecular gas in an ALMA [CII]-identified Submillimetre Galaxy at z = 4.44

    CERN Document Server

    Huynh, M T; Norris, R P; Smail, Ian; Chow, K E; Coppin, K E K; Emonts, B H C; Ivison, R J; Smolcic, V; Swinbank, A M

    2014-01-01

    We present the detection of $^{12}$CO(2-1) in the $z = 4.44$ submillimetre galaxy ALESS65.1 using the Australia Telescope Compact Array. A previous ALMA study of submillimetre galaxies in the Extended Chandra Deep Field South determined the redshift of this optically and near-infrared undetected source through the measurement of [CII] 157.74 $\\mu$m emission. Using the luminosity of the $^{12}$CO(2-1) emission we estimate the gas mass to be $M_{\\rm gas} \\sim 1.7 \\times 10^{10}$ ${\\rm M}_\\odot$. The gas depletion timescale of ALESS65.1 is $\\sim$ 25 Myr, similar to other high redshift submillimetre galaxies and consistent with $z > 4$ SMGs being the progenitors of massive "red-and-dead" galaxies at $z > 2$. The ratio of the [CII], $^{12}$CO and far-infrared luminosities implies a strong far-ultraviolet field of $G_0 \\sim 10^{3.25}$, which is at the high end of the far-ultraviolet fields seen in local starbursts, but weaker than the far-ultraviolet fields of most nearby ULIRGs. The high ratio of $L_{\\rm [CII]}/L_...

  16. Design and Fabrication of TES Detector Modules for the TIME-Pilot [CII] Intensity Mapping Experiment

    Science.gov (United States)

    Hunacek, J.; Bock, J.; Bradford, C. M.; Bumble, B.; Chang, T.-C.; Cheng, Y.-T.; Cooray, A.; Crites, A.; Hailey-Dunsheath, S.; Gong, Y.; Kenyon, M.; Koch, P.; Li, C.-T.; O'Brient, R.; Shirokoff, E.; Shiu, C.; Staniszewski, Z.; Uzgil, B.; Zemcov, M.

    2016-08-01

    We are developing a series of close-packed modular detector arrays for TIME-Pilot, a new mm-wavelength grating spectrometer array that will map the intensity fluctuations of the redshifted 157.7 \\upmu m emission line of singly ionized carbon ([CII]) from redshift z ˜ 5 to 9. TIME-Pilot's two banks of 16 parallel-plate waveguide spectrometers (one bank per polarization) will have a spectral range of 183-326 GHz and a resolving power of R ˜ 100. The spectrometers use a curved diffraction grating to disperse and focus the light on a series of output arcs, each sampled by 60 transition edge sensor (TES) bolometers with gold micro-mesh absorbers. These low-noise detectors will be operated from a 250 mK base temperature and are designed to have a background-limited NEP of {˜ }10^{-17} mathrm {W}/mathrm {Hz}^{1/2}. This proceeding presents an overview of the detector design in the context of the TIME-Pilot instrument. Additionally, a prototype detector module produced at the Microdevices Laboratory at JPL is shown.

  17. Phospholipids enhance nucleation but not elongation of apolipoprotein C-II amyloid fibrils.

    Science.gov (United States)

    Ryan, Timothy M; Teoh, Chai L; Griffin, Michael D W; Bailey, Michael F; Schuck, Peter; Howlett, Geoffrey J

    2010-06-25

    Amyloid fibrils and their oligomeric intermediates accumulate in several age-related diseases where their presence is considered to play an active role in disease progression. A common characteristic of amyloid fibril formation is an initial lag phase indicative of a nucleation-elongation mechanism for fibril assembly. We have investigated fibril formation by human apolipoprotein (apo) C-II. ApoC-II readily forms amyloid fibrils in a lipid-dependent manner via an initial nucleation step followed by fibril elongation, breaking, and joining. We used fluorescence techniques and stopped-flow analysis to identify the individual kinetic steps involved in the activation of apoC-II fibril formation by the short-chain phospholipid dihexanoyl phosphatidylcholine (DHPC). Submicellar DHPC activates fibril formation by promoting the rapid formation of a tetrameric species followed by a slow isomerisation that precedes monomer addition and fibril growth. Global fitting of the concentration dependence of apoC-II fibril formation showed that DHPC increased the overall tetramerisation constant from 7.5 x 10(-13) to 1.2 x 10(-6) microM(-3) without significantly affecting the rate of fibril elongation, breaking, or joining. Studies on the effect of DHPC on the free pool of apoC-II monomer and on fibril formation by cross-linked apoC-II dimers further demonstrate that DHPC affects nucleation but not elongation. These studies demonstrate the capacity of small lipid compounds to selectively target individual steps in the amyloid fibril forming pathway.

  18. Ethyl pyruvate therapy attenuates experimental severe arthritis caused by type II collagen (CII) in the mouse (CIA).

    Science.gov (United States)

    Di Paola, R; Mazzon, E; Galuppo, M; Esposito, E; Bramanti, P; Fink, M P; Cuzzocrea, S

    2010-01-01

    This study tested the hypothesis that ethyl pyruvate (EP), a simple aliphatic ester with anti-inflammatory effects, can reduce type II collagen-induced mouse arthritis (CIA). DBA/1J mice were used for the study, developing erosive hind paw arthritis when immunized with CII in an emulsion in complete Freund?s adjuvant (CFA). The incidence of CIA was 100 percent by day 28 in the CII-challenged mice, and the severity of CIA progressed over a 35-day period with radiographic evaluation revealing focal resorption of bone. The histopathology of CIA included erosion of the cartilage at the joint margins. EP-treatment (40 mg/kg/day i.p.) starting at the onset of arthritis (day 25) ameliorated the clinical signs at days 26-35 and improved histological status in the joint and paw. Immunohistochemical analysis for nitrotyrosine, poly (ADP-ribose) (PAR), inducible nitric oxide synthase (iNOS) revealed a positive staining in inflamed joints from mice subjected to CIA, while no staining was observed for HO-1 and Nrf-2 in the same group. The degree of staining for nitrotyrosine, PAR, iNOS, was significantly reduced in CII-challenged mice treated with the EP. Immuno-positive-staining for HO-1 and Nrf-2 was observed instead, in joints obtained from the EP-treated group. Plasma levels of TNF-α, IL-6 and the joint tissue levels of macrophage inflammatory protein (MIP)-1α and MIP-2 were also significantly reduced by EP treatment. Thirty-five days after immunization, EP-treatment significantly increased plasma levels of IL-10. These data demonstrate that EP treatment exerts an anti-inflammatory effect during chronic inflammation and is able to ameliorate the tissue damage associated with CIA.

  19. A Pressure-dependent Model for the Regulation of Lipoprotein Lipase by Apolipoprotein C-II*

    Science.gov (United States)

    Meyers, Nathan L.; Larsson, Mikael; Olivecrona, Gunilla; Small, Donald M.

    2015-01-01

    Apolipoprotein C-II (apoC-II) is the co-factor for lipoprotein lipase (LPL) at the surface of triacylglycerol-rich lipoproteins. LPL hydrolyzes triacylglycerol, which increases local surface pressure as surface area decreases and amphipathic products transiently accumulate at the lipoprotein surface. To understand how apoC-II adapts to these pressure changes, we characterized the behavior of apoC-II at multiple lipid/water interfaces. ApoC-II adsorption to a triacylglycerol/water interface resulted in large increases in surface pressure. ApoC-II was exchangeable at this interface and desorbed on interfacial compressions. These compressions increase surface pressure and mimic the action of LPL. Analysis of gradual compressions showed that apoC-II undergoes a two-step desorption, which indicates that lipid-bound apoC-II can exhibit at least two conformations. We characterized apoC-II at phospholipid/triacylglycerol/water interfaces, which more closely mimic lipoprotein surfaces. ApoC-II had a large exclusion pressure, similar to that of apoC-I and apoC-III. However, apoC-II desorbed at retention pressures higher than those seen with the other apoCs. This suggests that it is unlikely that apoC-I and apoC-III inhibit LPL via displacement of apoC-II from the lipoprotein surface. Upon rapid compressions and re-expansions, re-adsorption of apoC-II increased pressure by lower amounts than its initial adsorption. This indicates that apoC-II removed phospholipid from the interface upon desorption. These results suggest that apoC-II regulates the activity of LPL in a pressure-dependent manner. ApoC-II is provided as a component of triacylglycerol-rich lipoproteins and is the co-factor for LPL as pressure increases. Above its retention pressure, apoC-II desorbs and removes phospholipid. This triggers release of LPL from lipoproteins. PMID:26026161

  20. 4th INNS Symposia Series on Computational Intelligence in Information Systems

    CERN Document Server

    Au, Thien

    2015-01-01

    This book constitutes the refereed proceedings of the Fourth International Neural Network Symposia series on Computational Intelligence in Information Systems, INNS-CIIS 2014, held in Bandar Seri Begawan, Brunei in November 2014. INNS-CIIS aims to provide a platform for researchers to exchange the latest ideas and present the most current research advances in general areas related to computational intelligence and its applications in various domains. The 34 revised full papers presented in this book have been carefully reviewed and selected from 72 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.  

  1. ALMA Spectroscopic Survey in the Hubble Ultra Deep Field: Search for [CII] line and dust emission in $6

    CERN Document Server

    Aravena, Manuel; Walter, Fabian; Bouwens, Rychard; Oesch, Pascal; Carilli, Christopher; Bauer, Franz E; Da Cunha, Elisabete; Daddi, Emanuele; Gónzalez-López, Jorge; Ivison, R J; Riechers, Dominik; Smail, Ian R; Swinbank, Mark; Weiss, Axel; Anguita, Timo; Bacon, Roland; Bell, Eric; Bertoldi, Frank; Cortes, Paulo; Cox, Pierre; Hodge, Jacqueline; Ibar, Eduardo; Inami, Hanae; Infante, Leopoldo; Karim, Alexander; Magnelli, Benjamin; Ota, Kauzuaki; Popping, Gergö; van der Werf, Paul; Wagg, Jeffrey

    2016-01-01

    We present a search for [CII] line and dust continuum emission from optical dropout galaxies at $z>6$ using ASPECS, our ALMA Spectroscopic Survey in the Hubble Ultra-Deep Field (UDF). Our observations, which cover the frequency range $212-272$ GHz, encompass approximately the range $6$4.5 $\\sigma$, two of which correspond to blind detections with no optical counterparts. At this significance level, our statistical analysis shows that about 60\\% of our candidates are expected to be spurious. For one of our blindly selected [CII] line candidates, we tentatively detect the CO(6-5) line in our parallel 3-mm line scan. None of the line candidates are individually detected in the 1.2 mm continuum. A stack of all [CII] candidates results in a tentative detection with $S_{1.2mm}=14\\pm5\\mu$Jy. This implies a dust-obscured star formation rate (SFR) of $(3\\pm1)$ M$_\\odot$ yr$^{-1}$. We find that the two highest--SFR objects have candidate [CII] lines with luminosities that are consistent with the low-redshift $L_{\\rm [C...

  2. A multi-wavelength exploration of the [CII]/IR ratio in H-ATLAS/GAMA galaxies out to z=0.2

    CERN Document Server

    Ibar, E; Herrera-Camus, R; Hopwood, R; Bauer, A; Ivison, R J; Michałowski, M J; Dannerbauer, H; van der Werf, P; Riechers, D; Bourne, N; Baes, M; Valtchanov, I; Dunne, L; Verma, A; Brough, S; Cooray, A; De Zotti, G; Dye, S; Eales, S; Furlanetto, C; Maddox, S; Smith, M; Steele, O; Thomas, D; Valiante, E

    2015-01-01

    We explore the behaviour of [CII]-157.74um forbidden fine-structure line observed in a sample of 28 galaxies selected from ~50deg^2 of the H-ATLAS survey. The sample is restricted to galaxies with flux densities higher than S_160um>150mJy and optical spectra from the GAMA survey at 0.022.5x10^-3 with respect to those showing lower ratios. In particular, those with high ratios tend to have: (1) L_IR) is main parameter responsible for controlling the [CII]/IR ratio. It is possible that relatively high creates a positively charged dust grain distribution, impeding an efficient photo-electric extraction of electrons from these grains to then collisionally excite carbon atoms. Within the brighter IR population, 11CII]/IR ratio is unlikely to be modified by [CII] self absorption or controlled by the presence of a moderately luminous AGN (identified via the BPT diagram).

  3. [CII] 158$\\mu$m and [NII] 205$\\mu$m emission from IC 342 - Disentangling the emission from ionized and photo-dissociated regions

    CERN Document Server

    Röllig, Markus; Güsten, R; Stutzki, J; Israel, F; Jacobs, K

    2016-01-01

    Aims: We investigate how much of the [CII] emission in the nucleus of the nearby spiral galaxy IC 342 is contributed by PDRs and by the ionized gas. We examine the spatial variations of starburst/PDR activity and study the correlation of the [CII] line with the [NII] 205{\\textmu}m emission line coming exclusively from the HII regions. Methods: We present small maps of [CII] and [NII] lines recently observed with the GREAT receiver on board SOFIA. In particular we present a super-resolution method to derive how unresolved, kinematically correlated structures in the beam contribute to the observed line shapes. Results: We find that the emission coming from the ionized gas shows a kinematic component in addition to the general Doppler signature of the molecular gas. We interpret this as the signature of two bi-polar lobes of ionized gas expanding out of the galactic plane. We then show how this requires an adaptation of our understanding of the geometrical structure of the nucleus of IC~342. Examining the starbu...

  4. Gas and dust cooling along the major axis of M33 (HerM33es): ISO/LWS CII observations

    CERN Document Server

    Kramer, C; Garcia-Burillo, S; Relano, M; Aalto, S; Boquien, M; Braine, J; Buchbender, C; Gratier, P; Israel, F P; Nikola, T; Roellig, M; Verley, S; van der Werf, P; Xilouris, E M

    2013-01-01

    We aim to better understand the heating of the gas by observing the prominent gas cooling line [CII] at 158um in the low-metallicity environment of the Local Group spiral galaxy M33 at scales of 280pc. In particular, we aim at describing the variation of the photoelectric heating efficiency with galactic environment. In this unbiased study, we used ISO/LWS [CII] observations along the major axis of M33, in combination with Herschel PACS and SPIRE continuum maps, IRAM 30m CO 2-1 and VLA HI data to study the variation of velocity integrated intensities. The ratio of [CII] emission over the far-infrared continuum is used as a proxy for the heating efficiency, and models of photon-dominated regions are used to study the local physical densities, FUV radiation fields, and average column densities of the molecular clouds. The heating efficiency stays constant at 0.8% in the inner 4.5kpc radius of the galaxy where it starts to increase to reach values of ~3% in the outskirts at about 6kpc radial distance. The rise o...

  5. Lipoprotein lipase activity and mass, apolipoprotein C-II mass and polymorphisms of apolipoproteins E and A5 in subjects with prior acute hypertriglyceridaemic pancreatitis

    Directory of Open Access Journals (Sweden)

    García-Arias Carlota

    2009-06-01

    Full Text Available Abstract Background Severe hypertriglyceridaemia due to chylomicronemia may trigger an acute pancreatitis. However, the basic underlying mechanism is usually not well understood. We decided to analyze some proteins involved in the catabolism of triglyceride-rich lipoproteins in patients with severe hypertriglyceridaemia. Methods Twenty-four survivors of acute hypertriglyceridaemic pancreatitis (cases and 31 patients with severe hypertriglyceridaemia (controls were included. Clinical and anthropometrical data, chylomicronaemia, lipoprotein profile, postheparin lipoprotein lipase mass and activity, hepatic lipase activity, apolipoprotein C II and CIII mass, apo E and A5 polymorphisms were assessed. Results Only five cases were found to have LPL mass and activity deficiency, all of them thin and having the first episode in childhood. No cases had apolipoprotein CII deficiency. No significant differences were found between the non-deficient LPL cases and the controls in terms of obesity, diabetes, alcohol consumption, drug therapy, gender distribution, evidence of fasting chylomicronaemia, lipid levels, LPL activity and mass, hepatic lipase activity, CII and CIII mass or apo E polymorphisms. However, the SNP S19W of apo A5 tended to be more prevalent in cases than controls (40% vs. 23%, NS. Conclusion Primary defects in LPL and C-II are rare in survivors of acute hypertriglyceridaemic pancreatitis; lipase activity measurements should be restricted to those having their first episode during chilhood.

  6. A Foreground Removal Strategy for future C[II] Intensity Mapping Experiments: Insights From Galaxies Selected by Stellar Mass and Redshift

    CERN Document Server

    Sun, Guochao; Viero, Marco P; Bock, Jamie; Bradford, C Matt; Chang, Tzu-Ching; Cheng, Yun-Ting; Cooray, Asantha; Crites, Abigail; Hailey-Dunsheath, Steve; Hunacek, Jonathon; Uzgil, Bade; Zemcov, Michael

    2016-01-01

    Intensity mapping provides a unique avenue to understand the epoch of reionization (EoR), which occurred approximately 500 million to 1 billion years after the Big Bang. The C[II] 158$\\mu$m fine-structure line is one of the brightest emission lines of typical star-forming galaxies and a promising tracer of the global star-formation activity during the epoch of reionization. However, C[II] intensity maps are contaminated by interloping CO rotational line emission ($3 \\leq J_{\\rm upp} \\leq 6$) from lower-redshift galaxies, whose total power is a function of the population's stochasticity. Here we present a model of CO contamination from foreground galaxies to guide the masking strategy of future C[II] intensity mapping experiments. The model is based on empirical measurements of the mean and scatter of the bolometric infrared luminosities, converted to CO line strengths, of galaxies at $z 8$. We find that the addition of scatter, parameterized by a log-normal distribution with $\\sigma = 0.33\\pm 0.04$\\,dex, to ...

  7. Hydrolysis of guinea pig nascent very low density lipoproteins catalyzed by lipoprotein lipase: activation by hjman apolipoprotein C-II.

    Science.gov (United States)

    Fitzharris, T J; Quinn, D M; Goh, E H; Johnson, J D; Kashyap, M L; Srivastava, L S; Jackson, R L; Harmony, J A

    1981-08-01

    Very low density lipoproteins isolated from guinea pig liver perfusate (VLDLp) lack the equivalent of human apolipoprotein C-II (apoC-II), the activator of lipoprotein lipase (LpL). These lipoproteins are therefore ideal substrates with which to investigate the mechanism by which apoC-II activates the enzyme. VLDLp binds apoC-II, and apoC-II associated with VLDLp markedly increases the rate of lipoprotein lipase-catalyzed hydrolysis of VLDLp-triglycerides. The activator potency of apoC-II is independent of the method of enrichment of VLDLp with apoC-II: delipidated human apoC-II and apoC-II transferred from human high density lipoproteins activate lipoprotein lipase to equal extents. ApoC-II causes pH-dependent changes in both apparent Km and VmaX of LpL-catalyzed hydrolysis of VLDLp-triglycerides. At pH l7.4--7.5, the major effects of apoC-II is to decrease the apparent Km by 3.3--4.0 fold. The apparent Vmax is increased 1.3-fold. At pH 6.5 and 8.5, the decrease of apparent Km is less marked, 1.6-fold and 1.4-fold, respectively. At pH 6.5, apoC-II increases the apparent Vmax ty 1.3-fold, while at pH 8.5 the primary effect of apoC-II is a 1.6-fold increase of apparent Vmax. Based on a simple kinetic model, the data suggest that apoC-II favors direct interaction between enzyme and triglyceride within the lipoprotein particle, as well as subsequent catalytic turnover.

  8. [CII] and $^{12}$CO(1-0) Emission Maps in HLSJ091828.6+514223: A Strongly Lensed Interacting System at $z=5.24$

    CERN Document Server

    Rawle, T D; Bussmann, R S; Gurwell, M; Ivison, R J; Boone, F; Combes, F; Danielson, A L R; Rex, M; Richard, J; Smail, I; Swinbank, A M; Blain, A W; Clement, B; Dessauges-Zavadsky, M; Edge, A C; Fazio, G G; Jones, T; Kneib, J -P; Omont, A; Perez-Gonzalez, P G; Schaerer, D; Valtchanov, I; van der Werf, P P; Walth, G; Zamojski, M; Zemcov, M

    2013-01-01

    We present Submillimeter Array (SMA) [CII] 158um and Jansky Very Large Array (JVLA) $^{12}$CO(1-0) line emission maps for the bright, lensed, submillimeter source at $z=5.2430$ behind Abell 773: HLSJ091828.6+514223 (HLS0918). We combine these measurements with previously reported line profiles, including multiple $^{12}$CO rotational transitions, [CI], water and [NII], providing some of the best constraints on the properties of the interstellar medium (ISM) in a galaxy at $z>5$. HLS0918 has a total far-infrared (FIR) luminosity L_FIR(8-1000um) = (1.6$\\pm$0.1)x10^14 L_sun/mu, where the total magnification mu_total = 8.9$\\pm$1.9, via a new lens model from the [CII] and continuum maps. Despite a HyLIRG luminosity, the FIR continuum shape resembles that of a local LIRG. We simultaneously fit all of the observed spectral line profiles, finding four components which correspond cleanly to discrete spatial structures identified in the maps. The two most redshifted spectral components occupy the nucleus of a massive g...

  9. Witnessing the birth of the red sequence: ALMA high-resolution imaging of [CII] and dust in two interacting ultra-red starbursts at z = 4.425

    CERN Document Server

    Oteo, I; Dunne, L; Smail, I; Swinbank, M; Zhang, Z-Y; Lewis, A; Maddox, S; Riechers, D; Serjeant, S; Van der Werf, P; Bremer, M; Cigan, P; Clements, D L; Cooray, A; Dannerbauer, H; Eales, S; Ibar, E; Messias, H; Michałowski, M J; Pérez-Fournon, I; van Kampen, E

    2016-01-01

    Exploiting the sensitivity and spatial resolution of the Atacama Large Millimeter/submillimeter Array (ALMA), we have studied the morphology and the physical scale of the interstellar medium - both gas and dust - in SGP38326, an unlensed pair of interacting starbursts at $z= 4.425$. SGP38326 is the most luminous star bursting system known at $z > 4$ with an IR-derived ${\\rm SFR \\sim 4300 \\,} M_\\odot \\, {\\rm yr}^{-1}$. SGP38326 also contains a molecular gas reservoir among the most massive ever found in the early Universe, and it is the likely progenitor of a massive, red-and-dead elliptical galaxy at $z \\sim 3$. Probing scales of $\\sim 0.1"$ or $\\sim 800 \\, {\\rm pc}$ we find that the smooth distribution of the continuum emission from cool dust grains contrasts with the more irregular morphology of the gas, as traced by the [CII] fine structure emission. The gas is also extended over larger physical scales than the dust. The velocity information provided by the resolved [CII] emission reveals that the dynamics...

  10. Probing the Mass and Structure of the Ring Nebula in Lyra with SOFIA/GREAT Observations of the [CII] 158 micron line

    CERN Document Server

    Sahai, R; Werner, M W; Güsten, R; Wiesemeyer, H; Sandell, G

    2012-01-01

    We have obtained new velocity-resolved spectra of the [CII] 158 micron line towards the Ring Nebula in Lyra (NGC 6720), one of the best-studied planetary nebulae, in order to probe its controversial 3-dimensional structure and to estimate the mass of circumstellar material in this object. We used the Terahertz receiver GREAT aboard the SOFIA airborne telescope to obtain the [CII] spectra at eight locations within and outside the bright optical ring of NGC 6720. Emission was detected at all positions except for the most distant position along the nebula's minor axis, and generally covers a broad velocity range, ~50 km/s (FWZI), except at a position along the major axis located just outside the optical ring, where it is significantly narrower (~25 km/s). The one narrow spectrum appears to be probing circumstellar material lying outside the main nebular shell that has not been accelerated by past fast wind episodes from the central star, and therefore most likely comes from equatorial and/or low-latitude regions...

  11. Lipids and apolipoproteins A-I, B and C-II and different rapid weight loss programs (weight lifters, wrestlers, boxers and judokas).

    Science.gov (United States)

    Jauhiainen, M; Laitinen, M; Penttilä, I; Nousiainen, U; Ahonen, E

    1985-01-01

    Apolipoproteins A-I, B, C-II and lipids were studied before and after rapid weight loss schedules. The compared groups were all athletic. Apolipoproteins were determined by electroimmunoassay methods using apoproteins purified by chromatofocusing column method. Dextran T10 was shown to increase rocket height in ApoB assay. Over 1% Dextran concentrations gave poor response. The linearity during calibration was from 0.3 to 3.0 g ApoB/l. Baseline values of ApoA-I in wrestlers, weightlifters, boxers and judokas were slightly higher as compared to "normal" population; ApoB was clearly reduced (mean value of 690 mg/l). Weight-loss was significant in each experimental group; mean value of 4.1% in active exercise, sauna and diuretic groups together. Compared as the whole sportsmen group in passive weight loss (or sauna) and diuretic groups the most pronounced changes were seen to be elevated apoprotein concentrations, whereas weight-loss by active rapid exercise resulted no apoprotein changes, but instead an increment in HDL cholesterol and decrement in triglycerides, respectively. The present study was the first to evaluate baseline values of apoproteins A-I, B and C-II in first class athletes and also the possible changes in these and lipid values in rapid weight-loss practices.

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  15. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  18. COMPUTING

    CERN Document Server

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. Measuring Galaxy Clustering and the Evolution of [CII] Mean Intensity with far-IR Line Intensity Mapping During 0.5 < z < 1.5

    CERN Document Server

    Uzgil, Bade D; Bradford, Charles M; Lidz, Adam

    2014-01-01

    Infrared fine-structure emission lines from trace metals are powerful diagnostics of the interstellar medium in galaxies. We explore the possibility of studying the redshifted far-IR fine-structure line emission using the three-dimensional (3-D) power spectra obtained with an imaging spectrometer. The intensity mapping approach measures the spatio-spectral fluctuations due to line emission from all galaxies, including those below the individual detection threshold. The technique provides 3-D measurements of galaxy clustering and moments of the galaxy luminosity function. Furthermore, the linear portion of the power spectrum can be used to measure the total line emission intensity including all sources through cosmic time with redshift information naturally encoded. Total line emission, when compared to the total star formation activity and/or other line intensities reveals evolution of the interstellar conditions of galaxies in aggregate. As a case study, we consider measurement of [CII] autocorrelation in th...

  8. Shocked and Scorched: A GREAT Investigation of [CII] and [OI] emission from free-floating Evaporating Gas Globules in Massive Star Formation Regions

    Science.gov (United States)

    Sahai, Raghvendra

    We propose to use GREAT in order to observe [CII]158 micron and [OI]63 micron emission towards 3 select members of a new class of tadpole-shaped free-floating evaporating gas globules (frEGGs) in two massive star-formation regions. Since discovering the most prominent member of this class in an HST imaging survey, we have now identified substantial populations of such objects in several massive star-forming regions using Spitzer IRAC 8 micron images. By virtue of their distinct, isolated morphologies, frEGGs are ideal astrophysical laboratories for probing star formation in irradiated environments. Our molecular-line observations (CO, 13CO J=2-1 & HCO+ J=3-2) reveal the presence of dense molecular cores associated with these objects, with total masses of cool ( 15 K) molecular gas exceeding 0.5-3 Msun, and our radio continuum imaging reveals bright photo-ionized peripheries around these objects. This pilot study will allow us to determine the mass of warm (few 100 K) atomic gas which must exist in photodissociation regions surrounding the molecular gas in frEGGs. The line profiles will be used to probe the photoevaporative flow that is expected to drive the evolution of these objects. We will use sophisticated 3-D numerical simulations of dynamical and chemical evolution of dense, irradiated globules to reproduce our SOFIA data and additional existing multiwavelength data on frEGGs. Our proposed study will pave the way for a larger [CII] survey of frEGGs that will lead to new insights into the complex star formation process under the influence of the harsh ionizing radiation from massive stars.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  10. COMPUTING

    CERN Document Server

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  11. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  13. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  14. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  15. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  18. High performance computational integral imaging system using multi-view video plus depth representation

    Science.gov (United States)

    Shi, Shasha; Gioia, Patrick; Madec, Gérard

    2012-12-01

    Integral imaging is an attractive auto-stereoscopic three-dimensional (3D) technology for next-generation 3DTV. But its application is obstructed by poor image quality, huge data volume and high processing complexity. In this paper, a new computational integral imaging (CII) system using multi-view video plus depth (MVD) representation is proposed to solve these problems. The originality of this system lies in three aspects. Firstly, a particular depth-image-based rendering (DIBR) technique is used in encoding process to exploit the inter-view correlation between different sub-images (SIs). Thereafter, the same DIBR method is applied in the display side to interpolate virtual SIs and improve the reconstructed 3D image quality. Finally, a novel parallel group projection (PGP) technique is proposed to simplify the reconstruction process. According to experimental results, the proposed CII system improves compression efficiency and displayed image quality, while reducing calculation complexity. [Figure not available: see fulltext.

  19. Apolipoprotein C-II Is a Potential Serum Biomarker as a Prognostic Factor of Locally Advanced Cervical Cancer After Chemoradiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Harima, Yoko, E-mail: harima@takii.kmu.ac.jp [Department of Radiology, Takii Hospital, Kansai Medical University, Moriguchi, Osaka (Japan); Ikeda, Koshi; Utsunomiya, Keita; Komemushi, Atsushi; Kanno, Shohei; Shiga, Toshiko [Department of Radiology, Takii Hospital, Kansai Medical University, Moriguchi, Osaka (Japan); Tanigawa, Noboru [Department of Radiology, Hirakata Hospital, Kansai Medical University, Hirakata, Osaka (Japan)

    2013-12-01

    Purpose: To determine pretreatment serum protein levels for generally applicable measurement to predict chemoradiation treatment outcomes in patients with locally advanced squamous cell cervical carcinoma (CC). Methods and Materials: In a screening study, measurements were conducted twice. At first, 6 serum samples from CC patients (3 with no evidence of disease [NED] and 3 with cancer-caused death [CD]) and 2 from healthy controls were tested. Next, 12 serum samples from different CC patients (8 NED, 4 CD) and 4 from healthy controls were examined. Subsequently, 28 different CC patients (18 NED, 10 CD) and 9 controls were analyzed in the validation study. Protein chips were treated with the sample sera, and the serum protein pattern was detected by surface-enhanced laser desorption and ionization–time-of-flight mass spectrometry (SELDI-TOF MS). Then, single MS-based peptide mass fingerprinting (PMF) and tandem MS (MS/MS)-based peptide/protein identification methods, were used to identify protein corresponding to the detected peak. And then, turbidimetric assay was used to measure the levels of a protein that indicated the best match with this peptide peak. Results: The same peak 8918 m/z was identified in both screening studies. Neither the screening study nor the validation study had significant differences in the appearance of this peak in the controls and NED. However, the intensity of the peak in CD was significantly lower than that of controls and NED in both pilot studies (P=.02, P=.04) and validation study (P=.01, P=.001). The protein indicated the best match with this peptide peak at 8918 m/z was identified as apolipoprotein C-II (ApoC-II) using PMF and MS/MS methods. Turbidimetric assay showed that the mean serum levels of ApoC-II tended to decrease in CD group when compared with NED group (P=.078). Conclusion: ApoC-II could be used as a biomarker for detection in predicting and estimating the radiation treatment outcome of patients with CC.

  20. Apolipoprotein C-II and lipoprotein lipase show a temporal and geographic correlation with surfactant lipid synthesis in preparation for birth

    Directory of Open Access Journals (Sweden)

    Gérard-Hudon Marie-Christine

    2010-11-01

    Full Text Available Abstract Background Fatty acids are precursors in the synthesis of surfactant phospholipids. Recently, we showed expression of apolipoprotein C-II (apoC-II, the essential cofactor of lipoprotein lipase (LPL, in the fetal mouse lung and found the protein on the day of the surge of surfactant synthesis (gestation day 17.5 in secretory granule-like structures in the distal epithelium. In the present study, we will answer the following questions: Does apoC-II protein localization change according to the stage of lung development, thus according to the need in surfactant? Are LPL molecules translocated to the luminal surface of capillaries? Do the sites of apoC-II and LPL gene expression change according to the stage of lung development and to protein localization? Results The present study investigated whether the sites of apoC-II and LPL mRNA and protein accumulation are regulated in the mouse lung between gestation day 15 and postnatal day 10. The major sites of apoC-II and LPL gene expression changed over time and were found mainly in the distal epithelium at the end of gestation but not after birth. Accumulation of apoC-II in secretory granule-like structures was not systematically observed, but was found in the distal epithelium only at the end of gestation and soon after birth, mainly in epithelia with no or small lumina. A noticeable increase in surfactant lipid content was measured before the end of gestation day 18, which correlates temporally with the presence of apoC-II in secretory granules in distal epithelium with no or small lumina but not with large lumina. LPL was detected in capillaries at all the developmental times studied. Conclusions This study demonstrates that apoC-II and LPL mRNAs correlate temporally and geographically with surfactant lipid synthesis in preparation for birth and suggests that fatty acid recruitment from the circulation by apoC-II-activated LPL is regionally modulated by apoC-II secretion. We propose a model

  1. Acrylamide-induced carcinogenicity in mouse lung involves mutagenicity: cII gene mutations in the lung of big blue mice exposed to acrylamide and glycidamide for up to 4 weeks.

    Science.gov (United States)

    Manjanatha, Mugimane G; Guo, Li-Wu; Shelton, Sharon D; Doerge, Daniel R

    2015-06-01

    Potential health risks for humans from exposure to acrylamide (AA) and its epoxide metabolite glycidamide (GA) have garnered much attention lately because substantial amounts of AA are present in a variety of fried and baked starchy foods. AA is tumorigenic in rodents, and a large number of in vitro and in vivo studies indicate that AA is genotoxic. A recent cancer bioassay on AA demonstrated that the lung was one of the target organs for tumor induction in mice; however, the mutagenicity of AA in this tissue is unclear. Therefore, to investigate whether or not gene mutation is involved in the etiology of AA- or GA-induced mouse lung carcinogenicity, we screened for cII mutant frequency (MF) in lungs from male and female Big Blue (BB) mice administered 0, 1.4, and 7.0 mM AA or GA in drinking water for up to 4 weeks (19-111 mg/kg bw/days). Both doses of AA and GA produced significant increases in cII MFs, with the high doses producing responses 2.7-5.6-fold higher than the corresponding controls (P ≤ 0.05; control MFs = 17.2 ± 2.2 and 15.8 ± 3.5 × 10(-6) in males and females, respectively). Molecular analysis of the mutants from high doses indicated that AA and GA produced similar mutation spectra and that these spectra were significantly different from the spectra in control mice (P ≤ 0.01). The predominant types of mutations in the lung cII gene from AA- and GA-treated mice were A:T → T:A, and G:C → C:G transversions, and -1/+1 frameshifts at a homopolymeric run of Gs. The MFs and types of mutations induced by AA and GA in the lung are consistent with AA exerting its genotoxicity via metabolism to GA. These results suggest that AA is a mutagenic carcinogen in mouse lungs and therefore further studies on its potential health risk to humans are warranted. Environ. Mol. Mutagen. 56:446-456, 2015. © 2015 Wiley Periodicals, Inc.

  2. Lipoprotein lipase-catalyzed hydrolysis of phosphatidylcholine of guinea pig very low density lipoproteins and discoidal complexes of phospholipid and apolipoprotein: effect of apolipoprotein C-II on the catalytic mechanism.

    Science.gov (United States)

    Shirai, K; Fitzharris, T J; Shinomiya, M; Muntz, H G; Harmony, J A; Jackson, R L; Quinn, D M

    1983-06-01

    To elucidate the mechanism by which apolipoprotein C-II (apoC-II) enhances the activity of lipoprotein lipase (LpL), discoidal phospholipid complexes were prepared with apoC-III and di[(14)C]palmitoyl phosphatidylcholine (DPPC) and containing various amounts of apoC-II. The rate of DPPC hydrolysis catalyzed by purified bovine milk LpL was determined on the isolated complexes. The rate of hydrolysis was optimal at pH 8.0. Analysis of enzyme kinetic data over a range of phospholipid concentrations revealed that the major effect of apoC-II was to increase the maximal velocity (V(max)) some 50-fold with a limited effect on the Michaelis constant (K(m)). V(max) of the apoC-III complex containing no apoC-II was 9.2 nmol/min per mg LpL vs. 482 nmol/min per mg LpL for the complex containing only apoC-II. The effect of apoC-II on enzyme kinetic parameters for LpL-catalyzed hydrolysis of DPPC complexes was compared to that on the parameters for hydrolysis of DPPC and trioleoylglycerol incorporated into guinea pig very low density lipoproteins (VLDL(p)) which lack the equivalent of human apoC-II. Tri[(3)H]oleoylglycerol-labeled VLDL(p) were obtained by perfusion of guinea pig liver with [(3)H]oleic acid. Di[(14)C]palmitoyl phosphatidylcholine was incorporated into the VLDL(p) by incubation of VLDL(p) with sonicated vesicles of di[(14)C]palmitoyl phosphatidylcholine and purified bovine liver phosphatidylcholine exchange protein. The rates of LpL-catalyzed hydrolysis of trioleoylglycerol and DPPC were determined at pH 7.4 and 8.5 in the presence and absence of apoC-II. In the presence of apoC-II, the V(max) for DPPC hydrolysis in guinea pig VLDL(p) increased at both pH 7.4 and pH 8.5 (2.4- and 3.2-fold, respectively); the value of K(m) did not change at either pH (0.23 mm). On the other hand, the kinetic value of K(m) for triacylglycerol hydrolysis in the presence of apoC-II decreased at both pH 7.4 (3.05 vs. 0.54 mm) and pH 8.5 (2.73 vs. 0.62 mm). These kinetic studies suggest

  3. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  4. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  5. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  6. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  7. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  8. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  9. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  10. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  11. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  12. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  13. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  14. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  15. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  16. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  17. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  18. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  19. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  20. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  1. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  2. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  3. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  4. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  5. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  6. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  7. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  8. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  9. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  10. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  11. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  12. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  13. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  14. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  15. Computer Software.

    Science.gov (United States)

    Kay, Alan

    1984-01-01

    Discusses the nature and development of computer software. Programing, programing languages, types of software (including dynamic spreadsheets), and software of the future are among the topics considered. (JN)

  16. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  17. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  18. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  19. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  20. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  1. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  2. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  3. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  4. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  5. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  6. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  7. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  8. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  9. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  10. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  11. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  12. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  13. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  14. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  15. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  16. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  17. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  18. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  19. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  20. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  1. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  2. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  3. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  4. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  5. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  6. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  7. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  8. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  9. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  10. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  11. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  12. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  13. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  14. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  15. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  16. Computational Logistics

    DEFF Research Database (Denmark)

    Jensen, Rune Møller; Pacino, Dario; Voß, Stefan

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  17. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  18. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  19. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  20. Computational Combustion

    Energy Technology Data Exchange (ETDEWEB)

    Westbrook, C K; Mizobuchi, Y; Poinsot, T J; Smith, P J; Warnatz, J

    2004-08-26

    Progress in the field of computational combustion over the past 50 years is reviewed. Particular attention is given to those classes of models that are common to most system modeling efforts, including fluid dynamics, chemical kinetics, liquid sprays, and turbulent flame models. The developments in combustion modeling are placed into the time-dependent context of the accompanying exponential growth in computer capabilities and Moore's Law. Superimposed on this steady growth, the occasional sudden advances in modeling capabilities are identified and their impacts are discussed. Integration of submodels into system models for spark ignition, diesel and homogeneous charge, compression ignition engines, surface and catalytic combustion, pulse combustion, and detonations are described. Finally, the current state of combustion modeling is illustrated by descriptions of a very large jet lifted 3D turbulent hydrogen flame with direct numerical simulation and 3D large eddy simulations of practical gas burner combustion devices.

  1. Computational Electromagnetics

    CERN Document Server

    Rylander, Thomas; Bondeson, Anders

    2013-01-01

    Computational Electromagnetics is a young and growing discipline, expanding as a result of the steadily increasing demand for software for the design and analysis of electrical devices. This book introduces three of the most popular numerical methods for simulating electromagnetic fields: the finite difference method, the finite element method and the method of moments. In particular it focuses on how these methods are used to obtain valid approximations to the solutions of Maxwell's equations, using, for example, "staggered grids" and "edge elements." The main goal of the book is to make the reader aware of different sources of errors in numerical computations, and also to provide the tools for assessing the accuracy of numerical methods and their solutions. To reach this goal, convergence analysis, extrapolation, von Neumann stability analysis, and dispersion analysis are introduced and used frequently throughout the book. Another major goal of the book is to provide students with enough practical understan...

  2. Computational Physics

    Science.gov (United States)

    Thijssen, Jos

    2013-10-01

    1. Introduction; 2. Quantum scattering with a spherically symmetric potential; 3. The variational method for the Schrödinger equation; 4. The Hartree-Fock method; 5. Density functional theory; 6. Solving the Schrödinger equation in periodic solids; 7. Classical equilibrium statistical mechanics; 8. Molecular dynamics simulations; 9. Quantum molecular dynamics; 10. The Monte Carlo method; 11. Transfer matrix and diagonalisation of spin chains; 12. Quantum Monte Carlo methods; 13. The infinite element method for partial differential equations; 14. The lattice Boltzmann method for fluid dynamics; 15. Computational methods for lattice field theories; 16. High performance computing and parallelism; Appendix A. Numerical methods; Appendix B. Random number generators; References; Index.

  3. Everything Computes

    Institute of Scientific and Technical Information of China (English)

    Bill; Hofmann

    1999-01-01

    Dear American Professor, I am a student in Beijing. At the beginning of last semester, we fourroommates gathered some 10,000 yuan (a big sum here. approximately 1150USD ) and bought a computer, which is our joint-property. Since the computercame into our room, it was used round the clock except the time we were havingc1asses. So even at midnight, when I woke up from the dream, I could still see

  4. Computer Game

    Science.gov (United States)

    1992-01-01

    Using NASA studies of advanced lunar exploration and colonization, KDT Industries, Inc. and Wesson International have developed MOONBASE, a computer game. The player, or team commander, must build and operate a lunar base using NASA technology. He has 10 years to explore the surface, select a site and assemble structures brought from Earth into an efficient base. The game was introduced in 1991 by Texas Space Grant Consortium.

  5. Computational Electromagnetics

    Science.gov (United States)

    2011-02-20

    a collaboration between Caltech’s postdoctoral associate N. Albin and OB) have shown that, for a variety of reasons, the first-order...KZK approximation", Nathan Albin , Oscar P. Bruno, Theresa Y. Cheung and Robin O. Cleveland, preprint, (2011) "A Spectral FC Solver for the Compressible...Navier-Stokes Equations in General Domains I: Explicit time-stepping" Nathan Albin and Oscar P. Bruno, To appear in Journal of Computational Physics

  6. Topics in Chemical Instrumentation: CII. Automated Anodic Stripping Voltammetry.

    Science.gov (United States)

    Stock, John T.; Ewing, Galen W., Ed.

    1980-01-01

    Presents details of anodic stripping analysis (ASV) in college chemistry laboratory experiments. Provides block diagrams of the analyzer system, circuitry and power supplies of the automated stripping analyzer, and instructions for implementing microcomputer control of the ASV. (CS)

  7. Computer vision

    Science.gov (United States)

    Gennery, D.; Cunningham, R.; Saund, E.; High, J.; Ruoff, C.

    1981-01-01

    The field of computer vision is surveyed and assessed, key research issues are identified, and possibilities for a future vision system are discussed. The problems of descriptions of two and three dimensional worlds are discussed. The representation of such features as texture, edges, curves, and corners are detailed. Recognition methods are described in which cross correlation coefficients are maximized or numerical values for a set of features are measured. Object tracking is discussed in terms of the robust matching algorithms that must be devised. Stereo vision, camera control and calibration, and the hardware and systems architecture are discussed.

  8. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  9. Computational crystallization.

    Science.gov (United States)

    Altan, Irem; Charbonneau, Patrick; Snell, Edward H

    2016-07-15

    Crystallization is a key step in macromolecular structure determination by crystallography. While a robust theoretical treatment of the process is available, due to the complexity of the system, the experimental process is still largely one of trial and error. In this article, efforts in the field are discussed together with a theoretical underpinning using a solubility phase diagram. Prior knowledge has been used to develop tools that computationally predict the crystallization outcome and define mutational approaches that enhance the likelihood of crystallization. For the most part these tools are based on binary outcomes (crystal or no crystal), and the full information contained in an assembly of crystallization screening experiments is lost. The potential of this additional information is illustrated by examples where new biological knowledge can be obtained and where a target can be sub-categorized to predict which class of reagents provides the crystallization driving force. Computational analysis of crystallization requires complete and correctly formatted data. While massive crystallization screening efforts are under way, the data available from many of these studies are sparse. The potential for this data and the steps needed to realize this potential are discussed.

  10. Tensor computations in computer algebra systems

    CERN Document Server

    Korolkova, A V; Sevastyanov, L A

    2014-01-01

    This paper considers three types of tensor computations. On their basis, we attempt to formulate criteria that must be satisfied by a computer algebra system dealing with tensors. We briefly overview the current state of tensor computations in different computer algebra systems. The tensor computations are illustrated with appropriate examples implemented in specific systems: Cadabra and Maxima.

  11. Brain computer

    Directory of Open Access Journals (Sweden)

    Sarah N. Abdulkader

    2015-07-01

    Full Text Available Brain computer interface technology represents a highly growing field of research with application systems. Its contributions in medical fields range from prevention to neuronal rehabilitation for serious injuries. Mind reading and remote communication have their unique fingerprint in numerous fields such as educational, self-regulation, production, marketing, security as well as games and entertainment. It creates a mutual understanding between users and the surrounding systems. This paper shows the application areas that could benefit from brain waves in facilitating or achieving their goals. We also discuss major usability and technical challenges that face brain signals utilization in various components of BCI system. Different solutions that aim to limit and decrease their effects have also been reviewed.

  12. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    papers [1,2]. In [1] we assume that the adversary can corrupt any set from a given adversary structure. In this setting we study a problem of doing efficient VSS and MPC given an access to a secret sharing scheme (SS). For all adversary structures where VSS is possible at all, we show that, up...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...... adversary structure. We propose new VSS and MPC protocols that are substantially more efficient than the ones previously known. Another contribution of [2] is an attack against a Weak Secret Sharing Protocol (WSS) of [3]. The attack exploits the fact that the adversary is adaptive. We present this attack...

  13. Social Computing

    CERN Document Server

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  14. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  15. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  16. Computational micromechanics

    Science.gov (United States)

    Ortiz, M.

    1996-09-01

    Selected issues in computational micromechanics are reviewed, with particular emphasis on multiple-scale problems and micromechanical models of material behavior. Examples considered include: the bridging of atomistic and continuum scales, with application to nanoindentation and the brittle-to-ductile transition; the development of dislocation-based constitutive relations for pure metallic crystals and intermetallic compounds, with applications to fracture of single crystals and bicrystals; the simulation of non-planar three-dimensional crack growth at the microscale, with application to mixed mode I III effective behavior and crack trapping and bridging in fiber-reinforced composites; and the direct micromechanical simulation of fragmentation of brittle solids and subsequent flow of the comminuted phase.

  17. Computed Tomography (CT) -- Sinuses

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the ... of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly ...

  18. Study of Quantum Computing

    Directory of Open Access Journals (Sweden)

    Prashant Anil Patil

    2012-04-01

    Full Text Available This paper gives the detailed information about Quantum computer, and difference between quantum computer and traditional computers, the basis of Quantum computers which are slightly similar but still different from traditional computer. Many research groups are working towards the highly technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. Quantum computer is very much use full for computation purpose in field of Science and Research. Large amount of data and information will be computed, processing, storing, retrieving, transmitting and displaying information in less time with that much of accuracy which is not provided by traditional computers.

  19. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  20. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  1. Computing with functionals—computability theory or computer science?

    OpenAIRE

    Normann, Dag

    2006-01-01

    We review some of the history of the computability theory of functionals of higher types, and we will demonstrate how contributions from logic and theoretical computer science have shaped this still active subject.

  2. Program Facilitates Distributed Computing

    Science.gov (United States)

    Hui, Joseph

    1993-01-01

    KNET computer program facilitates distribution of computing between UNIX-compatible local host computer and remote host computer, which may or may not be UNIX-compatible. Capable of automatic remote log-in. User communicates interactively with remote host computer. Data output from remote host computer directed to local screen, to local file, and/or to local process. Conversely, data input from keyboard, local file, or local process directed to remote host computer. Written in ANSI standard C language.

  3. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  4. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  5. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  6. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  7. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  8. Cloud Computing (4)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ 8 Case Study Cloud computing is still a new phenomenon. Although many IT giants are developing their own cloud computing infrastructures,platforms, software, and services, few have really succeeded in becoming cloud computing providers.

  9. Avoiding Computer Viruses.

    Science.gov (United States)

    Rowe, Joyce; And Others

    1989-01-01

    The threat of computer sabotage is a real concern to business teachers and others responsible for academic computer facilities. Teachers can minimize the possibility. Eight suggestions for avoiding computer viruses are given. (JOW)

  10. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  11. ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING: APPLICATION OF COMPUTATIONAL BIOPHYSICAL TRANSPORT, COMPUTATIONAL CHEMISTRY, AND COMPUTATIONAL BIOLOGY

    Science.gov (United States)

    Computational toxicology (CompTox) leverages the significant gains in computing power and computational techniques (e.g., numerical approaches, structure-activity relationships, bioinformatics) realized over the last few years, thereby reducing costs and increasing efficiency i...

  12. Computed Tomography (CT) - Spine

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Spine Computed tomography (CT) of the ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ...

  13. Computed Tomography (CT) -- Head

    Science.gov (United States)

    ... News Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ...

  14. PR Educators Stress Computers.

    Science.gov (United States)

    Fleming, Charles A.

    1988-01-01

    Surveys the varied roles computers play in public relations education. Asserts that, because computers are used extensively in the public relations field, students should become acquainted with the varied capabilities of computers and their role in public relations practice. (MM)

  15. A semi-empirical model for the M star GJ832 using modeling tools developed for computing semi-empirical solar models

    Science.gov (United States)

    Linsky, Jeffrey; Fontenla, Juan; France, Kevin

    2016-05-01

    We present a semi-empirical model of the photosphere, chromosphere, transition region, and corona for the M2 dwarf star GJ832, which hosts two exoplanets. The atmospheric model uses a modification of the Solar Radiation Physical Modeling tools developed by Fontenla and collaborators. These computer codes model non-LTE spectral line formation for 52 atoms and ions and include a large number of lines from 20 abundant diatomic molecules that are present in the much cooler photosphere and chromosphere of this star. We constructed the temperature distribution to fit Hubble Space Telescope observations of chromospheric lines (e.g., MgII), transition region lines (CII, CIV, SiIV, and NV), and the UV continuum. Temperatures in the coronal portion of the model are consistent with ROSAT and XMM-Newton X-ray observations and the FeXII 124.2 nm line. The excellent fit of the model to the data demonstrates that the highly developed model atmosphere code developed to explain regions of the solar atmosphere with different activity levels has wide applicability to stars, including this M star with an effective temperature 2200 K cooler than the Sun. We describe similarities and differences between the M star model and models of the quiet and active Sun.

  16. Distributed Computing: An Overview

    OpenAIRE

    Md. Firoj Ali; Rafiqul Zaman Khan

    2015-01-01

    Decrease in hardware costs and advances in computer networking technologies have led to increased interest in the use of large-scale parallel and distributed computing systems. Distributed computing systems offer the potential for improved performance and resource sharing. In this paper we have made an overview on distributed computing. In this paper we studied the difference between parallel and distributed computing, terminologies used in distributed computing, task allocation in distribute...

  17. Introduction to computers

    OpenAIRE

    Rajaraman, A

    1995-01-01

    An article on computer application for knowledge processing intended to generate awareness among librarians on the possiblities offered by ICT to improve services. Compares computers and the human brain, provides a historical perspective of the development of computer technology, explains the components of the computer and the computer languages, identifes the areas where computers can be applied and its benefits. Explains available storage systems and database management process. Points out ...

  18. A Review on Modern Distributed Computing Paradigms: Cloud Computing, Jungle Computing and Fog Computing

    OpenAIRE

    Hajibaba, Majid; Gorgin, Saeid

    2014-01-01

    The distributed computing attempts to improve performance in large-scale computing problems by resource sharing. Moreover, rising low-cost computing power coupled with advances in communications/networking and the advent of big data, now enables new distributed computing paradigms such as Cloud, Jungle and Fog computing.Cloud computing brings a number of advantages to consumers in terms of accessibility and elasticity. It is based on centralization of resources that possess huge processing po...

  19. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  20. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  1. Cloud Computing (2)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series discusses cloud computing technology in the following aspects: The first part provided a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  2. Cloud Computing (1)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: Cloud computing is a topic of intense interest in the Internet field. Major IT giants have launched their own cloud computing products. This four-part lecture series will discuss cloud computing technology in the following aspects: The first part provides a brief description of the origin and characteristics of cloud computing from the users view of point; the other parts introduce typical applications of cloud computing, technically analyze the specific content within the cloud, its components, architecture and computational paradigm, compare cloud computing to other distributed computing technologies, and discuss its successful cases, commercial models, related technical and economic issues, and development trends.

  3. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance Computing The ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  4. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  5. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  6. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  7. The Computer Manpower Evolution

    Science.gov (United States)

    Rooney, Joseph J.

    1975-01-01

    Advances and employment outlook in the field of computer science are discussed as well as the problems related to improving the quality of computer education. Specific computer jobs discussed include: data processing machine repairers, systems analysts, programmers, computer and peripheral equipment operators, and keypunch operators. (EA)

  8. Elementary School Computer Literacy.

    Science.gov (United States)

    New York City Board of Education, Brooklyn, NY.

    This curriculum guide presents lessons for computer literacy instruction in the elementary grades. The first section of the guide includes 22 lessons on hardware, covering such topics as how computers work, keyboarding, word processing, and computer peripherals. The 13 lessons in the second section cover social topics related to the computer,…

  9. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  10. Great Principles of Computing

    OpenAIRE

    Denning, Peter J.

    2008-01-01

    The Great Principles of Computing is a framework for understanding computing as a field of science. The website ...April 2008 (Rev. 8/31/08) The Great Principles of Computing is a framework for understanding computing as a field of science.

  11. Computer Viruses. Technology Update.

    Science.gov (United States)

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  12. Study on Parallel Computing

    Institute of Scientific and Technical Information of China (English)

    Guo-Liang Chen; Guang-Zhong Sun; Yun-Quan Zhang; Ze-Yao Mo

    2006-01-01

    In this paper, we present a general survey on parallel computing. The main contents include parallel computer system which is the hardware platform of parallel computing, parallel algorithm which is the theoretical base of parallel computing, parallel programming which is the software support of parallel computing. After that, we also introduce some parallel applications and enabling technologies. We argue that parallel computing research should form an integrated methodology of "architecture - algorithm - programming - application". Only in this way, parallel computing research becomes continuous development and more realistic.

  13. Students’ Choice for Computers

    Institute of Scientific and Technical Information of China (English)

    Cai; Wei

    2015-01-01

    Nowadays,computers are widely used as useful tools for our daily life.So you can see students using computers everywhere.The purpose of our survey is to find out the answers to the following questions:1.What brand of computers do students often choose?2.What is the most important factor of choosing computers in students’idea?3.What do students want to do with computers most?After that,we hope the students will know what kind of computers they really need and how many factors must be thought about when buying computers.

  14. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  15. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  16. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  17. Computation in Classical Mechanics

    CERN Document Server

    Timberlake, Todd

    2007-01-01

    There is a growing consensus that physics majors need to learn computational skills, but many departments are still devoid of computation in their physics curriculum. Some departments may lack the resources or commitment to create a dedicated course or program in computational physics. One way around this difficulty is to include computation in a standard upper-level physics course. An intermediate classical mechanics course is particularly well suited for including computation. We discuss the ways we have used computation in our classical mechanics courses, focusing on how computational work can improve students' understanding of physics as well as their computational skills. We present examples of computational problems that serve these two purposes. In addition, we provide information about resources for instructors who would like to include computation in their courses.

  18. Research on Comparison of Cloud Computing and Grid Computing

    OpenAIRE

    Liu Yuxi; Wang Jianhua

    2012-01-01

    The development of computer industry is promoted by the progress of distributed computing, parallel computing and grid computing, so the cloud computing movement rises. This study describes the types of cloud computing services, the similarities and differences of cloud computing and grid computing, meanwhile discusses the better aspect of cloud computing than grid computing, and refers the common problems faced to the both computing, and some security issues.

  19. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  20. Distributed computing in bioinformatics.

    Science.gov (United States)

    Jain, Eric

    2002-01-01

    This paper provides an overview of methods and current applications of distributed computing in bioinformatics. Distributed computing is a strategy of dividing a large workload among multiple computers to reduce processing time, or to make use of resources such as programs and databases that are not available on all computers. Participating computers may be connected either through a local high-speed network or through the Internet.

  1. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  2. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  3. Ion Trap Quantum Computing

    Science.gov (United States)

    2011-12-01

    an inspiring speech at the MIT Physics of Computation 1st Conference in 1981, Feynman proposed the development of a computer that would obey the...on ion trap based 36 quantum computing for physics and computer science students would include lecture notes, slides, lesson plans, a syllabus...reading lists, videos, demonstrations, and laboratories. 37 LIST OF REFERENCES [1] R. P. Feynman , “Simulating physics with computers,” Int. J

  4. Cloud Computing (3)

    Institute of Scientific and Technical Information of China (English)

    Wang Bai; Xu Liutong

    2010-01-01

    @@ Editor's Desk: In the preceding two parts of this series, several aspects of cloud computing-including definition, classification, characteristics, typical applications, and service levels-were discussed. This part continues with a discussion of Cloud Computing Oopen Architecture and Market-Oriented Cloud. A comparison is made between cloud computing and other distributed computing technologies, and Google's cloud platform is analyzed to determine how distributed computing is implemented in its particular model.

  5. Understanding Student Computational Thinking with Computational Modeling

    CERN Document Server

    Aiken, John M; Douglas, Scott S; Burk, John B; Scanlon, Erin M; Thoms, Brian D; Schatz, Michael F

    2012-01-01

    Recently, the National Research Council's framework for next generation science standards highlighted "computational thinking" as one of its "fundamental practices". Students taking a physics course that employed the Arizona State University's Modeling Instruction curriculum were taught to construct computational models of physical systems. Student computational thinking was assessed using a proctored programming assignment, written essay, and a series of think-aloud interviews, where the students produced and discussed a computational model of a baseball in motion via a high-level programming environment (VPython). Roughly a third of the students in the study were successful in completing the programming assignment. Student success on this assessment was tied to how students synthesized their knowledge of physics and computation. On the essay and interview assessments, students displayed unique views of the relationship between force and motion; those who spoke of this relationship in causal (rather than obs...

  6. Heterogeneous Distributed Computing for Computational Aerosciences

    Science.gov (United States)

    Sunderam, Vaidy S.

    1998-01-01

    The research supported under this award focuses on heterogeneous distributed computing for high-performance applications, with particular emphasis on computational aerosciences. The overall goal of this project was to and investigate issues in, and develop solutions to, efficient execution of computational aeroscience codes in heterogeneous concurrent computing environments. In particular, we worked in the context of the PVM[1] system and, subsequent to detailed conversion efforts and performance benchmarking, devising novel techniques to increase the efficacy of heterogeneous networked environments for computational aerosciences. Our work has been based upon the NAS Parallel Benchmark suite, but has also recently expanded in scope to include the NAS I/O benchmarks as specified in the NHT-1 document. In this report we summarize our research accomplishments under the auspices of the grant.

  7. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  8. Duality quantum computing

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article,we make a review on the development of a newly proposed quantum computer,duality computer,or the duality quantum computer and the duality mode of quantum computers.The duality computer is based on the particle-wave duality principle of quantum mechanics.Compared to an ordinary quantum computer,the duality quantum computer is a quantum computer on the move and passing through a multi-slit.It offers more computing operations than is possible with an ordinary quantum computer.The most two distinct operations are:the quantum division operation and the quantum combiner operation.The division operation divides the wave function of a quantum computer into many attenuated,and identical parts.The combiner operation combines the wave functions in different parts into a single part.The duality mode is a way in which a quantum computer with some extra qubit resource simulates a duality computer.The main structure of duality quantum computer and duality mode,the duality mode,their mathematical description and algorithm designs are reviewed.

  9. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  10. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  11. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw; Gokhale, Maya B.; McCabe, Kevin Peter

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  12. Language and Computers

    CERN Document Server

    Dickinson, Markus; Meurers, Detmar

    2012-01-01

    Language and Computers introduces students to the fundamentals of how computers are used to represent, process, and organize textual and spoken information. Concepts are grounded in real-world examples familiar to students’ experiences of using language and computers in everyday life. A real-world introduction to the fundamentals of how computers process language, written specifically for the undergraduate audience, introducing key concepts from computational linguistics. Offers a comprehensive explanation of the problems computers face in handling natural language Covers a broad spectru

  13. Computer techniques for electromagnetics

    CERN Document Server

    Mittra, R

    1973-01-01

    Computer Techniques for Electromagnetics discusses the ways in which computer techniques solve practical problems in electromagnetics. It discusses the impact of the emergence of high-speed computers in the study of electromagnetics. This text provides a brief background on the approaches used by mathematical analysts in solving integral equations. It also demonstrates how to use computer techniques in computing current distribution, radar scattering, and waveguide discontinuities, and inverse scattering. This book will be useful for students looking for a comprehensive text on computer techni

  14. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  15. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  16. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  17. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available

    Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.

    Keywords: Cloud computing, QoS, quality of cloud computing

  18. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  19. The Global Computer

    DEFF Research Database (Denmark)

    Sharp, Robin

    2002-01-01

    This paper describes a Danish project, involving partners from Copenhagen University, DTU, the University of Southern Denmark, Aalborg University, Copenhagen Business School and UNI-C, for exploiting Grid technology to provide computer resources for applications with very large computational...

  20. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  1. Applying Computational Intelligence

    CERN Document Server

    Kordon, Arthur

    2010-01-01

    Offers guidelines on creating value from the application of computational intelligence methods. This work introduces a methodology for effective real-world application of computational intelligence while minimizing development cost, and outlines the critical, underestimated technology marketing efforts required

  2. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  3. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  4. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  5. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  6. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  7. Computer Intrusions and Attacks.

    Science.gov (United States)

    Falk, Howard

    1999-01-01

    Examines some frequently encountered unsolicited computer intrusions, including computer viruses, worms, Java applications, trojan horses or vandals, e-mail spamming, hoaxes, and cookies. Also discusses virus-protection software, both for networks and for individual users. (LRW)

  8. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT of the sinuses is primarily used ...

  9. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  10. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  11. Introduction to Quantum Computation

    Science.gov (United States)

    Ekert, A.

    A computation is a physical process. It may be performed by a piece of electronics or on an abacus, or in your brain, but it is a process that takes place in nature and as such it is subject to the laws of physics. Quantum computers are machines that rely on characteristically quantum phenomena, such as quantum interference and quantum entanglement in order to perform computation. In this series of lectures I want to elaborate on the computational power of such machines.

  12. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  13. Computably regular topological spaces

    OpenAIRE

    Weihrauch, Klaus

    2013-01-01

    This article continues the study of computable elementary topology started by the author and T. Grubba in 2009 and extends the author's 2010 study of axioms of computable separation. Several computable T3- and Tychonoff separation axioms are introduced and their logical relation is investigated. A number of implications between these axioms are proved and several implications are excluded by counter examples, however, many questions have not yet been answered. Known results on computable metr...

  14. Biomolecular computation for bionanotechnology

    CERN Document Server

    Liu, Jian-Qin

    2006-01-01

    Computers built with moleware? The drive toward non-silicon computing is underway, and this first-of-its-kind guide to molecular computation gives researchers a firm grasp of the technologies, biochemical details, and theoretical models at the cutting edge. It explores advances in molecular biology and nanotechnology and illuminates how the convergence of various technologies is propelling computational capacity beyond the limitations of traditional hardware technology and into the realm of moleware.

  15. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  16. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  17. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2015-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  18. Computational intelligence in optimization

    CERN Document Server

    Tenne, Yoel

    2010-01-01

    This volume presents a collection of recent studies covering the spectrum of computational intelligence applications with emphasis on their application to challenging real-world problems. Topics covered include: Intelligent agent-based algorithms, Hybrid intelligent systems, Cognitive and evolutionary robotics, Knowledge-Based Engineering, fuzzy sets and systems, Bioinformatics and Bioengineering, Computational finance and Computational economics, Data mining, Machine learning, and Expert systems. ""Computational Intelligence in Optimization"" is a comprehensive reference for researchers, prac

  19. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  20. Computer system identification

    OpenAIRE

    Lesjak, Borut

    2008-01-01

    The concept of computer system identity in computer science bears just as much importance as does the identity of an individual in a human society. Nevertheless, the identity of a computer system is incomparably harder to determine, because there is no standard system of identification we could use and, moreover, a computer system during its life-time is quite indefinite, since all of its regular and necessary hardware and software upgrades soon make it almost unrecognizable: after a number o...

  1. Space Spurred Computer Graphics

    Science.gov (United States)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  2. Mobile collaborative cloudless computing

    OpenAIRE

    Cruz, Nuno Miguel Machado, 1978-

    2015-01-01

    Tese de doutoramento, Informática (Engenharia Informática), Universidade de Lisboa, Faculdade de Ciências, 2015 Although the computational power of mobile devices has been increasing, it is still not enough for some classes of applications. In the present, these applications delegate the computing power burden on servers located on the Internet. This model assumes an always-on Internet connectivity and implies a non-negligible latency. Cloud computing is an innovative computing paradigm wh...

  3. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  4. Computational Social Creativity.

    Science.gov (United States)

    Saunders, Rob; Bown, Oliver

    2015-01-01

    This article reviews the development of computational models of creativity where social interactions are central. We refer to this area as computational social creativity. Its context is described, including the broader study of creativity, the computational modeling of other social phenomena, and computational models of individual creativity. Computational modeling has been applied to a number of areas of social creativity and has the potential to contribute to our understanding of creativity. A number of requirements for computational models of social creativity are common in artificial life and computational social science simulations. Three key themes are identified: (1) computational social creativity research has a critical role to play in understanding creativity as a social phenomenon and advancing computational creativity by making clear epistemological contributions in ways that would be challenging for other approaches; (2) the methodologies developed in artificial life and computational social science carry over directly to computational social creativity; and (3) the combination of computational social creativity with individual models of creativity presents significant opportunities and poses interesting challenges for the development of integrated models of creativity that have yet to be realized.

  5. On Understanding Computers.

    Science.gov (United States)

    Olds, Henry F., Jr.; And Others

    1983-01-01

    Three articles discuss the use of computers in education: (1) "References for a Broader Vision" (Henry F. Olds, Jr.); (2) "What Every Teacher Should Know About Computer Simulations" (David Grady); and (3) "The Computer as Palette and Model Builder" (Interview of Alan Kay). (CJ)

  6. Education for Computers

    Science.gov (United States)

    Heslep, Robert D.

    2012-01-01

    The computer engineers who refer to the education of computers do not have a definite idea of education and do not bother to justify the fuzzy ones to which they allude. Hence, they logically cannot specify the features a computer must have in order to be educable. This paper puts forth a non-standard, but not arbitrary, concept of education that…

  7. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  8. People Shaping Educational Computing.

    Science.gov (United States)

    Blair, Marjorie; Lobello, Sharon

    1984-01-01

    Discusses contributions to educational computing of Seymour Papert, LOGO creator; Irwin Hoffman, first school-based computer education program developer; Dorothy Deringer, National Science Foundation's monitor and supporter of educational computing projects; Sherwin Steffin, educational software company vice-president; and Jessie Muse, National…

  9. Women and Computer Science.

    Science.gov (United States)

    Breene, L. Anne

    1992-01-01

    Discusses issues concerning women in computer science education, and in the workplace, and sex bias in the computer science curriculum. Concludes that computing environment has not improved for women over last 20 years. Warns that, although number of white males entering college is declining, need for scientists and engineers is not. (NB)

  10. Ethics and Computer Scientists.

    Science.gov (United States)

    Pulliam, Sylvia Clark

    The purpose of this study was to explore the perceptions that computer science educators have about computer ethics. The study focused on four areas: (1) the extent to which computer science educators believe that ethically inappropriate practices are taking place (both on campus and throughout society); (2) perceptions of such educators about…

  11. Deductive Computer Programming. Revision

    Science.gov (United States)

    1989-09-30

    Lecture Notes in Computer Science 354...automata", In Temporal Logic in Specification, Lecture Notes in Computer Science 398, Springer-Verlag, 1989, pp. 124-164. *[MP4] Z. Manna and A. Pnueli... Notes in Computer Science 372, Springer-Verlag, 1989, pp. 534-558. CONTRIBUTION TO BOOKS [MP5] Z. Manna and A. Pnueli, "An exercise in the

  12. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss th

  13. Computers at the Crossroads.

    Science.gov (United States)

    Ediger, Marlow

    1988-01-01

    Discusses reasons for the lack of computer and software use in the classroom, especially on the elementary level. Highlights include deficiencies in available software, including lack of interaction and type of feedback; philosophies of computer use; the psychology of learning and computer use; and suggestions for developing quality software. (4…

  14. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  15. Computer Training at Harwell

    Science.gov (United States)

    Hull, John

    1969-01-01

    By using teletypewriters connected to the Harwell multi-access computing system, lecturers can easily demonstrate the operation of the computer in the classroom; this saves time and eliminates errors and staff can carry out exercises using the main computer. (EB)

  16. Mixing Computations and Proofs

    Directory of Open Access Journals (Sweden)

    Michael Beeson

    2016-01-01

    Full Text Available We examine the relationship between proof and computation in mathematics, especially in formalized mathematics. We compare the various approaches to proofs with a significant computational component, including (i verifying  the algorithms, (ii verifying the results of the unverified algorithms, and (iii trusting an external computation.

  17. The science of computing - Parallel computation

    Science.gov (United States)

    Denning, P. J.

    1985-01-01

    Although parallel computation architectures have been known for computers since the 1920s, it was only in the 1970s that microelectronic components technologies advanced to the point where it became feasible to incorporate multiple processors in one machine. Concommitantly, the development of algorithms for parallel processing also lagged due to hardware limitations. The speed of computing with solid-state chips is limited by gate switching delays. The physical limit implies that a 1 Gflop operational speed is the maximum for sequential processors. A computer recently introduced features a 'hypercube' architecture with 128 processors connected in networks at 5, 6 or 7 points per grid, depending on the design choice. Its computing speed rivals that of supercomputers, but at a fraction of the cost. The added speed with less hardware is due to parallel processing, which utilizes algorithms representing different parts of an equation that can be broken into simpler statements and processed simultaneously. Present, highly developed computer languages like FORTRAN, PASCAL, COBOL, etc., rely on sequential instructions. Thus, increased emphasis will now be directed at parallel processing algorithms to exploit the new architectures.

  18. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  19. Scalable distributed computing hierarchy: cloud, fog and dew computing

    OpenAIRE

    Skala, Karolj; Davidović, Davor; Afgan, Enis; Sović, Ivan; Šojat, Zorislav

    2015-01-01

    The paper considers the conceptual approach for organization of the vertical hierarchical links between the scalable distributed computing paradigms: Cloud Computing, Fog Computing and Dew Computing. In this paper, the Dew Computing is described and recognized as a new structural layer in the existing distributed computing hierarchy. In the existing computing hierarchy, the Dew computing is positioned as the ground level for the Cloud and Fog computing paradigms. Vertical, complementary, hier...

  20. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  1. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  2. Perspectives in Computation

    CERN Document Server

    Geroch, Robert

    2009-01-01

    Computation is the process of applying a procedure or algorithm to the solution of a mathematical problem. Mathematicians and physicists have been occupied for many decades pondering which problems can be solved by which procedures, and, for those that can be solved, how this can most efficiently be done. In recent years, quantum mechanics has augmented our understanding of the process of computation and of its limitations. Perspectives in Computation covers three broad topics: the computation process and its limitations, the search for computational efficiency, and the role of quantum mechani

  3. Rough-Granular Computing

    Institute of Scientific and Technical Information of China (English)

    Andrzej Skowron

    2006-01-01

    Solving complex problems by multi-agent systems in distributed environments requires new approximate reasoning methods based on new computing paradigms. One such recently emerging computing paradigm is Granular Computing(GC). We discuss the Rough-Granular Computing(RGC) approach to modeling of computations in complex adaptive systems and multiagent systems as well as for approximate reasoning about the behavior of such systems. The RGC methods have been successfully applied for solving complex problems in areas such as identification of objects or behavioral patterns by autonomous systems, web mining, and sensor fusion.

  4. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface

  5. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  6. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  7. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  8. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  9. Replacing the computer mouse

    OpenAIRE

    Dernoncourt, Franck

    2014-01-01

    In a few months the computer mouse will be half-a-century-old. It is known to have many drawbacks, the main ones being: loss of productivity due to constant switching between keyboard and mouse, and health issues such as RSI. Like the keyboard, it is an unnatural human-computer interface. However the vast majority of computer users still use computer mice nowadays. In this article, we explore computer mouse alternatives. Our research shows that moving the mouse cursor can be done efficiently ...

  10. Analogue computing methods

    CERN Document Server

    Welbourne, D

    1965-01-01

    Analogue Computing Methods presents the field of analogue computation and simulation in a compact and convenient form, providing an outline of models and analogues that have been produced to solve physical problems for the engineer and how to use and program the electronic analogue computer. This book consists of six chapters. The first chapter provides an introduction to analogue computation and discusses certain mathematical techniques. The electronic equipment of an analogue computer is covered in Chapter 2, while its use to solve simple problems, including the method of scaling is elaborat

  11. Topology for computing

    CERN Document Server

    Zomorodian, Afra J

    2005-01-01

    The emerging field of computational topology utilizes theory from topology and the power of computing to solve problems in diverse fields. Recent applications include computer graphics, computer-aided design (CAD), and structural biology, all of which involve understanding the intrinsic shape of some real or abstract space. A primary goal of this book is to present basic concepts from topology and Morse theory to enable a non-specialist to grasp and participate in current research in computational topology. The author gives a self-contained presentation of the mathematical concepts from a comp

  12. Trust Based Pervasive Computing

    Institute of Scientific and Technical Information of China (English)

    LI Shiqun; Shane Balfe; ZHOU Jianying; CHEN Kefei

    2006-01-01

    Pervasive computing environment is a distributed and mobile space. Trust relationship must be established and ensured between devices and the systems in the pervasive computing environment. The trusted computing (TC) technology introduced by trusted computing group is a distributed-system-wide approach to the provisions of integrity protection of resources. The TC' notion of trust and security can be described as conformed system behaviors of a platform environment such that the conformation can be attested to a remote challenger. In this paper the trust requirements in a pervasive/ubiquitous environment are analyzed. Then security schemes for the pervasive computing are proposed using primitives offered by TC technology.

  13. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  14. Cloud Computing Technologies

    Directory of Open Access Journals (Sweden)

    Sean Carlin

    2012-06-01

    Full Text Available This paper outlines the key characteristics that cloud computing technologies possess and illustrates the cloud computing stack containing the three essential services (SaaS, PaaS and IaaS that have come to define the technology and its delivery model. The underlying virtualization technologies that make cloud computing possible are also identified and explained. The various challenges that face cloud computing technologies today are investigated and discussed. The future of cloud computing technologies along with its various applications and trends are also explored, giving a brief outlook of where and how the technology will progress into the future.

  15. Richard Feynman and computation

    Science.gov (United States)

    Hey, Tony

    1999-04-01

    The enormous contribution of Richard Feynman to modern physics is well known, both to teaching through his famous Feynman Lectures on Physics, and to research with his Feynman diagram approach to quantum field theory and his path integral formulation of quantum mechanics. Less well known perhaps is his long-standing interest in the physics of computation and this is the subject of this paper. Feynman lectured on computation at Caltech for most of the last decade of his life, first with John Hopfield and Carver Mead, and then with Gerry Sussman. The story of how these lectures came to be written up as the Feynman Lectures on Computation is briefly recounted. Feynman also discussed the fundamentals of computation with other legendary figures of the computer science and physics community such as Ed Fredkin, Rolf Landauer, Carver Mead, Marvin Minsky and John Wheeler. He was also instrumental in stimulating developments in both nanotechnology and quantum computing. During the 1980s Feynman re-visited long-standing interests both in parallel computing with Geoffrey Fox and Danny Hillis, and in reversible computation and quantum computing with Charles Bennett, Norman Margolus, Tom Toffoli and Wojciech Zurek. This paper records Feynman's links with the computational community and includes some reminiscences about his involvement with the fundamentals of computing.

  16. ALMA correlator computer systems

    Science.gov (United States)

    Pisano, Jim; Amestica, Rodrigo; Perez, Jesus

    2004-09-01

    We present a design for the computer systems which control, configure, and monitor the Atacama Large Millimeter Array (ALMA) correlator and process its output. Two distinct computer systems implement this functionality: a rack- mounted PC controls and monitors the correlator, and a cluster of 17 PCs process the correlator output into raw spectral results. The correlator computer systems interface to other ALMA computers via gigabit Ethernet networks utilizing CORBA and raw socket connections. ALMA Common Software provides the software infrastructure for this distributed computer environment. The control computer interfaces to the correlator via multiple CAN busses and the data processing computer cluster interfaces to the correlator via sixteen dedicated high speed data ports. An independent array-wide hardware timing bus connects to the computer systems and the correlator hardware ensuring synchronous behavior and imposing hard deadlines on the control and data processor computers. An aggregate correlator output of 1 gigabyte per second with 16 millisecond periods and computational data rates of approximately 1 billion floating point operations per second define other hard deadlines for the data processing computer cluster.

  17. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  18. Desktop Computing Integration Project

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  19. Hyperswitch Communication Network Computer

    Science.gov (United States)

    Peterson, John C.; Chow, Edward T.; Priel, Moshe; Upchurch, Edwin T.

    1993-01-01

    Hyperswitch Communications Network (HCN) computer is prototype multiple-processor computer being developed. Incorporates improved version of hyperswitch communication network described in "Hyperswitch Network For Hypercube Computer" (NPO-16905). Designed to support high-level software and expansion of itself. HCN computer is message-passing, multiple-instruction/multiple-data computer offering significant advantages over older single-processor and bus-based multiple-processor computers, with respect to price/performance ratio, reliability, availability, and manufacturing. Design of HCN operating-system software provides flexible computing environment accommodating both parallel and distributed processing. Also achieves balance among following competing factors; performance in processing and communications, ease of use, and tolerance of (and recovery from) faults.

  20. Community Cloud Computing

    CERN Document Server

    Marinos, Alexandros

    2009-01-01

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenge...

  1. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  2. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  3. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  4. Computers and neurosurgery.

    Science.gov (United States)

    Shaikhouni, Ammar; Elder, J Bradley

    2012-11-01

    At the turn of the twentieth century, the only computational device used in neurosurgical procedures was the brain of the surgeon. Today, most neurosurgical procedures rely at least in part on the use of a computer to help perform surgeries accurately and safely. The techniques that revolutionized neurosurgery were mostly developed after the 1950s. Just before that era, the transistor was invented in the late 1940s, and the integrated circuit was invented in the late 1950s. During this time, the first automated, programmable computational machines were introduced. The rapid progress in the field of neurosurgery not only occurred hand in hand with the development of modern computers, but one also can state that modern neurosurgery would not exist without computers. The focus of this article is the impact modern computers have had on the practice of neurosurgery. Neuroimaging, neuronavigation, and neuromodulation are examples of tools in the armamentarium of the modern neurosurgeon that owe each step in their evolution to progress made in computer technology. Advances in computer technology central to innovations in these fields are highlighted, with particular attention to neuroimaging. Developments over the last 10 years in areas of sensors and robotics that promise to transform the practice of neurosurgery further are discussed. Potential impacts of advances in computers related to neurosurgery in developing countries and underserved regions are also discussed. As this article illustrates, the computer, with its underlying and related technologies, is central to advances in neurosurgery over the last half century.

  5. COMPUTER-ASSISTED ACCOUNTING

    Directory of Open Access Journals (Sweden)

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  6. Serious computer games in computer science education

    Directory of Open Access Journals (Sweden)

    Jože Rugelj

    2015-11-01

    Full Text Available The role and importance of serious computer games in contemporary educational practice is presented in this paper as well as the theoretical fundamentals that justify their use in different forms of education. We present a project for designing and developing serious games that take place within the curriculum for computer science teachers’ education as an independent project work in teams. In this project work students have to use their knowledge in the field of didactics and computer science to develop games. The developed game is tested and evaluated in schools in the framework of their practical training. The results of the evaluation can help students improve their games and verify to which extent specified learning goals have been achieved.

  7. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  8. New computing systems and their impact on computational mechanics

    Science.gov (United States)

    Noor, Ahmed K.

    1989-01-01

    Recent advances in computer technology that are likely to impact computational mechanics are reviewed. The technical needs for computational mechanics technology are outlined. The major features of new and projected computing systems, including supersystems, parallel processing machines, special-purpose computing hardware, and small systems are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed, and a novel partitioning strategy is outlined for maximizing the degree of parallelism on multiprocessor computers with a shared memory.

  9. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways and without arcane encodings of data and algorithm); it is also uniform: new “hardware” is not needed to solve new problems; and (last but not least) it is Turing complete......Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...

  10. CLOUD COMPUTING AND SECURITY

    Directory of Open Access Journals (Sweden)

    Asharani Shinde

    2015-10-01

    Full Text Available This document gives an insight into Cloud Computing giving an overview of key features as well as the detail study of exact working of Cloud computing. Cloud Computing lets you access all your application and documents from anywhere in the world, freeing you from the confines of the desktop thus making it easier for group members in different locations to collaborate. Certainly cloud computing can bring about strategic, transformational and even revolutionary benefits fundamental to future enterprise computing but it also offers immediate and pragmatic opportunities to improve efficiencies today while cost effectively and systematically setting the stage for the strategic change. As this technology makes the computing, sharing, networking easy and interesting, we should think about the security and privacy of information too. Thus the key points we are going to be discussed are what is cloud, what are its key features, current applications, future status and the security issues and the possible solutions.

  11. Computation with narrow CTCs

    CERN Document Server

    Say, A C Cem

    2011-01-01

    We examine some variants of computation with closed timelike curves (CTCs), where various restrictions are imposed on the memory of the computer, and the information carrying capacity and range of the CTC. We give full characterizations of the classes of languages recognized by polynomial time probabilistic and quantum computers that can send a single classical bit to their own past. Such narrow CTCs are demonstrated to add the power of limited nondeterminism to deterministic computers, and lead to exponential speedup in constant-space probabilistic and quantum computation. We show that, given a time machine with constant negative delay, one can implement CTC-based computations without the need to know about the runtime beforehand.

  12. Programming in Biomolecular Computation

    DEFF Research Database (Denmark)

    Hartmann, Lars; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level....

  13. Sensor sentinel computing device

    Science.gov (United States)

    Damico, Joseph P.

    2016-08-02

    Technologies pertaining to authenticating data output by sensors in an industrial environment are described herein. A sensor sentinel computing device receives time-series data from a sensor by way of a wireline connection. The sensor sentinel computing device generates a validation signal that is a function of the time-series signal. The sensor sentinel computing device then transmits the validation signal to a programmable logic controller in the industrial environment.

  14. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  15. Electronics and computer acronyms

    CERN Document Server

    Brown, Phil

    1988-01-01

    Electronics and Computer Acronyms presents a list of almost 2,500 acronyms related to electronics and computers. The material for this book is drawn from a number of subject areas, including electrical, electronics, computers, telecommunications, fiber optics, microcomputers/microprocessors, audio, video, and information technology. The acronyms also encompass avionics, military, data processing, instrumentation, units, measurement, standards, services, organizations, associations, and companies. This dictionary offers a comprehensive and broad view of electronics and all that is associated wi

  16. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  17. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  18. Computable de Finetti measures

    CERN Document Server

    Freer, Cameron E

    2009-01-01

    We prove a uniformly computable version of de Finetti's theorem on exchangeable sequences of real random variables. As a consequence, exchangeable stochastic processes in probabilistic functional programming languages can be automatically rewritten as procedures that do not modify non-local state. Along the way, we prove that a distribution on the unit interval is computable if and only if its moments are uniformly computable.

  19. Computational movement analysis

    CERN Document Server

    Laube, Patrick

    2014-01-01

    This SpringerBrief discusses the characteristics of spatiotemporal movement data, including uncertainty and scale. It investigates three core aspects of Computational Movement Analysis: Conceptual modeling of movement and movement spaces, spatiotemporal analysis methods aiming at a better understanding of movement processes (with a focus on data mining for movement patterns), and using decentralized spatial computing methods in movement analysis. The author presents Computational Movement Analysis as an interdisciplinary umbrella for analyzing movement processes with methods from a range of fi

  20. Computer science I essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science I includes fundamental computer concepts, number representations, Boolean algebra, switching circuits, and computer architecture.

  1. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  2. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  3. Computer and Applied Ethics

    OpenAIRE

    越智, 貢

    2014-01-01

    With this essay I treat some problems raised by the new developments in science and technology, that is, those about Computer Ethics to show how and how far Applied Ethics differs from traditional ethics. I take up backgrounds on which Computer Ethics rests, particularly historical conditions of morality. Differences of conditions in time and space explain how Computer Ethics and Applied Ethics are not any traditional ethics in concrete cases. But I also investigate the normative rea...

  4. Research in computer science

    Science.gov (United States)

    Ortega, J. M.

    1986-01-01

    Various graduate research activities in the field of computer science are reported. Among the topics discussed are: (1) failure probabilities in multi-version software; (2) Gaussian Elimination on parallel computers; (3) three dimensional Poisson solvers on parallel/vector computers; (4) automated task decomposition for multiple robot arms; (5) multi-color incomplete cholesky conjugate gradient methods on the Cyber 205; and (6) parallel implementation of iterative methods for solving linear equations.

  5. Factors Affecting Computer Anxiety in High School Computer Science Students.

    Science.gov (United States)

    Hayek, Linda M.; Stephens, Larry

    1989-01-01

    Examines factors related to computer anxiety measured by the Computer Anxiety Index (CAIN). Achievement in two programing courses was inversely related to computer anxiety. Students who had a home computer and had computer experience before high school had lower computer anxiety than those who had not. Lists 14 references. (YP)

  6. Cloud Computing: An Overview

    Science.gov (United States)

    Qian, Ling; Luo, Zhiguo; Du, Yujian; Guo, Leitao

    In order to support the maximum number of user and elastic service with the minimum resource, the Internet service provider invented the cloud computing. within a few years, emerging cloud computing has became the hottest technology. From the publication of core papers by Google since 2003 to the commercialization of Amazon EC2 in 2006, and to the service offering of AT&T Synaptic Hosting, the cloud computing has been evolved from internal IT system to public service, from cost-saving tools to revenue generator, and from ISP to telecom. This paper introduces the concept, history, pros and cons of cloud computing as well as the value chain and standardization effort.

  7. Computer Games and Art

    Directory of Open Access Journals (Sweden)

    Anton Sukhov

    2015-10-01

    Full Text Available This article devoted to the search of relevant sources (primary and secondary and characteristics of computer games that allow to include them in the field of art (such as the creation of artistic games, computer graphics, active interaction with other forms of art, signs of spiritual aesthetic act, own temporality of computer games, “aesthetic illusion”, interactivity. In general, modern computer games can be attributed to commercial art and popular culture (blockbuster games and to elite forms of contemporary media art (author’s games, visionary games.

  8. Computational approaches to vision

    Science.gov (United States)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  9. Modeling Trusted Computing

    Institute of Scientific and Technical Information of China (English)

    CHEN Shuyi; WEN Yingyou; ZHAO Hong

    2006-01-01

    In this paper, a formal approach based on predicate logic is proposed for representing and reasoning of trusted computing models. Predicates are defined to represent the characteristics of the objects and the relationship among these objects in a trusted system according to trusted computing specifications. Inference rules of trusted relation are given too. With the semantics proposed, some trusted computing models are formalized and verified, which shows that Predicate calculus logic provides a general and effective method for modeling and reasoning trusted computing systems.

  10. Discrete and computational geometry

    CERN Document Server

    Devadoss, Satyan L

    2011-01-01

    Discrete geometry is a relatively new development in pure mathematics, while computational geometry is an emerging area in applications-driven computer science. Their intermingling has yielded exciting advances in recent years, yet what has been lacking until now is an undergraduate textbook that bridges the gap between the two. Discrete and Computational Geometry offers a comprehensive yet accessible introduction to this cutting-edge frontier of mathematics and computer science. This book covers traditional topics such as convex hulls, triangulations, and Voronoi diagrams, as well a

  11. Theory and Computation

    Data.gov (United States)

    Federal Laboratory Consortium — Flexible computational infrastructure, software tools and theoretical consultation are provided to support modeling and understanding of the structure and properties...

  12. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  13. Frontiers in Computer Education

    CERN Document Server

    Zhu, Egui; 2011 International Conference on Frontiers in Computer Education (ICFCE 2011)

    2012-01-01

    This book is the proceedings of the 2011 International Conference on Frontiers in Computer Education (ICFCE 2011) in Sanya, China, December 1-2, 2011. The contributions can be useful for researchers, software engineers, and programmers, all interested in promoting the computer and education development. Topics covered are computing and communication technology, network management, wireless networks, telecommunication, Signal and Image Processing, Machine Learning, educational management, educational psychology, educational system, education engineering, education technology and training.  The emphasis is on methods and calculi for computer science and education technology development, verification and verification tools support, experiences from doing developments, and the associated theoretical problems.

  14. Computer assisted audit techniques

    Directory of Open Access Journals (Sweden)

    Dražen Danić

    2008-12-01

    Full Text Available The purpose of this work is to point to the possibilities of more efficient auditing. In the encirclement of more and more intensive use of computer techniques that help to CAAT all the aims and the volume of auditing do not change when the audit is done in the computer-informatics environment. The computer assisted audit technique (CAATs can improve the efficiency and productivity of audit procedures. In the computerized information system, the CAATs are the ways in which an auditor can use computer to gather or as help in gathering auditing evidence. There are more reasons why the auditors apply computer techniques that help in auditing. Most often, they do it to achieve improvement of auditing efficiency when the data volume is large. It depends on several factors whether the auditors will apply the computer techniques that help auditing and to what degree respectively. If they do it, the most important are the computer knowledge, professional skill, experience of auditors, and availability of computer technique, and adequacy of computer supports, infeasibility of hand tests, efficiency and time limit. Through several examples from practice, we showed the possibilities of ACL as one of the CAAT tools.

  15. Handheld-computers

    NARCIS (Netherlands)

    Ramaekers, P.; Huiskes, J.

    1994-01-01

    Het Proefstation voor de Varkenshouderij onderzoekt de meerwaarde van 'handheld'-computers (handcomputers) voor de registratie van ziekten en behandelingen ten opzichte van de schriftelijke registratie.

  16. Annual review of computer science

    Energy Technology Data Exchange (ETDEWEB)

    Traub, J.F. (Columbia Univ., New York, NY (USA)); Grosz, B.J. (Harvard Univ., Cambridge, MA (USA)); Lampson, B.W. (Digital Equipment Corp. (US)); Nilsson, N.J. (Stanford Univ., CA (USA))

    1988-01-01

    This book contains the annual review of computer science. Topics covered include: Database security, parallel algorithmic techniques for combinatorial computation, algebraic complexity theory, computer applications in manufacturing, and computational geometry.

  17. Computational matter: evolving computational solutions in materials

    NARCIS (Netherlands)

    Miller, Julian F.; Broersma, Hajo; Silva, Sara

    2015-01-01

    Natural Evolution has been exploiting the physical properties of matter since life first appeared on earth. Evolution-in-materio (EIM) attempts to program matter so that computational problems can be solved. The beauty of this approach is that artificial evolution may be able to utilize unknown phys

  18. Educational Computer Utilization and Computer Communications.

    Science.gov (United States)

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  19. Asynchronous Multiparty Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Geisler, Martin; Krøigaard, Mikkel

    2009-01-01

    We propose an asynchronous protocol for general multiparty computation. The protocol has perfect security and communication complexity  where n is the number of parties, |C| is the size of the arithmetic circuit being computed, and k is the size of elements in the underlying field. The protocol g...

  20. Quantum Knitting Computer

    OpenAIRE

    Fujii, Toshiyuki; Matsuo, Shigemasa; Hatakenaka, Noriyuki

    2009-01-01

    We propose a fluxon-controlled quantum computer incorporated with three-qubit quantum error correction using special gate operations, i.e., joint-phase and SWAP gate operations, inherent in capacitively coupled superconducting flux qubits. The proposed quantum computer acts exactly like a knitting machine at home.

  1. Programming the social computer.

    Science.gov (United States)

    Robertson, David; Giunchiglia, Fausto

    2013-03-28

    The aim of 'programming the global computer' was identified by Milner and others as one of the grand challenges of computing research. At the time this phrase was coined, it was natural to assume that this objective might be achieved primarily through extending programming and specification languages. The Internet, however, has brought with it a different style of computation that (although harnessing variants of traditional programming languages) operates in a style different to those with which we are familiar. The 'computer' on which we are running these computations is a social computer in the sense that many of the elementary functions of the computations it runs are performed by humans, and successful execution of a program often depends on properties of the human society over which the program operates. These sorts of programs are not programmed in a traditional way and may have to be understood in a way that is different from the traditional view of programming. This shift in perspective raises new challenges for the science of the Web and for computing in general.

  2. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem conspicu...

  3. Simulation of quantum computers

    NARCIS (Netherlands)

    De Raedt, H; Michielsen, K; Hams, AH; Miyashita, S; Saito, K; Landau, DP; Lewis, SP; Schuttler, HB

    2001-01-01

    We describe a simulation approach to study the functioning of Quantum Computer hardware. The latter is modeled by a collection of interacting spin-1/2 objects. The time evolution of this spin system maps one-to-one to a quantum program carried out by the Quantum Computer. Our simulation software con

  4. Computer Aided Mathematics

    DEFF Research Database (Denmark)

    Sinclair, Robert

    1998-01-01

    Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++.......Course notes of a PhD course held in 1998. The central idea is to introduce students to computational mathematics using object oriented programming in C++....

  5. The Computing World

    Science.gov (United States)

    1992-04-01

    before Zuse would finish the machine. The British ended up receiving a smuggled replica of the German message-scrambling device. Alan Turing applied...general purpose computer followed closely by Alan Turing and his programmable digital computer. These pioneers thus launched the modern era of

  6. Computer Processed Evaluation.

    Science.gov (United States)

    Griswold, George H.; Kapp, George H.

    A student testing system was developed consisting of computer generated and scored equivalent but unique repeatable tests based on performance objectives for undergraduate chemistry classes. The evaluation part of the computer system, made up of four separate programs written in FORTRAN IV, generates tests containing varying numbers of multiple…

  7. Quantum Analog Computing

    Science.gov (United States)

    Zak, M.

    1998-01-01

    Quantum analog computing is based upon similarity between mathematical formalism of quantum mechanics and phenomena to be computed. It exploits a dynamical convergence of several competing phenomena to an attractor which can represent an externum of a function, an image, a solution to a system of ODE, or a stochastic process.

  8. Teaching Using Computer Games

    Science.gov (United States)

    Miller, Lee Dee; Shell, Duane; Khandaker, Nobel; Soh, Leen-Kiat

    2011-01-01

    Computer games have long been used for teaching. Current reviews lack categorization and analysis using learning models which would help instructors assess the usefulness of computer games. We divide the use of games into two classes: game playing and game development. We discuss the Input-Process-Outcome (IPO) model for the learning process when…

  9. Exercises in Computational Chemistry

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2016-01-01

    A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16).......A selection of HyperChem© PC-exercises in computational chemistry. Answers to most questions are appended (Roskilde University 2014-16)....

  10. Computational chemistry at Janssen.

    Science.gov (United States)

    van Vlijmen, Herman; Desjarlais, Renee L; Mirzadegan, Tara

    2016-12-19

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  11. Computers, Networks and Education.

    Science.gov (United States)

    Kay, Alan C.

    1991-01-01

    Discussed is how globally networked, easy-to-use computers can enhance learning only within an educational environment that encourages students to question "facts" and seek challenges. The strengths and weaknesses of computers used as amplifiers for learning are described. (KR)

  12. Computer Anxiety and Instruction.

    Science.gov (United States)

    Baumgarte, Roger

    While the computer is commonly viewed as a tool for simplifying and enriching lives, many individuals react to this technology with feelings of anxiety, paranoia, and alienation. These reactions may have potentially serious career and educational consequences. Fear of computers reflects a generalized fear of current technology and is most…

  13. Computer-assisted Crystallization.

    Science.gov (United States)

    Semeister, Joseph J., Jr.; Dowden, Edward

    1989-01-01

    To avoid a tedious task for recording temperature, a computer was used for calculating the heat of crystallization for the compound sodium thiosulfate. Described are the computer-interfacing procedures. Provides pictures of laboratory equipment and typical graphs from experiments. (YP)

  14. Computer Aided Lecturing.

    Science.gov (United States)

    Van Meter, Donald E.

    1994-01-01

    Surveyed students taking a natural resource conservation course to determine the effects of computer software that provides tools for creating and managing visual presentations to students. Results indicated that 94% of the respondents believed computer-aided lectures helped them and recommended their continued use; note taking was more effective,…

  15. The Computational Materials Repository

    DEFF Research Database (Denmark)

    Landis, David D.; Hummelshøj, Jens S.; Nestorov, Svetlozar

    2012-01-01

    The possibilities for designing new materials based on quantum physics calculations are rapidly growing, but these design efforts lead to a significant increase in the amount of computational data created. The Computational Materials Repository (CMR) addresses this data challenge and provides...... a software infrastructure that supports the collection, storage, retrieval, analysis, and sharing of data produced by many electronic-structure simulators....

  16. Computations in Plasma Physics.

    Science.gov (United States)

    Cohen, Bruce I.; Killeen, John

    1983-01-01

    Discusses contributions of computers to research in magnetic and inertial-confinement fusion, charged-particle-beam propogation, and space sciences. Considers use in design/control of laboratory and spacecraft experiments and in data acquisition; and reviews major plasma computational methods and some of the important physics problems they…

  17. Text analysis and computers

    OpenAIRE

    1995-01-01

    Content: Erhard Mergenthaler: Computer-assisted content analysis (3-32); Udo Kelle: Computer-aided qualitative data analysis: an overview (33-63); Christian Mair: Machine-readable text corpora and the linguistic description of danguages (64-75); Jürgen Krause: Principles of content analysis for information retrieval systems (76-99); Conference Abstracts (100-131).

  18. Computers in construction

    DEFF Research Database (Denmark)

    Howard, Rob

    The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future......The evolution of technology, particularly computing in building, learning from the past in order to anticipate what may happen in the future...

  19. Computational chemistry at Janssen

    Science.gov (United States)

    van Vlijmen, Herman; Desjarlais, Renee L.; Mirzadegan, Tara

    2016-12-01

    Computer-aided drug discovery activities at Janssen are carried out by scientists in the Computational Chemistry group of the Discovery Sciences organization. This perspective gives an overview of the organizational and operational structure, the science, internal and external collaborations, and the impact of the group on Drug Discovery at Janssen.

  20. Computer Virus Protection

    Science.gov (United States)

    Rajala, Judith B.

    2004-01-01

    A computer virus is a program--a piece of executable code--that has the unique ability to replicate. Like biological viruses, computer viruses can spread quickly and are often difficult to eradicate. They can attach themselves to just about any type of file, and are spread by replicating and being sent from one individual to another. Simply having…

  1. Computational physics: a perspective.

    Science.gov (United States)

    Stoneham, A M

    2002-06-15

    Computing comprises three distinct strands: hardware, software and the ways they are used in real or imagined worlds. Its use in research is more than writing or running code. Having something significant to compute and deploying judgement in what is attempted and achieved are especially challenging. In science or engineering, one must define a central problem in computable form, run such software as is appropriate and, last but by no means least, convince others that the results are both valid and useful. These several strands are highly interdependent. A major scientific development can transform disparate aspects of information and computer technologies. Computers affect the way we do science, as well as changing our personal worlds. Access to information is being transformed, with consequences beyond research or even science. Creativity in research is usually considered uniquely human, with inspiration a central factor. Scientific and technological needs are major forces in innovation, and these include hardware and software opportunities. One can try to define the scientific needs for established technologies (atomic energy, the early semiconductor industry), for rapidly developing technologies (advanced materials, microelectronics) and for emerging technologies (nanotechnology, novel information technologies). Did these needs define new computing, or was science diverted into applications of then-available codes? Regarding credibility, why is it that engineers accept computer realizations when designing engineered structures, whereas predictive modelling of materials has yet to achieve industrial confidence outside very special cases? The tensions between computing and traditional science are complex, unpredictable and potentially powerful.

  2. Learning with Ubiquitous Computing

    Science.gov (United States)

    Rosenheck, Louisa

    2008-01-01

    If ubiquitous computing becomes a reality and is widely adopted, it will inevitably have an impact on education. This article reviews the background of ubiquitous computing and current research projects done involving educational "ubicomp." Finally it explores how ubicomp may and may not change education in both formal and informal settings and…

  3. Logic via Computer Programming.

    Science.gov (United States)

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  4. Computer Use Exposed

    NARCIS (Netherlands)

    J.M. Richter (Janneke)

    2009-01-01

    textabstractEver since the introduction of the personal computer, our daily lives are infl uenced more and more by computers. A day in the life of a PhD-student illustrates this: “At the breakfast table, I check my e-mail to see if the meeting later that day has been confi rmed, and I check the time

  5. Preventing Computer Glitches

    Science.gov (United States)

    Goldsborough, Reid

    2009-01-01

    It has been said that a computer lets a person make more mistakes faster than any other invention in human history, with the possible exceptions of handguns and tequila. Computers also make mistakes on their own, whether they're glitches, conflicts, bugs, crashes, or failures. Avoiding glitches is considerably less frustrating than trying to fix…

  6. Ubiquitous Human Computing

    OpenAIRE

    Zittrain, Jonathan L.

    2008-01-01

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a thumb tack and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This short essay explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  7. Theory of computational complexity

    CERN Document Server

    Du, Ding-Zhu

    2011-01-01

    DING-ZHU DU, PhD, is a professor in the Department of Computer Science at the University of Minnesota. KER-I KO, PhD, is a professor in the Department of Computer Science at the State University of New York at Stony Brook.

  8. Fostering Computational Thinking

    CERN Document Server

    Caballero, Marcos D; Schatz, Michael F

    2011-01-01

    Students taking introductory physics are rarely exposed to computational modeling. In a one-semester large lecture introductory calculus-based mechanics course at Georgia Tech, students learned to solve physics problems using the VPython programming environment. During the term 1357 students in this course solved a suite of fourteen computational modeling homework questions delivered using an online commercial course management system. Their proficiency with computational modeling was evaluated in a proctored environment using a novel central force problem. The majority of students (60.4%) successfully completed the evaluation. Analysis of erroneous student-submitted programs indicated that a small set of student errors explained why most programs failed. We discuss the design and implementation of the computational modeling homework and evaluation, the results from the evaluation and the implications for instruction in computational modeling in introductory STEM courses.

  9. The Computer Festival

    Institute of Scientific and Technical Information of China (English)

    1998-01-01

    The Beijing neighborhood of Zhongguancun is considered China’s Silicon Valley. After ten years of rapid devel-opment, it has hewn out a dominant position for itself in respect to computer markets, technology and labor force. Even on an average day, the famous "Computer Street" attracts a large number of visitors and consumers, but at a recent computer fair, the crowds were even larger. The purpose of the festival was to encourage computer use in homes and offices, to further promote the development of high-tech production and to keep pushing the modernization of information in China. The once-a-year computer festival will probably become a new custom in Chinese people’s lives.

  10. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  11. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  12. Indirection and computer security.

    Energy Technology Data Exchange (ETDEWEB)

    Berg, Michael J.

    2011-09-01

    The discipline of computer science is built on indirection. David Wheeler famously said, 'All problems in computer science can be solved by another layer of indirection. But that usually will create another problem'. We propose that every computer security vulnerability is yet another problem created by the indirections in system designs and that focusing on the indirections involved is a better way to design, evaluate, and compare security solutions. We are not proposing that indirection be avoided when solving problems, but that understanding the relationships between indirections and vulnerabilities is key to securing computer systems. Using this perspective, we analyze common vulnerabilities that plague our computer systems, consider the effectiveness of currently available security solutions, and propose several new security solutions.

  13. Unconditionally verifiable blind computation

    CERN Document Server

    Fitzsimons, Joseph F

    2012-01-01

    Blind Quantum Computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output and computation remain private. Recently the authors together with Broadbent proposed a universal unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol, or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. In this paper we extend the BQC protocol presented in [Broadbent, Fitzsimons and Kashefi, FOCS 2009 p517] with new functionality allowing blind computational basis m...

  14. Blind Quantum Computation

    CERN Document Server

    Arrighi, P; Arrighi, Pablo; Salvail, Louis

    2003-01-01

    We investigate the possibility of having someone carry out the work of executing a function for you, but without letting him learn anything about your input. Say Alice wants Bob to compute some well-known function f upon her input x, but wants to prevent Bob from learning anything about x. The situation arises for instance if client Alice has limited computational resources in comparison with mistrusted server Bob, or if x is an inherently mobile piece of data. Could there be a protocol whereby Bob is forced to compute f(x) "blindly", i.e. without observing x? We provide such a blind computation protocol for the class of functions which admit an efficient procedure to generate random input-output pairs, e.g. factorization. The setting is quantum, the security is unconditional, the eavesdropper is as malicious as can be. Keywords: Secure Circuit Evaluation, Secure Two-party Computation, Information Hiding, Information gain vs disturbance.

  15. Philosophy of Computer Science

    Directory of Open Access Journals (Sweden)

    Aatami Järvinen

    2014-06-01

    Full Text Available The diversity and interdisciplinary of Computer Sciences, and the multiplicity of its uses in other sciences make it difficult to define them and prescribe how to perform them. Furthermore, also cause friction between computer scientists from different branches. Because of how they are structured, these sciences programs are criticized for not offer an adequate methodological training, or a deep understanding of different research traditions. To collaborate on a solution, some have decided to include in their curricula courses that enable students to gain awareness about epistemology and methodological issues in Computer Science, as well as give meaning to the practice of computer scientists. In this article the needs and objectives of the courses on the philosophy of Computer Science are analyzed, and its structure and management are explained.

  16. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  17. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...... project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future...

  18. Ubiquitous Computing Systems

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Friday, Adrian

    2009-01-01

    First introduced two decades ago, the term ubiquitous computing is now part of the common vernacular. Ubicomp, as it is commonly called, has grown not just quickly but broadly so as to encompass a wealth of concepts and technology that serves any number of purposes across all of human endeavor......, an original ubicomp pioneer, Ubiquitous Computing Fundamentals brings together eleven ubiquitous computing trailblazers who each report on his or her area of expertise. Starting with a historical introduction, the book moves on to summarize a number of self-contained topics. Taking a decidedly human...... perspective, the book includes discussion on how to observe people in their natural environments and evaluate the critical points where ubiquitous computing technologies can improve their lives. Among a range of topics this book examines: How to build an infrastructure that supports ubiquitous computing...

  19. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn

    2009-01-01

    An increased interest in the notion of place has evolved in interaction design based on the proliferation of wireless infrastructures, developments in digital media, and a ‘spatial turn’ in computing. In this article, place-specific computing is suggested as a genre of interaction design...... that addresses the shaping of interactions among people, place-specific resources and global socio-technical networks, mediated by digital technology, and influenced by the structuring conditions of place. The theoretical grounding for place-specific computing is located in the meeting between conceptions...... examples of place-specific computing are presented from a series of pilot studies, conducted in close collaboration with design students in Malmö, Berlin, Cape Town and Rome, that generated 36 design concepts in the genre. Reflecting on these examples, issues in the design of place-specific computing...

  20. CERN School of Computing

    CERN Multimedia

    2007-01-01

    The 2007 CERN School of Computing, organised by CERN in collaboration with the University of Split (FESB) will be held from 20 to 31 August 2007 in Dubrovnik, Croatia. It is aimed at postgraduate students and research workers with a few years' experience in scientific physics, computing or related fields. Special themes this year are: GRID Technologies: The Grid track delivers unique theoretical and hands-on education on some of the most advanced GRID topics; Software Technologies: The Software track addresses some of the most relevant modern techniques and tools for large scale distributed software development and handling as well as for computer security; Physics Computing: The Physics Computing track focuses on informatics topics specific to the HEP community. After setting-the-scene lectures, it addresses data acquisition and ROOT. Grants from the European Union Framework Programme 6 (FP6) are available to participants to cover part or all of the cost of the School. More information can be found at...

  1. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  2. Computing Algebraic Immunity by Reconfigurable Computer

    Science.gov (United States)

    2012-09-01

    the linear system, then the amount of computation required is O (( n d )ω) , where ω is the well–known “exponent of Gaussian reduction” (ω = 3 ( Gauss ...x2, x3) = x1x2 ⊕ x1x3 ⊕ x2x3. The top half of Table 2 shows the minterm canonical form of f̄ . Here, the first (leftmost) column represents all

  3. Big data computing

    CERN Document Server

    Akerkar, Rajendra

    2013-01-01

    Due to market forces and technological evolution, Big Data computing is developing at an increasing rate. A wide variety of novel approaches and tools have emerged to tackle the challenges of Big Data, creating both more opportunities and more challenges for students and professionals in the field of data computation and analysis. Presenting a mix of industry cases and theory, Big Data Computing discusses the technical and practical issues related to Big Data in intelligent information management. Emphasizing the adoption and diffusion of Big Data tools and technologies in industry, the book i

  4. Attacks on computer systems

    Directory of Open Access Journals (Sweden)

    Dejan V. Vuletić

    2012-01-01

    Full Text Available Computer systems are a critical component of the human society in the 21st century. Economic sector, defense, security, energy, telecommunications, industrial production, finance and other vital infrastructure depend on computer systems that operate at local, national or global scales. A particular problem is that, due to the rapid development of ICT and the unstoppable growth of its application in all spheres of the human society, their vulnerability and exposure to very serious potential dangers increase. This paper analyzes some typical attacks on computer systems.

  5. Research in Computational Astrobiology

    Science.gov (United States)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  6. Archives and the computer

    CERN Document Server

    Cook, Michael Garnet

    1986-01-01

    Archives and the Computer deals with the use of the computer and its systems and programs in archiving data and other related materials. The book covers topics such as the scope of automated systems in archives; systems for records management, archival description, and retrieval; and machine-readable archives. The selection also features examples of archives from different institutions such as the University of Liverpool, Berkshire County Record Office, and the National Maritime Museum.The text is recommended for archivists who would like to know more about the use of computers in archiving of

  7. Mobile computing handbook

    CERN Document Server

    Ilyas, Mohammad

    2004-01-01

    INTRODUCTION AND APPLICATIONS OF MOBILE COMPUTING Wearable Computing,A. Smailagic and D.P. Siewiorek Developing Mobile Applications: A Lime Primer,G.P. Picco, A.L. Murphy, and G.-C. Roman Pervasive Application Development: Approaches and Pitfalls,G. Banavar, N. Cohen, and D. Soroker ISAM, Joining Context-Awareness and Mobility to Building Pervasive Applications,I. Augustin, A. Corrêa Yamin, J.L. Victória Barbosa, L. Cavalheiro da Silva, R. Araújo Real, G. Frainer, G.G. Honrich Cavalheiro, and C.F. Resin Geyer Integrating Mobile Wireless Devices into the Computational Grid,T. Phan, L. Huan

  8. Desktop grid computing

    CERN Document Server

    Cerin, Christophe

    2012-01-01

    Desktop Grid Computing presents common techniques used in numerous models, algorithms, and tools developed during the last decade to implement desktop grid computing. These techniques enable the solution of many important sub-problems for middleware design, including scheduling, data management, security, load balancing, result certification, and fault tolerance. The book's first part covers the initial ideas and basic concepts of desktop grid computing. The second part explores challenging current and future problems. Each chapter presents the sub-problems, discusses theoretical and practical

  9. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  10. Single neuron computation

    CERN Document Server

    McKenna, Thomas M; Zornetzer, Steven F

    1992-01-01

    This book contains twenty-two original contributions that provide a comprehensive overview of computational approaches to understanding a single neuron structure. The focus on cellular-level processes is twofold. From a computational neuroscience perspective, a thorough understanding of the information processing performed by single neurons leads to an understanding of circuit- and systems-level activity. From the standpoint of artificial neural networks (ANNs), a single real neuron is as complex an operational unit as an entire ANN, and formalizing the complex computations performed by real n

  11. Computational Science and Innovation

    CERN Document Server

    Dean, D J

    2010-01-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  12. Introduction to grid computing

    CERN Document Server

    Magoules, Frederic; Tan, Kiat-An; Kumar, Abhinit

    2009-01-01

    A Thorough Overview of the Next Generation in ComputingPoised to follow in the footsteps of the Internet, grid computing is on the verge of becoming more robust and accessible to the public in the near future. Focusing on this novel, yet already powerful, technology, Introduction to Grid Computing explores state-of-the-art grid projects, core grid technologies, and applications of the grid.After comparing the grid with other distributed systems, the book covers two important aspects of a grid system: scheduling of jobs and resource discovery and monitoring in grid. It then discusses existing a

  13. Practical scientific computing

    CERN Document Server

    Muhammad, A

    2011-01-01

    Scientific computing is about developing mathematical models, numerical methods and computer implementations to study and solve real problems in science, engineering, business and even social sciences. Mathematical modelling requires deep understanding of classical numerical methods. This essential guide provides the reader with sufficient foundations in these areas to venture into more advanced texts. The first section of the book presents numEclipse, an open source tool for numerical computing based on the notion of MATLAB®. numEclipse is implemented as a plug-in for Eclipse, a leading integ

  14. Advances in computers

    CERN Document Server

    Memon, Atif

    2012-01-01

    Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field. In-depth surveys and tutorials on new computer technologyWell-known authors and researchers in the fieldExtensive bibliographies with m

  15. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  16. Computing with Harmonic Functions

    OpenAIRE

    Axler, Sheldon

    2015-01-01

    This document is the manual for a free Mathematica package for computing with harmonic functions. This package allows the user to make calculations that would take a prohibitive amount of time if done without a computer. For example, the Poisson integral of any polynomial can be computed exactly. This software can find exact solutions to Dirichlet, Neumann, and biDirichlet problems in R^n with polynomial data on balls, ellipsoids, and annular regions. It can also find bases for spaces of sphe...

  17. Advanced Computer Typography.

    Science.gov (United States)

    1981-12-01

    ADVANCED COMPUTER TYPOGRAPHY .(U) DEC 81 A V HERSHEY UNCLASSIFIED NPS012-81-005 M MEEEIEEEII IIUJIL15I.4 MICROCQP RE SO.JjI ON ft R NPS012-81-005...NAVAL POSTGRADUATE SCHOOL 0Monterey, California DTIC SELECTEWA APR 5 1982 B ADVANCED COMPUTER TYPOGRAPHY by A. V. HERSHEY December 1981 OApproved for...Subtitle) S. TYPE Or REPORT & PERIOD COVERED Final ADVANCED COMPUTER TYPOGRAPHY Dec 1979 - Dec 1981 S. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) S CONTRACT

  18. Convergence: Computing and communications

    Energy Technology Data Exchange (ETDEWEB)

    Catlett, C. [National Center for Supercomputing Applications, Champaign, IL (United States)

    1996-12-31

    This paper highlights the operations of the National Center for Supercomputing Applications (NCSA). NCSA is developing and implementing a national strategy to create, use, and transfer advanced computing and communication tools and information technologies for science, engineering, education, and business. The primary focus of the presentation is historical and expected growth in the computing capacity, personal computer performance, and Internet and WorldWide Web sites. Data are presented to show changes over the past 10 to 20 years in these areas. 5 figs., 4 tabs.

  19. Computer science handbook

    CERN Document Server

    Tucker, Allen B

    2004-01-01

    Due to the great response to the famous Computer Science Handbook edited by Allen B. Tucker, … in 2004 Chapman & Hall/CRC published a second edition of this comprehensive reference book. Within more than 70 chapters, every one new or significantly revised, one can find any kind of information and references about computer science one can imagine. … All in all, there is absolute nothing about computer science that can not be found in the encyclopedia with its 110 survey articles …-Christoph Meinel, Zentralblatt MATH

  20. Games, puzzles, and computation

    CERN Document Server

    Hearn, Robert A

    2009-01-01

    The authors show that there are underlying mathematical reasons for why games and puzzles are challenging (and perhaps why they are so much fun). They also show that games and puzzles can serve as powerful models of computation-quite different from the usual models of automata and circuits-offering a new way of thinking about computation. The appendices provide a substantial survey of all known results in the field of game complexity, serving as a reference guide for readers interested in the computational complexity of particular games, or interested in open problems about such complexities.

  1. Computer architecture technology trends

    CERN Document Server

    1991-01-01

    Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in

  2. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  3. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation.

  4. Introduction to computer networking

    CERN Document Server

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  5. Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2014, held at Warsaw, Poland, September 7-10, 2014. The book presents recent advances in computational optimization. The volume includes important real problems like parameter settings for controlling processes in bioreactor and other processes, resource constrained project scheduling, infection distribution, molecule distance geometry, quantum computing, real-time management and optimal control, bin packing, medical image processing, localization the abrupt atmospheric contamination source and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks.

  6. Cloud computing security.

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Dongwan; Claycomb, William R.; Urias, Vincent E.

    2010-10-01

    Cloud computing is a paradigm rapidly being embraced by government and industry as a solution for cost-savings, scalability, and collaboration. While a multitude of applications and services are available commercially for cloud-based solutions, research in this area has yet to fully embrace the full spectrum of potential challenges facing cloud computing. This tutorial aims to provide researchers with a fundamental understanding of cloud computing, with the goals of identifying a broad range of potential research topics, and inspiring a new surge in research to address current issues. We will also discuss real implementations of research-oriented cloud computing systems for both academia and government, including configuration options, hardware issues, challenges, and solutions.

  7. Computational drug discovery

    Institute of Scientific and Technical Information of China (English)

    Si-sheng OU-YANG; Jun-yan LU; Xiang-qian KONG; Zhong-jie LIANG; Cheng LUO; Hualiang JIANG

    2012-01-01

    Computational drug discovery is an effective strategy for accelerating and economizing drug discovery and development process.Because of the dramatic increase in the availability of biological macromolecule and small molecule information,the applicability of computational drug discovery has been extended and broadly applied to nearly every stage in the drug discovery and development workflow,including target identification and validation,lead discovery and optimization and preclinical tests.Over the past decades,computational drug discovery methods such as molecular docking,pharmacophore modeling and mapping,de novo design,molecular similarity calculation and sequence-based virtual screening have been greatly improved.In this review,we present an overview of these important computational methods,platforms and successful applications in this field.

  8. Computation and Spacetime Structure

    CERN Document Server

    Stannett, Mike

    2011-01-01

    We investigate the relationship between computation and spacetime structure, focussing on the role of closed timelike curves (CTCs) in promoting computational speedup. We note first that CTC traversal can be interpreted in two distinct ways, depending on ones understanding of spacetime. Focussing on one interpretation leads us to develop a toy universe in which no CTC can be traversed more than once, whence no computational speedup is possible. Focussing on the second (and more standard) interpretation leads to the surprising conclusion that CTCs act as perfect information repositories: just as black holes have entropy, so do CTCs. If we also assume that P is not equal to NP, we find that all observers agree that, even if unbounded time travel existed in their youth, this capability eventually vanishes as they grow older. Thus the computational assumption "P is not NP" is also an assumption concerning cosmological structure.

  9. Theoretical Computer Science

    DEFF Research Database (Denmark)

    2002-01-01

    The proceedings contains 8 papers from the Conference on Theoretical Computer Science. Topics discussed include: query by committee, linear separation and random walks; hardness results for neural network approximation problems; a geometric approach to leveraging weak learners; mind change...

  10. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating......In this paper we discuss computational abstraction steps as a way to create class abstractions from concrete objects, and from examples. Computational abstraction steps are regarded as symmetric counterparts to computational concretisation steps, which are well-known in terms of function calls...... or capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during...

  11. Computer-Assisted Instruction.

    Science.gov (United States)

    Broadbent, Brooke

    1990-01-01

    Provides insight into how computers can be used in union education. States that they will never replace an effective classroom environment where participants' questions are answered by instructors, but can support existing systems. (JOW)

  12. Feynman Lectures on Computation

    CERN Document Server

    Feynman, Richard Phillips; Allen, Robin W

    1999-01-01

    "When, in 1984-86, Richard P. Feynman gave his famous course on computation at the California Institute of Technology, he asked Tony Hey to adapt his lecture notes into a book. Although led by Feynman,"

  13. Encyclopedia of cloud computing

    CERN Document Server

    Bojanova, Irena

    2016-01-01

    The Encyclopedia of Cloud Computing provides IT professionals, educators, researchers and students with a compendium of cloud computing knowledge. Authored by a spectrum of subject matter experts in industry and academia, this unique publication, in a single volume, covers a wide range of cloud computing topics, including technological trends and developments, research opportunities, best practices, standards, and cloud adoption. Providing multiple perspectives, it also addresses questions that stakeholders might have in the context of development, operation, management, and use of clouds. Furthermore, it examines cloud computing's impact now and in the future. The encyclopedia presents 56 chapters logically organized into 10 sections. Each chapter covers a major topic/area with cross-references to other chapters and contains tables, illustrations, side-bars as appropriate. Furthermore, each chapter presents its summary at the beginning and backend material, references and additional resources for further i...

  14. Computational modeling in biomechanics

    CERN Document Server

    Mofrad, Mohammad

    2010-01-01

    This book provides a glimpse of the diverse and important roles that modern computational technology is playing in various areas of biomechanics. It includes unique chapters on ab initio quantum mechanical, molecular dynamic and scale coupling methods..

  15. Computers, Children and Epistemology.

    Science.gov (United States)

    Blitman, Elaine; And Others

    1984-01-01

    The implementation of a computer education program in a Hawaiian school is described. The use of the LOGO programing language is central to the program, and teachers' comments on student behaviors and reactions are included. (MNS)

  16. Computers, Networks and Work.

    Science.gov (United States)

    Sproull, Lee; Kiesler, Sara

    1991-01-01

    Discussed are how computer networks can affect the nature of work and the relationships between managers and employees. The differences between face-to-face exchanges and electronic interactions are described. (KR)

  17. Kinetic equations: computation

    CERN Document Server

    Pareschi, Lorenzo

    2013-01-01

    Kinetic equations bridge the gap between a microscopic description and a macroscopic description of the physical reality. Due to the high dimensionality the construction of numerical methods represents a challenge and requires a careful balance between accuracy and computational complexity.

  18. Computability in HOL

    DEFF Research Database (Denmark)

    Hougaard, Ole Ildsgaard

    1994-01-01

    This paper describes the implementation of a formal model for computability theory in the logical system HOL. The computability is modeled through an imperative language formally defined with the use of the Backus-Naur form and natural semantics. I will define the concepts of computable functions...... will then evolve in two directions: The first subject is the reduction of recursive sets, leading to the unsolvability of the halting problem. The other is two general results of computability theory: The s-m-n theorem and Kleene's version of the 2nd recursion theorem. The use of the HOL system implies...... that the theory must be proven with the absence of Church's thesis, and, in fact, all proofs have to be done in detail. The paper will show how the HOL system is used to define the modeling language, as well as demonstrating the interaction between the HOL system and the theory. At some points the HOL system...

  19. Computing the functional proteome

    DEFF Research Database (Denmark)

    O'Brien, Edward J.; Palsson, Bernhard

    2015-01-01

    Constraint-based models enable the computation of feasible, optimal, and realized biological phenotypes from reaction network reconstructions and constraints on their operation. To date, stoichiometric reconstructions have largely focused on metabolism, resulting in genome-scale metabolic models (M...

  20. Computer Fever Hits China

    Institute of Scientific and Technical Information of China (English)

    1996-01-01

    THE term "home computer" became commonplace amongst Chinese citizens only a few years ago. Today, however, an untold number of Chinese women, representing thousands upon thousands of Chinese families, have joined in the home computer craze sweeping the nation. China is home to well over 300 million families, including 74 million families in cities and towns, with the latter figure representing a population greater than that of the United States. The estimated computer market for China stands at 100 million units. While one out of every thousand Chinese families currently owns a computer, the annual increase is rising at the phenomenal rate of 20 percent. Women usually control the purse stringsin Chinese families, especially in cities and

  1. Computational neurology and psychiatry

    CERN Document Server

    Bhattacharya, Basabdatta; Cochran, Amy

    2017-01-01

    This book presents the latest research in computational methods for modeling and simulating brain disorders. In particular, it shows how mathematical models can be used to study the relationship between a given disorder and the specific brain structure associated with that disorder. It also describes the emerging field of computational psychiatry, including the study of pathological behavior due to impaired functional connectivity, pathophysiological activity, and/or aberrant decision-making. Further, it discusses the data analysis techniques that will be required to analyze the increasing amount of data being generated about the brain. Lastly, the book offers some tips on the application of computational models in the field of quantitative systems pharmacology. Mainly written for computational scientists eager to discover new application fields for their model, this book also benefits neurologists and psychiatrists wanting to learn about new methods.

  2. Quantum computing: towards reality

    Science.gov (United States)

    Trabesinger, Andreas

    2017-03-01

    The concept of computers that harness the laws of quantum mechanics has transformed our thinking about how information can be processed. Now the environment exists to make prototype devices a reality.

  3. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT ... distinguished from one another on an x-ray film or CT electronic image. In a conventional x- ...

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... be viewed on a computer monitor, printed on film or transferred to a CD or DVD. CT ... distinguished from one another on an x-ray film or CT electronic image. In a conventional x- ...

  5. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  6. The Computer in Lexicography

    Science.gov (United States)

    Bailey, Richard W.; Robinson, Jay L.

    1973-01-01

    Expanded version of a paper presented to the section on Computer Research in Language and Literature of the Midwest Modern Language Association, October 24, 1969. Article is part of Lexicography and Dialect Geography, Festgabe for Hans Kurath''. (DD)

  7. NETL Super Computer

    Data.gov (United States)

    Federal Laboratory Consortium — The NETL Super Computer was designed for performing engineering calculations that apply to fossil energy research. It is one of the world’s larger supercomputers,...

  8. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... of the Head? What is CT Scanning of the Head? Computed tomography, more commonly known as a ... of page What are some common uses of the procedure? CT scanning of the head is typically ...

  9. ENERGY STAR Certified Computers

    Data.gov (United States)

    U.S. Environmental Protection Agency — Certified models meet all ENERGY STAR requirements as listed in the Version 6.1 ENERGY STAR Program Requirements for Computers that are effective as of June 2, 2014....

  10. Computer monitors drilling performance

    Energy Technology Data Exchange (ETDEWEB)

    1984-05-01

    Computer systems that can monitor over 40 drilling variables, display them graphically, record and transmit the information have been developed separately by two French companies. The systems, Vigigraphic and Visufora, involve the linking of a master computer with various surface and downhole sensors to measure the data on a real-time (as experienced) basis and compute the information. Vigigraphic is able to produce graphic displays grouped on four screens - drilling, tripping, geological and mud data. It computes at least 200 variables from the sensor readings, and it can store over 100 variables. Visufora allows the operator to group the drilling variables as desired. It can monitor and analyze surface and downhole parameters. The system can be linked with MWD tools. Twenty channels of input are assigned to surface values and the remaining 20 channels can be used to monitor downhole instrumentation.

  11. Complex networks and computing

    Institute of Scientific and Technical Information of China (English)

    Shuigeng ZHOU; Zhongzhi ZHANG

    2009-01-01

    @@ Nowadays complex networks are pervasive in various areas of science and technology. Popular examples of complex networks include the Internet, social networks of collaboration, citations and co-authoring, as well as biological networks such as gene and protein interactions and others. Complex networks research spans across mathematics, computer science, engineering, biology and the social sciences. Even in computer science area, increasing problems are either found to be related to complex networks or studied from the perspective of complex networks, such as searching on Web and P2P networks, routing in sensor networks, language processing, software engineering etc. The interaction and mergence of complex networks and computing is inspiring new chances and challenges in computer science.

  12. Resilient computer system design

    CERN Document Server

    Castano, Victor

    2015-01-01

    This book presents a paradigm for designing new generation resilient and evolving computer systems, including their key concepts, elements of supportive theory, methods of analysis and synthesis of ICT with new properties of evolving functioning, as well as implementation schemes and their prototyping. The book explains why new ICT applications require a complete redesign of computer systems to address challenges of extreme reliability, high performance, and power efficiency. The authors present a comprehensive treatment for designing the next generation of computers, especially addressing safety-critical, autonomous, real time, military, banking, and wearable health care systems.   §  Describes design solutions for new computer system - evolving reconfigurable architecture (ERA) that is free from drawbacks inherent in current ICT and related engineering models §  Pursues simplicity, reliability, scalability principles of design implemented through redundancy and re-configurability; targeted for energy-,...

  13. Computer and information science

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 15th IEEE/ACIS International Conference on Computer and Information Science (ICIS 2016) which was held on June 26– 29 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the program committee, and underwent further rigorous rounds of review. This publication captures 12 of the conference’s most promising...

  14. GPU computing and applications

    CERN Document Server

    See, Simon

    2015-01-01

    This book presents a collection of state of the art research on GPU Computing and Application. The major part of this book is selected from the work presented at the 2013 Symposium on GPU Computing and Applications held in Nanyang Technological University, Singapore (Oct 9, 2013). Three major domains of GPU application are covered in the book including (1) Engineering design and simulation; (2) Biomedical Sciences; and (3) Interactive & Digital Media. The book also addresses the fundamental issues in GPU computing with a focus on big data processing. Researchers and developers in GPU Computing and Applications will benefit from this book. Training professionals and educators can also benefit from this book to learn the possible application of GPU technology in various areas.

  15. Adiabatic Quantum Computing

    CERN Document Server

    Pinski, Sebastian D

    2011-01-01

    Adiabatic Quantum Computing (AQC) is a relatively new subject in the world of quantum computing, let alone Physics. Inspiration for this project has come from recent controversy around D-Wave Systems in British Columbia, Canada, who claim to have built a working AQC which is now commercially available and hope to be distributing a 1024 qubit chip by the end of 2008. Their 16 qubit chip was demonstrated online for the Supercomputing 2007 conference within which a few small problems were solved; although the explanations that journalists and critics received were minimal and very little was divulged in the question and answer session. This 'unconvincing' demonstration has caused physicists and computer scientists to hit back at D-Wave. The aim of this project is to give an introduction to the historic advances in classical and quantum computing and to explore the methods of AQC. Through numerical simulations an algorithm for the Max Independent Set problem is empirically obtained.

  16. 1987 computer science research: Computation directorate

    Energy Technology Data Exchange (ETDEWEB)

    McGraw, J.R.; Grupe, K.F. (eds.)

    1987-01-01

    The topics of research presented here reflect our view of the broad range of issues we need to address in support of our computing environment. Large-scale Scientific Computations represents one of our newest ventures. The goal is to more closely link expertise in the problem domains (e.g., fluid dynamics) with expertise in sophisticated numerical methods, thus allowing for a broader range of solution strategies to get better answers. Parallel Numerical Algorithms focuses more tightly on the development and analysis of numerical techniques for use in parallel computing situations. Issues here include the solution of extremely large partial differential equations, matrix solution techniques, and Monte Carlo programming techniques. In the area of General Numerical Algorithms we recognize the need for a significant amount of research on numerics without the additional complexity of parallelism. This area includes work on partial differential equations, ordinary differential equations, interpolation, and a variety of statistical analysis. Parallel Systems Software addresses issues related to going from a parallel algorithm to its correct and efficient implementation on a particular system. Distributed Operating Systems and Networks describes our efforts to provide a very flexible environment for users to access a diverse set of machines and services in an efficient and simple manner. Expert Systems Software covers another relatively new and expanding area. We are looking at various ways that knowledge engineering ideas can reduce development time for writing new code systems and improve our control over experimental processes. In the section on General Purpose Software we include several projects that span a wide range of topics. The last section, Technology Information Systems, reports the status of a special effort to provide sophisticated methods for allowing users to access remote information centers.

  17. Computers and clinical arrhythmias.

    Science.gov (United States)

    Knoebel, S B; Lovelace, D E

    1983-02-01

    Cardiac arrhythmias are ubiquitous in normal and abnormal hearts. These disorders may be life-threatening or benign, symptomatic or unrecognized. Arrhythmias may be the precursor of sudden death, a cause or effect of cardiac failure, a clinical reflection of acute or chronic disorders, or a manifestation of extracardiac conditions. Progress is being made toward unraveling the diagnostic and therapeutic problems involved in arrhythmogenesis. Many of the advances would not be possible, however, without the availability of computer technology. To preserve the proper balance and purposeful progression of computer usage, engineers and physicians have been exhorted not to work independently in this field. Both should learn some of the other's trade. The two disciplines need to come together to solve important problems with computers in cardiology. The intent of this article was to acquaint the practicing cardiologist with some of the extant and envisioned computer applications and some of the problems with both. We conclude that computer-based database management systems are necessary for sorting out the clinical factors of relevance for arrhythmogenesis, but computer database management systems are beset with problems that will require sophisticated solutions. The technology for detecting arrhythmias on routine electrocardiograms is quite good but human over-reading is still required, and the rationale for computer application in this setting is questionable. Systems for qualitative, continuous monitoring and review of extended time ECG recordings are adequate with proper noise rejection algorithms and editing capabilities. The systems are limited presently for clinical application to the recognition of ectopic rhythms and significant pauses. Attention should now be turned to the clinical goals for detection and quantification of arrhythmias. We should be asking the following questions: How quantitative do systems need to be? Are computers required for the detection of

  18. Computer aided product design

    DEFF Research Database (Denmark)

    Constantinou, Leonidas; Bagherpour, Khosrow; Gani, Rafiqul

    1996-01-01

    A general methodology for Computer Aided Product Design (CAPD) with specified property constraints which is capable of solving a large range of problems is presented. The methodology employs the group contribution approach, generates acyclic, cyclic and aromatic compounds of various degrees......-liquid equilibria (LLE), solid-liquid equilibria (SLE) and gas solubility. Finally, a computer program based on the extended methodology has been developed and the results from five case studies highlighting various features of the methodology are presented....

  19. Recent computational chemistry

    Science.gov (United States)

    Onishi, Taku

    2015-12-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  20. Ternary optical computer principle

    Institute of Scientific and Technical Information of China (English)

    金翊; 何华灿; 吕养天

    2003-01-01

    The fundamental principle and the characteristics of ternary optical computer, using horizontal polarized light, vertical polarized light and no-intensity to express information, are propounded in thispaper. The practicability to make key parts of the ternary optical computer from modern micro or integrated optical devices, opto-electronic and electro-photonic elements is discussed. The principle can be applied in three-state optical fiber communication via horizontal and vertical polarized light.

  1. Metacomputing on Commodity Computers

    Science.gov (United States)

    1999-05-01

    Charlotte is the rst parallel programming system to provide one - click com- puting on the Web. That is, without any administrative e ort, volunteers...below: 106 Charlotte is the rst parallel programming system to provide one - click com- puting. The idea behind one click computing is to allow...capable brows- er to a Web site. A key ingredient in one - click computing is its lack of requirements: user-accounts are not required, the availability of

  2. Computer Games and Instruction

    Science.gov (United States)

    Tobias, Sigmund, Ed.; Fletcher, J. D., Ed.

    2011-01-01

    There is intense interest in computer games. A total of 65 percent of all American households play computer games, and sales of such games increased 22.9 percent last year. The average amount of game playing time was found to be 13.2 hours per week. The popularity and market success of games is evident from both the increased earnings from games,…

  3. Theoretical and computational chemistry.

    Science.gov (United States)

    Meuwly, Markus

    2010-01-01

    Computer-based and theoretical approaches to chemical problems can provide atomistic understanding of complex processes at the molecular level. Examples ranging from rates of ligand-binding reactions in proteins to structural and energetic investigations of diastereomers relevant to organo-catalysis are discussed in the following. They highlight the range of application of theoretical and computational methods to current questions in chemical research.

  4. Multifractals and Entropy Computing

    CERN Document Server

    Slomczynski, W; Zyczkowski, K; Slomczynski, Wojciech; Kwapien, Jaroslaw; Zyczkowski, Karol

    1998-01-01

    We discuss the properties of invariant measures corresponding to iterated function systems (IFSs) with place-dependent probabilities and compute their shown that with certain dynamical systems one can associate the corresponding IFSs in such a way that their generalized entropies are equal. We use this method to compute entropy of some classical and quantum dynamical systems. Numerical techniques are based on integration over fractal measures.

  5. Adiabatic quantum computing

    OpenAIRE

    Lobe, Elisabeth; Stollenwerk, Tobias; Tröltzsch, Anke

    2015-01-01

    In the recent years, the field of adiabatic quantum computing has gained importance due to the advances in the realisation of such machines, especially by the company D-Wave Systems. These machines are suited to solve discrete optimisation problems which are typically very hard to solve on a classical computer. Due to the quantum nature of the device it is assumed that there is a substantial speedup compared to classical HPC facilities. We explain the basic principles of adiabatic ...

  6. Philosophy of Computer Science

    OpenAIRE

    Aatami Järvinen; Adele Mustonen; Jaana Ihalainen; Kaarle Lajunen

    2014-01-01

    The diversity and interdisciplinary of Computer Sciences, and the multiplicity of its uses in other sciences make it difficult to define them and prescribe how to perform them. Furthermore, also cause friction between computer scientists from different branches. Because of how they are structured, these sciences programs are criticized for not offer an adequate methodological training, or a deep understanding of different research traditions. To collaborate on a solution, some have decided to...

  7. Partnership in Computational Science

    Energy Technology Data Exchange (ETDEWEB)

    Huray, Paul G.

    1999-02-24

    This is the final report for the "Partnership in Computational Science" (PICS) award in an amount of $500,000 for the period January 1, 1993 through December 31, 1993. A copy of the proposal with its budget is attached as Appendix A. This report first describes the consequent significance of the DOE award in building infrastructure of high performance computing in the Southeast and then describes the work accomplished under this grant and a list of publications resulting from it.

  8. Computational temporal ghost imaging

    CERN Document Server

    Devaux, Fabrice; Denis, Severine; Lantz, Eric

    2016-01-01

    We present a very simple device, inspired by computational ghost imaging, that allows the re- trieval of a single non-reproducible, periodic or non-periodic, temporal signal. The reconstruction is performed by a single shot, spatially multiplexed, measurement of the spatial intensity correlations between computer-generated random images and the images modulated by the temporal signal, recorded and summed on a chip CMOS camera used with no temporal resolution.

  9. Paraconsistent Computational Logic

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Villadsen, Jørgen

    2012-01-01

    In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study.......In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....

  10. Optics and Symbolic Computing

    Science.gov (United States)

    1987-03-01

    computation. The well-structured data formats of vectors, matrices , etc. used in numeric computing give way to data structures that can change their shapes...complex circuitry, especially to emiIate and achieve capabilities typically if operacion requires uniformity over the complete associated with...function (PSF) P as a MxN matrix of rank R, then it is possible to decompose matrix P into a product of two unitary matrices and a diagonal matrix, ’ A

  11. Reconfigurable environmentally adaptive computing

    Science.gov (United States)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  12. CLOUD COMPUTING SECURITY

    Directory of Open Access Journals (Sweden)

    DANISH JAMIL,

    2011-04-01

    Full Text Available It is no secret that cloud computing is becoming more and more popular today and is ever increasing inpopularity with large companies as they share valuable resources in a cost effective way. Due to this increasingdemand for more clouds there is an ever growing threat of security becoming a major issue. This paper shalllook at ways in which security threats can be a danger to cloud computing and how they can be avoided.

  13. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  14. Professionalism in Computer Forensics

    Science.gov (United States)

    Irons, Alastair D.; Konstadopoulou, Anastasia

    The paper seeks to address the need to consider issues regarding professionalism in computer forensics in order to allow the discipline to develop and to ensure the credibility of the discipline from the differing perspectives of practitioners, the criminal justice system and in the eyes of the public. There is a need to examine and develop professionalism in computer forensics in order to promote the discipline and maintain the credibility of the discipline.

  15. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  16. Computing for calculus

    CERN Document Server

    Christensen, Mark J

    1981-01-01

    Computing for Calculus focuses on BASIC as the computer language used for solving calculus problems.This book discusses the input statement for numeric variables, advanced intrinsic functions, numerical estimation of limits, and linear approximations and tangents. The elementary estimation of areas, numerical and string arrays, line drawing algorithms, and bisection and secant method are also elaborated. This text likewise covers the implicit functions and differentiation, upper and lower rectangular estimates, Simpson's rule and parabolic approximation, and interpolating polynomials. Other to

  17. Computing Borel's Regulator II

    CERN Document Server

    Choo, Zacky; Sánchez-García, Rubén J; Snaith, Victor P

    2009-01-01

    In our earlier article we described a power series formula for the Borel regulator evaluated on the odd-dimensional homology of the general linear group of a number field and, concentrating on dimension three for simplicity, described a computer algorithm which calculates the value to any chosen degree of accuracy. In this sequel we give an algorithm for the construction of the input homology classes and describe the results of one cyclotomic field computation.

  18. Quantum computing with trapped ions

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, R.J.

    1998-01-01

    The significance of quantum computation for cryptography is discussed. Following a brief survey of the requirements for quantum computational hardware, an overview of the ion trap quantum computation project at Los Alamos is presented. The physical limitations to quantum computation with trapped ions are analyzed and an assessment of the computational potential of the technology is made.

  19. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  20. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  1. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  2. A Fundamental Tradeoff between Computation and Communication in Distributed Computing

    OpenAIRE

    Li, Songze; Maddah-Ali, Mohammad Ali; Yu, Qian; Avestimehr, A. Salman

    2016-01-01

    How can we optimally trade extra computing power to reduce the communication load in distributed computing? We answer this question by characterizing a fundamental tradeoff relationship between computation and communication in distributed computing, i.e., the two are inverse-linearly proportional to each other. More specifically, a general distributed computing framework, motivated by commonly used structures like MapReduce, is considered, where the goal is to compute $Q$ arbitrary output fun...

  3. Parallel Computing in SCALE

    Energy Technology Data Exchange (ETDEWEB)

    DeHart, Mark D [ORNL; Williams, Mark L [ORNL; Bowman, Stephen M [ORNL

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  4. Making It in Computer Sales.

    Science.gov (United States)

    Davidson, Robert L., III

    1987-01-01

    Discusses some of the possibilities for careers in computer sales. Describes some of the attributes of quality computer salespersons, as illustrated by interviews with two experts on computer sales. (TW)

  5. Coronary Computed Tomography Angiography (CTA)

    Science.gov (United States)

    ... Physician Resources Professions Site Index A-Z Coronary Computed Tomography Angiography (CCTA) Coronary computed tomography angiography (CCTA) ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ...

  6. Experiences with distributed computing for meteorological applications: grid computing and cloud computing

    OpenAIRE

    Oesterle, F.; Ostermann, S; R. Prodan; G. J. Mayr

    2015-01-01

    Experiences with three practical meteorological applications with different characteristics are used to highlight the core computer science aspects and applicability of distributed computing to meteorology. Through presenting cloud and grid computing this paper shows use case scenarios fitting a wide range of meteorological applications from operational to research studies. The paper concludes that distributed computing complements and extends existing high performance comput...

  7. Computer Use and Computer Anxiety in Older Korean Americans.

    Science.gov (United States)

    Yoon, Hyunwoo; Jang, Yuri; Xie, Bo

    2016-09-01

    Responding to the limited literature on computer use in ethnic minority older populations, the present study examined predictors of computer use and computer anxiety in older Korean Americans. Separate regression models were estimated for computer use and computer anxiety with the common sets of predictors: (a) demographic variables (age, gender, marital status, and education), (b) physical health indicators (chronic conditions, functional disability, and self-rated health), and (c) sociocultural factors (acculturation and attitudes toward aging). Approximately 60% of the participants were computer-users, and they had significantly lower levels of computer anxiety than non-users. A higher likelihood of computer use and lower levels of computer anxiety were commonly observed among individuals with younger age, male gender, advanced education, more positive ratings of health, and higher levels of acculturation. In addition, positive attitudes toward aging were found to reduce computer anxiety. Findings provide implications for developing computer training and education programs for the target population.

  8. Computer science a concise introduction

    CERN Document Server

    Sinclair, Ian

    2014-01-01

    Computer Science: A Concise Introduction covers the fundamentals of computer science. The book describes micro-, mini-, and mainframe computers and their uses; the ranges and types of computers and peripherals currently available; applications to numerical computation; and commercial data processing and industrial control processes. The functions of data preparation, data control, computer operations, applications programming, systems analysis and design, database administration, and network control are also encompassed. The book then discusses batch, on-line, and real-time systems; the basic

  9. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  10. Exercises in molecular computing.

    Science.gov (United States)

    Stojanovic, Milan N; Stefanovic, Darko; Rudchenko, Sergei

    2014-06-17

    CONSPECTUS: The successes of electronic digital logic have transformed every aspect of human life over the last half-century. The word "computer" now signifies a ubiquitous electronic device, rather than a human occupation. Yet evidently humans, large assemblies of molecules, can compute, and it has been a thrilling challenge to develop smaller, simpler, synthetic assemblies of molecules that can do useful computation. When we say that molecules compute, what we usually mean is that such molecules respond to certain inputs, for example, the presence or absence of other molecules, in a precisely defined but potentially complex fashion. The simplest way for a chemist to think about computing molecules is as sensors that can integrate the presence or absence of multiple analytes into a change in a single reporting property. Here we review several forms of molecular computing developed in our laboratories. When we began our work, combinatorial approaches to using DNA for computing were used to search for solutions to constraint satisfaction problems. We chose to work instead on logic circuits, building bottom-up from units based on catalytic nucleic acids, focusing on DNA secondary structures in the design of individual circuit elements, and reserving the combinatorial opportunities of DNA for the representation of multiple signals propagating in a large circuit. Such circuit design directly corresponds to the intuition about sensors transforming the detection of analytes into reporting properties. While this approach was unusual at the time, it has been adopted since by other groups working on biomolecular computing with different nucleic acid chemistries. We created logic gates by modularly combining deoxyribozymes (DNA-based enzymes cleaving or combining other oligonucleotides), in the role of reporting elements, with stem-loops as input detection elements. For instance, a deoxyribozyme that normally exhibits an oligonucleotide substrate recognition region is

  11. Quantum Computational Cryptography

    Science.gov (United States)

    Kawachi, Akinori; Koshiba, Takeshi

    As computational approaches to classical cryptography have succeeded in the establishment of the foundation of the network security, computational approaches even to quantum cryptography are promising, since quantum computational cryptography could offer richer applications than the quantum key distribution. Our project focused especially on the quantum one-wayness and quantum public-key cryptosystems. The one-wayness of functions (or permutations) is one of the most important notions in computational cryptography. First, we give an algorithmic characterization of quantum one-way permutations. In other words, we show a necessary and sufficient condition for quantum one-way permutations in terms of reflection operators. Second, we introduce a problem of distinguishing between two quantum states as a new underlying problem that is harder to solve than the graph automorphism problem. The new problem is a natural generalization of the distinguishability problem between two probability distributions, which are commonly used in computational cryptography. We show that the problem has several cryptographic properties and they enable us to construct a quantum publickey cryptosystem, which is likely to withstand any attack of a quantum adversary.

  12. Optical computer motherboards

    Science.gov (United States)

    Jannson, Tomasz P.; Xu, Guoda; Bartha, John M.; Gruntman, Michael A.

    1997-09-01

    In this paper, we investigate the application of precision plastic optics into a communication/computer sub-system, such as a hybrid computer motherboard. We believe that using optical waveguides for next-generation computer motherboards can provide a high performance alternative for present multi-layer printed circuit motherboards. In response to this demand, we suggest our novel concept of a hybrid motherboard based on an internal-fiber-coupling (IFC) wavelength-division-multiplexing (WDM) optical backplane. The IFC/WDM backplane provides dedicated Tx/Rx connections, and applies low-cost, high-performance components, including CD LDs, GRIN plastic fibers, molding housing, and nonimaging optics connectors. Preliminary motherboard parameters are: speed 100 MHz/100 m, or 1 GHz/10 m; fiber loss approximately 0.01 dB/m; almost zero fan-out/fan-in optical power loss, and eight standard wavelength channels. The proposed hybrid computer motherboard, based on innovative optical backplane technology, should solve low-speed, low-parallelism bottlenecks in present electric computer motherboards.

  13. Petaflops computing: planning ahead

    Energy Technology Data Exchange (ETDEWEB)

    McGraw, J R

    1998-06-17

    This talk considers the problem of defining success criteria for petaflop computers. Current expectations for teraflop systems show an alarming acceleration of a trend we have seen for many years in high performance computers. Namely, it is becoming increasingly difficult to effectively use the computational capability of these machines. If this situation is not reversed quickly, the term "petaflop computer" may simply mean the next fastest computer that we cannot use. In many cases, we have some understanding of why we cannot achieve anywhere near the peak performance of these machines on real applications. Effective use of these resources is a highly complex optimization problem that must be solved over all of the different components of each application program. Given this complexity, it is the responsibility of our community to better quantify our progress in developing high perforrnance systems with more meaningful metrics than simply "peak floating point operations per second." We need to develop metrics and tools that help us to enhance the end-to-end performance of solving large scientific applications on these advanced machines.

  14. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database (Ethernet and wire-less cards) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (security problems, viruses, etc.) • Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) • Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/registerVisitorComp...

  15. Computer Registration Becoming Mandatory

    CERN Multimedia

    2003-01-01

    Following the decision by the CERN Management Board (see Weekly Bulletin 38/2003), registration of all computers connected to CERN's network will be enforced and only registered computers will be allowed network access. The implementation has started with the IT buildings, continues with building 40 and the Prevessin site (as of Tuesday 4th November 2003), and will cover the whole of CERN before the end of this year. We therefore recommend strongly that you register all your computers in CERN's network database including all network access cards (Ethernet AND wireless) as soon as possible without waiting for the access restriction to take force. This will allow you accessing the network without interruption and help IT service providers to contact you in case of problems (e.g. security problems, viruses, etc.) Users WITH a CERN computing account register at: http://cern.ch/register/ (CERN Intranet page) Visitors WITHOUT a CERN computing account (e.g. short term visitors) register at: http://cern.ch/regis...

  16. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  17. Fast Local Computation Algorithms

    CERN Document Server

    Rubinfeld, Ronitt; Vardi, Shai; Xie, Ning

    2011-01-01

    For input $x$, let $F(x)$ denote the set of outputs that are the "legal" answers for a computational problem $F$. Suppose $x$ and members of $F(x)$ are so large that there is not time to read them in their entirety. We propose a model of {\\em local computation algorithms} which for a given input $x$, support queries by a user to values of specified locations $y_i$ in a legal output $y \\in F(x)$. When more than one legal output $y$ exists for a given $x$, the local computation algorithm should output in a way that is consistent with at least one such $y$. Local computation algorithms are intended to distill the common features of several concepts that have appeared in various algorithmic subfields, including local distributed computation, local algorithms, locally decodable codes, and local reconstruction. We develop a technique, based on known constructions of small sample spaces of $k$-wise independent random variables and Beck's analysis in his algorithmic approach to the Lov{\\'{a}}sz Local Lemma, which und...

  18. Geospatial computing in mobile devices

    CERN Document Server

    Chen, Ruizhi

    2014-01-01

    Geospatial computing includes utilizing computing devices and sensors to acquire, process, analyze, manage, and visualize geospatial data, which users can then interact with via a large variety of smart geospatial applications. Geospatial computing is a computational-demanding task, in terms of computation power, data storage capacity, and memory space. Therefore, it has primarily been performed on non-mobile computers. Recent developments allow smartphones to meet many of the demanded requirements for geospatial computing.This book addresses the topic of geospatial computing in smartphones, i

  19. Computational problems in engineering

    CERN Document Server

    Mladenov, Valeri

    2014-01-01

    This book provides readers with modern computational techniques for solving variety of problems from electrical, mechanical, civil and chemical engineering. Mathematical methods are presented in a unified manner, so they can be applied consistently to problems in applied electromagnetics, strength of materials, fluid mechanics, heat and mass transfer, environmental engineering, biomedical engineering, signal processing, automatic control and more.   • Features contributions from distinguished researchers on significant aspects of current numerical methods and computational mathematics; • Presents actual results and innovative methods that provide numerical solutions, while minimizing computing times; • Includes new and advanced methods and modern variations of known techniques that can solve difficult scientific problems efficiently.  

  20. Computational Protein Design

    DEFF Research Database (Denmark)

    Johansson, Kristoffer Enøe

    Proteins are the major functional group of molecules in biology. The impact of protein science on medicine and chemical productions is rapidly increasing. However, the greatest potential remains to be realized. The fi eld of protein design has advanced computational modeling from a tool of support...... to a central method that enables new developments. For example, novel enzymes with functions not found in natural proteins have been de novo designed to give enough activity for experimental optimization. This thesis presents the current state-of-the-art within computational design methods together...... with a novel method based on probability theory. With the aim of assembling a complete pipeline for protein design, this work touches upon several aspects of protein design. The presented work is the computational half of a design project where the other half is dedicated to the experimental part...