WorldWideScience

Sample records for cii computers

  1. Intensity Mapping of the [CII] Fine Structure Line during the Epoch of Reionization

    Science.gov (United States)

    Gong, Yan; Cooray, A.; Silva, M.; Santos, M. G.; Bock, J.; Bradford, M.; Zemcov, M.

    2012-01-01

    The atomic CII fine-structure line is one of the brightest lines in a typical star-forming galaxy spectrum with a luminosity 0.1% to 1% of the bolometric luminosity. It is otentially a reliable tracer of the dense gas distribution at high edshifts and could provide an additional probe to the era of reionization. By taking into account of the spontaneous, stimulated and collisional emission of the CII line, we calculate the spin temperature and the mean intensity as a function of the redshift. When averaged over a cosmologically large volume, we find that the CII emission from ionized carbon in individual galaxies is larger than the signal generated by carbon in the intergalactic medium (IGM). Assuming that the CII luminosity is proportional to the carbon mass in dark matter halos, we also compute the power spectrum of the CII line intensity at various redshifts. In order to avoid the contamination from CO rotational lines at low redshift when targeting a CII survey at high redshifts, we propose the cross-correlation of CII and 21-cm line emission from high redshifts. To explore the detectability of the CII signal from reionization, we also evaluate the expected errors on the CII power spectrum and CII-21 cm cross power spectrum based on the design of the future milimeter surveys. We note that the CII-21 cm cross power spectrum contains interesting features that captures physics during reionization, including the ionized bubble sizes and the mean ionization fraction, which are challenging to measure from 21-cm data alone. We propose an instrumental concept for the reionization CII experiment targeting the frequency range of 200 to 300 GHz with 1, 3 and 10 meter apertures and a bolometric spectrometer array with 64 independent spectral pixels with about 20,000 bolometers.

  2. The [CII] 158 μm line emission in high-redshift galaxies

    Science.gov (United States)

    Lagache, G.; Cousin, M.; Chatzikos, M.

    2018-02-01

    Gas is a crucial component of galaxies, providing the fuel to form stars, and it is impossible to understand the evolution of galaxies without knowing their gas properties. The [CII] fine structure transition at 158 μm is the dominant cooling line of cool interstellar gas, and is the brightest of emission lines from star forming galaxies from FIR through metre wavelengths, almost unaffected by attenuation. With the advent of ALMA and NOEMA, capable of detecting [CII]-line emission in high-redshift galaxies, there has been a growing interest in using the [CII] line as a probe of the physical conditions of the gas in galaxies, and as a star formation rate (SFR) indicator at z ≥ 4. In this paper, we have used a semi-analytical model of galaxy evolution (G.A.S.) combined with the photoionisation code CLOUDY to predict the [CII] luminosity of a large number of galaxies (25 000 at z ≃ 5) at 4 ≤ z ≤ 8. We assumed that the [CII]-line emission originates from photo-dominated regions. At such high redshift, the CMB represents a strong background and we discuss its effects on the luminosity of the [CII] line. We studied the L[CII ]-SFR and L[ CII ]-Zg relations and show that they do not strongly evolve with redshift from z = 4 and to z = 8. Galaxies with higher [CII] luminosities tend to have higher metallicities and higher SFRs but the correlations are very broad, with a scatter of about 0.5 and 0.8 dex for L[ CII ]-SFR and L[ CII ]-Zg, respectively. Our model reproduces the L[ CII ]-SFR relations observed in high-redshift star-forming galaxies, with [CII] luminosities lower than expected from local L[ CII ]-SFR relations. Accordingly, the local observed L[ CII ]-SFR relation does not apply at high-z (z ≳ 5), even when CMB effects are ignored. Our model naturally produces the [CII] deficit (i.e. the decrease of L[ CII ]/LIR with LIR), which appears to be strongly correlated with the intensity of the radiation field in our simulated galaxies. We then predict the

  3. Autoimmune severe hypertriglyceridemia induced by anti-apolipoprotein C-II antibody.

    Science.gov (United States)

    Yamamoto, Hiroyasu; Tanaka, Minoru; Yoshiga, Satomi; Funahashi, Tohru; Shimomura, Iichiro; Kihara, Shinji

    2014-05-01

    Among type V hyperlipoproteinemias, only one-fourth of the patients have genetic defects in lipoprotein lipase (LPL) or in its associated molecules; the exact mechanism in other patients is usually unknown. The aim of the study was to report a case of severe hypertriglyceridemia induced by anti-apolipoprotein (apo) C-II autoantibody and to clarify its pathogenesis. A 29-year-old Japanese woman presented with severe persistent hypertriglyceridemia since the age of 20 years. The past history was negative for acute pancreatitis, eruptive xanthomas, or lipemia retinalis. LPL mass and activities were normal. Plasma apo C-II levels were extremely low, but no mutation was observed in APOC2. Apo C-II protein was detected in the serum by immunoprecipitation and Western blotting. Large amounts of IgG and IgM were incorporated with apo C-II protein coimmunoprecipitated by anti-apo C-II antibody. IgG, but not IgM, purified from the serum prevented interaction of apo C-II with lipid substrate and diminished LPL hydrolysis activity. We identified anti-apo C-II antibody in a myeloma-unrelated severe hypertriglyceridemic patient. In vitro analysis confirmed that the autoantibody disrupted the interaction between apo C-II and lipid substrate, suggesting the etiological role of anti-apo C-II antibody in severe hypertriglyceridemia in this patient.

  4. Rapid radioimmunoassay of human apolipoproteins C-II and C-III

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, S; Oestlund-Lindqvist, A M; Vessby, B [Uppsala Univ. (Sweden)

    1984-06-01

    Apolipoprotein (apo) C-II is an activator of lipoprotein lipase, while apo C-III has the ability to inhibit apo C-II activated lipolysis. In order to study further the relationship between lipoprotein lipase mediated hydrolysis and the serum concentrations of apo C-II and apo C-III radioimmunoassays for these apolipoproteins have been developed. Formalin-treated Staphylococcus aureus Cowan I was used for immunoprecipitation and were shown to give rapid uptake of immune complexes that could easily be harvested by centrifugation. The assays were shown to be sensitive (10 ..mu..g/1), specific, precise (inter- and intra-assay coefficients of variation below 10%), rapid (completed in less than 6 h) and simple to perform. Delipidation of serum and lipoproteins had no effect on the results, indicating that the immunologically active sites of apo C-II and apo C-III are exposed to the aqueous environment under assay conditions. Serum apo C-II and apo C-III levels of normolipidaemic subjects were approximately 25 mg/1 and 110 mg/1, respectively. Highly significant positive correlations were found between VLDL apo C-II and VLDL apo C-III, respectively, and VLDL triglycerides, VLDL cholesterol and total serum TG. There was also a highly significant correlation between the HDL cholesterol concentration and the HDL apo C-III concentration.

  5. [Cii] emission from L1630 in the Orion B molecular cloud.

    Science.gov (United States)

    Pabst, C H M; Goicoechea, J R; Teyssier, D; Berné, O; Ochsendorf, B B; Wolfire, M G; Higgins, R D; Riquelme, D; Risacher, C; Pety, J; Le Petit, F; Roueff, E; Bron, E; Tielens, A G G M

    2017-10-01

    L1630 in the Orion B molecular cloud, which includes the iconic Horsehead Nebula, illuminated by the star system σ Ori, is an example of a photodissociation region (PDR). In PDRs, stellar radiation impinges on the surface of dense material, often a molecular cloud, thereby inducing a complex network of chemical reactions and physical processes. Observations toward L1630 allow us to study the interplay between stellar radiation and a molecular cloud under relatively benign conditions, that is, intermediate densities and an intermediate UV radiation field. Contrary to the well-studied Orion Molecular Cloud 1 (OMC1), which hosts much harsher conditions, L1630 has little star formation. Our goal is to relate the [Cii] fine-structure line emission to the physical conditions predominant in L1630 and compare it to studies of OMC1. The [Cii] 158 μ m line emission of L1630 around the Horsehead Nebula, an area of 12' × 17', was observed using the upgraded German Receiver for Astronomy at Terahertz Frequencies (upGREAT) onboard the Stratospheric Observatory for Infrared Astronomy (SOFIA). Of the [Cii] emission from the mapped area 95%, 13 L ⊙ , originates from the molecular cloud; the adjacent Hii region contributes only 5%, that is, 1 L ⊙ . From comparison with other data (CO(1-0)-line emission, far-infrared (FIR) continuum studies, emission from polycyclic aromatic hydrocarbons (PAHs)), we infer a gas density of the molecular cloud of n H ∼ 3 · 10 3 cm -3 , with surface layers, including the Horsehead Nebula, having a density of up to n H ∼ 4 · 10 4 cm -3 . The temperature of the surface gas is T ∼ 100 K. The average [Cii] cooling efficiency within the molecular cloud is 1.3 · 10 -2 . The fraction of the mass of the molecular cloud within the studied area that is traced by [Cii] is only 8%. Our PDR models are able to reproduce the FIR-[Cii] correlations and also the CO(1-0)-[Cii] correlations. Finally, we compare our results on the heating efficiency of the

  6. GOT C+: Galactic Plane Survey of the 1.9 THz [CII] Line

    Science.gov (United States)

    Langer, William

    2012-01-01

    The ionized carbon [CII] 1.9 THz fine structure line is a major gas coolant in the interstellar medium (ISM) and controls the thermal conditions in diffuse gas clouds and Photodissociation Regions (PDRs). The [CII] line is also an important tracer of the atomic gas and atomic to molecular transition in diffuse clouds throughout the Galaxy. I will review some of the results from the recently completed Galactic Observations of Terahertz C+ (GOT C+) survey. This Herschel Open Time Key Project is a sparse, but uniform volume sample survey of [CII] line emission throughout the Galactic disk using the HIFI heterodyne receiver. HIFI observations, with their high spectral resolution, isolate and locate individual clouds in the Galaxy and provide excitation information on the gas. I will present [CII] position-velocity maps that reveal the distribution and motion of the clouds in the inner Galaxy and discuss results on the physical properties of the gas using spectral observations of [CII] and ancillary HI and 12CO, 13CO, and C18O J=1-0 data. The [CII] emission is also a useful tracer of the "Dark H2 Gas", and I will discuss its distribution in a sample of interstellar clouds. This research was conducted at the Jet Propulsion Laboratory, California Institute of Technology under contract with the National Aeronautics and Space Administration.

  7. Selected issues of the universal communication environment implementation for CII standard

    Science.gov (United States)

    Zagoździńska, Agnieszka; Poźniak, Krzysztof T.; Drabik, Paweł K.

    2011-10-01

    In the contemporary FPGA market there is the wide assortment of structures, integrated development environments, and boards of different producers. The variety allows to fit resources to requirements of the individual designer. There is the need of standardization of the projects to make it useful in research laboratories equipped with different producers tools. Proposed solution is CII standardization of VHDL components. This paper contains specification of the universal communication environment for CII standard. The link can be used in different FPGA structures. Implementation of the link enables object oriented VHDL programming with the use of CII standardization. The whole environment contains FPGA environment and PC software. The paper contains description of the selected issues of FPGA environment. There is description of some specific solutions that enables environment usage in structures of different producers. The flexibility of different size data transmissions with the use of CII is presented. The specified tool gives the opportunity to use FPGA structures variety fully and design faster and more effectively.

  8. [CII] At 1 Universe with Zeus (1 and 2)

    Science.gov (United States)

    Ferkinhoff, Carl; Hailey-Dunsheath, S.; Nikola, T.; Oberst, T.; Parshley, S.; Stacey, G.; Benford, D.; staguhn, J.

    2010-01-01

    We report the detection of the [CII] 158 micron fine structure line from six submillimeter galaxies with redshifts between 1.12 and 1.73. This more than doubles the total number of [CII] 158 micron detections reported from high redshift sources. These observations were made with the Redshift(z) and Early Universe Spectrometer(ZEUS) at the Caltech Submillimeter Observatory on Mauna Kea, Hawaii between December 2006 and March 2009. ZEUS is a background limited submm echelle grating spectrometer (Hailey-Dunsheath 2009). Currently we are constructing ZEUS-2. This new instrument will utilize the same grating but will feature a two dimensional transition-edge sensed bolometer array with SQUID multiplexing readout system enabling simultaneous background limited observations in the 200, 340,450 and 650 micron telluric windows. ZEUS-2 will allow for long slit imaging spectroscopy in nearby galaxies and a [CII] survey from z 0.25 to 2.5.

  9. PROTECTING CRITICAL DATABASES – TOWARDS A RISK-BASED ASSESSMENT OF CRITICAL INFORMATION INFRASTRUCTURES (CIIS IN SOUTH AFRICA

    Directory of Open Access Journals (Sweden)

    Mzukisi N Njotini

    2013-04-01

    Full Text Available South Africa has made great strides towards protecting critical information infrastructures (CIIs. For example, South Africa recognises the significance of safeguarding places or areas that are essential to the national security of South Africa or the economic and social well-being of South African citizens. For this reason South Africa has established mechanisms to assist in preserving the integrity and security of CIIs. The measures provide inter alia for the identification of CIIs; the registration of the full names, address and contact details of the CII administrators (the persons who manage CIIs; the identification of the location(s of CIIs or their component parts; and the outlining of the general descriptions of information or data stored in CIIs.It is argued that the measures to protect CIIs in South Africa are inadequate. In particular, the measures rely on a one-size-fits-all approach to identify and classify CIIs. For this reason the South African measures are likely to lead to the adoption of a paradigm that considers every infrastructure, data or database, regardless of its significance or importance, to be key or critical.

  10. Characterizing the Multi-Phase Origin of the [CII] Emission in M101 and NGC 6946

    Science.gov (United States)

    Tarantino, Elizabeth; Bolatto, Alberto; Herrera-Camus, Rodrigo

    2018-01-01

    The bright far-infrared line [CII] is a dominant cooling channel of the neutral interstellar medium (ISM) and is a tracer of star formation. However, [CII] can be excited in different environments of the ISM, such as in dense photodissociation regions (PDRs), the cold/warm neutral medium (CNM/WNM), and the warm ionized medium (WIM). Separating the [CII] emission into its multiple components is vital for understanding star formation and for using [CII] as a star formation tracer. We present spectrally resolved SOFIA/GREAT data of the 158 μm [CII] emission, as well as ancillary HI and CO 2-1 data, to disentangle the multiple phases of the ISM. We use 18 pointings that sample the range of different environments present in these galaxies, including star formation activity, metallicity, radiation field strength, and gas content. We find that on average the [CII] is more associated with the dense CO gas coming from PDRs than the neutral medium, consistent with other results in the literature. Additionally, the [CII] observations allow us to access the “CO-faint” molecular gas in regions that have too low of a metallicty to produce CO. This adds to the small number of studies that have explored this “CO-faint” regime.

  11. Activation of catalase activity by a peroxisome-localized small heat shock protein Hsp17.6CII.

    Science.gov (United States)

    Li, Guannan; Li, Jing; Hao, Rong; Guo, Yan

    2017-08-20

    Plant catalases are important antioxidant enzymes and are indispensable for plant to cope with adverse environmental stresses. However, little is known how catalase activity is regulated especially at an organelle level. In this study, we identified that small heat shock protein Hsp17.6CII (AT5G12020) interacts with and activates catalases in the peroxisome of Arabidopsis thaliana. Although Hsp17.6CII is classified into the cytosol-located small heat shock protein subfamily, we found that Hsp17.6CII is located in the peroxisome. Moreover, Hsp17.6CII contains a novel non-canonical peroxisome targeting signal 1 (PTS1), QKL, 16 amino acids upstream from the C-terminus. The QKL signal peptide can partially locate GFP to peroxisome, and mutations in the tripeptide lead to the abolishment of this activity. In vitro catalase activity assay and holdase activity assay showed that Hsp17.6CII increases CAT2 activity and prevents it from thermal aggregation. These results indicate that Hsp17.6CII is a peroxisome-localized catalase chaperone. Overexpression of Hsp17.6CII conferred enhanced catalase activity and tolerance to abiotic stresses in Arabidopsis. Interestingly, overexpression of Hsp17.6CII in catalase-deficient mutants, nca1-3 and cat2 cat3, failed to rescue their stress-sensitive phenotypes and catalase activity, suggesting that Hsp17.6CII-mediated stress response is dependent on NCA1 and catalase activity. Overall, we identified a novel peroxisome-located catalase chaperone that is involved in plant abiotic stress resistance by activating catalase activity. Copyright © 2017 Institute of Genetics and Developmental Biology, Chinese Academy of Sciences, and Genetics Society of China. Published by Elsevier Ltd. All rights reserved.

  12. Apolipoprotein C-II and C-III metabolism in a kindred of familial hypobetalipoproteinemia

    International Nuclear Information System (INIS)

    Malmendier, C.L.; Delcroix, C.; Lontie, J.F.; Dubois, D.Y.

    1991-01-01

    Three affected members of a kindred with asymptomatic hypobetalipoproteinemia (HBL) showed low levels of triglycerides, low-density lipoprotein (LDL)-cholesterol, and apolipoproteins (apo) B, C-II, and C-III. Turnover of iodine-labeled apo C-II and apo C-III associated in vitro to plasma lipoproteins was studied after intravenous injection. Radioactivity in plasma and lipoproteins (95% recovered in high-density lipoprotein [HDL] density range) and in 24-hour urine samples was observed for 16 days. A parallelism of the slowest slopes of plasma decay curves was observed between apo C-II and apo C-III, indicating a partial common catabolic route. Urine/plasma radioactivity ratio (U/P) varied with time, suggesting heterogeneity of metabolic pathways. A new compartmental model using the SAAM program was built, not only fitting simultaneously plasma and urine data, but also taking into account the partial common metabolism of lipoprotein particles (LP) containing apo C-II and apo C-III. The low apo C-II and C-III plasma concentrations observed in HBL compared with normal resulted from both an increased catabolism and a reduced synthesis, these changes being more marked for apo C-III. The modifications in the rate constants of the different pathways calculated from the new model are in favor of an increased direct removal of particles following the fast pathway, likely in the very-low-density lipoprotein (VLDL) density range

  13. Toxicity of the bacteriophage λ cII gene product to Escherichia coli arises from inhibition of host cell DNA replication

    International Nuclear Information System (INIS)

    Kedzierska, Barbara; Glinkowska, Monika; Iwanicki, Adam; Obuchowski, Michal; Sojka, Piotr; Thomas, Mark S.; Wegrzyn, Grzegorz

    2003-01-01

    The bacteriophage λ cII gene codes for a transcriptional activator protein which is a crucial regulator at the stage of the 'lysis-versus-lysogeny' decision during phage development. The CII protein is highly toxic to the host, Escherichia coli, when overproduced. However, the molecular mechanism of this toxicity is not known. Here we demonstrate that DNA synthesis, but not total RNA synthesis, is strongly inhibited in cII-overexpressing E. coli cells. The toxicity was also observed when the transcriptional stimulator activity of CII was abolished either by a point mutation in the cII gene or by a point mutation, rpoA341, in the gene coding for the RNA polymerase α subunit. Moreover, inhibition of cell growth, caused by both wild-type and mutant CII proteins in either rpoA + or rpoA341 hosts, could be relieved by overexpression of the E. coli dnaB and dnaC genes. In vitro replication of an oriC-based plasmid DNA was somewhat impaired by the presence of the CII, and several CII-resistant E. coli strains contain mutations near dnaC. We conclude that the DNA replication machinery may be a target for the toxic activity of CII

  14. Computational Intelligence in Information Systems Conference

    CERN Document Server

    Au, Thien-Wan; Omar, Saiful

    2017-01-01

    This book constitutes the Proceedings of the Computational Intelligence in Information Systems conference (CIIS 2016), held in Brunei, November 18–20, 2016. The CIIS conference provides a platform for researchers to exchange the latest ideas and to present new research advances in general areas related to computational intelligence and its applications. The 26 revised full papers presented in this book have been carefully selected from 62 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.

  15. GOT C+ Survey of [CII] 158 μm Emission: Atomic to Molecular Cloud Transitions in the Inner Galaxy

    Science.gov (United States)

    Velusamy, T.; Langer, W. D.; Willacy, K.; Pineda, J. L.; Goldsmith, P. F.

    2013-03-01

    We present the results of the distribution of CO-dark H2 gas in a sample of 2223 interstellar clouds in the inner Galaxy (l=-90° to +57°) detected in the velocity resolved [CII] spectra observed in the GOT C+ survey using the Herschel HIFI. We analyze the [CII] intensities along with the ancillary HI, 12CO and 13CO data for each cloud to determine their evolutionary state and to derive the H2 column densities in the C+ and C+/CO transition layers in the cloud. We discuss the overall Galactic distribution of the [CII] clouds and their properties as a function Galactic radius. GOT C+ results on the global distribution of [CII] clouds and CO-dark H2 gas traces the FUV intensity and star formation rate in the Galactic disk.

  16. KirCII- promising tool for polyketide diversification

    DEFF Research Database (Denmark)

    Musiol-Kroll, Ewa Maria; Härtner, Thomas; Kulik, Andreas

    2014-01-01

    Kirromycin is produced by Streptomyces collinus Tü 365. This compound is synthesized by a large assembly line of type I polyketide synthases and non-ribosomal peptide synthetases (PKS I/NRPS), encoded by the genes kirAI-kirAVI and kirB. The PKSs KirAI-KirAV have no acyltransferase domains integra...... introducing the non-native substrates in an in vivo context. Thus, KirCII represents a promising tool for polyketide diversification....

  17. Galactic Observations of Terahertz C+ (GOT C+): [CII] Detection of Warm "Dark Gas" in the ISM

    Science.gov (United States)

    Langer, W. D.; Velusamy, T.; Pineda, J.; Goldsmith, P.; Li, D.; Yorke, H. W.

    2011-11-01

    The Herschel HIFI Key Program, Galactic Observations of Terahertz C+ (GOT C+) is a survey of [CII] 1.9 THz emission throughout the Galaxy. Comparison of the first results of this survey with HI and CO isotopomer emission reveals excess [CII] emission beyond that expected from HI and CO layers alone, and is best explained as coming from a hidden layer of H2 gas, the so-called ISM "dark gas".

  18. Host regulation of lysogenic decision in bacteriophage lambda: transmembrane modulation of FtsH (HflB), the cII degrading protease, by HflKC (HflA).

    Science.gov (United States)

    Kihara, A; Akiyama, Y; Ito, K

    1997-05-27

    The cII gene product of bacteriophage lambda is unstable and required for the establishment of lysogenization. Its intracellular amount is important for the decision between lytic growth and lysogenization. Two genetic loci of Escherichia coli are crucial for these commitments of infecting lambda genome. One of them, hflA encodes the HflKC membrane protein complex, which has been believed to be a protease degrading the cII protein. However, both its absence and overproduction stabilized cII in vivo and the proposed serine protease-like sequence motif in HflC was dispensable for the lysogenization control. Moreover, the HflKC protein was found to reside on the periplasmic side of the plasma membrane. In contrast, the other host gene, ftsH (hflB) encoding an integral membrane ATPase/protease, is positively required for degradation of cII, since loss of its function stabilized cII and its overexpression accelerated the cII degradation. In vitro, purified FtsH catalyzed ATP-dependent proteolysis of cII and HflKC antagonized the FtsH action. These results, together with our previous finding that FtsH and HflKC form a complex, suggest that FtsH is the cII degrading protease and HflKC is a modulator of the FtsH function. We propose that this transmembrane modulation differentiates the FtsH actions to different substrate proteins such as the membrane-bound SecY protein and the cytosolic cII protein. This study necessitates a revision of the prevailing view about the host control over lambda lysogenic decision.

  19. Dielectronic recombination rate coefficients to the excited states of CII from CIII

    International Nuclear Information System (INIS)

    Kato, Takako; Safronova, U.; Ohira, Mituhiko.

    1996-02-01

    Energy levels, radiative transition probabilities and autoionization rates for CII including 1s 2 2l2l'nl'' (n=2-6, l'≤(n-1)) states were calculated by using multi-configurational Hartree-Fock (Cowan code) method. Autoionizing levels above three thresholds: 1s 2 2s 2 ( 1 S), 1s 2 2s2p( 3 P), 1s 2 2s2p( 1 P) were considered. Branching ratios related to the first threshold and the intensity factor were calculated for satellite lines of CII ion. The dielectronic recombination rate coefficients to the excited states for n=2-6 are calculated with these atomic data. The rate coefficients are fitted to an analytical formula and the fit parameters are given. The values for higher excited states than n=6 are extrapolated and the total dielectronic recombination rate coefficients are derived. The effective recombination rate coefficient for different electron densities are also derived. (author)

  20. Isolation of Escherichia coli rpoB mutants resistant to killing by lambda cII protein and altered in pyrE gene attenuation

    DEFF Research Database (Denmark)

    Hammer, Karin; Jensen, Kaj Frank; Poulsen, Peter

    1987-01-01

    Escherichia coli mutants simultaneously resistant to rifampin and to the lethal effects of bacteriophage lambda cII protein were isolated. The sck mutant strains carry alterations in rpoB that allow them to survive cII killing (thus the name sck), but that do not impair either the expression of c......II or the activation by cII of the lambda promoters pE and pI. The sck-1, sck-2, and sck-3 mutations modify transcription termination. The growth of lambda, but not of the N-independent lambda variant, lambda nin-5, is hindered by these mutations, which act either alone or in concert with the bacterial nusA1 mutation....... In contrast to their effect on lambda growth, the three mutations reduce transcription termination in bacterial operons. The E. coli pyrE gene, which is normally regulated by attenuation, is expressed constitutively in the mutant strains. The sck mutations appear to prevent pyrE attenuation by slowing...

  1. Constraining star formation through redshifted CO and CII emission in archival CMB data

    Science.gov (United States)

    Switzer, Eric

    LCDM is a strikingly successful paradigm to explain the CMB anisotropy and its evolution into observed galaxy clustering statistics. The formation and evolution of galaxies within this context is more complex and only partly characterized. Measurements of the average star formation and its precursors over cosmic time are required to connect theories of galaxy evolution to LCDM evolution. The fine structure transition in CII at 158 um traces star formation rates and the ISM radiation environment. Cold, molecular gas fuels star formation and is traced well by a ladder of CO emission lines. Catalogs of emission lines in individual galaxies have provided the most information about CII and CO to-date but are subject to selection effects. Intensity mapping is an alternative approach to measuring line emission. It surveys the sum of all line radiation as a function of redshift, and requires angular resolution to reach cosmologically interesting scales, but not to resolve individual sources. It directly measures moments of the luminosity function from all emitting objects. Intensity mapping of CII and CO can perform an unbiased census of stars and cold gas across cosmic time. We will use archival COBE-FIRAS and Planck data to bound or measure cosmologically redshifted CII and CO line emission through 1) the monopole spectrum, 2) cross-power between FIRAS/Planck and public galaxy survey catalogs from BOSS and the 2MASS redshift surveys, 3) auto-power of the FIRAS/Planck data itself. FIRAS is unique in its spectral range and all-sky coverage, provided by the space-borne FTS architecture. In addition to sensitivity to a particular emission line, intensity mapping is sensitive to all other contributions to surface brightness. We will remove CMB and foreground spatial and spectral templates using models from WMAP and Planck data. Interlopers and residual foregrounds additively bias the auto-power and monopole, but both can still be used to provide rigorous upper bounds. The

  2. 4th INNS Symposia Series on Computational Intelligence in Information Systems

    CERN Document Server

    Au, Thien

    2015-01-01

    This book constitutes the refereed proceedings of the Fourth International Neural Network Symposia series on Computational Intelligence in Information Systems, INNS-CIIS 2014, held in Bandar Seri Begawan, Brunei in November 2014. INNS-CIIS aims to provide a platform for researchers to exchange the latest ideas and present the most current research advances in general areas related to computational intelligence and its applications in various domains. The 34 revised full papers presented in this book have been carefully reviewed and selected from 72 submissions. They cover a wide range of topics and application areas in computational intelligence and informatics.  

  3. Further Validation of the Conner's Adult Attention Deficit/Hyperactivity Rating Scale Infrequency Index (CII) for Detection of Non-Credible Report of Attention Deficit/Hyperactivity Disorder Symptoms.

    Science.gov (United States)

    Cook, Carolyn M; Bolinger, Elizabeth; Suhr, Julie

    2016-06-01

    Attention deficit/hyperactivity disorder (ADHD) can be easily presented in a non-credible manner, through non-credible report of ADHD symptoms and/or by non-credible performance on neuropsychological tests. While most studies have focused on detection of non-credible performance using performance validity tests, there are few studies examining the ability to detect non-credible report of ADHD symptoms. We provide further validation data for a recently developed measure of non-credible ADHD symptom report, the Conner's Adult ADHD Rating Scales (CAARS) Infrequency Index (CII). Using archival data from 86 adults referred for concerns about ADHD, we examined the accuracy of the CII in detecting extreme scores on the CAARS and invalid reporting on validity indices of the Minnesota Multiphasic Personality Inventory-2 Restructured Format (MMPI-2-RF). We also examined the accuracy of the CII in detecting non-credible performance on standalone and embedded performance validity tests. The CII was 52% sensitive to extreme scores on CAARS DSM symptom subscales (with 97% specificity) and 20%-36% sensitive to invalid responding on MMPI-2-RF validity scales (with near 90% specificity), providing further evidence for the interpretation of the CII as an indicator of non-credible ADHD symptom report. However, the CII detected only 18% of individuals who failed a standalone performance validity test (Word Memory Test), with 87.8% specificity, and was not accurate in detecting non-credible performance using embedded digit span cutoffs. Future studies should continue to examine how best to assess for non-credible symptom report in ADHD referrals. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  4. Comparison of programmed and cabled re-entrance systems. Elaboration of cabled re-entrance system for a CII 90.40; Comparaison des systemes de reentrance programmes et cables. Realisation d'un systeme de reentrance cable pour un CII 90.40

    Energy Technology Data Exchange (ETDEWEB)

    Perraudeau, Jean

    1976-11-26

    The objective of this research thesis is to study problems related to re-entrance, and, more particularly, to study a re-entrance system for a CII 90.40 computer. Such a system can be realised under a programmed or cabled form, and both approaches are described and compared. A generalisation of this comparison is briefly proposed. As the computer already possesses a programmed re-entrance system, the author focuses on the study of the cabled re-entrance system which results in an improvement of performance and possibilities provided by this computer, particularly for its use in real time. The design, realisation and development of such a cabled system are reported. A first part reports a theoretical study on re-entrance (definition, problems, applications), a presentation of the computer, a description of the programmed re-entrance system, a presentation of the principle of the chosen cabled re-entrance system, a definition of the structure and operating mode of the cabled pile and a description of its various components, and a flowchart analysis of function execution. The second part reports the practical realisation: definition, technological overview, technology used in the cabled pile, sequencing and multiplexing principle, impulse transmission, logical layouts, and circuit adjustments. The third part presents practical example. An assessment and perspectives are finally discussed.

  5. Comparison of programmed and cabled re-entrance systems. Elaboration of cabled re-entrance system for a CII 90.40

    International Nuclear Information System (INIS)

    Perraudeau, Jean

    1976-01-01

    The objective of this research thesis is to study problems related to re-entrance, and, more particularly, to study a re-entrance system for a CII 90.40 computer. Such a system can be realised under a programmed or cabled form, and both approaches are described and compared. A generalisation of this comparison is briefly proposed. As the computer already possesses a programmed re-entrance system, the author focuses on the study of the cabled re-entrance system which results in an improvement of performance and possibilities provided by this computer, particularly for its use in real time. The design, realisation and development of such a cabled system are reported. A first part reports a theoretical study on re-entrance (definition, problems, applications), a presentation of the computer, a description of the programmed re-entrance system, a presentation of the principle of the chosen cabled re-entrance system, a definition of the structure and operating mode of the cabled pile and a description of its various components, and a flowchart analysis of function execution. The second part reports the practical realisation: definition, technological overview, technology used in the cabled pile, sequencing and multiplexing principle, impulse transmission, logical layouts, and circuit adjustments. The third part presents practical example. An assessment and perspectives are finally discussed

  6. [CII] dynamics in the S140 region

    International Nuclear Information System (INIS)

    Dedes, C.; Röllig, M.; Okada, Y.; Ossenkopf, V.; Mookerjea, B.

    2015-01-01

    We report the observation of [C II] emission in a cut through the S140 region together with single pointing observations of several molecular tracers, including hydrides, in key regions of the photon-dominated region (PDR) and molecular cloud [1]. At a distance of 910 pc, a BOV star ionizes the edge of the molecular cloud L1204, creating S140. In addition, the dense molecular cloud hosts a cluster of embedded massive young stellar objects only 75' from the H II region [e.g. 2, 3]. We used HIFI on Herschel to observe [CII] in a strip following the direction of the impinging radiation across the ionisation front and through the cluster of embedded YSOs. With [C II], we can trace the ionising radiation and, together with the molecular tracers such as CO isotopologues and HCO + , study the dynamical processes in the region. Combining HIFIs high spectral resolution data with ground based molecular data allows us to study the dynamics and excitation conditions both in the ionization front and the dense molecular star forming region and model their physical conditions [4

  7. First Results from the $Herschel$ and ALMA Spectroscopic Surveys of the SMC: The Relationship Between [CII]-bright Gas and CO-bright Gas at Low Metallicity

    OpenAIRE

    Jameson, Katherine E.; Bolatto, Alberto D.; Wolfire, Mark; Warren, Steven R.; Herrera-Camus, Rodrigo; Croxall, Kevin; Pellegrini, Eric; Smith, John-David; Rubio, Monica; Indebetouw, Remy; Israel, Frank P.; Meixner, Margaret; Roman-Duval, Julia; van Loon, Jacco Th.; Muller, Erik

    2018-01-01

    The Small Magellanic Cloud (SMC) provides the only laboratory to study the structure of molecular gas at high resolution and low metallicity. We present results from the Herschel Spectroscopic Survey of the SMC (HS$^{3}$), which mapped the key far-IR cooling lines [CII], [OI], [NII], and [OIII] in five star-forming regions, and new ALMA 7m-array maps of $^{12}$CO and $^{13}$CO $(2-1)$ with coverage overlapping four of the five HS$^{3}$ regions. We detect [CII] and [OI] throughout all of the r...

  8. [CII] dynamics in the S140 region

    Energy Technology Data Exchange (ETDEWEB)

    Dedes, C. [ETH Zurich, Institute for Astronomy, Zurich (Switzerland); Röllig, M.; Okada, Y.; Ossenkopf, V. [1. Physikalisches Institut Universität Köln (Germany); Mookerjea, B. [Tata Institute of Fundamental Research, Mumbai (India); Collaboration: WADI Team

    2015-01-22

    We report the observation of [C II] emission in a cut through the S140 region together with single pointing observations of several molecular tracers, including hydrides, in key regions of the photon-dominated region (PDR) and molecular cloud [1]. At a distance of 910 pc, a BOV star ionizes the edge of the molecular cloud L1204, creating S140. In addition, the dense molecular cloud hosts a cluster of embedded massive young stellar objects only 75' from the H II region [e.g. 2, 3]. We used HIFI on Herschel to observe [CII] in a strip following the direction of the impinging radiation across the ionisation front and through the cluster of embedded YSOs. With [C II], we can trace the ionising radiation and, together with the molecular tracers such as CO isotopologues and HCO{sup +}, study the dynamical processes in the region. Combining HIFIs high spectral resolution data with ground based molecular data allows us to study the dynamics and excitation conditions both in the ionization front and the dense molecular star forming region and model their physical conditions [4].

  9. C+/CO Transitions in the Diffuse ISM: Transitional Cloud Sample from the GOT C+ Survey of [CII] in the inner Galaxy at l = -30deg to 30deg

    Science.gov (United States)

    Velusamy, T.; Pineda, J. L.; Langer, W. D.; Willacy, K.; Goldsmith, P. F.

    2011-05-01

    Our knowledge of interstellar gas has been limited primarily to the diffuse atomic phase traced by HI and the well-shielded molecular phase traced by CO. Recently, using the first results of the Herschel Key Project GOT C+, a HIFI C+ survey of the Galactic plane, Velusamy, Langer, Pineda et al. (A&A 521, L18, 2010) have shown that in the diffuse interstellar transition clouds a significant fraction of the carbon exists primarily as C^+ with little C^0 and CO in a warm 'dark gas' layer in which hydrogen is mostly H_2 with little atomic H, surrounding a modest 12CO-emitting core. The [CII] fine structure transition, at 1.9 THz (158 μm) is the best tracer of this component of the interstellar medium, which is critical to our understanding of the atomic to molecular cloud transitions. The Herschel Key Project GOT C+ is designed to study such clouds by observing with HIFI the [CII] line emission along 500 lines of sight (LOSs) throughout the Galactic disk. Here we present the identification and chemical status of a few hundred diffuse and transition clouds traced by [CII], along with auxiliary HI and CO data covering ~100 LOSs in the inner Galaxy between l= -30° and 30°. We identify transition clouds as [CII] components that are characterized by the presence of both HI and 12CO, but no 13CO emission. The intensities, I(CII) and I(HI), are used as measures of the visual extinction, AV, in the cloud up to the C^+/C^0/CO transition layer and a comparison with I(12CO) yields a more complete H_2 molecular inventory. Our results show that [CII] emission is an excellent tool to study transition clouds and their carbon chemistry in the ISM, in particular as a unique tracer of molecular H_2, which is not easily observed by other means. The large sample presented here will serve as a resource to study the chemical and physical status of diffuse transition clouds in a wide range of Galactic environments and constrain the physical parameters such as the FUV intensity and cosmic

  10. SDP_wlanger_3: State of the Diffuse ISM: Galactic Observations of the Terahertz CII Line (GOT CPlus)

    Science.gov (United States)

    Langer, W.

    2011-09-01

    Star formation activity throughout the Galactic disk depends on the thermal and dynamical state of the interstellar gas, which in turn depends on heating and cooling rates, modulated by the gravitational potential and shock and turbulent pressures. Molecular cloud formation, and thus the star formation, may be regulated by pressures in the interstellar medium (ISM). To understand these processes we need information about the properties of the diffuse atomic and diffuse molecular gas clouds, and Photon Dominated Regions (PDR). An important tracer of these regions is the CII line at 158 microns (1900.5 GHz). We propose a "pencil-beam" survey of CII with HIFI band 7b, based on deep integrations and systematic sparse sampling of the Galactic disk plus selected targets, totaling over 900 lines of sight. We will detect both emission and, against the bright inner Galaxy and selected continuum sources, absorption lines. These spectra will provide the astronomical community with a large rich statistical database of the diffuse cloud properties throughout the Galaxy for understanding the Milky Way ISM and, by extension, other galaxies. It will be extremely valuable for determining the properties of the atomic gas, the role of barometric pressure and turbulence in cloud evolution, and the properties of the interface between the atomic and molecular clouds. The CII line is one of the major ISM cooling lines and is present throughout the Galactic plane. It is the strongest far-IR emission line in the Galaxy, with a total luminosity about a 1000 times that of the CO J=1-0 line. Combined with other data, it can be used to determine density, pressure, and radiation environment in gas clouds, and PDRs, and their dynamics via velocity fields. HSO is the best opportunity over the next several years to probe the ISM in this tracer and will provide a template for large-scale surveys with dedicated small telescopes and future surveys of other important ISM tracers.

  11. KPOT_wlanger_1: State of the Diffuse ISM: Galactic Observations of the Terahertz CII Line (GOT CPlus)

    Science.gov (United States)

    Langer, W.

    2007-10-01

    Star formation activity throughout the Galactic disk depends on the thermal and dynamical state of the interstellar gas, which in turn depends on heating and cooling rates, modulated by the gravitational potential and shock and turbulent pressures. Molecular cloud formation, and thus the star formation, may be regulated by pressures in the interstellar medium (ISM). To understand these processes we need information about the properties of the diffuse atomic and diffuse molecular gas clouds, and Photon Dominated Regions (PDR). An important tracer of these regions is the CII line at 158 microns (1900.5 GHz). We propose a "pencil-beam" survey of CII with HIFI band 7b, based on deep integrations and systematic sparse sampling of the Galactic disk plus selected targets, totaling over 900 lines of sight. We will detect both emission and, against the bright inner Galaxy and selected continuum sources, absorption lines. These spectra will provide the astronomical community with a large rich statistical database of the diffuse cloud properties throughout the Galaxy for understanding the Milky Way ISM and, by extension, other galaxies. It will be extremely valuable for determining the properties of the atomic gas, the role of barometric pressure and turbulence in cloud evolution, and the properties of the interface between the atomic and molecular clouds. The CII line is one of the major ISM cooling lines and is present throughout the Galactic plane. It is the strongest far-IR emission line in the Galaxy, with a total luminosity about a 1000 times that of the CO J=1-0 line. Combined with other data, it can be used to determine density, pressure, and radiation environment in gas clouds, and PDRs, and their dynamics via velocity fields. HSO is the best opportunity over the next several years to probe the ISM in this tracer and will provide a template for large-scale surveys with dedicated small telescopes and future surveys of other important ISM tracers.

  12. OT2_tvelusam_4: Probing Galactic Spiral Arm Tangencies with [CII

    Science.gov (United States)

    Velusamy, T.

    2011-09-01

    We propose to use the unique viewing geometry of the Galactic spiral arm tangents , which provide an ideal environment for studying the effects of density waves on spiral structure. We propose a well-sampled map of the[C II] 1.9 THz line emission along a 15-degree longitude region across the Norma-3kpc arm tangential, which includes the edge of the Perseus Arm. The COBE-FIRAS instrument observed the strongest [C II] and [N II] emission along these spiral arm tangencies.. The Herschel Open Time Key Project Galactic Observations of Terahertz C+ (GOT C+), also detects the strongest [CII] emission near these spiral arm tangential directions in its sparsely sampled HIFI survey of [CII] in the Galactic plane survey. The [C II] 158-micron line is the strongest infrared line emitted by the ISM and is an excellent tracer and probe of both the diffuse gases in the cold neutral medium (CNM) and the warm ionized medium (WIM). Furthermore, as demonstrated in the GOTC+ results, [C II] is an efficient tracer of the dark H2 gas in the ISM that is not traced by CO or HI observations. Thus, taking advantage of the long path lengths through the spiral arm across the tangencies, we can use the [C II] emission to trace and characterize the diffuse atomic and ionized gas as well as the diffuse H2 molecular gas in cloud transitions from HI to H2 and C+ to C and CO, throughout the ISM. The main goal of our proposal is to use the well sampled (at arcmin scale) [C II] to study these gas components of the ISM in the spiral-arm, and inter-arm regions, to constrain models of the spiral structure and to understand the influence of spiral density waves on the Galactic gas and the dynamical interaction between the different components. The proposed HIFI observations will consist of OTF 15 degree longitude scans and one 2-degree latitude scan sampled every 40arcsec across the Norma- 3kpc Perseus Spiral tangency.

  13. Lipoprotein lipase activity and mass, apolipoprotein C-II mass and polymorphisms of apolipoproteins E and A5 in subjects with prior acute hypertriglyceridaemic pancreatitis

    Directory of Open Access Journals (Sweden)

    García-Arias Carlota

    2009-06-01

    Full Text Available Abstract Background Severe hypertriglyceridaemia due to chylomicronemia may trigger an acute pancreatitis. However, the basic underlying mechanism is usually not well understood. We decided to analyze some proteins involved in the catabolism of triglyceride-rich lipoproteins in patients with severe hypertriglyceridaemia. Methods Twenty-four survivors of acute hypertriglyceridaemic pancreatitis (cases and 31 patients with severe hypertriglyceridaemia (controls were included. Clinical and anthropometrical data, chylomicronaemia, lipoprotein profile, postheparin lipoprotein lipase mass and activity, hepatic lipase activity, apolipoprotein C II and CIII mass, apo E and A5 polymorphisms were assessed. Results Only five cases were found to have LPL mass and activity deficiency, all of them thin and having the first episode in childhood. No cases had apolipoprotein CII deficiency. No significant differences were found between the non-deficient LPL cases and the controls in terms of obesity, diabetes, alcohol consumption, drug therapy, gender distribution, evidence of fasting chylomicronaemia, lipid levels, LPL activity and mass, hepatic lipase activity, CII and CIII mass or apo E polymorphisms. However, the SNP S19W of apo A5 tended to be more prevalent in cases than controls (40% vs. 23%, NS. Conclusion Primary defects in LPL and C-II are rare in survivors of acute hypertriglyceridaemic pancreatitis; lipase activity measurements should be restricted to those having their first episode during chilhood.

  14. Lipoprotein lipase activity and mass, apolipoprotein C-II mass and polymorphisms of apolipoproteins E and A5 in subjects with prior acute hypertriglyceridaemic pancreatitis

    Science.gov (United States)

    2009-01-01

    Background Severe hypertriglyceridaemia due to chylomicronemia may trigger an acute pancreatitis. However, the basic underlying mechanism is usually not well understood. We decided to analyze some proteins involved in the catabolism of triglyceride-rich lipoproteins in patients with severe hypertriglyceridaemia. Methods Twenty-four survivors of acute hypertriglyceridaemic pancreatitis (cases) and 31 patients with severe hypertriglyceridaemia (controls) were included. Clinical and anthropometrical data, chylomicronaemia, lipoprotein profile, postheparin lipoprotein lipase mass and activity, hepatic lipase activity, apolipoprotein C II and CIII mass, apo E and A5 polymorphisms were assessed. Results Only five cases were found to have LPL mass and activity deficiency, all of them thin and having the first episode in childhood. No cases had apolipoprotein CII deficiency. No significant differences were found between the non-deficient LPL cases and the controls in terms of obesity, diabetes, alcohol consumption, drug therapy, gender distribution, evidence of fasting chylomicronaemia, lipid levels, LPL activity and mass, hepatic lipase activity, CII and CIII mass or apo E polymorphisms. However, the SNP S19W of apo A5 tended to be more prevalent in cases than controls (40% vs. 23%, NS). Conclusion Primary defects in LPL and C-II are rare in survivors of acute hypertriglyceridaemic pancreatitis; lipase activity measurements should be restricted to those having their first episode during chilhood. PMID:19534808

  15. Role of the RNA polymerase α subunits in CII-dependent activation of the bacteriophage λ pE promoter: identification of important residues and positioning of the α C-terminal domains

    Science.gov (United States)

    Kedzierska, Barbara; Lee, David J.; Węgrzyn, Grzegorz; Busby, Stephen J. W.; Thomas, Mark S.

    2004-01-01

    The bacteriophage λ CII protein stimulates the activity of three phage promoters, pE, pI and paQ, upon binding to a site overlapping the –35 element at each promoter. Here we used preparations of RNA polymerase carrying a DNA cleavage reagent attached to specific residues in the C-terminal domain of the RNA polymerase α subunit (αCTD) to demonstrate that one αCTD binds near position –41 at pE, whilst the other αCTD binds further upstream. The αCTD bound near position –41 is oriented such that its 261 determinant is in close proximity to σ70. The location of αCTD in CII-dependent complexes at the pE promoter is very similar to that found at many activator-independent promoters, and represents an alternative configuration for αCTD at promoters where activators bind sites overlapping the –35 region. We also used an in vivo alanine scan analysis to show that the DNA-binding determinant of αCTD is involved in stimulation of the pE promoter by CII, and this was confirmed by in vitro transcription assays. We also show that whereas the K271E substitution in αCTD results in a drastic decrease in CII-dependent activation of pE, the pI and paQ promoters are less sensitive to this substitution, suggesting that the role of αCTD at the three lysogenic promoters may be different. PMID:14762211

  16. OMNET - high speed data communications for PDP-11 computers

    International Nuclear Information System (INIS)

    Parkman, C.F.; Lee, J.G.

    1979-12-01

    Omnet is a high speed data communications network designed at CERN for PDP-11 computers. It has grown from a link multiplexor system built for a CII 10070 computer into a full multi-point network, to which some fifty computers are now connected. It provides communications facilities for several large experimental installations as well as many smaller systems and has connections to all parts of the CERN site. The transmission protocol is discussed and brief details are given of the hardware and software used in its implementation. Also described is the gateway interface to the CERN packet switching network, 'Cernet'. (orig.)

  17. Carbon Chemistry in Transitional Clouds from the GOT C+ Survey of CII 158 micron Emission in the Galactic Plane

    Science.gov (United States)

    Langer, W. D.; Velusamy, T.; Pineda, J.; Willacy, K.; Goldsmith, P. F.

    2011-05-01

    In understanding the lifecycle and chemistry of the interstellar gas, the transition from diffuse atomic to molecular gas clouds is a very important stage. The evolution of carbon from C+ to C0 and CO is a fundamental part of this transition, and C+ along with its carbon chemistry is a key diagnostic. Until now our knowledge of interstellar gas has been limited primarily to the diffuse atomic phase traced by HI and the dense molecular H2 phase traced by CO. However, we have generally been missing an important layer in diffuse and transition clouds, which is denoted by the warm "dark gas'', that is mostly H2 and little HI and CO, and is best traced with C+. Here, we discuss the chemistry in the transition from C+ to C0 and CO in these clouds as understood by a survey of the CII 1.9 THz (158 micron) line from a sparse survey of the inner galaxy over about 40 degrees in longitude as part of the Galactic Observations of Terahertz C+ (GOT C+) program, a Herschel Space Observatory Open Time Key Program to study interstellar clouds by sampling ionized carbon. Using the first results from GOT C+ along 11 LOSs, in a sample of 53 transition clouds, Velusamy, Langer et al. (A&A 521, L18, 2010) detected an excess of CII intensities indicative of a thick H2 layer (a significant warm H2, "dark gas'' component) around the 12CO core. Here we present a much larger, statistically significant sample of a few hundred diffuse and transition clouds traced by CII, along with auxiliary HI and CO data in the inner Galaxy between l=-30° and +30°. Our new and more extensive sample of transition clouds is used to elucidate the time dependent physical and carbon chemical evolution of diffuse to transition clouds, and transition layers. We consider the C+ to CO conversion pathways such as H++ O and C+ + H2 chemistry for CO production to constrain the physical parameters such as the FUV intensity and cosmic ray ionization rate that drive the CO chemistry in the diffuse transition clouds.

  18. Herschel HIFI GOT C+ Survey: CII, HI, and CO Emissions in a Sample of Transition Clouds and Star-Forming regions in the Inner Galaxy

    Science.gov (United States)

    Pineda, Jorge; Velusamy, Thangasamy; Langer, William D.; Goldsmith, Paul; Li, Di; Yorke, Harold

    The GOT C+ a HIFI Herschel Key Project, studies the diffuse ISM throughout the Galactic Plane, using C+ as cloud tracer. The C+ line at 1.9 THz traces a so-far poorly studied stage in ISM cloud evolution -the transitional clouds going from atomic HI to molecular H2. This transition cloud phase, which is difficult to observe in HI and CO alone, may be best characterized via CII emission or absorption. The C+ line is also an excellent tracer of the warm diffuse gas and the warm, dense gas in the Photon Dominated Regions (PDRs). We can, therefore, use the CII emission as a probe to understand the effects of star formation on their interstellar environment. We present our first results on the transition between dense and hot gas (traced by CII) and dense and cold gas (traced by 12CO and 13CO) along a few representative lines of sight in the inner Galaxy from longitude 325 degrees to 25 degrees, taken during the HIFI Priority Science Phase. Comparisons of the high spectral resolution ( 1 km/s) HIFI data on C+ with HI, 12CO, and 13CO spectra allow us to separate out the different ISM components along each line of sight. Our results provide detailed information about the transition of diffuse atomic to molecular gas clouds needed to understand star formation and the lifecycle of the interstellar gas. These observations are being carried out with the Herschel Space Observatory, which is an ESA cornerstone mission, with contributions from NASA. This research was conducted at the Jet Propulsion Laboratory, California Institute of Technology under contract with the National Aeronautics and Space Administration. JLP was supported under the NASA Postdoctoral Program at JPL, Caltech, administered by Oak Ridge Associated Universities through a contract with NASA, and is currently supported as a Caltech-JPL Postdoctoral associate.

  19. CI, [CII] and CO observations towards TNJ 1338–1942: Probing the ISM in a massive proto-cluster galaxy at z = 4.11

    DEFF Research Database (Denmark)

    König, S; Greve, T R; Seymour, N

    2012-01-01

    density, temperature, ambient UV-field) prevailing in the interstellar medium (ISM) of these objects. Here we report on ongoing CI, [CII] and CO observations of TNJ 1338–1942 at z = 4.11 with the IRAM 30m telescope, the JCMT and ATCA. With these observations we will make a first attempt at constraining...... the average ISM conditions in TNJ 1338–1942....

  20. Change in Serum Lipid during Growth Hormone Therapy in a Growth Hormone-Deficient Patient with Decreased Serum Apolipoprotem C-II

    OpenAIRE

    Tadashi, Moriwake; Masanori, Takaiwa; Masako, Kawakami; Shouichi, Tanaka; Tetsuya, Nakamura; Department of Pediatrics, Iwakuni National Hospital; Department of Pediatrics, Iwakuni National Hospital; Department of Pediatrics, Iwakuni National Hospital; Department of Internal Medicine, Iwakuni National Hospital; Department of Radiology, Iwakuni National Hospital

    2003-01-01

    Introduction The effects of GH on lipid metabolism have been discussed frequently in relation to quality of adult life in childhood-onset GH deficiency, but its effects on lipid metabolism were not fully understood. In the present study, we analyzed the longitudinal change in serum lipid metabolites and apolipoproteins in a GH-deficient patient who had a history of cholelithiasis with decreased apolipoprotein C-II. Case K.Y. Four-year old boy visited the emergency clinic of Iwakuni National H...

  1. Coupling of a real time computer to nuclear detectors systems; Couplage d'un calculateur en temps reel a un ensemble experimental de detection

    Energy Technology Data Exchange (ETDEWEB)

    Lugol, J [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1967-06-01

    Electronic computers are now included in nuclear physics experiment systems. This corresponds to a general trend to replace conventional multichannel analyzers by on line, real time, computers. An one line computer performing nuclear data acquisition and storage, offers the advantage of reduction and calculation routines in real time. This advantage becomes a need when the number of experimental parameters increase. At the Saclay variable energy cyclotron we have connected a C 90-10 computer of C.I.I. We describe the input/output hardware features. In order to establish a dialogue with physicists, we have built a main display unit able to control many display consoles at different points: we describe them as well as some utility routines. (author) [French] Les experiences de physique nucleaire font appel, de plus en plus, a des calculateurs electroniques. Ceux-ci se substituent alors aux analyseurs multicanaux traditionnels. Un calculateur en ligne assurant l'acquisition et le stockage des donnees experimentales, presente l'avantage de pouvoir executer, en temps reel, des programmes simples de calcul et de reduction. Nous montrons la necessite de prise en charge, par un calculateur, des experiences de physique nucleaire ou le nombre des parametres experimentaux devient important. Au cyclotron a energie variable de Saclay, la solution retenue est l'emploi d'un calculateur C 90-10 de la C.I.I. Nous decrivons les interfaces necessaires au couplage. Afin d'assurer un dialogue avec les physiciens, un systeme de visualisation est realise; nous le decrivons ainsi que quelques programmes types d'utilisation de l'ensemble du systeme. (auteur)

  2. Endogenous estrogen status, but not genistein supplementation, modulates 7,12-dimethylbenz[a]anthracene-induced mutation in the liver cII gene of transgenic big blue rats.

    Science.gov (United States)

    Chen, Tao; Hutts, Robert C; Mei, Nan; Liu, Xiaoli; Bishop, Michelle E; Shelton, Sharon; Manjanatha, Mugimane G; Aidoo, Anane

    2005-06-01

    A growing number of studies suggest that isoflavones found in soybeans have estrogenic activity and may safely alleviate the symptoms of menopause. One of these isoflavones, genistein, is commonly used by postmenopausal women as an alternative to hormone replacement therapy. Although sex hormones have been implicated as an important risk factor for the development of hepatocellular carcinoma, there are limited data on the potential effects of the estrogens, including phytoestrogens, on chemical mutagenesis in liver. Because of the association between mutation induction and the carcinogenesis process, we investigated whether endogenous estrogen and supplemental genistein affect 7,12-dimethylbenz[a]anthracene (DMBA)-induced mutagenesis in rat liver. Intact and ovariectomized female Big Blue rats were treated with 80 mg DMBA/kg body weight. Some of the rats also received a supplement of 1,000 ppm genistein. Sixteen weeks after the carcinogen treatment, the rats were sacrificed, their livers were removed, and mutant frequencies (MFs) and types of mutations were determined in the liver cII gene. DMBA significantly increased the MFs in liver for both the intact and ovariectomized rats. While there was no significant difference in MF between the ovariectomized and intact control animals, the mutation induction by DMBA in the ovariectomized groups was significantly higher than that in the intact groups. Dietary genistein did not alter these responses. Molecular analysis of the mutants showed that DMBA induced chemical-specific types of mutations in the liver cII gene. These results suggest that endogenous ovarian hormones have an inhibitory effect on liver mutagenesis by DMBA, whereas dietary genistein does not modulate spontaneous or DMBA-induced mutagenesis in either intact or ovariectomized rats.

  3. Collagen-induced arthritis in nonhuman primates: multiple epitopes of type II collagen can induce autoimmune-mediated arthritis in outbred cynomolgus monkeys.

    Science.gov (United States)

    Shimozuru, Y; Yamane, S; Fujimoto, K; Terao, K; Honjo, S; Nagai, Y; Sawitzke, A D; Terato, K

    1998-03-01

    To define which regions of the type II collagen (CII) molecule result in anticollagen antibody production and the subsequent development of autoantibodies in a collagen-induced arthritis (CIA) nonhuman primate model. Male and female cynomolgus monkeys (2-6 of each sex per group) were immunized with either chicken (Ch), human, or monkey (Mk) CII, or with cyanogen bromide (CB)-generated peptide fragments of ChCII emulsified in Freund's complete adjuvant. Monkeys were observed for the development of arthritis, and sera were collected and analyzed for anticollagen antibody specificity by enzyme-linked immunosorbent assay. Overt arthritis developed in all groups of monkeys immunized with intact CII and with all major CB peptide fragments of ChCII except CB8. Onset and severity of arthritis correlated best with serum anti-MkCII antibody levels. The levels of IgG autoantibody to MkCII were a result of the cross-reactivity rate of anti-heterologous CII antibodies with MkCII, which was based on the genetic background of individual monkeys rather than on sex differences. CII from several species and disparate regions of the CII molecule were able to induce autoantibody-mediated arthritis in outbred cynomolgus monkeys. The strong anti-MkCII response suggests that epitope spreading or induction of broad-based CII cross-reactivity occurred in these animals. Autoantibody levels to MkCII were higher in CIA-susceptible monkeys than in resistant monkeys, despite comparable antibody levels in response to the various immunizations of CII. These results closely parallel the type of anticollagen responses found in sera from rheumatoid arthritis patients. Perhaps this can be accounted for by similar major histocompatibility complex heterogenicity associated with an outbred population, or maybe this is a primate-specific pattern of reactivity to CII.

  4. Type II collagen in cartilage evokes peptide-specific tolerance and skews the immune response.

    Science.gov (United States)

    Malmström, V; Kjellén, P; Holmdahl, R

    1998-06-01

    T cell recognition of type II collagen (CII) is a crucial event in the induction of collagen-induced arthritis in the mouse. Several CII peptides have been shown to be of importance, dependent on which MHC haplotype the mouse carries. By sequencing the rat CII and comparing the sequence with mouse, human, bovine and chicken CII, we have found that the immunodominant peptides all differ at critical positions compared with the autologous mouse sequence. Transgenic expression of the immunodominant Aq-restricted heterologous CII 256-270 epitope inserted into type I collagen (TSC mice) or type II collagen (MMC-1 mice) led to epitope-specific tolerance. Immunization of TSC mice with chick CII led to arthritis and immune responses, dependent on the subdominant, Aq-restricted and chick-specific CII 190-200 epitope. Immunization of F1 mice, expressing both H-2q and H-2r as well as transgenic expression of the Aq-restricted CII 256-270 epitope in cartilage, with bovine CII, led to arthritis, dependent on the Ar-restricted, bovine-specific epitope CII 607-621. These data show that the immunodominance of CII recognition is directed towards heterologous determinants, and that T cells directed towards the corresponding autologous epitopes are tolerated without evidence of active suppression.

  5. Metabolism of apolipoproteins C-II, C-III, and B in hypertriglyceridemic men. Changes after heparin-induced lipolysis

    International Nuclear Information System (INIS)

    Huff, M.W.; Breckenridge, W.C.; Strong, W.L.; Wolfe, B.M.

    1988-01-01

    The C apolipoproteins are normally transferred to high density lipoproteins (HDL) after lipolysis of very low density lipoprotein (VLDL) triglyceride. In previous studies, a loss of plasma C apolipoproteins was documented after heparin-induced lipolysis in hypertriglyceridemic subjects. The present studies were designed to determine if this decline in plasma C apolipoproteins was due to their clearance with VLDL remnants. Five Type IV hypertriglyceridemic and two normal subjects were injected with 125I-VLDL and 131I-low density lipoproteins (LDL) to document kinetically an excess of VLDL apolipoprotein (apo) B flux relative to LDL apo B flux in the Type IV subjects. A mean of 46% VLDL apo B was cleared from the circulation, without conversion to intermediate density lipoprotein (IDL) or LDL. Heparin was then infused (9000 IU over 4 hours) to generate an excess of VLDL remnants that were not converted to IDL or LDL. VLDL triglyceride, apo B, and apo C concentrations fell at a similar rate. VLDL apo B declined by 42% (p less than 0.01). However, no increases were observed in IDL or LDL apo B in the Type IV subjects. This resulted in a 14% (p less than 0.01) decline in plasma apo B concentrations, indicating a clearance of VLDL remnants. VLDL apo C-II and C-III concentrations fell by 42% (p less than 0.025) and 52% (p less than 0.01), respectively. During the first 2.5 hours of infusion, they were almost quantitatively recovered in HDL. Thereafter, the C apolipoproteins declined in HDL during which time VLDL apo C concentrations continued to decline

  6. Conception and production of a time sharing system for a Mitra-15 CII mini-computer dedicated to APL

    International Nuclear Information System (INIS)

    Perrin, Rene

    1977-01-01

    The installation of a time-sharing system on a mini-computer poses several interesting problems. These technical problems are especially interesting when the goal is to equitably divide the physical resources of the machine amongst users of a high level, conservational language like APL. Original solutions were necessary to be able to retain the rapidity and performances of the original hard and software. The system has been implemented in such way that several users may simultaneously access logical resources, such as the library zones their read/write requests are managed by semaphores which may also be directly controlled by the APL programmer. (author) [fr

  7. Oral administration of type-II collagen peptide 250-270 suppresses specific cellular and humoral immune response in collagen-induced arthritis.

    Science.gov (United States)

    Zhu, Ping; Li, Xiao-Yan; Wang, Hong-Kun; Jia, Jun-Feng; Zheng, Zhao-Hui; Ding, Jin; Fan, Chun-Mei

    2007-01-01

    Oral antigen is an attractive approach for the treatment of autoimmune and inflammatory diseases. Establishment of immune markers and methods in evaluating the effects of antigen-specific cellular and humoral immune responses will help the application of oral tolerance in the treatment of human diseases. The present article observed the effects of chicken collagen II (CII), the recombinant polymerized human collagen II 250-270 (rhCII 250-270) peptide and synthesized human CII 250-270 (syCII 250-270) peptide on the induction of antigen-specific autoimmune response in rheumatoid arthritis (RA) peripheral blood mononuclear cells (PBMC) and on the specific cellular and humoral immune response in collagen-induced arthritis (CIA) and mice fed with CII (250-270) prior to immunization with CII. In the study, proliferation, activation and intracellular cytokine production of antigen-specific T lymphocytes were simultaneously analyzed by bromodeoxyuridine (BrdU) incorporation and flow cytometry at the single-cell level. The antigen-specific antibody and antibody-forming cells were detected by ELISA and ELISPOT, respectively. CII (250-270) was found to have stimulated the response of specific lymphocytes in PBMC from RA patients, including the increase expression of surface activation antigen marker CD69 and CD25, and DNA synthesis. Mice, fed with CII (250-270) before CII immunization, had significantly lower arthritic scores than the mice immunized with CII alone, and the body weight of the former increased during the study period. Furthermore, the specific T cell activity, proliferation and secretion of interferon (IFN)-gamma in spleen cells were actively suppressed in CII (250-270)-fed mice, and the serum anti-CII, anti-CII (250-270) antibody activities and the frequency of specific antibody-forming spleen cells were significantly lower in CII (250-270)-fed mice than in mice immunized with CII alone. These observations suggest that oral administration of CII (250-270) can

  8. Glycosylation of type II collagen is of major importance for T cell tolerance and pathology in collagen-induced arthritis

    DEFF Research Database (Denmark)

    Bäcklund, Johan; Treschow, Alexandra; Bockermann, Robert

    2002-01-01

    Type II collagen (CII) is a candidate cartilage-specific autoantigen, which can become post-translationally modified by hydroxylation and glycosylation. T cell recognition of CII is essential for the development of murine collagen-induced arthritis (CIA) and also occurs in rheumatoid arthritis (RA......). The common denominator of murine CIA and human RA is the presentation of an immunodominant CII-derived glycosylated peptide on murine Aq and human DR4 molecules, respectively. To investigate the importance of T cell recognition of glycosylated CII in CIA development after immunization with heterologous CII......, we treated neonatal mice with different heterologous CII-peptides (non-modified, hydroxylated and galactosylated). Treatment with the galactosylated peptide (galactose at position 264) was superior in protecting mice from CIA. Protection was accompanied by a reduced antibody response to CII...

  9. Effect of (3,5,6-trimethylpyrazin-2-yl)methyl 2-[4-(2-methylpropyl)phenyl]propanoate (ITE), a newly developed anti-inflammatory drug, on type II collagen-induced arthritis in mice.

    Science.gov (United States)

    Ma, Tao; Cao, Ying-Lin; Xu, Bei-Bei; Zhou, Xiao-Mian

    2004-06-01

    The effect of (3,5,6-trimethylpyrazin-2-yl)methyl 2-[4-(2-methylpropyl)phenyl]propanoate (ITE) on type II collagen (CII)-induced arthritis in mice was studied. Mice were immunized twice with CII, ITE being given orally once a day for 40 d after the 1st immunization. Clinical assessment showed that ITE had no effect on the day of onset of arthritis but did lowered the incidence rate of arthritis and the arthritis score. And ITE had a marked suppressive effect on the mouse hind paw edema induced by CII. ITE suppressed the delayed-type mouse ear skin reaction to CII but had no effect on the level of serum anti-CII antibodies. These results suggest that ITE inhibits the development of CII-induced arthritis in mice by suppressing delayed-type hypersensitivity to CII.

  10. Arthrogenicity of type II collagen monoclonal antibodies associated with complement activation and antigen affinity.

    Science.gov (United States)

    Koobkokkruad, Thongchai; Kadotani, Tatsuya; Hutamekalin, Pilaiwanwadee; Mizutani, Nobuaki; Yoshino, Shin

    2011-11-04

    The collagen antibody-induced arthritis (CAIA) model, which employs a cocktail of monoclonal antibodies (mAbs) to type II collagen (CII), has been widely used for studying the pathogenesis of autoimmune arthritis. In this model, not all mAbs to CII are capable of inducing arthritis because one of the initial events is the formation of collagen-antibody immune complexes on the cartilage surface or in the synovium, and subsequent activation of the complement by the complexes induces arthritis, suggesting that a combination of mAbs showing strong ability to bind mouse CII and activate the complement may effectively induce arthritis in mice. In the present study, we examined the relationship between the induction of arthritis by the combination of IgG2a (CII-6 and C2A-12), IgG2b (CII-3, C2B-14 and C2B-16) and IgM (CM-5) subclones of monoclonal antibodies (mAb) of anti-bovine or chicken CII and the ability of mAbs to activate complement and bind mouse CII. DBA/1J mice were injected with several combinations of mAbs followed by lipopolysaccharide. Furthermore, the ability of mAbs to activate the complement and bind mouse CII was examined by ELISA. First, DBA/1J mice were injected with the combined 4 mAbs (CII-3, CII-6, C2B-14, and CM-5) followed by lipopolysaccharide, resulting in moderate arthritis. Excluding one of the mAbs, i.e., using only CII-3, CII-6, and C2B-14, induced greater inflammation of the joints. Next, adding C2A-12 but not C2B-16 to these 3 mAbs produced more severe arthritis. A combination of five clones, consisting of all 5 mAbs, was less effective. Histologically, mice given the newly developed 4-clone cocktail had marked proliferation of synovial tissues, massive infiltration by inflammatory cells, and severe destruction of cartilage and bone. Furthermore, 4 of the 6 clones (CII-3, CII-6, C2B-14, and C2A-12) showed not only a strong cross-reaction with mouse CII but also marked activation of the complement in vitro. The combination of 4 mAbs showing

  11. Arthrogenicity of type II collagen monoclonal antibodies associated with complement activation and antigen affinity

    Directory of Open Access Journals (Sweden)

    Mizutani Nobuaki

    2011-11-01

    Full Text Available Abstract Background The collagen antibody-induced arthritis (CAIA model, which employs a cocktail of monoclonal antibodies (mAbs to type II collagen (CII, has been widely used for studying the pathogenesis of autoimmune arthritis. In this model, not all mAbs to CII are capable of inducing arthritis because one of the initial events is the formation of collagen-antibody immune complexes on the cartilage surface or in the synovium, and subsequent activation of the complement by the complexes induces arthritis, suggesting that a combination of mAbs showing strong ability to bind mouse CII and activate the complement may effectively induce arthritis in mice. In the present study, we examined the relationship between the induction of arthritis by the combination of IgG2a (CII-6 and C2A-12, IgG2b (CII-3, C2B-14 and C2B-16 and IgM (CM-5 subclones of monoclonal antibodies (mAb of anti-bovine or chicken CII and the ability of mAbs to activate complement and bind mouse CII. Methods DBA/1J mice were injected with several combinations of mAbs followed by lipopolysaccharide. Furthermore, the ability of mAbs to activate the complement and bind mouse CII was examined by ELISA. Results First, DBA/1J mice were injected with the combined 4 mAbs (CII-3, CII-6, C2B-14, and CM-5 followed by lipopolysaccharide, resulting in moderate arthritis. Excluding one of the mAbs, i.e., using only CII-3, CII-6, and C2B-14, induced greater inflammation of the joints. Next, adding C2A-12 but not C2B-16 to these 3 mAbs produced more severe arthritis. A combination of five clones, consisting of all 5 mAbs, was less effective. Histologically, mice given the newly developed 4-clone cocktail had marked proliferation of synovial tissues, massive infiltration by inflammatory cells, and severe destruction of cartilage and bone. Furthermore, 4 of the 6 clones (CII-3, CII-6, C2B-14, and C2A-12 showed not only a strong cross-reaction with mouse CII but also marked activation of the

  12. Oral and nasal administration of chicken type II collagen suppresses adjuvant arthritis in rats with intestinal lesions induced by meloxicam.

    Science.gov (United States)

    Zheng, Yong-Qiu; Wei, Wei; Shen, Yu-Xian; Dai, Min; Liu, Li-Hua

    2004-11-01

    To investigate the curative effects of oral and nasal administration of chicken type II collagen (CII) on adjuvant arthritis (AA) in rats with meloxicam-induced intestinal lesions. AA model in Sprague-Dawley (SD) rats with or without intestinal lesions induced by meloxicam was established and those rats were divided randomly into six groups which included AA model, AA model+meloxicam, AA model+oral CII, AA model+nasal CII, AA model+ meloxicam+oral C II and AA model+meloxicam+nasal CII (n = 12). Rats was treated with meloxicam intragastrically for 7 d from d 14 after immunization with complete Freund's adjuvant (CFA), and then treated with chicken CII intragastrically or nasally for 7 d. Histological changes of right hind knees were examined. Hind paw secondary swelling and intestinal lesions were evaluated. Synoviocyte proliferation was measured by 3-(4,5-dimethylthiazol-2-thiazolyl)-2,5-diphenyl-2H tetrazolium bromide (MTT) method. Activities of myeloperoxidase (MPO) and diamine oxidase (DAO) from supernatants of intestinal homogenates were assayed by spectrophotometric analysis. Intragastrical administration of meloxicam (1.5 mg/kg) induced multiple intestinal lesions in AA rats. There was a significant decrease of intestinal DAO activities in AA+meloxicam group (P<0.01) and AA model group (P<0.01) compared with normal group. DAO activities of intestinal homogenates in AA+meloxicam group were significantly less than those in AA rats (P<0.01). There was a significant increase of intestinal MPO activities in AA+meloxicam group compared with normal control (P<0.01). Oral or nasal administration of CII (20 microg/kg) could suppress the secondary hind paw swelling(P<0.05 for oral CII; P<0.01 for nasal CII), synoviocyte proliferation (P<0.01) and histopathological degradation in AA rats, but they had no significant effects on DAO and MPO changes. However, oral administration of CII (20 microg/kg) showed the limited efficacy on arthritis in AA+meloxicam model and the

  13. The relationship between SARA fractions and crude oil stability

    Directory of Open Access Journals (Sweden)

    Siavash Ashoori

    2017-03-01

    Full Text Available Asphaltene precipitation and deposition are drastic issues in the petroleum industry. Monitoring the asphaltene stability in crude oil is still a serious problem and has been subject of many studies. To investigate crude oil stability by saturate, aromatic, resin and asphaltene (SARA analysis seven types of crudes with different components were used. The applied methods for SARA quantification are IP-143 and ASTM D893-69 and the colloidal instability index (CII is computed from the SARA values as well. In comparison between CII results, the values of oil compositions demonstrated that the stability of asphaltenes in crude oils is a phenomenon that is related to all these components and it cannot be associated only with one of them, individually.

  14. Search for C II Emission on Cosmological Scales at Redshift Z ˜ 2.6

    Science.gov (United States)

    Pullen, Anthony R.; Serra, Paolo; Chang, Tzu-Ching; Doré, Olivier; Ho, Shirley

    2018-05-01

    We present a search for Cii emission over cosmological scales at high-redshifts. The Cii line is a prime candidate to be a tracer of star formation over large-scale structure since it is one of the brightest emission lines from galaxies. Redshifted Cii emission appears in the submillimeter regime, meaning it could potentially be present in the higher frequency intensity data from the Planck satellite used to measure the cosmic infrared background (CIB). We search for Cii emission over redshifts z = 2 - 3.2 in the Planck 545 GHz intensity map by cross-correlating the 3 highest frequency Planck maps with spectroscopic quasars and CMASS galaxies from the Sloan Digital Sky Survey III (SDSS-III), which we then use to jointly fit for Cii intensity, CIB parameters, and thermal Sunyaev-Zeldovich (SZ) emission. We report a measurement of an anomalous emission I_ν =6.6^{+5.0}_{-4.8}× 10^4Jy/sr at 95% confidence, which could be explained by Cii emission, favoring collisional excitation models of Cii emission that tend to be more optimistic than models based on Cii luminosity scaling relations from local measurements; however, a comparison of Bayesian information criteria reveal that this model and the CIB & SZ only model are equally plausible. Thus, more sensitive measurements will be needed to confirm the existence of large-scale Cii emission at high redshifts. Finally, we forecast that intensity maps from Planck cross-correlated with quasars from the Dark Energy Spectroscopic Instrument (DESI) would increase our sensitivity to Cii emission by a factor of 5, while the proposed Primordial Inflation Explorer (PIXIE) could increase the sensitivity further.

  15. Kiloparsec-scale gaseous clumps and star formation at z = 5-7

    Science.gov (United States)

    Carniani, S.; Maiolino, R.; Amorin, R.; Pentericci, L.; Pallottini, A.; Ferrara, A.; Willott, C. J.; Smit, R.; Matthee, J.; Sobral, D.; Santini, P.; Castellano, M.; De Barros, S.; Fontana, A.; Grazian, A.; Guaita, L.

    2018-05-01

    We investigate the morphology of the [CII] emission in a sample of "normal" star-forming galaxies at 5 limits. By taking into account the presence of all these components, we find that the L[CII]-SFR relation at early epochs is fully consistent with the local relation, but it has a dispersion of 0.48±0.07 dex, which is about two times larger than observed locally. We also find that the deviation from the local L[CII]-SFR relation has a weak anti-correlation with the EW(Lyα). The morphological analysis also reveals that [CII] emission is generally much more extended than the UV emission. As a consequence, these primordial galaxies are characterised by a [CII] surface brightness generally much lower than expected from the local Σ _{[CII]}-Σ _{SFR} relation. These properties are likely a consequence of a combination of different effects, namely: gas metallicity, [CII] emission from obscured star-forming regions, strong variations of the ionisation parameter, and circumgalactic gas in accretion or ejected by these primeval galaxies.

  16. Apolipoprotein CII

    Science.gov (United States)

    ... lipoprotein ( VLDL ), which is made up of mostly triglycerides (a type of fat in your blood). This ... gov/pubmed/23257303 . Semenkovich CF. Disorders of lipid metabolism. In: Goldman L, Schafer AI, eds. Goldman's Cecil ...

  17. Generating Natural Language Under Pragmatic Constraints.

    Science.gov (United States)

    1987-03-01

    Computation and Mathematics Department Bethesda, Maryland 20084 Captain Grace M. Hopper, USNR I copy Naval Data Automation Command, Code OOH Washington Navy...Institute Mathematical Science New York University New York, NY 10012 Dr. Morgan I copy University of Pennsylvania Dept. of Computer Science & Info. Sci...If mi initerpretatioi wvith the group’s concepts i 4hut Alrealdv present ili mlemory. ,1 lew higzh- lvel interpretltion cii 1e created and indexed off

  18. A Novel Apolipoprotein C-II Mimetic Peptide That Activates Lipoprotein Lipase and Decreases Serum Triglycerides in Apolipoprotein E–Knockout Mice

    Science.gov (United States)

    Sakurai, Toshihiro; Sakurai-Ikuta, Akiko; Sviridov, Denis; Freeman, Lita; Ahsan, Lusana; Remaley, Alan T.

    2015-01-01

    Apolipoprotein A-I (apoA-I) mimetic peptides are currently being developed as possible new agents for the treatment of cardiovascular disease based on their ability to promote cholesterol efflux and their other beneficial antiatherogenic properties. Many of these peptides, however, have been reported to cause transient hypertriglyceridemia due to inhibition of lipolysis by lipoprotein lipase (LPL). We describe a novel bihelical amphipathic peptide (C-II-a) that contains an amphipathic helix (18A) for binding to lipoproteins and stimulating cholesterol efflux as well as a motif based on the last helix of apolipoprotein C-II (apoC-II) that activates lipolysis by LPL. The C-II-a peptide promoted cholesterol efflux from ATP-binding cassette transporter ABCA1-transfected BHK cells similar to apoA-I mimetic peptides. Furthermore, it was shown in vitro to be comparable to the full-length apoC-II protein in activating lipolysis by LPL. When added to serum from a patient with apoC-II deficiency, it restored normal levels of LPL-induced lipolysis and also enhanced lipolysis in serum from patients with type IV and V hypertriglyceridemia. Intravenous injection of C-II-a (30 mg/kg) in apolipoprotein E–knockout mice resulted in a significant reduction of plasma cholesterol and triglycerides of 38 ± 6% and 85 ± 7%, respectively, at 4 hours. When coinjected with the 5A peptide (60 mg/kg), the C-II-a (30 mg/kg) peptide was found to completely block the hypertriglyceridemic effect of the 5A peptide in C57Bl/6 mice. In summary, C-II-a is a novel peptide based on apoC-II, which promotes cholesterol efflux and lipolysis and may therefore be useful for the treatment of apoC-II deficiency and other forms of hypertriglyceridemia. PMID:25395590

  19. Chronic inflammation of the prostate type IV with respect to risk of prostate cancer

    Directory of Open Access Journals (Sweden)

    Antonio B. Porcaro

    2014-09-01

    Full Text Available Background: Chronic inflammatory infiltrate (CII might be involved in prostate cancer (PCA and benign hyperplasia (BPH; however, its significance is controversial. Chronic inflammatory prostatitis type IV is the most common non cancer diagnosis in men undergoing biopsy because of suspected PCA. Objective: To evaluate potential associations of coexistent CII and PCA in biopsy specimens after prostate assessment. Design, setting, and participants: Between January 2007 and December 2008, 415 consecutive patients who underwent prostate biopsy were retrospectively evaluated. The investigated variables included Age (years and PSA (ug/l; moreover, CII+, glandular atrophy (GA+, glandular hyperplasia (GH+, prostate Intraepithelial neoplasm (PIN+, atypical small acinar cell proliferation (ASAP+ and PCA positive cores (P+ were evaluated as categorical and continuous (proportion of positive cores. Outcome measurements and statistical analysis: Associations of CII+ and PCA risk were assessed by statistical methods. Results and limitations: In the patient population, a biopsy core positive for PCA was detected in 34.2% of cases and the rate of high grade PCA (HGPCA: bGS ! 8 resulted 4.82%. CII+ significantly and inversely associated with a positive biopsy core P+ (P < 0.0001; OR = 0.26 and HGPCA (P = 0.0005; OR = 0.05. Moreover, the associations indicated that patients with coexistent CII+ on needle biopsy were 74% less likely to have coexistent PCA than men without CII+ as well as 95% less likely to have HGPCA in the biopsy core than men without coexistent CII+. There were limits in our study which was single centre and included only one dedicated pathologist. Conclusions: There was an inverse association of chronic inflammation of the prostate type IV and risk of PCA; moreover, HGPCA was less likely to be detected in cancers associated with coexistent CII. In prostate microenvironment, prostate chronic inflammation may be protective; however, its role in

  20. Dynamics Determine Signaling in a Multicomponent System Associated with Rheumatoid Arthritis.

    Science.gov (United States)

    Lindgren, Cecilia; Tyagi, Mohit; Viljanen, Johan; Toms, Johannes; Ge, Changrong; Zhang, Naru; Holmdahl, Rikard; Kihlberg, Jan; Linusson, Anna

    2018-05-24

    Strategies that target multiple components are usually required for treatment of diseases originating from complex biological systems. The multicomponent system consisting of the DR4 major histocompatibility complex type II molecule, the glycopeptide CII259-273 from type II collagen, and a T-cell receptor is associated with development of rheumatoid arthritis (RA). We introduced non-native amino acids and amide bond isosteres into CII259-273 and investigated the effect on binding to DR4 and the subsequent T-cell response. Molecular dynamics simulations revealed that complexes between DR4 and derivatives of CII259-273 were highly dynamic. Signaling in the overall multicomponent system was found to depend on formation of an appropriate number of dynamic intramolecular hydrogen bonds between DR4 and CII259-273, together with the positioning of the galactose moiety of CII259-273 in the DR4 binding groove. Interestingly, the system tolerated modifications at several positions in CII259-273, indicating opportunities to use analogues to increase our understanding of how rheumatoid arthritis develops and for evaluation as vaccines to treat RA.

  1. A novel recombinant peptide containing only two T-cell tolerance epitopes of chicken type II collagen that suppresses collagen-induced arthritis.

    Science.gov (United States)

    Xi, Caixia; Tan, Liuxin; Sun, Yeping; Liang, Fei; Liu, Nan; Xue, Hong; Luo, Yuan; Yuan, Fang; Sun, Yuying; Xi, Yongzhi

    2009-02-01

    Immunotherapy of rheumatoid arthritis (RA) using oral-dosed native chicken or bovine type II collagen (nCII) to induce specific immune tolerance is an attractive strategy. However, the majority of clinical trials of oral tolerance in human diseases including RA in recent years have been disappointing. Here, we describe a novel recombinant peptide rcCTE1-2 which contains only two tolerogenic epitopes (CTE1 and CTE2) of chicken type II collagen (cCII). These are the critical T-cell determinants for suppression of RA that were first developed and used to compare its suppressive effects with ncCII on the collagen-induced arthritis (CIA) model. The rcCTE1-2 was produced using the prokaryotic pET expression system and purified by Ni-NTA His affinity chromatography. Strikingly, our results showed clearly that rcCTE1-2 was as efficacious as ncCII at the dose of 50 microg/kg/d. This dose significantly reduced footpad swelling, arthritic incidence and scores, and deferred the onset of disease. Furthermore, rcCTE1-2 of 50 microg/kg/d could lower the level of anti-nCII antibody in the serum of CIA animals, decrease Th1-cytokine INF-gamma level, and increase Th3-cytokine TGF-beta(1) produced level by spleen cells from CIA mice after in vivo stimulation with ncCII. Importantly, rcCTE1-2 was even more potent than native cCII, which was used in the clinic for RA. Equally importantly, the findings that the major T-cell determinants of cCII that are also recognized by H-2(b) MHC-restricted T cells have not previously been reported. Taken together, these results suggest that we have successfully developed a novel recombinant peptide rcCTE1-2 that can induce a potent tolerogenic response in CIA.

  2. Apolipoprotein C-II Is a Potential Serum Biomarker as a Prognostic Factor of Locally Advanced Cervical Cancer After Chemoradiation Therapy

    International Nuclear Information System (INIS)

    Harima, Yoko; Ikeda, Koshi; Utsunomiya, Keita; Komemushi, Atsushi; Kanno, Shohei; Shiga, Toshiko; Tanigawa, Noboru

    2013-01-01

    Purpose: To determine pretreatment serum protein levels for generally applicable measurement to predict chemoradiation treatment outcomes in patients with locally advanced squamous cell cervical carcinoma (CC). Methods and Materials: In a screening study, measurements were conducted twice. At first, 6 serum samples from CC patients (3 with no evidence of disease [NED] and 3 with cancer-caused death [CD]) and 2 from healthy controls were tested. Next, 12 serum samples from different CC patients (8 NED, 4 CD) and 4 from healthy controls were examined. Subsequently, 28 different CC patients (18 NED, 10 CD) and 9 controls were analyzed in the validation study. Protein chips were treated with the sample sera, and the serum protein pattern was detected by surface-enhanced laser desorption and ionization–time-of-flight mass spectrometry (SELDI-TOF MS). Then, single MS-based peptide mass fingerprinting (PMF) and tandem MS (MS/MS)-based peptide/protein identification methods, were used to identify protein corresponding to the detected peak. And then, turbidimetric assay was used to measure the levels of a protein that indicated the best match with this peptide peak. Results: The same peak 8918 m/z was identified in both screening studies. Neither the screening study nor the validation study had significant differences in the appearance of this peak in the controls and NED. However, the intensity of the peak in CD was significantly lower than that of controls and NED in both pilot studies (P=.02, P=.04) and validation study (P=.01, P=.001). The protein indicated the best match with this peptide peak at 8918 m/z was identified as apolipoprotein C-II (ApoC-II) using PMF and MS/MS methods. Turbidimetric assay showed that the mean serum levels of ApoC-II tended to decrease in CD group when compared with NED group (P=.078). Conclusion: ApoC-II could be used as a biomarker for detection in predicting and estimating the radiation treatment outcome of patients with CC

  3. Apolipoprotein C-II Is a Potential Serum Biomarker as a Prognostic Factor of Locally Advanced Cervical Cancer After Chemoradiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Harima, Yoko, E-mail: harima@takii.kmu.ac.jp [Department of Radiology, Takii Hospital, Kansai Medical University, Moriguchi, Osaka (Japan); Ikeda, Koshi; Utsunomiya, Keita; Komemushi, Atsushi; Kanno, Shohei; Shiga, Toshiko [Department of Radiology, Takii Hospital, Kansai Medical University, Moriguchi, Osaka (Japan); Tanigawa, Noboru [Department of Radiology, Hirakata Hospital, Kansai Medical University, Hirakata, Osaka (Japan)

    2013-12-01

    Purpose: To determine pretreatment serum protein levels for generally applicable measurement to predict chemoradiation treatment outcomes in patients with locally advanced squamous cell cervical carcinoma (CC). Methods and Materials: In a screening study, measurements were conducted twice. At first, 6 serum samples from CC patients (3 with no evidence of disease [NED] and 3 with cancer-caused death [CD]) and 2 from healthy controls were tested. Next, 12 serum samples from different CC patients (8 NED, 4 CD) and 4 from healthy controls were examined. Subsequently, 28 different CC patients (18 NED, 10 CD) and 9 controls were analyzed in the validation study. Protein chips were treated with the sample sera, and the serum protein pattern was detected by surface-enhanced laser desorption and ionization–time-of-flight mass spectrometry (SELDI-TOF MS). Then, single MS-based peptide mass fingerprinting (PMF) and tandem MS (MS/MS)-based peptide/protein identification methods, were used to identify protein corresponding to the detected peak. And then, turbidimetric assay was used to measure the levels of a protein that indicated the best match with this peptide peak. Results: The same peak 8918 m/z was identified in both screening studies. Neither the screening study nor the validation study had significant differences in the appearance of this peak in the controls and NED. However, the intensity of the peak in CD was significantly lower than that of controls and NED in both pilot studies (P=.02, P=.04) and validation study (P=.01, P=.001). The protein indicated the best match with this peptide peak at 8918 m/z was identified as apolipoprotein C-II (ApoC-II) using PMF and MS/MS methods. Turbidimetric assay showed that the mean serum levels of ApoC-II tended to decrease in CD group when compared with NED group (P=.078). Conclusion: ApoC-II could be used as a biomarker for detection in predicting and estimating the radiation treatment outcome of patients with CC.

  4. Approving cancer treatments based on endpoints other than overall survival: an analysis of historical data using the PACE Continuous Innovation Indicators™ (CII).

    Science.gov (United States)

    Brooks, Neon; Campone, Mario; Paddock, Silvia; Shortenhaus, Scott; Grainger, David; Zummo, Jacqueline; Thomas, Samuel; Li, Rose

    2017-01-01

    There is an active debate about the role that endpoints other than overall survival (OS) should play in the drug approval process. Yet the term 'surrogate endpoint' implies that OS is the only critical metric for regulatory approval of cancer treatments. We systematically analyzed the relationship between U.S. Food and Drug Administration (FDA) approval and publication of OS evidence to understand better the risks and benefits of delaying approval until OS evidence is available. Using the PACE Continuous Innovation Indicators (CII) platform, we analyzed the effects of cancer type, treatment goal, and year of approval on the lag time between FDA approval and publication of first significant OS finding for 53 treatments approved between 1952 and 2016 for 10 cancer types (n = 71 approved indications). Greater than 59% of treatments were approved before significant OS data for the approved indication were published. Of the drugs in the sample, 31% had lags between approval and first published OS evidence of 4 years or longer. The average number of years between approval and first OS evidence varied by cancer type and did not reliably predict the eventual amount of OS evidence accumulated. Striking the right balance between early access and minimizing risk is a central challenge for regulators worldwide. We illustrate that endpoints other than OS have long helped to provide timely access to new medicines, including many current standards of care. We found that many critical drugs are approved many years before OS data are published, and that OS may not be the most appropriate endpoint in some treatment contexts. Our examination of approved treatments without significant OS data suggests contexts where OS may not be the most relevant endpoint and highlights the importance of using a wide variety of fit-for-purpose evidence types in the approval process.

  5. Quadriceps exercise intolerance in patients with chronic obstructive pulmonary disease

    DEFF Research Database (Denmark)

    Gifford, Jayson R; Trinity, Joel D; Layec, Gwenael

    2015-01-01

    This study sought to determine if qualitative alterations in skeletal muscle mitochondrial respiration, associated with decreased mitochondrial efficiency, contribute to exercise intolerance in patients with chronic obstructive pulmonary disease (COPD). Using permeabilized muscle fibers from.......05). Overall, this study indicates that COPD is associated with qualitative alterations in skeletal muscle mitochondria that affect the contribution of CI and CII-driven respiration, which potentially contributes to the exercise intolerance associated with this disease....... the vastus lateralis of 13 patients with COPD and 12 healthy controls, complex I (CI) and complex II (CII)-driven State 3 mitochondrial respiration were measured separately (State 3:CI and State 3:CII) and in combination (State 3:CI+CII). State 2 respiration was also measured. Exercise tolerance was assessed...

  6. Construction of collagen II/hyaluronate/chondroitin-6-sulfate tri-copolymer scaffold for nucleus pulposus tissue engineering and preliminary analysis of its physico-chemical properties and biocompatibility.

    Science.gov (United States)

    Li, Chang-Qing; Huang, Bo; Luo, Gang; Zhang, Chuan-Zhi; Zhuang, Ying; Zhou, Yue

    2010-02-01

    To construct a novel scaffold for nucleus pulposus (NP) tissue engineering, The porous type II collagen (CII)/hyaluronate (HyA)-chondroitin-6-sulfate (6-CS) scaffold was prepared using 1-ethyl-3-(3-dimethylaminopropyl)-carbodiimide (EDC) and N-hydroxysuccinimide (NHS) cross-linking system. The physico-chemical properties and biocompatibility of CII/HyA-CS scaffolds were evaluated. The results suggested CII/HyA-CS scaffolds have a highly porous structure (porosity: 94.8 +/- 1.5%), high water-binding capacity (79.2 +/- 2.8%) and significantly improved mechanical stability by EDC/NHS crosslinking (denaturation temperature: 74.6 +/- 1.8 and 58.1 +/- 2.6 degrees C, respectively, for the crosslinked scaffolds and the non-crosslinked; collagenase degradation rate: 39.5 +/- 3.4 and 63.5 +/- 2.0%, respectively, for the crosslinked scaffolds and the non-crosslinked). The CII/HyA-CS scaffolds also showed satisfactory cytocompatibility and histocompatibility as well as low immunogenicity. These results indicate CII/HyA-CS scaffolds may be an alternative material for NP tissue engineering due to the similarity of its composition and physico-chemical properties to those of the extracellular matrices (ECM) of native NP.

  7. Creation of n-dimension spectra, projection and visualization of sub-volumes

    International Nuclear Information System (INIS)

    Be, M.M.

    1980-06-01

    A data base has been created which allows the processing of multiparameter data (five at most) resulting from nuclear physics experiments, with help of a CII 10 020 computer. From this basis one can set conditions on the various parameters and thus obtain one to five monoparametric spectra and one to four bidimensional spectra which are created simultaneously. These mono- and bidimensional spectra can be visualized as soon as their extraction ends up [fr

  8. Data acquisition system using a C 90-10 computer

    International Nuclear Information System (INIS)

    Smiljanic, Gabro

    1969-05-01

    The aim of this study is to make possible the acquisition of experimental data by the memory of a numerical calculator. These data come from analog-to-digital converters that analyze the amplitude of the pulses provided by detectors. Normally the computer executes its main program (data processing, transfer on magnetic tape, visualization, etc.). When information is available at the output of a converter, an interruption of the main program is requested, and after agreement a subroutine supports access to the information in the computer. The author has also considered the bi- and tri-parametric acquisition. The calculator and the converters are commercial devices (calculator C 90-10 from CII and converter CA 12 or CA 25 from Intertechnique), while the systems of adaptation of the input-output levels and the visualization were studied and realized at the CEA Saclay. An interface device was built to connect the converters; it's the cable part of the system. On the other hand, the programs necessary for the operation of the calculator have been studied and developed; it is the program aspect of the system. As far as possible the interface is designed to be universal, i.e. it must be able to work with other brands of equipment. The acquisition of the data is carried out in two phases: a) the converter expresses the amplitude of the input signal in the form of a binary number which is transferred into the interface at the same time as an interruption of the main program is asked. b) After acceptance of this interruption, the subprogram supports the transfer of the information of the interface in the computer, then adds a unit to the word located at the address determined from the information received. In other words, the system behaves like an amplitude analyzer whose operation is well known. But it is of a much more flexible use because of the possibilities of quick adaptation of the programs to the needs of the considered experiment, of the possibility to treat

  9. Epicutaneous immunization with type II collagen inhibits both onset and progression of chronic collagen-induced arthritis.

    Directory of Open Access Journals (Sweden)

    Jessica Strid

    Full Text Available Epicutaneous immunization is a potential non-invasive technique for antigen-specific immune-modulation. Topical application of protein antigens to barrier-disrupted skin induces potent antigen-specific immunity with a strong Th2-bias. In this study, we investigate whether the autoimmune inflammatory response of chronic collagen-induced arthritis (CCIA in DBA/1-TCR-beta Tg mice can be modified by epicutaneous immunization. We show that epicutaneous immunization with type II collagen (CII inhibited development and progression of CCIA and, importantly, also ameliorated ongoing disease as indicated by clinical scores of disease severity, paw swelling and joints histology. Treated mice show reduced CII-driven T cell proliferation and IFN-gamma production, as well as significantly lower levels of CII-specific IgG2a serum antibodies. In contrast, CII-driven IL-4 production and IgE antibody levels were increased consistent with skewing of the CII response from Th1 to Th2 in treated mice. IL-4 production in treated mice was inversely correlated with disease severity. Moreover, T cells from treated mice inhibited proliferation and IFN-gamma production by T cells from CCIA mice, suggesting induction of regulatory T cells that actively inhibit effector responses in arthritic mice. The levels of CD4(+CD25(+ T cells were however not increased following epicutaneous CII treatment. Together, these results suggest that epicutaneous immunization may be used as an immune-modulating procedure to actively re-programme pathogenic Th1 responses, and could have potential as a novel specific and simple treatment for chronic autoimmune inflammatory diseases such as rheumatoid arthritis.

  10. Understanding Cyber Threats and Vulnerabilities

    NARCIS (Netherlands)

    Luiijf, H.A.M.

    2012-01-01

    This chapter reviews current and anticipated cyber-related threats to the Critical Information Infrastructure (CII) and Critical Infrastructures (CI). The potential impact of cyber-terrorism to CII and CI has been coined many times since the term was first coined during the 1980s. Being the

  11. Endogenous collagen peptide activation of CD1d-restricted NKT cells ameliorates tissue-specific inflammation in mice

    DEFF Research Database (Denmark)

    Liu, Yawei; Teige, Anna; Mondoc, Emma

    2011-01-01

    NKT cells in the mouse recognize antigen in the context of the MHC class I-like molecule CD1d and play an important role in peripheral tolerance and protection against autoimmune and other diseases. NKT cells are usually activated by CD1d-presented lipid antigens. However, peptide recognition...... in the context of CD1 has also been documented, although no self-peptide ligands have been reported to date. Here, we have identified an endogenous peptide that is presented by CD1d to activate mouse NKT cells. This peptide, the immunodominant epitope from mouse collagen type II (mCII707-721), was not associated...... with either MHC class I or II. Activation of CD1d-restricted mCII707-721-specific NKT cells was induced via TCR signaling and classical costimulation. In addition, mCII707-721-specific NKT cells induced T cell death through Fas/FasL, in an IL-17A-independent fashion. Moreover, mCII707-721-specific NKT cells...

  12. Hydrophobic Man-1-P derivatives correct abnormal glycosylation in Type I congenital disorder of glycosylation fibroblasts.

    Science.gov (United States)

    Eklund, Erik A; Merbouh, Nabyl; Ichikawa, Mie; Nishikawa, Atsushi; Clima, Jessica M; Dorman, James A; Norberg, Thomas; Freeze, Hudson H

    2005-11-01

    Patients with Type I congenital disorders of glycosylation (CDG-I) make incomplete lipid-linked oligosaccharides (LLO). These glycans are poorly transferred to proteins resulting in unoccupied glycosylation sequons. Mutations in phosphomannomutase (PMM2) cause CDG-Ia by reducing the activity of PMM, which converts mannose (Man)-6-P to Man-1-P before formation of GDP-Man. These patients have reduced Man-1-P and GDP-Man. To replenish intracellular Man-1-P pools in CDG-Ia cells, we synthesized two hydrophobic, membrane permeable acylated versions of Man-1-P and determined their ability to normalize LLO size and N-glycosylation in CDG-Ia fibroblasts. Both compounds, compound I (diacetoxymethyl 2,3,4,6-tetra-O-acetyl-alpha-D-mannopyranosyl phosphate) (C-I) and compound II (diacetoxymethyl 2,3,4,6-tetra-O-ethyloxycarbonyl-alpha-D-mannopyranosyl phosphate) (C-II), contain two acetoxymethyl (CH2OAc) groups O-linked to phosphorous. C-I contains acetyl esters and C-II contains ethylcarbonate (CO2Et) esters on the Man residue. Both C-I and C-II normalized truncated LLO, but C-II was about 2-fold more efficient than C-I. C-II replenished the GDP-Man pool in CDG-Ia cells and was more efficiently incorporated into glycoproteins than exogenous Man at low concentrations (25-75 mM). In a glycosylation assay of DNaseI in CDG-Ia cells, C-II restored glycosylation to control cell levels. C-II also corrected impaired LLO biosynthesis in cells from a Dolichol (Dol)-P-Man deficient patient (CDG-Ie) and partially corrected LLO in cells from an ALG12 mannosyltransferase-deficient patient (CDG-Ig), whereas cells from an ALG3-deficient patient (CDG-Id) and from an MPDU1-deficient patient (CDG-If) were not corrected. These results validate the general concept of using pro-Man-1-P substrates as potential therapeutics for CDG-I patients.

  13. Effect of carbon ion implantation on the tribology of metal-on-metal bearings for artificial joints.

    Science.gov (United States)

    Koseki, Hironobu; Tomita, Masato; Yonekura, Akihiko; Higuchi, Takashi; Sunagawa, Sinya; Baba, Koumei; Osaki, Makoto

    2017-01-01

    Metal-on-metal (MoM) bearings have become popular due to a major advantage over metal-on-polymer bearings for total hip arthroplasty in that the larger femoral head and hydrodynamic lubrication of the former reduce the rate of wear. However, concerns remain regarding adverse reactions to metal debris including metallosis caused by metal wear generated at the taper-head interface and another modular junction. Our group has hypothesized that carbon ion implantation (CII) may improve metal wear properties. The purpose of this study was to investigate the wear properties and friction coefficients of CII surfaces with an aim to ultimately apply these surfaces to MoM bearings in artificial joints. CII was applied to cobalt-chromium-molybdenum (Co-Cr-Mo) alloy substrates by plasma source ion implantation. The substrates were characterized using scanning electron microscopy and a 3D measuring laser microscope. Sliding contact tests were performed with a simple geometry pin-on-plate wear tester at a load of 2.5 N, a calculated contact pressure of 38.5 MPa (max: 57.8 MPa), a reciprocating velocity of 30 mm/s, a stroke length of 60 mm, and a reciprocating cycle count of 172,800 cycles. The surfaces of the CII substrates were generally featureless with a smooth surface topography at the same level as untreated Co-Cr-Mo alloy. Compared to the untreated Co-Cr-Mo alloy, the CII-treated bearings had lower friction coefficients, higher resistance to catastrophic damage, and prevented the adhesion of wear debris. The results of this study suggest that the CII surface stabilizes the wear status due to the low friction coefficient and low infiltration of partner materials, and these properties also prevent the adhesion of wear debris and inhibit excessive wear. Carbon is considered to be biologically inert; therefore, CII is anticipated to be applicable to the bearing surfaces of MoM prostheses.

  14. Class I and II Small Heat Shock Proteins Together with HSP101 Protect Protein Translation Factors during Heat Stress.

    Science.gov (United States)

    McLoughlin, Fionn; Basha, Eman; Fowler, Mary E; Kim, Minsoo; Bordowitz, Juliana; Katiyar-Agarwal, Surekha; Vierling, Elizabeth

    2016-10-01

    The ubiquitous small heat shock proteins (sHSPs) are well documented to act in vitro as molecular chaperones to prevent the irreversible aggregation of heat-sensitive proteins. However, the in vivo activities of sHSPs remain unclear. To investigate the two most abundant classes of plant cytosolic sHSPs (class I [CI] and class II [CII]), RNA interference (RNAi) and overexpression lines were created in Arabidopsis (Arabidopsis thaliana) and shown to have reduced and enhanced tolerance, respectively, to extreme heat stress. Affinity purification of CI and CII sHSPs from heat-stressed seedlings recovered eukaryotic translation elongation factor (eEF) 1B (α-, β-, and γ-subunits) and eukaryotic translation initiation factor 4A (three isoforms), although the association with CI sHSPs was stronger and additional proteins involved in translation were recovered with CI sHSPs. eEF1B subunits became partially insoluble during heat stress and, in the CI and CII RNAi lines, showed reduced recovery to the soluble cell fraction after heat stress, which was also dependent on HSP101. Furthermore, after heat stress, CI sHSPs showed increased retention in the insoluble fraction in the CII RNAi line and vice versa. Immunolocalization revealed that both CI and CII sHSPs were present in cytosolic foci, some of which colocalized with HSP101 and with eEF1Bγ and eEF1Bβ. Thus, CI and CII sHSPs have both unique and overlapping functions and act either directly or indirectly to protect specific translation factors in cytosolic stress granules. © 2016 American Society of Plant Biologists. All Rights Reserved.

  15. Helminth antigens enable CpG-activated dendritic cells to inhibit the symptoms of collagen-induced arthritis through Foxp3+ regulatory T cells.

    Directory of Open Access Journals (Sweden)

    Franco Carranza

    Full Text Available Dendritic cells (DC have the potential to control the outcome of autoimmunity by modulating the immune response. In this study, we tested the ability of Fasciola hepatica total extract (TE to induce tolerogenic properties in CpG-ODN (CpG maturated DC, to then evaluate the therapeutic potential of these cells to diminish the inflammatory response in collagen induced arthritis (CIA. DBA/1J mice were injected with TE plus CpG treated DC (T/C-DC pulsed with bovine collagen II (CII between two immunizations with CII and clinical scores CIA were determined. The levels of CII-specific IgG2 and IgG1 in sera, the histological analyses in the joints, the cytokine profile in the draining lymph node (DLN cells and in the joints, and the number, and functionality of CD4+CD25+Foxp3+ T cells (Treg were evaluated. Vaccination of mice with CII pulsed T/C-DC diminished the severity and incidence of CIA symptoms and the production of the inflammatory cytokine, while induced the production of anti-inflammatory cytokines. The therapeutic effect was mediated by Treg cells, since the adoptive transfer of CD4+CD25+ T cells, inhibited the inflammatory symptoms in CIA. The in vitro blockage of TGF-β in cultures of DLN cells plus CII pulsed T/C-DC inhibited the expansion of Treg cells. Vaccination with CII pulsed T/C-DC seems to be a very efficient approach to diminish exacerbated immune response in CIA, by inducing the development of Treg cells, and it is therefore an interesting candidate for a cell-based therapy for rheumatoid arthritis (RA.

  16. Collaborative Access Control For Critical Infrastructures

    Science.gov (United States)

    Baina, Amine; El Kalam, Anas Abou; Deswarte, Yves; Kaaniche, Mohamed

    A critical infrastructure (CI) can fail with various degrees of severity due to physical and logical vulnerabilities. Since many interdependencies exist between CIs, failures can have dramatic consequences on the entire infrastructure. This paper focuses on threats that affect information and communication systems that constitute the critical information infrastructure (CII). A new collaborative access control framework called PolyOrBAC is proposed to address security problems that are specific to CIIs. The framework offers each organization participating in a CII the ability to collaborate with other organizations while maintaining control of its resources and internal security policy. The approach is demonstrated on a practical scenario involving the electrical power grid.

  17. 6 CFR 29.2 - Definitions.

    Science.gov (United States)

    2010-01-01

    ...-management planning, or risk audit; or (3) Any planned or past operational problem or solution regarding... importance or use of the CII, when accompanied by an express statement as described in 6 CFR 29.5. (h... CII Act, including the maintenance, management, and review of the information provided in furtherance...

  18. A mouse model of adoptive immunotherapeutic targeting of autoimmune arthritis using allo-tolerogenic dendritic cells.

    Directory of Open Access Journals (Sweden)

    Jie Yang

    Full Text Available OBJECTIVE: Tolerogenic dendritic cells (tDCs are immunosuppressive cells with potent tolerogenic ability and are promising immunotherapeutic tools for treating rheumatoid arthritis (RA. However, it is currently unknown whether allogeneic tDCs (allo-tDCs induce tolerance in RA, and whether the numbers of adoptively transferred allo-tDCs, or the requirement for pulsing with relevant auto-antigens are important. METHODS: tDCs were derived from bone marrow precursors of C57BL/B6 mice, which were induced in vitro by GM-CSF, IL-10 and TGF-β1. Collagen-induced arthritis (CIA was modeled in D1 mice by immunization with type II collagen (CII to test the therapeutic ability of allo-tDCs against CIA. Clinical and histopathologic scores, arthritic incidence, cytokine and anti-CII antibody secretion, and CD4(+Th subsets were analyzed. RESULTS: tDCs were characterized in vitro by a stable immature phonotype and a potent immunosuppressive ability. Following adoptive transfer of low doses (5×10(5 of CII-loaded allo-tDCs, a remarkable anti-arthritic activity, improved clinical scores and histological end-points were found. Serological levels of inflammatory cytokines and anti-CII antibodies were also significantly lower in CIA mice treated with CII-pulsed allo-tDCs as compared with allo-tDCs. Moreover, treatment with allo-tDCs altered the proportion of Treg/Th17 cells. CONCLUSION: These findings suggested that allo-tDCs, especially following antigen loading, reduced the severity of CIA in a dose-dependent manner. The dampening of CIA was associated with modulated cytokine secretion, Treg/Th17 polarization and inhibition of anti-CII secretion. This study highlights the potential therapeutic utility of allo-tDCs in autoimmune arthritis and should facilitate the future design of allo-tDC immunotherapeutic strategies against RA.

  19. Endogenous collagen peptide activation of CD1d-restricted NKT cells ameliorates tissue-specific inflammation in mice.

    Science.gov (United States)

    Liu, Yawei; Teige, Anna; Mondoc, Emma; Ibrahim, Saleh; Holmdahl, Rikard; Issazadeh-Navikas, Shohreh

    2011-01-01

    NKT cells in the mouse recognize antigen in the context of the MHC class I-like molecule CD1d and play an important role in peripheral tolerance and protection against autoimmune and other diseases. NKT cells are usually activated by CD1d-presented lipid antigens. However, peptide recognition in the context of CD1 has also been documented, although no self-peptide ligands have been reported to date. Here, we have identified an endogenous peptide that is presented by CD1d to activate mouse NKT cells. This peptide, the immunodominant epitope from mouse collagen type II (mCII707-721), was not associated with either MHC class I or II. Activation of CD1d-restricted mCII707-721-specific NKT cells was induced via TCR signaling and classical costimulation. In addition, mCII707-721-specific NKT cells induced T cell death through Fas/FasL, in an IL-17A-independent fashion. Moreover, mCII707-721-specific NKT cells suppressed a range of in vivo inflammatory conditions, including delayed-type hypersensitivity, antigen-induced airway inflammation, collagen-induced arthritis, and EAE, which were all ameliorated by mCII707-721 vaccination. The findings presented here offer new insight into the intrinsic roles of NKT cells in health and disease. Given the results, endogenous collagen peptide activators of NKT cells may offer promise as novel therapeutics in tissue-specific autoimmune and inflammatory diseases.

  20. C57BL/6 mice need MHC class II Aq to develop collagen-induced arthritis dependent on autoreactive T cells.

    Science.gov (United States)

    Bäcklund, Johan; Li, Cuiqin; Jansson, Erik; Carlsen, Stefan; Merky, Patrick; Nandakumar, Kutty-Selva; Haag, Sabrina; Ytterberg, Jimmy; Zubarev, Roman A; Holmdahl, Rikard

    2013-07-01

    Collagen-induced arthritis (CIA) has traditionally been performed in MHC class II A(q)-expressing mice, whereas most genetically modified mice are on the C57BL/6 background (expressing the b haplotype of the major histocompatibility complex (MHC) class II region). However, C57BL/6 mice develop arthritis after immunisation with chicken-derived collagen type II (CII), but arthritis susceptibility has been variable, and the immune specificity has not been clarified. To establish a CIA model on the C57BL/6 background with a more predictable and defined immune response to CII. Both chicken and rat CII were arthritogenic in C57BL/6 mice provided they were introduced with high doses of Mycobacterium tuberculosis adjuvant. However, contaminating pepsin was strongly immunogenic and was essential for arthritis development. H-2(b)-restricted T cell epitopes on chicken or rat CII could not be identified, but expression of A(q) on the C57BL/6 background induced T cell response to the CII260-270 epitope, and also prolonged the arthritis to be more chronic. The putative (auto)antigen and its arthritogenic determinants in C57BL/6 mice remains undisclosed, questioning the value of the model for addressing T cell-driven pathological pathways in arthritis. To circumvent this impediment, we recommend MHC class II congenic C57BL/6N.Q mice, expressing A(q), with which T cell determinants have been thoroughly characterised.

  1. The familial hyperchylomicronemia syndrome: New insights into underlying genetic defects

    Energy Technology Data Exchange (ETDEWEB)

    Santamarina-Fojo, S.; Brewer, H.B. (National Inst. of Health, Bethesda, MD (United States))

    1991-02-20

    This case history reports the diagnosis of familial hyperchylomicronemia, a rare genetic syndrome inherited as an autosomal recessive trait. It is characterized by severe fasting hypertriglyceridemia and massive accumulations of chylomicrons in plasma. The two major molecular defects in the disease are a deficiency of lipoprotein lipase or of apo C-II. The location of the mutations in the human apolipoprotein (apo) C-II gene are identified.

  2. Exacerbation of collagen induced arthritis by Fcγ receptor targeted collagen peptide due to enhanced inflammatory chemokine and cytokine production

    Directory of Open Access Journals (Sweden)

    Szarka E

    2012-04-01

    Full Text Available Eszter Szarka1*, Zsuzsa Neer1*, Péter Balogh2, Monika Ádori1, Adrienn Angyal1, József Prechl3, Endre Kiss1,3, Dorottya Kövesdi1, Gabriella Sármay11Department of Immunology, Eötvös Loránd University, 1117 Budapest, 2Department of Immunology and Biotechnology, University of Pécs, Pécs, 3Immunology Research Group of the Hungarian Academy of Science at Eötvös Loránd University, 1117 Budapest, Hungary*These authors contributed equally to this workAbstract: Antibodies specific for bovine type II collagen (CII and Fcγ receptors play a major role in collagen-induced arthritis (CIA, a mouse model of rheumatoid arthritis (RA. Our aim was to clarify the mechanism of immune complex-mediated inflammation and modulation of the disease. CII pre-immunized DBA/1 mice were intravenously boosted with extravidin coupled biotinylated monomeric CII-peptide epitope (ARGLTGRPGDA and its complexes with biotinylated FcγRII/III specific single chain Fv (scFv fragment. Disease scores were monitored, antibody titers and cytokines were determined by ELISA, and binding of complexes was detected by flow cytometry and immune histochemistry. Cytokine and chemokine secretion was monitored by protein profiler microarray. When intravenously administered into collagen-primed DBA/1 mice, both CII-peptide and its complex with 2.4G2 scFv significantly accelerated CIA and increased the severity of the disease, whereas the monomeric peptide and monomeric 2.4G2 scFv had no effect. FcγRII/III targeted CII-peptide complexes bound to marginal zone macrophages and dendritic cells, and significantly elevated the synthesis of peptide-specific IgG2a. Furthermore, CII-peptide containing complexes augmented the in vivo secretion of cytokines, including IL-10, IL-12, IL-17, IL-23, and chemokines (CXCL13, MIP-1, MIP-2. These data indicate that complexes formed by the CII-peptide epitope aggravate CIA by inducing the secretion of chemokines and the IL-12/23 family of pro

  3. Effect of carbon ion implantation on the tribology of metal-on-metal bearings for artificial joints

    Directory of Open Access Journals (Sweden)

    Koseki H

    2017-05-01

    Full Text Available Hironobu Koseki,1 Masato Tomita,2 Akihiko Yonekura,2 Takashi Higuchi,1 Sinya Sunagawa,2 Koumei Baba,3,4 Makoto Osaki2 1Department of Locomotive Rehabilitation Science, Unit of Rehabilitation Sciences, 2Department of Orthopedic Surgery, Nagasaki University Graduate School of Biomedical Sciences, Sakamoto, Nagasaki, Japan; 3Industrial Technology Center of Nagasaki, Ikeda, Omura, Nagasaki, Japan; 4Affiliated Division, Nagasaki University School of Engineering, Bunkyo, Nagasaki, Japan Abstract: Metal-on-metal (MoM bearings have become popular due to a major advantage over metal-on-polymer bearings for total hip arthroplasty in that the larger femoral head and hydrodynamic lubrication of the former reduce the rate of wear. However, concerns remain regarding adverse reactions to metal debris including metallosis caused by metal wear generated at the taper-head interface and another modular junction. Our group has hypothesized that carbon ion implantation (CII may improve metal wear properties. The purpose of this study was to investigate the wear properties and friction coefficients of CII surfaces with an aim to ultimately apply these surfaces to MoM bearings in artificial joints. CII was applied to cobalt-chromium-molybdenum (Co-Cr-Mo alloy substrates by plasma source ion implantation. The substrates were characterized using scanning electron microscopy and a 3D measuring laser microscope. Sliding contact tests were performed with a simple geometry pin-on-plate wear tester at a load of 2.5 N, a calculated contact pressure of 38.5 MPa (max: 57.8 MPa, a reciprocating velocity of 30 mm/s, a stroke length of 60 mm, and a reciprocating cycle count of 172,800 cycles. The surfaces of the CII substrates were generally featureless with a smooth surface topography at the same level as untreated Co-Cr-Mo alloy. Compared to the untreated Co-Cr-Mo alloy, the CII-treated bearings had lower friction coefficients, higher resistance to catastrophic damage, and

  4. Evolución del Comercio Intraindustrial entre las regiones colombianas y la Comunidad Andina 1990-2004: un análisis comparativo

    Directory of Open Access Journals (Sweden)

    Héctor Mauricio Posada

    2007-06-01

    Full Text Available Este artículo busca medir y comparar los niveles de Comercio Intraindustrial (CII de Colombia y sus principales regiones económicas con la Comunidad Andina de Naciones (CAN. Se encuentra que estudios anteriores, por sesgos geográficos y de agregación, han sobreestimado dicho comercio. La naturaleza del CII es en proporción dominante vertical, donde Colombia muestra una tendencia a producir las variedades de mayor calidad. Desde lo regional, los departamentos “centro” de cada región explican la mayor parte de la composición sectorial de los flujos significativos de CII con la CAN, poniendo en evidencia su relación con el nivel de desarrollo industrial, por encima de otros determinantes tales como la proximidad geográfica.

  5. un análisis comparativo

    Directory of Open Access Journals (Sweden)

    Ana Isabel Moreno

    2007-01-01

    Full Text Available Este artículo busca medir y comparar los niveles de Comercio Intraindustrial (CII de Colombia y sus principales regiones económicas con la Comunidad Andina de Naciones (CAN. Se encuentra que estudios anteriores, por sesgos geográficos y de agregación, han sobreestimado dicho comercio. La naturaleza del CII es en proporción dominante vertical, donde Colombia muestra una tendencia a producir las variedades de mayor calidad. Desde lo regional, los departamentos centro de cada región explican la mayor parte de la composición sectorial de los flujos significativos de CII con la CAN, poniendo en evidencia su relación con el nivel de desarrollo industrial, por encima de otros determinantes tales como la proximidad geográfica.

  6. Regulatory cross-talk links Vibrio cholerae chromosome II replication and segregation.

    Directory of Open Access Journals (Sweden)

    Yoshiharu Yamaichi

    2011-07-01

    Full Text Available There is little knowledge of factors and mechanisms for coordinating bacterial chromosome replication and segregation. Previous studies have revealed that genes (and their products that surround the origin of replication (oriCII of Vibrio cholerae chromosome II (chrII are critical for controlling the replication and segregation of this chromosome. rctB, which flanks one side of oriCII, encodes a protein that initiates chrII replication; rctA, which flanks the other side of oriCII, inhibits rctB activity. The chrII parAB2 operon, which is essential for chrII partitioning, is located immediately downstream of rctA. Here, we explored how rctA exerts negative control over chrII replication. Our observations suggest that RctB has at least two DNA binding domains--one for binding to oriCII and initiating replication and the other for binding to rctA and thereby inhibiting RctB's ability to initiate replication. Notably, the inhibitory effect of rctA could be alleviated by binding of ParB2 to a centromere-like parS site within rctA. Furthermore, by binding to rctA, ParB2 and RctB inversely regulate expression of the parAB2 genes. Together, our findings suggest that fluctuations in binding of the partitioning protein ParB2 and the chrII initiator RctB to rctA underlie a regulatory network controlling both oriCII firing and the production of the essential chrII partitioning proteins. Thus, by binding both RctB and ParB2, rctA serves as a nexus for regulatory cross-talk coordinating chrII replication and segregation.

  7. Alpha-1 antitrypsin protein and gene therapies decrease autoimmunity and delay arthritis development in mouse model

    Directory of Open Access Journals (Sweden)

    Atkinson Mark A

    2011-02-01

    Full Text Available Abstract Background Alpha-1 antitrypsin (AAT is a multi-functional protein that has anti-inflammatory and tissue protective properties. We previously reported that human AAT (hAAT gene therapy prevented autoimmune diabetes in non-obese diabetic (NOD mice and suppressed arthritis development in combination with doxycycline in mice. In the present study we investigated the feasibility of hAAT monotherapy for the treatment of chronic arthritis in collagen-induced arthritis (CIA, a mouse model of rheumatoid arthritis (RA. Methods DBA/1 mice were immunized with bovine type II collagen (bCII to induce arthritis. These mice were pretreated either with hAAT protein or with recombinant adeno-associated virus vector expressing hAAT (rAAV-hAAT. Control groups received saline injections. Arthritis development was evaluated by prevalence of arthritis and arthritic index. Serum levels of B-cell activating factor of the TNF-α family (BAFF, antibodies against both bovine (bCII and mouse collagen II (mCII were tested by ELISA. Results Human AAT protein therapy as well as recombinant adeno-associated virus (rAAV8-mediated hAAT gene therapy significantly delayed onset and ameliorated disease development of arthritis in CIA mouse model. Importantly, hAAT therapies significantly reduced serum levels of BAFF and autoantibodies against bCII and mCII, suggesting that the effects are mediated via B-cells, at least partially. Conclusion These results present a new drug for arthritis therapy. Human AAT protein and gene therapies are able to ameliorate and delay arthritis development and reduce autoimmunity, indicating promising potential of these therapies as a new treatment strategy for RA.

  8. Chemical changes demonstrated in cartilage by synchrotron infrared microspectroscopy in an antibody-induced murine model of rheumatoid arthritis

    Science.gov (United States)

    Croxford, Allyson M.; Selva Nandakumar, Kutty; Holmdahl, Rikard; Tobin, Mark J.; McNaughton, Don; Rowley, Merrill J.

    2011-06-01

    Collagen antibody-induced arthritis develops in mice following passive transfer of monoclonal antibodies (mAbs) to type II collagen (CII) and is attributed to effects of proinflammatory immune complexes, but transferred mAbs may react directly and damagingly with CII. To determine whether such mAbs cause cartilage damage in vivo in the absence of inflammation, mice lacking complement factor 5 that do not develop joint inflammation were injected intravenously with two arthritogenic mAbs to CII, M2139 and CIIC1. Paws were collected at day 3, decalcified, paraffin embedded, and 5-μm sections were examined using standard histology and synchrotron Fourier-transform infrared microspectroscopy (FTIRM). None of the mice injected with mAb showed visual or histological evidence of inflammation but there were histological changes in the articular cartilage including loss of proteoglycan and altered chondrocyte morphology. Findings using FTIRM at high lateral resolution revealed loss of collagen and the appearance of a new peak at 1635 cm-1 at the surface of the cartilage interpreted as cellular activation. Thus, we demonstrate the utility of synchrotron FTIRM for examining chemical changes in diseased cartilage at the microscopic level and establish that arthritogenic mAbs to CII do cause cartilage damage in vivo in the absence of inflammation.

  9. Reversal of tolerance induced by transplantation of skin expressing the immunodominant T cell epitope of rat type II collagen entitles development of collagen-induced arthritis but not graft rejection

    DEFF Research Database (Denmark)

    Bäcklund, Johan; Treschow, Alexandra; Firan, Mihail

    2002-01-01

    rejection or instead to tolerance and arthritis protection. Interestingly, TSC grafts were accepted and not even immunization of recipient mice with CII in adjuvant induced graft rejection. Instead, TSC skin recipients displayed a reduced T and B cell response to CII and were also protected from arthritis...... collagen (CI), e.g. in skin, are tolerized against rat CII and resistant to CIA. In this study we transplanted skin from TSC transgenic mice onto non-transgenic CIA-susceptible littermates to investigate whether introduction of this epitope to a naïve immune system would lead to T cell priming and graft....... However, additional priming could break arthritis protection and was accompanied by an increased T cell response to the grafted epitope. Strikingly, despite the regained T cell response, development of arthritis was not accompanied by graft rejection, showing that these immune-mediated inflammatory...

  10. Arthrogenicity of type II collagen monoclonal antibodies associated with complement activation and antigen affinity

    OpenAIRE

    Koobkokkruad, Thongchai; Kadotani, Tatsuya; Hutamekalin, Pilaiwanwadee; Mizutani, Nobuaki; Yoshino, Shin

    2011-01-01

    Abstract Background The collagen antibody-induced arthritis (CAIA) model, which employs a cocktail of monoclonal antibodies (mAbs) to type II collagen (CII), has been widely used for studying the pathogenesis of autoimmune arthritis. In this model, not all mAbs to CII are capable of inducing arthritis because one of the initial events is the formation of collagen-antibody immune complexes on the cartilage surface or in the synovium, and subsequent activation of the complement by the complexes...

  11. Betahistine attenuates murine collagen-induced arthritis by suppressing both inflammatory and Th17 cell responses.

    Science.gov (United States)

    Tang, Kuo-Tung; Chao, Ya-Hsuan; Chen, Der-Yuan; Lim, Yun-Ping; Chen, Yi-Ming; Li, Yi-Rong; Yang, Deng-Ho; Lin, Chi-Chen

    2016-10-01

    The objective of this study was to evaluate the potential therapeutic effects of betahistine dihydrochloride (betahistine) in a collagen-induced arthritis (CIA) mouse model. CIA was induced in DBA/1 male mice by primary immunization with 100μl of emulsion containing 2mg/ml chicken type II collagen (CII) mixed with complete Freund's adjuvant (CFA) in an 1:1 ratio, and booster immunization with 100μl of emulsion containing 2mg/ml CII mixed with incomplete Freund's adjuvant (IFA) in an 1:1 ratio. Immunization was performed subcutaneously at the base of the tail. After being boosted on day 21, betahistine (1 and 5mg/kg) was orally administered daily for 2weeks. The severity of CIA was determined by arthritic scores and assessment of histopathological joint destruction. Expression of cytokines in the paw and anti-CII antibodies in the serum was evaluated by ELISA. The proliferative response against CII in the lymph node cells was measured by (3)H-thymidine incorporation assay. The frequencies of different CII specific CD4(+) T cell subsets in the lymph node were determined by flow-cytometric analysis. Betahistine treatment attenuated the severity of arthritis and reduced the levels of pro-inflammatory cytokines, including TNF-α, IL-6, IL-23 and IL-17A, in the paw tissues of CIA mice. Lymph node cells from betahistine-treated mice showed a decrease in proliferation, as well as a lower frequency of Th17 cells. In vitro, betahistine suppressed CD4(+) T cell differentiation into Th17 cells. These results indicate that betahistine is effective in suppressing both inflammatory and Th17 responses in mouse CIA and that it may have therapeutic value as an adjunct treatment for rheumatoid arthritis. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Resolution and optimization methods for tour planning problems

    International Nuclear Information System (INIS)

    Vasserot, Jean-Pierre

    1976-12-01

    The aim of this study is to describe computerized methods for the resolution of the computer supported tour planning problem. After a presentation of this problem in operational research, the different existing methods of resolution are reviewed with the different approaches which have led to their elaboration. Different critics and comparisons are made on these methods and some improvements and new procedures are proposed, some of them allowing to solve more general problems. Finally, the structure of such a program, made at the CII to solve this kind of problem under multiple constraints is analysed [fr

  13. The Compression Intensity Index: a practical anatomical estimate of the biomechanical risk for a deep tissue injury.

    Science.gov (United States)

    Gefen, Amit

    2008-01-01

    Pressure-related deep tissue injury (DTI) is a severe form of pressure ulcer that initiates in compressed muscle tissues under bony prominences, and progresses superficially towards the skin. Patients with impaired motosensory capacities are at high risk of developing DTI. There is a critical medical need for developing risk assessment tools for DTI. A new anatomical index, the Compression Intensity Index: CII=(BW/Rt);[1/2], which depends on the body weight (BW), radius of curvature of the ischial tuberosities (R) and thickness of the underlying gluteus muscles (t), is suggested for approximating the loading intensity in muscle tissue during sitting in permanent wheelchair users, as part of a clinically-oriented risk assessment for DTI. Preliminary CII data were calculated for 6 healthy and 4 paraplegic subjects following MRI scans, and data were compared between the groups and with respect to a gold standard, being a previously developed subject-specific MRI-finite-element (MRI-FE) method of calculating muscle tissue stresses (Linder-Ganz et al., J. Biomech. 2007). Marked differences between the R and t parameters of the two groups caused the CII values of the paraplegics to be approximately 1.6-fold higher than for the healthy (pbedridden patients. Hence, CII measurements can be integrated into DTI-risk-assessment tools, the need of which is now being discussed intensively in the American and European Pressure Ulcer Advisory Panel meetings.

  14. A method of experimental rheumatoid arthritis induction using collagen type II isolated from chicken sternal cartilage.

    Science.gov (United States)

    Su, Zhaoliang; Shotorbani, Siamak Sandoghchian; Jiang, Xugan; Ma, Rui; Shen, Huiling; Kong, Fanzhi; Xu, Huaxi

    2013-07-01

    At present, collagen‑induced arthritis (CIA) is the best known and most extensively used model for the immunological and pathological characteristics of human rheumatoid arthritis (RA). This model is useful not only in aiding our understanding of the pathogenesis of this disease, but also in the development of new therapies. Bovine, porcine and human collagen has been used to induce CIA; however, response has been identified to vary between strains and injection conditions, and false positive results and reduced potency are common as a result of minor contaminants or deglycosylated protein. Therefore, in the present study, type II collagen (CII) was isolated and purified from chicken sternal cartilage and was found to successfully induce the RA model. Furthermore, T helper 17 (Th17) cells were observed to infiltrate the joint on day 45 following induction by CII. In vitro, expression of toll‑like receptor 2 (TLR2) increased in peritoneal macrophages stimulated by CII. In addition, blockage of TLR2 was identified to markedly decrease levels of TGF‑β and IL‑6 in the cell culture supernatant. The results indicate that CII isolated from chicken sternal cartilage may be recognized by TLR2 on macrophages, leading to TGF‑β and IL‑6 production and subsequent activation of Th17 cells which mediates CIA development.

  15. Dephosphorylation of the Core Clock Protein KaiC in the Cyanobacterial KaiABC Circadian Oscillator Proceeds via an ATP Synthase Mechanism

    Energy Technology Data Exchange (ETDEWEB)

    Egli, Martin; Mori, Tetsuya; Pattanayek, Rekha; Xu, Yao; Qin, Ximing; Johnson, Carl H. (Vanderbilt)

    2014-10-02

    The circadian clock of the cyanobacterium Synechococcus elongatus can be reconstituted in vitro from three proteins, KaiA, KaiB, and KaiC in the presence of ATP, to tick in a temperature-compensated manner. KaiC, the central cog of this oscillator, forms a homohexamer with 12 ATP molecules bound between its N- and C-terminal domains and exhibits unusual properties. Both the N-terminal (CI) and C-terminal (CII) domains harbor ATPase activity, and the subunit interfaces between CII domains are the sites of autokinase and autophosphatase activities. Hydrolysis of ATP correlates with phosphorylation at threonine and serine sites across subunits in an orchestrated manner, such that first T432 and then S431 are phosphorylated, followed by dephosphorylation of these residues in the same order. Although structural work has provided insight into the mechanisms of ATPase and kinase, the location and mechanism of the phosphatase have remained enigmatic. From the available experimental data based on a range of approaches, including KaiC crystal structures and small-angle X-ray scattering models, metal ion dependence, site-directed mutagenesis (i.e., E318, the general base), and measurements of the associated clock periods, phosphorylation patterns, and dephosphorylation courses as well as a lack of sequence motifs in KaiC that are typically associated with known phosphatases, we hypothesized that KaiCII makes use of the same active site for phosphorylation and dephosphorlyation. We observed that wild-type KaiC (wt-KaiC) exhibits an ATP synthase activity that is significantly reduced in the T432A/S431A mutant. We interpret the first observation as evidence that KaiCII is a phosphotransferase instead of a phosphatase and the second that the enzyme is capable of generating ATP, both from ADP and P{sub i} (in a reversal of the ATPase reaction) and from ADP and P-T432/P-S431 (dephosphorylation). This new concept regarding the mechanism of dephosphorylation is also supported by the

  16. Dopamine D2 Receptor Is Involved in Alleviation of Type II Collagen-Induced Arthritis in Mice.

    Science.gov (United States)

    Lu, Jian-Hua; Liu, Yi-Qian; Deng, Qiao-Wen; Peng, Yu-Ping; Qiu, Yi-Hua

    2015-01-01

    Human and murine lymphocytes express dopamine (DA) D2-like receptors including DRD2, DRD3, and DRD4. However, their roles in rheumatoid arthritis (RA) are less clear. Here we showed that lymphocyte DRD2 activation alleviates both imbalance of T-helper (Th)17/T-regulatory (Treg) cells and inflamed symptoms in a mouse arthritis model of RA. Collagen-induced arthritis (CIA) was prepared by intradermal injection of chicken collagen type II (CII) in tail base of DBA/1 mice or Drd2 (-/-) C57BL/6 mice. D2-like receptor agonist quinpirole downregulated expression of proinflammatory Th17-related cytokines interleukin- (IL-) 17 and IL-22 but further upregulated expression of anti-inflammatory Treg-related cytokines transforming growth factor- (TGF-) β and IL-10 in lymphocytes in vitro and in ankle joints in vivo in CIA mice. Quinpirole intraperitoneal administration reduced both clinical arthritis score and serum anti-CII IgG level in CIA mice. However, Drd2 (-/-) CIA mice manifested more severe limb inflammation and higher serum anti-CII IgG level and further upregulated IL-17 and IL-22 expression and downregulated TGF-β and IL-10 expression than wild-type CIA mice. In contrast, Drd1 (-/-) CIA mice did not alter limb inflammation or anti-CII IgG level compared with wild-type CIA mice. These results suggest that DRD2 activation is involved in alleviation of CIA symptoms by amelioration of Th17/Treg imbalance.

  17. Design of glycopeptides used to investigate class II MHC binding and T-cell responses associated with autoimmune arthritis.

    Directory of Open Access Journals (Sweden)

    Ida E Andersson

    Full Text Available The glycopeptide fragment CII259-273 from type II collagen (CII binds to the murine A(q and human DR4 class II Major Histocompatibility Complex (MHC II proteins, which are associated with development of murine collagen-induced arthritis (CIA and rheumatoid arthritis (RA, respectively. It has been shown that CII259-273 can be used in therapeutic vaccination of CIA. This glycopeptide also elicits responses from T-cells obtained from RA patients, which indicates that it has an important role in RA as well. We now present a methodology for studies of (glycopeptide-receptor interactions based on a combination of structure-based virtual screening, ligand-based statistical molecular design and biological evaluations. This methodology included the design of a CII259-273 glycopeptide library in which two anchor positions crucial for binding in pockets of A(q and DR4 were varied. Synthesis and biological evaluation of the designed glycopeptides provided novel structure-activity relationship (SAR understanding of binding to A(q and DR4. Glycopeptides that retained high affinities for these MHC II proteins and induced strong responses in panels of T-cell hybridomas were also identified. An analysis of all the responses revealed groups of glycopeptides with different response patterns that are of high interest for vaccination studies in CIA. Moreover, the SAR understanding obtained in this study provides a platform for the design of second-generation glycopeptides with tuned MHC affinities and T-cell responses.

  18. Are there any indians left in Colombia? The indigenista movement from 1940 to 1950

    Directory of Open Access Journals (Sweden)

    Jimena Perry

    2016-09-01

    Full Text Available This paper examines the creation of the Colombian Indigenista Institute, CII (1942, its institutional history, and its closure (1949. The CII, influenced by the Mexican indigenista movement, fought for indigenous peoples’ visibility for eight years. In the end, however, the CII failed due to a combination of internal factors such as ideological disagreements, and internal ones, including violence, scarce resources, and lack of awareness on the part of politicians and elites about the status, challenges, and even existence of Amerindian groups in Colombia. Three Colombian scholars created the CII. They attended the First Inter-American Indigenista Congress (Pátzcuaro 1940 where they got in touch with Manuel Gamio, then director of the Interamerican Indigenista Institute. Gamio´s ideas —that Indians have the right to govern themselves, to have tribal organizations, to elect their community leaders, and to maintain and assert their cultural identity— inspired the Colombians. In this paper, I trace the influences of the Mexican indigenista movement on Colombian indigenismo projects between 1940 and 1950. I draw on a variety of sources including Colombian newspapers and magazines, as well as letters of the indigenistas housed at the Colombian library Luis Ángel Arango in Bogotá and the Nettie Lee Benson Latin American Collection at the University of Texas at Austin. I conclude by explaining why Colombian indigenismo of the decade between 1940 and 1950 ultimately failed.

  19. A cII-dependent promoter is located within the Q gene of bacteriophage lambda.

    OpenAIRE

    Hoopes, B C; McClure, W R

    1985-01-01

    We have found a cII-dependent promoter, PaQ, within the Q gene of bacteriophage lambda. Transcription experiments and abortive initiation assays performed in vitro showed that the promoter strength and the cII affinity of PaQ were comparable to the other cII-dependent lambda promoters, PE and PI. The location and leftward direction of PaQ suggests a possible role in the delay of lambda late-gene expression by cII protein, a phenomenon that has been called cII-dependent inhibition. We have con...

  20. The Complexity integrated-Instruments components media of IPA at Elementary School

    Directory of Open Access Journals (Sweden)

    Angreni Siska

    2018-01-01

    Full Text Available This research aims at describing the complexity of Integrated Instrument Components media (CII in learning of science at Elementary schools in District Siulak Mukai and at Elementary schools in District Siulak. The research applied a descriptive method which included survey forms. Instruments used were observation sheets. The result of the research showed Integrated Instrument Components media (CII natural science that complexity at primary school district Siulak was more complex compared with that at primary school district Siulak Mukai. is better than from primary school district Mukai

  1. Iron hexacyanide/cytochrome-C - intramolecular electron transfer and binding constants - (pulse radiolytic study). Progress report

    International Nuclear Information System (INIS)

    Ilan, Y.; Shafferman, A.

    Internal oxidation and reduction rates of horse cytochrome-c in the complexes, CII.Fe/sup III/(CN) -3 6 and CIII.Fe/sup II/(CN) -4 6 , are 4.6.10 4 s -1 and 3.3.10 2 s -1 , respectively. The binding sites of the iron hexacyanide ions on either CII or CIII are kinetically almost indistinguishable; binding constants range from 0.87.10 3 to 2.10 3 M -1 . The present pulse radiolytic kinetic data are compared with that from N.M.R, T-jump and equilibrium dialysis studies

  2. Dopamine D2 Receptor Is Involved in Alleviation of Type II Collagen-Induced Arthritis in Mice

    Directory of Open Access Journals (Sweden)

    Jian-Hua Lu

    2015-01-01

    Full Text Available Human and murine lymphocytes express dopamine (DA D2-like receptors including DRD2, DRD3, and DRD4. However, their roles in rheumatoid arthritis (RA are less clear. Here we showed that lymphocyte DRD2 activation alleviates both imbalance of T-helper (Th17/T-regulatory (Treg cells and inflamed symptoms in a mouse arthritis model of RA. Collagen-induced arthritis (CIA was prepared by intradermal injection of chicken collagen type II (CII in tail base of DBA/1 mice or Drd2−/− C57BL/6 mice. D2-like receptor agonist quinpirole downregulated expression of proinflammatory Th17-related cytokines interleukin- (IL- 17 and IL-22 but further upregulated expression of anti-inflammatory Treg-related cytokines transforming growth factor- (TGF- β and IL-10 in lymphocytes in vitro and in ankle joints in vivo in CIA mice. Quinpirole intraperitoneal administration reduced both clinical arthritis score and serum anti-CII IgG level in CIA mice. However, Drd2−/− CIA mice manifested more severe limb inflammation and higher serum anti-CII IgG level and further upregulated IL-17 and IL-22 expression and downregulated TGF-β and IL-10 expression than wild-type CIA mice. In contrast, Drd1−/− CIA mice did not alter limb inflammation or anti-CII IgG level compared with wild-type CIA mice. These results suggest that DRD2 activation is involved in alleviation of CIA symptoms by amelioration of Th17/Treg imbalance.

  3. Biochemical analysis of a papain-like protease isolated from the latex of Asclepias curassavica L.

    Science.gov (United States)

    Liggieri, Constanza; Obregon, Walter; Trejo, Sebastian; Priolo, Nora

    2009-02-01

    Most of the species belonging to Asclepiadaceae family usually secrete an endogenous milk-like fluid in a network of laticifer cells in which sub-cellular organelles intensively synthesize proteins and secondary metabolites. A new papain-like endopeptidase (asclepain c-II) has been isolated and characterized from the latex extracted from petioles of Asclepias curassavica L. (Asclepiadaceae). Asclepain c-II was the minor proteolytic component in the latex, but showed higher specific activity than asclepain c-I, the main active fraction previously studied. Both enzymes displayed quite distinct biochemical characteristics, confirming that they are different enzymes. Crude extract was purified by cation exchange chromatography (FPLC). Two active fractions, homogeneous by sodium dodecyl sulphate-polyacrylamide gel electrophoresis and mass spectrometry, were isolated. Asclepain c-II displayed a molecular mass of 23,590 Da, a pI higher than 9.3, maximum proteolytic activity at pH 9.4-10.2, and showed poor thermostability. The activity of asclepain c-II is inhibited by cysteine proteases inhibitors like E-64, but not by any other protease inhibitors such as 1,10-phenantroline, phenylmethanesulfonyl fluoride, and pepstatine. The Nterminal sequence (LPSFVDWRQKGVVFPIRNQGQCGSCWTFSA) showed a high similarity with those of other plant cysteine proteinases. When assayed on N-alpha-CBZ-amino acid-p-nitrophenyl esters, the enzyme exhibited higher preference for the glutamine derivative. Determinations of kinetic parameters were performed with N-alpha-CBZ-L-Gln-p-nitrophenyl ester as substrate: K(m)=0.1634 mM, k(cat)=121.48 s(-1), and k(cat)/K(m)=7.4 x 10(5) s(-1)/mM.

  4. The role of lipopolysaccharide injected systemically in the reactivation of collagen-induced arthritis in mice

    Science.gov (United States)

    Yoshino, Shin; Ohsawa, Motoyasu

    2000-01-01

    We investigated the role of bacterial lipopolysaccharide (LPS) in the reactivation of autoimmune disease by using collagen-induced arthritis (CIA) in mice in which autoimmunity to the joint cartilage component type II collagen (CII) was involved.CIA was induced by immunization with CII emulsified with complete Freund's adjuvant at the base of the tail (day 0) followed by a booster injection on day 21. Varying doses of LPS from E. coli were i.p. injected on day 50.Arthritis began to develop on day 25 after immunization with CII and reached a peak on day 35. Thereafter, arthritis subsided gradually but moderate joint inflammation was still observed on day 50. An i.p. injection of LPS on day 50 markedly reactivated arthritis on a dose-related fashion. Histologically, on day 55, there were marked oedema of synovium which had proliferated by the day of LPS injection, new formation of fibrin, and intense infiltration of neutrophils accompanied with a large number of mononuclear cells. The reactivation of CIA by LPS was associated with increases in anti-CII IgG and IgG2a antibodies as well as various cytokines including IL-12, IFN-γ, IL-1β, and TNF-α. LPS from S. enteritidis, S. typhimurium, and K. neumoniae and its component, lipid A from E. coli also reactivated the disease. Polymyxin B sulphate suppressed LPS- or lipid A-induced reactivation of CIA.These results suggest that LPS may play an important role in the reactivation of autoimmune joint inflammatory diseases such as rheumatoid arthritis in humans. PMID:10742285

  5. Prophylactic Injection of Recombinant Alpha-Enolase Reduces Arthritis Severity in the Collagen-Induced Arthritis Mice Model.

    Directory of Open Access Journals (Sweden)

    Clément Guillou

    Full Text Available To evaluate the ability of the glycolytic enzyme alpha-enolase (ENO1 or its immunodominant peptide (pEP1 to reduce the severity of CIA in DBA/1 mice when injected in a prophylactic way.Mice were treated with mouse ENO1 or pEP1 one day prior to collagen II immunization. Clinical assessment was evaluated using 4 parameters (global and articular scores, ankle thickness and weight. Titers of serum anti-ENO1, anti-cyclic citrullinated peptides (anti-CCP and anti-CII (total IgG and IgG1/IgG2a isotypes antibodies were measured by ELISA at different time-points. Disease activity was assessed by histological analysis of both anterior and hind paws at the end of experimentation.Prophylactic injection of 100 μg of ENO1 reduced severity of CIA. Serum levels of anti-CII antibodies were reduced in ENO1-treated mice. Concordantly, ENO1-treated mice joints presented less severe histological signs of arthritis. ENO1 did not induce a shift toward a Th2 response since IgG1/IgG2a ratio of anti-CII antibodies remained unchanged and IL-4 serum levels were similar to those measured in the control group.Pre-immunization with ENO1 or its immunodominant peptide pEP1 reduces CIA severity at the clinical, immunological and histological levels. Effects of pEP1 were less pronounced. This immunomodulatory effect is associated with a reduction in anti-CII antibodies production but is not due to a Th1/Th2 shift.

  6. Prophylactic Injection of Recombinant Alpha-Enolase Reduces Arthritis Severity in the Collagen-Induced Arthritis Mice Model

    Science.gov (United States)

    Guillou, Clément; Derambure, Céline; Fréret, Manuel; Verdet, Mathieu; Avenel, Gilles; Golinski, Marie-Laure; Sabourin, Jean-Christophe; Loarer, François Le; Adriouch, Sahil; Boyer, Olivier; Lequerré, Thierry; Vittecoq, Olivier

    2015-01-01

    Objective To evaluate the ability of the glycolytic enzyme alpha-enolase (ENO1) or its immunodominant peptide (pEP1) to reduce the severity of CIA in DBA/1 mice when injected in a prophylactic way. Methods Mice were treated with mouse ENO1 or pEP1 one day prior to collagen II immunization. Clinical assessment was evaluated using 4 parameters (global and articular scores, ankle thickness and weight). Titers of serum anti-ENO1, anti-cyclic citrullinated peptides (anti-CCP) and anti-CII (total IgG and IgG1/IgG2a isotypes) antibodies were measured by ELISA at different time-points. Disease activity was assessed by histological analysis of both anterior and hind paws at the end of experimentation. Results Prophylactic injection of 100 μg of ENO1 reduced severity of CIA. Serum levels of anti-CII antibodies were reduced in ENO1-treated mice. Concordantly, ENO1-treated mice joints presented less severe histological signs of arthritis. ENO1 did not induce a shift toward a Th2 response since IgG1/IgG2a ratio of anti-CII antibodies remained unchanged and IL-4 serum levels were similar to those measured in the control group. Conclusions Pre-immunization with ENO1 or its immunodominant peptide pEP1 reduces CIA severity at the clinical, immunological and histological levels. Effects of pEP1 were less pronounced. This immunomodulatory effect is associated with a reduction in anti-CII antibodies production but is not due to a Th1/Th2 shift. PMID:26302382

  7. Green options for anti-sulfate of slag cement concrete containing pozzolans

    Directory of Open Access Journals (Sweden)

    Yang Chien-Jou

    2017-01-01

    Full Text Available This study mainly adopted densified mixture design algorithm (DMDA with pozzolans (fly ash and slag, different fineness slag cement (1:1; MF40 and HF40 and Type I cement (E40 to construct the mixtures for w/cm=0.40, and applied ACI 211.1R and Type II cement as control group (CII40, w/c=0.40. Life cycle inventory of LEED suggested by PCA for cementitious materials (kg/m3 contained cement use, CO2 emission, raw materials, energy consumption, compressive strength, and immersed in different concentration Na2SO4 solution. Results showed cement content, CO2 emission, raw materials and energy consumption for E40, MF40 and HF40, with respect to CII40, were 14% ∼ 26%, 14% ∼ 26%, 13% ∼ 26% and 17%∼28%. At 28 days, compressive strength(all mixtures were greater than 41MPa. Repeatedly 25 cycles, specimens immersed in 5000ppm Na2SO4 solution and oven-dried at 105°C, the exterior had no damage, and weight loss (n and pulse velocity change (nv were less than -1% and -5%. But in saturated Na2SO4 solution, the n and nv were ranged from - 0.91% (E40 to -2.62% (MF40 and -6.7% (E40 to -10.9% (MF40. The exterior had been obviously scaling (chalking or spalling at the second (CII40, the fifth (MF40 and HF40 and the ninth cycle (E40. The comprehensive evaluation of green options for anti-sulfate indicated that the merits of all mixtures were respectively E40 > HF40 > MF40 > CII40.

  8. Submillimeter and far infrared line observations of M17 SW: A clumpy molecular cloud penetrated by UV radiation

    Science.gov (United States)

    Stutzki, J.; Stacey, G. J.; Genzel, R.; Harris, A. I.; Jaffe, d. T.; Lugten, J. B.

    1987-01-01

    Millimeter, submillimeter, and far infrared spectroscopic observations of the M17 SW star formation region are discussed. The results require the molecular cloud near the interface to be clumpy or filamentary. As a consequence, far ultraviolet radiation from the central OB stellar cluster can penetrate into the dense molecular cloud to a depth of several pc, thus creating bright and extended (CII) emission from the photodissociated surfaces of dense atomic and molecular clumps or sheets. The extended (CII) emission throughout the molecular cloud SW of the M17 complex has a level 20 times higher than expected from a single molecular cloud interface exposed to an ultraviolet radiation field typical of the solar neighborhood. This suggests that the molecular cloud as a whole is penetrated by ultraviolet radiation and has a clumpy or filamentary structure. The number of B stars expected to be embedded in the M17 molecular cloud probably can provide the UV radiation necessary for the extended (CII) emission. Alternatively, the UV radiation could be external, if the interstellar radiation in the vicinity of M17 is higher than in the solar neighborhood.

  9. Examining the interlinkages between regional infrastructure disparities, economic growth, and poverty: A case of Indian states

    Directory of Open Access Journals (Sweden)

    Chotia Varun

    2015-01-01

    Full Text Available This paper investigates the interlinkages between regional infrastructure disparities, economic growth, and poverty in the 21 major Indian States. An overall comprehensive index of infrastructure, the Composite Infrastructure Index (CII, is calculated for each Indian state using the Principal Component Analysis technique. In order to analyse the regional disparities between states in terms of infrastructure, they are ranked based on the calculated CII. We extend our analysis by evaluating the inter-relationship between the Composite Infrastructure Index, Per Capita Net State Domestic Product (PCNSDP, and poverty. The empirical analysis also proves that composite infrastructural growth and economic growth go hand in hand.

  10. The TIME-Pilot intensity mapping experiment

    Science.gov (United States)

    Crites, A. T.; Bock, J. J.; Bradford, C. M.; Chang, T. C.; Cooray, A. R.; Duband, L.; Gong, Y.; Hailey-Dunsheath, S.; Hunacek, J.; Koch, P. M.; Li, C. T.; O'Brient, R. C.; Prouve, T.; Shirokoff, E.; Silva, M. B.; Staniszewski, Z.; Uzgil, B.; Zemcov, M.

    2014-08-01

    TIME-Pilot is designed to make measurements from the Epoch of Reionization (EoR), when the first stars and galaxies formed and ionized the intergalactic medium. This will be done via measurements of the redshifted 157.7 um line of singly ionized carbon ([CII]). In particular, TIME-Pilot will produce the first detection of [CII] clustering fluctuations, a signal proportional to the integrated [CII] intensity, summed over all EoR galaxies. TIME-Pilot is thus sensitive to the emission from dwarf galaxies, thought to be responsible for the balance of ionizing UV photons, that will be difficult to detect individually with JWST and ALMA. A detection of [CII] clustering fluctuations would validate current theoretical estimates of the [CII] line as a new cosmological observable, opening the door for a new generation of instruments with advanced technology spectroscopic array focal planes that will map [CII] fluctuations to probe the EoR history of star formation, bubble size, and ionization state. Additionally, TIME-Pilot will produce high signal-to-noise measurements of CO clustering fluctuations, which trace the role of molecular gas in star-forming galaxies at redshifts 0 z < 2. With its unique atmospheric noise mitigation, TIME-Pilot also significantly improves sensitivity for measuring the kinetic Sunyaev-Zel'dovich (kSZ) effect in galaxy clusters. TIME-Pilot will employ a linear array of spectrometers, each consisting of a parallel-plate diffraction grating. The spectrometer bandwidth covers 185-323 GHz to both probe the entire redshift range of interest and to include channels at the edges of the band for atmospheric noise mitigation. We illuminate the telescope with f/3 horns, which balances the desire to both couple to the sky with the best efficiency per beam, and to pack a large number of horns into the fixed field of view. Feedhorns couple radiation to the waveguide spectrometer gratings. Each spectrometer grating has 190 facets and provides resolving power

  11. Preventive effects of CTLA4Ig-overexpressing adipose tissue--derived mesenchymal stromal cells in rheumatoid arthritis.

    Science.gov (United States)

    Choi, Eun Wha; Yun, Tae Won; Song, Ji Woo; Lee, Minjae; Yang, Jehoon; Choi, Kyu-Sil

    2015-03-01

    Rheumatoid arthritis is a systemic autoimmune disorder. In this study, we first compared the therapeutic effects of syngeneic and xenogeneic adipose tissue-derived stem cells on a collagen-induced arthritis mouse model. Second, we investigated the synergistic preventive effects of CTLA4Ig and adipose tissue-derived mesenchymal stromal cells (ASCs) as a therapeutic substance. Arthritis was induced in all groups except for the normal, saline (N) group, using chicken type II collagen (CII). Animals were divided into C (control, saline), H (hASCs), M (mASCs) and N groups (experiment I) and C, H, CT (CTLA4Ig-overexpressing human ASC [CTLA4Ig-hASCs]) and N groups (experiment II), according to transplanted material. Approximately 2 × 10(6) ASCs or 150 μL of saline was intravenously administered on days 24, 27, 30 and 34, and all animals were killed on days 42 to 44 after CII immunization. Anti-mouse CII autoantibodies were significantly lower in the H, M and CT groups than in the C group. Cartilage damage severity score and C-telopeptide of type II collagen were significantly lower in the CT group than in the C group. The serum levels of IL-6 were significantly lower in the H, M and CT groups than in the C group. The serum levels of keratinocyte chemoattractant were significantly lower in the CT group than the C group. There were similar effects of ASCs on the decrease of anti-mouse CII autoantibody levels between syngeneic and xenogeneic transplantations, and CTLA4Ig-hASCs showed synergistic preventive effects compared with non-transduced hASCs. Copyright © 2015 International Society for Cellular Therapy. Published by Elsevier Inc. All rights reserved.

  12. Transforming growth factor (TGF)-β signalling is increased in rheumatoid synovium but TGF-β blockade does not modify experimental arthritis.

    Science.gov (United States)

    Gonzalo-Gil, E; Criado, G; Santiago, B; Dotor, J; Pablos, J L; Galindo, M

    2013-11-01

    The aim of this study was to analyse the distribution of regulatory and inhibitory mothers against decapentaplegic homologue (Smad) proteins as markers of active transforming growth factor (TGF)-β signalling in rheumatoid arthritis (RA) synovial tissue and to investigate the effect of TGF-β blockade in the development and progression of collagen-induced arthritis. The expression of Smad proteins in synovial tissues from RA, osteoarthritic and healthy controls was analysed by immunohistochemistry. Arthritis was induced in DBA/1 mice by immunization with chicken type-II collagen (CII). TGF-β was blocked in vivo with the specific peptide p17 starting at the time of immunization or on the day of arthritis onset. T cell population frequencies and specific responses to CII were analysed. The expression of cytokines and transcription factors was quantified in spleen and joint samples. Statistical differences between groups were compared using the Mann-Whitney U-test or one-way analysis of variance (anova) using the Kruskal-Wallis test. p-Smad-2/3 and inhibitory Smad-7 expression were detected in RA and control tissues. In RA, most lymphoid infiltrating cells showed nuclear p-Smad-2/3 without Smad-7 expression. Treatment with TGF-β antagonist did not affect clinical severity, joint inflammation and cartilage damage in collagen-induced arthritis. Frequency of T cell subsets, mRNA levels of cytokines and transcription factors, specific proliferation to CII, serum interleukin (IL)-6 and anti-CII antibodies were comparable in p17 and phosphate-buffered saline (PBS)-treated groups. The pattern of Smad proteins expression demonstrates active TGF-β signalling in RA synovium. However, specific TGF-β blockade does not have a significant effect in the mice model of collagen-induced arthritis. © 2013 British Society for Immunology.

  13. GOT C+: A Herschel Space Observatory Key Program to Study the Diffuse ISM

    Science.gov (United States)

    Langer, William; Velusamy, T.; Goldsmith, P. F.; Li, D.; Pineda, J.; Yorke, H.

    2010-01-01

    Star formation activity is regulated by pressures in the interstellar medium, which in turn depend on heating and cooling rates, modulated by the gravitational potential, and shock and turbulent pressures. To understand these processes we need information about the diffuse atomic and diffuse molecular gas cloud properties. The ionized carbon CII fine structure line at 1.9 THz is an important tracer of the atomic gas in the diffuse regions and the atomic to molecular cloud transformation. Furthermore, C+ is a major ISM coolant, the Galaxy's strongest emission line, with a total luminosity about a 1000 times that of CO J=1-0. Galactic Observations of the Terahertz C+ Line (GOT C+) is a Herschel Space Observatory Open Time Key Program to study the diffuse interstellar medium by sampling CII line emission throughout the Galactic disk. GOT C+ will obtain high spectral resolution CII using the Heterodyne Instrument for the Far Infrared (HIFI) instrument. It employees deep integrations, wide velocity coverage (350 km s-1) with 0.22 km s-1 resolution, and systematic sparse sampling of the Galactic disk together with observations of selected targets, of over 900 lines of sight. It will be a resource of the atomic gas properties, in the (a) Galactic disk, (b) Galaxy's central 300pc, (c) Galactic warp, (d) high latitude HI clouds, and (e) Photon Dominated Regions (PDRs). Along with HI, CO isotopes, and CI spectra, our C+ data will provide the astronomical community with a rich statistical database of diffuse cloud properties, for understanding the role of barometric pressure and turbulence in cloud evolution in the Galactic ISM and, by extension, other galaxies. The GOT C+ project will provide a template for future even larger-scale CII surveys. This research was conducted at the Jet Propulsion Laboratory, California Institute of Technology and is supported by a NASA grant.

  14. A novel APOC2 gene mutation identified in a Chinese patient with severe hypertriglyceridemia and recurrent pancreatitis.

    Science.gov (United States)

    Jiang, Jingjing; Wang, Yuhui; Ling, Yan; Kayoumu, Abudurexiti; Liu, George; Gao, Xin

    2016-01-16

    The severe forms of hypertriglyceridemia are usually caused by genetic defects. In this study, we described a Chinese female with severe hypertriglyceridemia caused by a novel homozygous mutation in the APOC2 gene. Lipid profiles of the pedigree were studied in detail. LPL and HL activity were also measured. The coding regions of 5 candidate genes (namely LPL, APOC2, APOA5, LMF1, and GPIHBP1) were sequenced using genomic DNA from peripheral leucocytes. The ApoE gene was also genotyped. Serum triglyceride level was extremely high in the proband, compared with other family members. Plasma LPL activity was also significantly reduced in the proband. Serum ApoCII was very low in the proband as well as in the heterozygous mutation carriers. A novel mutation (c.86A > CC) was identified on exon 3 [corrected] of the APOC2 gene, which converted the Asp [corrected] codon at position 29 into Ala, followed by a termination codon (TGA). This study presented the first case of ApoCII deficiency in the Chinese population, with a novel mutation c.86A > CC in the APOC2 gene identified. Serum ApoCII protein might be a useful screening test for identifying mutation carriers.

  15. Utility of the Conners' Adult ADHD Rating Scale validity scales in identifying simulated attention-deficit hyperactivity disorder and random responding.

    Science.gov (United States)

    Walls, Brittany D; Wallace, Elizabeth R; Brothers, Stacey L; Berry, David T R

    2017-12-01

    Recent concern about malingered self-report of symptoms of attention-deficit hyperactivity disorder (ADHD) in college students has resulted in an urgent need for scales that can detect feigning of this disorder. The present study provided further validation data for a recently developed validity scale for the Conners' Adult ADHD Rating Scale (CAARS), the CAARS Infrequency Index (CII), as well as for the Inconsistency Index (INC). The sample included 139 undergraduate students: 21 individuals with diagnoses of ADHD, 29 individuals responding honestly, 54 individuals responding randomly (full or half), and 35 individuals instructed to feign. Overall, the INC showed moderate sensitivity to random responding (.44-.63) and fairly high specificity to ADHD (.86-.91). The CII demonstrated modest sensitivity to feigning (.31-.46) and excellent specificity to ADHD (.91-.95). Sequential application of validity scales had correct classification rates of honest (93.1%), ADHD (81.0%), feigning (57.1%), half random (42.3%), and full random (92.9%). The present study suggests that the CII is modestly sensitive (true positive rate) to feigned ADHD symptoms, and highly specific (true negative rate) to ADHD. Additionally, this study highlights the utility of applying the CAARS validity scales in a sequential manner for identifying feigning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. EFEITO DO DIÂMETRO FOLICULAR SOBRE A QUALIDADE DOS OÓCITOS OBTIDOS DE OVÁRIOS DE OVELHAS (Ovis aries E CABRAS (Capra hircus

    Directory of Open Access Journals (Sweden)

    Ricardo de Macêdo Chaves

    2010-10-01

    Full Text Available This study aimed to evaluate the influence of follicular diameter on the quality of oocytes from ovaries of female sheep and goats. The ovaries of sheep (156 units and goats (105 units from slaughterhouse were used. The follicles were measured, aspirated and divided into three classes of follicular diameter: class 1 (CI – 2 to 3 mm, class 2 (CII – 4 to 5 mm and class 3 (CIII – 6 mm. The recovered oocytes were evaluated and classified according to morphological aspect in 5 quality groups: grade 1 (GI, grade 2 (GII, grade 3 (GIII, naked (N and atresia (A. Of the 468 follicles aspirated from sheep, 327 CI, 84 CII, CIII 57 and 83 GI, 78 GII, GIII 95, 119 N, 93 A were found, and from the 422 follicles aspirated from goats 197 CI, 132 CII, 92 CIII and 64 GI, 70 GII, 91 GIII, 123 N, 74 A were found. The data did not show significant correlation between quality of oocytes and diameter of follicles (P > 0.05. Under the conditions observed in this study it was concluded that the follicular diameter has no influence on the quality of cumulus-oocyte complex (COC recovered from the ovaries of female sheep and goats.

  17. Oral type II collagen in the treatment of rheumatoid arthritis. A six-month double blind placebo-controlled study.

    Science.gov (United States)

    Cazzola, M; Antivalle, M; Sarzi-Puttini, P; Dell'Acqua, D; Panni, B; Caruso, I

    2000-01-01

    To evaluate the efficacy of oral chicken type II collagen (CII) in the treatment of rheumatoid arthritis (RA). Sixty patients with clinically active RA of long duration (mean 7.2 +/- 5.5 years) were treated for 6 months with oral chicken CII at 0.25 mg/day (n = 31) or with placebo (n = 29) in a double-blind randomized study. The response rate to treatment of the collagen-treated group, based on the ACR 20% criteria, was higher than that of the control group but this difference was not statistically significant at any time. Intention-to-treat (ITT) analysis did not show statistically significant improvement in any of the several secondary outcome measures over the 6 months of the study in the collagen-treated patients in comparison with the placebo-treated group. However, in 2 collagen-treated patients we observed a clinical remission according to the criteria of the American Rheumatism Association. Our study seems to show that the oral treatment of RA patients with chicken CII is ineffective and results in only small and inconsistent benefits. Furthermore, our results raise the possibility that in a sub-group of patients oral collagen administration, usually considered devoid of harmful effects, may actually induce disease flares.

  18. Safety and immunogenicity of a novel therapeutic DNA vaccine encoding chicken type II collagen for rheumatoid arthritis in normal rats.

    Science.gov (United States)

    Juan, Long; Xiao, Zhao; Song, Yun; Zhijian, Zhang; Jing, Jin; Kun, Yu; Yuna, Hao; Dongfa, Dai; Lili, Ding; Liuxin, Tan; Fei, Liang; Nan, Liu; Fang, Yuan; Yuying, Sun; Yongzhi, Xi

    2015-01-01

    Current clinically available treatments for rheumatoid arthritis (RA) fail to cure the disease or unsatisfactorily halt disease progression. To overcome these limitations, the development of therapeutic DNA vaccines and boosters may offer new promising strategies. Because type II collagen (CII) as a critical autoantigen in RA and native chicken type II collagen (nCCII) has been used to effectively treat RA, we previously developed a novel therapeutic DNA vaccine encoding CCII (pcDNA-CCOL2A1) with efficacy comparable to that of the current "gold standard", methotrexate(MTX). Here, we systemically evaluated the safety and immunogenicity of the pcDNA-CCOL2A1 vaccine in normal Wistar rats. Group 1 received only a single intramuscular injection into the hind leg with pcDNA-CCOL2A1 at the maximum dosage of 3 mg/kg on day 0; Group 2 was injected with normal saline (NS) as a negative control. All rats were monitored daily for any systemic adverse events, reactions at the injection site, and changes in body weights. Plasma and tissues from all experimental rats were collected on day 14 for routine examinations of hematology and biochemistry parameters, anti-CII IgG antibody reactivity, and histopathology. Our results indicated clearly that at the maximum dosage of 3 mg/kg, the pcDNA-CCOL2A1 vaccine was safe and well-tolerated. No abnormal clinical signs or deaths occurred in the pcDNA-CCOL2A1 group compared with the NS group. Furthermore, no major alterations were observed in hematology, biochemistry, and histopathology, even at the maximum dose. In particularly, no anti-CII IgG antibodies were detected in vaccinated normal rats at 14 d after vaccination; this was relevant because we previously demonstrated that the pcDNA-CCOL2A1 vaccine, when administered at the therapeutic dosage of 300 μg/kg alone, did not induce anti-CII IgG antibody production and significantly reduced levels of anti-CII IgG antibodies in the plasma of rats with established collagen-induced arthritis

  19. C II 158 ??bservations of a Sample of Late-type Galaxies from the Virgo Cluster

    Science.gov (United States)

    Leech, K.; Volk, H.; Heinrichsen, I.; Hippelein, H.; Metcalfe, L.; Pierini, D.; Popescu, C.; Tuffs, R.; Xu, C.

    1999-01-01

    We have observed 19 Virgo cluster spiral galaxies with the Long Wavelength Spectrometer (LWS) onboard ESAs Infrared Space Observatory (ISO) obtaining spectra around the [CII] 157.741 ??ine structure line.

  20. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  1. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  2. Modelling the interactions between Pseudomonas putida and Escherichia coli O157:H7 in fish-burgers: use of the lag-exponential model and of a combined interaction index.

    Science.gov (United States)

    Speranza, B; Bevilacqua, A; Mastromatteo, M; Sinigaglia, M; Corbo, M R

    2010-08-01

    The objective of the current study was to examine the interactions between Pseudomonas putida and Escherichia coli O157:H7 in coculture studies on fish-burgers packed in air and under different modified atmospheres (30 : 40 : 30 O(2) : CO(2) : N(2), 5 : 95 O(2) : CO(2) and 50 : 50 O(2) : CO(2)), throughout the storage at 8 degrees C. The lag-exponential model was applied to describe the microbial growth. To give a quantitative measure of the occurring microbial interactions, two simple parameters were developed: the combined interaction index (CII) and the partial interaction index (PII). Under air, the interaction was significant (P exponential growth phase (CII, 1.72), whereas under the modified atmospheres, the interactions were highly significant (P exponential and in the stationary phase (CII ranged from 0.33 to 1.18). PII values for E. coli O157:H7 were lower than those calculated for Ps. putida. The interactions occurring into the system affected both E. coli O157:H7 and pseudomonads subpopulations. The packaging atmosphere resulted in a key element. The article provides some useful information on the interactions occurring between E. coli O157:H7 and Ps. putida on fish-burgers. The proposed index describes successfully the competitive growth of both micro-organisms, giving also a quantitative measure of a qualitative phenomenon.

  3. The Italian validation of the minimal assessment of cognitive function in multiple sclerosis (MACFIMS) and the application of the Cognitive Impairment Index scoring procedure in MS patients.

    Science.gov (United States)

    Argento, Ornella; Incerti, Chiara C; Quartuccio, Maria E; Magistrale, Giuseppe; Francia, Ada; Caltagirone, Carlo; Pisani, Valerio; Nocentini, Ugo

    2018-04-27

    Cognitive dysfunction occurs in almost 50-60% of patients with multiple sclerosis (MS) even in early stages of the disease and affects different aspects of patient's life. Aims of the present study were (1) to introduce and validate an Italian version of the minimal assessment of cognitive functions in MS (MACFIMS) battery and (2) to propose the use of the Cognitive Impairment Index (CII) as a scoring procedure to define the degree of impairment in relapsing-remitting (RRMS) and secondary-progressive (SPMS) patients. A total of 240 HC and 123 MS patients performed the Italian version of the MACFIMS composed by the same tests as the original except for the Paced Auditory Serial Addition Test. The CII was derived for each score of the 11 scales for participants of both groups. The results of the study show that cognitive impairment affects around 50% of our sample of MS patients. In RRMS group, only the 15.7% of patients reported a severe impairment, while in the group of SPMS, the 51.4% of patients felt in the "severely impaired" group. Results are in line with previously reported percentages of impairment in MS patients, showing that the calculation of the CII applied to the Italian version of the MACFIMS is sensitive and reliable in detecting different degrees of impairment in MS patients.

  4. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  5. Mitra 15. Development of a centralized multipoint network at 10 megabauds

    International Nuclear Information System (INIS)

    Mestrallet, Michel.

    1975-01-01

    The system APIS was designed to control the irradiation devices located in Osiris (70MW swimming-pool reactor). In a first stage the satellite units work autonomously. Each is equipped with a small computer (C.I.I. Mitra 15). A larger central computer enables tables and operations to be worked out and prepared from one shut-down to the next. It is also used for link-up trials, to establish test programmes and secondarily to serve as a small computing centre. Each Processing Unit possesses a peripheral MINIBUS on which are connected the terminal links. This MINIBUS takes the form of a printed circuit placed in a crate and allowing the link-up card position to be entirely standardized. The MITRA 15 possesses a micro-programme which automatically deals with deviations on detection of mishap. It is built around a main ferrite core memory equipped with 4 accesses for the connection of 1 to 4 micro-programmed Processing Units (P.U.) or direct access to the memory. According to the nature of the micro-programmes integrated to the P.U. these can serve as Central Unit, Exchange Unit or Special Unit adapted to a special processing [fr

  6. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  7. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  8. Expression of three sHSP genes involved in heat pretreatment-induced chilling tolerance in banana fruit.

    Science.gov (United States)

    He, Li-hong; Chen, Jian-ye; Kuang, Jian-fei; Lu, Wang-jin

    2012-07-01

    Banana fruit is highly susceptible to chilling injury. In previous research it was shown that heat pretreatment of banana fruit at 38 °C for 3 days before storage at a chilling temperature of 8 °C for 12 days prevented increases in visible chilling injury index, electrolyte leakage and malondialdehyde content and also decreases in lightness and chroma, indicating that heat pretreatment could effectively alleviate chilling injury of banana fruit. However, little is known about the role of small heat shock proteins (sHSPs) in postharvest chilling tolerance of banana fruit. In the present study, three cytosolic sHSP expression profiles in peel and pulp tissues of banana fruit during heat pretreatment and subsequent chilled storage (8 °C) were investigated in relation to heat pretreatment-induced chilling tolerance. Three full-length cDNAs of cytosolic sHSP genes, including two class I sHSP (CI sHSP) and one class II sHSP (CII sHSP) cDNAs, named Ma-CI sHSP1, Ma-CI sHSP2 and Ma-CII sHSP3 respectively, were isolated and characterised from harvested banana fruit. Accumulation of Ma-CI sHSP1 mRNA transcripts in peel and pulp tissues and Ma-CII sHSP3 mRNA transcripts in peel tissue increased during heat pretreatment. Expression of all three Ma-sHSP genes in peel and pulp tissues was induced during subsequent chilled storage. Furthermore, Ma-CI sHSP1 and Ma-CII sHSP3 mRNA transcripts in pulp tissue and Ma-CI sHSP2 mRNA transcripts in peel and pulp tissues were obviously enhanced by heat pretreatment at days 6 and 9 of subsequent chilled storage. These results suggested that heat pretreatment enhanced the expression of Ma-sHSPs, which might be involved in heat pretreatment-induced chilling tolerance of banana fruit. Copyright © 2012 Society of Chemical Industry.

  9. Improving Shadow Suppression for Illumination Robust Face Recognition

    KAUST Repository

    Zhang, Wuming; Zhao, Xi; Morvan, Jean-Marie; Chen, Liming

    2017-01-01

    surface, lighting source and camera sensor, and elaborates the formation of face color appearance. Specifically, the proposed illumination processing pipeline enables the generation of Chromaticity Intrinsic Image (CII) in a log chromaticity space which

  10. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  11. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  12. MCR III. Multicomponent reactions and their libraries, a new type of organic chemistry of the isocyanides and phosphorus derivatives

    NARCIS (Netherlands)

    Chattopadhyaya, J.; Domling, A.; Lorenz, K.; Richter, W.; Ugi, I.; Werner, B.

    1997-01-01

    Various new one-pot multicomponent reactions (MCRs) of C(II) and P(III) derivatives and their libraries are described here. The preparation of some nucleobase- and phospholipid compound libraries by MCRs have been carried out.

  13. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  14. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  15. 75 FR 9493 - Commission Statement in Support of Convergence and Global Accounting Standards

    Science.gov (United States)

    2010-03-02

    ..., Office of International Corporate Finance, Division of Corporation Finance, at (202) 551-3450, Jeffrey S...''), CFA Institute (``CFA''), Council of Institutional Investors (``CII''), International Corporate Governance Network (``ICGN''), Institute of International Finance, Investors Technical Advisory Committee...

  16. Simulating the [CII] emission of high redshift galaxies

    DEFF Research Database (Denmark)

    Olsen, Karen Pardos; Greve, Thomas Rodriguez; Narayanan, Desika

    2016-01-01

    and radiative transfer, the photoionization code CLOUDY isimplemented. I will show results for z=2 star-forming galaxies yet to beobserved, as well as preliminary results for galaxies at z~6-7 whereobservations have presented contradictory detections and non-detectionsof star-forming galaxies....

  17. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  18. Myocardial Infarction Injury in Patients with Chronic Lung Disease Entering Pulmonary Rehabilitation: Frequency and Association with Heart Rate Parameters.

    Science.gov (United States)

    Sima, Carmen A; Lau, Benny C; Taylor, Carolyn M; van Eeden, Stephan F; Reid, W Darlene; Sheel, Andrew W; Kirkham, Ashley R; Camp, Pat G

    2018-03-14

    Myocardial infarction (MI) remains under-recognized in chronic lung disease (CLD) patients. Rehabilitation health professionals need accessible clinical measurements to identify the presence of prior MI in order to determine appropriate training prescription. To estimate prior MI in CLD patients entering a pulmonary rehabilitation program, as well as its association with heart rate parameters such as resting heart rate and chronotropic response index. Retrospective cohort design. Pulmonary rehabilitation outpatient clinic in a tertiary care university-affiliated hospital. Eighty-five CLD patients were studied. Electrocardiograms at rest and peak cardiopulmonary exercise testing, performed before pulmonary rehabilitation, were analyzed. Electrocardiographic evidence of prior MI, quantified by the Cardiac Infarction Injury Score (CIIS), was contrasted with reported myocardial events and then correlated with resting heart rate and chronotropic response index parameters. CIIS, resting heart rate, and chronotropic response index. Sixteen CLD patients (19%) demonstrated electrocardiographic evidence of prior MI, but less than half (8%) had a reported MI history (P CLD patients with a resting heart rate higher than 80 beats/min had approximately 5 times higher odds of having prior MI, as evidenced by a CIIS ≥20. CLD patients entering pulmonary rehabilitation are at risk of unreported prior MI. Elevated resting heart rate seems to be an indicator of prior MI in CLD patients; therefore, careful adjustment of training intensity such as intermittent training is recommended under these circumstances. III. Copyright © 2018 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  19. Recombinant adenovirus-mediated gene transfer suppresses experimental arthritis

    Directory of Open Access Journals (Sweden)

    E. Quattrocchi

    2011-09-01

    Full Text Available Collagen Induced Arthritis (CIA is a widely studied animal model to develop and test novel therapeutic approaches for treating Rheumatoid Arthritis (RA in humans. Soluble Cytotoxic T-Lymphocyte Antigen 4 (CTLA4-Ig, which binds B7 molecule on antigen presenting cells and blocks CD28 mediated T-lymphocyte activation, has been shown to ameliorate experimental autoimmune diseases such as lupus, diabetes and CIA. Objective of our research was to investigate in vivo the effectiveness of blocking the B7/CD28 T-lymphocyte co-stimulatory pathway, utilizing a gene transfer technology, as a therapeutic strategy against CIA. Replication-deficient adenoviruses encoding a chimeric CTLA4-Ig fusion protein, or β-galactosidase as control, have been injected intravenously once at arthritis onset. Disease activity has been monitored by the assessment of clinical score, paw thickness and type II collagen (CII specific cellular and humoral immune responses for 21 days. The adenovirally delivered CTLA4-Ig fusion protein at a dose of 2×108 pfu suppressed established CIA, whereas the control β-galactosidase did not significantly affect the disease course. CII-specific lymphocyte proliferation, IFNg production and anti-CII antibodies were significantly reduced by CTLA4-Ig treatment. Our results demonstrate that blockade of the B7/CD28 co-stimulatory pathway by adenovirus-mediated CTLA4-Ig gene transfer is effective in treating established CIA suggesting its potential in treating RA.

  20. Consistent response of bird populations to climate change on two continents.

    Science.gov (United States)

    Stephens, Philip A; Mason, Lucy R; Green, Rhys E; Gregory, Richard D; Sauer, John R; Alison, Jamie; Aunins, Ainars; Brotons, Lluís; Butchart, Stuart H M; Campedelli, Tommaso; Chodkiewicz, Tomasz; Chylarecki, Przemysław; Crowe, Olivia; Elts, Jaanus; Escandell, Virginia; Foppen, Ruud P B; Heldbjerg, Henning; Herrando, Sergi; Husby, Magne; Jiguet, Frédéric; Lehikoinen, Aleksi; Lindström, Åke; Noble, David G; Paquet, Jean-Yves; Reif, Jiri; Sattler, Thomas; Szép, Tibor; Teufelbauer, Norbert; Trautmann, Sven; van Strien, Arco J; van Turnhout, Chris A M; Vorisek, Petr; Willis, Stephen G

    2016-04-01

    Global climate change is a major threat to biodiversity. Large-scale analyses have generally focused on the impacts of climate change on the geographic ranges of species and on phenology, the timing of ecological phenomena. We used long-term monitoring of the abundance of breeding birds across Europe and the United States to produce, for both regions, composite population indices for two groups of species: those for which climate suitability has been either improving or declining since 1980. The ratio of these composite indices, the climate impact indicator (CII), reflects the divergent fates of species favored or disadvantaged by climate change. The trend in CII is positive and similar in the two regions. On both continents, interspecific and spatial variation in population abundance trends are well predicted by climate suitability trends. Copyright © 2016, American Association for the Advancement of Science.

  1. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  2. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  3. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  4. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  5. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  6. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  7. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  8. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  9. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  10. Current steering and current focusing in cochlear implants: comparison of monopolar, tripolar, and virtual channel electrode configurations.

    NARCIS (Netherlands)

    Berenstein, C.K.; Mens, L.H.M.; Mulder, J.J.S.; Vanpoucke, F.J.

    2008-01-01

    OBJECTIVES: To compare the effects of Monopole (Mono), Tripole (Tri), and "Virtual channel" (Vchan) electrode configurations on spectral resolution and speech perception in a crossover design. DESIGN: Nine experienced adults who received an Advanced Bionics CII/90K cochlear implant participated in a

  11. 47 CFR 22.973 - Information exchange.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 2 2010-10-01 2010-10-01 false Information exchange. 22.973 Section 22.973 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) COMMON CARRIER SERVICES PUBLIC MOBILE SERVICES Cellular Radiotelephone Service § 22.973 Information exchange. (a) Prior notification. Public safety/CII...

  12. 5 CFR 213.3102 - Entire executive civil service.

    Science.gov (United States)

    2010-01-01

    ...) Executive branch employees (other than employees of intelligence agencies) who are entitled to placement... intelligence agencies defined in 5 U.S.C. 2302(a)(2)(C)(ii) who are entitled to placement under § 353.110. (2... printed volume and on GPO Access. ...

  13. Population-based versus practice-based recall for childhood immunizations: a randomized controlled comparative effectiveness trial.

    Science.gov (United States)

    Kempe, Allison; Saville, Alison; Dickinson, L Miriam; Eisert, Sheri; Reynolds, Joni; Herrero, Diana; Beaty, Brenda; Albright, Karen; Dibert, Eva; Koehler, Vicky; Lockhart, Steven; Calonge, Ned

    2013-06-01

    We compared the effectiveness and cost-effectiveness of population-based recall (Pop-recall) versus practice-based recall (PCP-recall) at increasing immunizations among preschool children. This cluster-randomized trial involved children aged 19 to 35 months needing immunizations in 8 rural and 6 urban Colorado counties. In Pop-recall counties, recall was conducted centrally using the Colorado Immunization Information System (CIIS). In PCP-recall counties, practices were invited to attend webinar training using CIIS and offered financial support for mailings. The percentage of up-to-date (UTD) and vaccine documentation were compared 6 months after recall. A mixed-effects model assessed the association between intervention and whether a child became UTD. Ten of 195 practices (5%) implemented recall in PCP-recall counties. Among children needing immunizations, 18.7% became UTD in Pop-recall versus 12.8% in PCP-recall counties (P immunization rates in preschool children.

  14. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  15. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  16. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  17. Increased chemotaxis and activity of circulatory myeloid progenitor cells may contribute to enhanced osteoclastogenesis and bone loss in the C57BL/6 mouse model of collagen-induced arthritis.

    Science.gov (United States)

    Ikić Matijašević, M; Flegar, D; Kovačić, N; Katavić, V; Kelava, T; Šućur, A; Ivčević, S; Cvija, H; Lazić Mosler, E; Kalajzić, I; Marušić, A; Grčević, D

    2016-12-01

    Our study aimed to determine the functional activity of different osteoclast progenitor (OCP) subpopulations and signals important for their migration to bone lesions, causing local and systemic bone resorption during the course of collagen-induced arthritis in C57BL/6 mice. Arthritis was induced with chicken type II collagen (CII), and assessed by clinical scoring and detection of anti-CII antibodies. We observed decreased trabecular bone volume of axial and appendicular skeleton by histomorphometry and micro-computed tomography as well as decreased bone formation and increased bone resorption rate in arthritic mice in vivo. In the affected joints, bone loss was accompanied with severe osteitis and bone marrow hypercellularity, coinciding with the areas of active osteoclasts and bone erosions. Flow cytometry analysis showed increased frequency of putative OCP cells (CD3 - B220 - NK1.1 - CD11b -/lo CD117 + CD115 + for bone marrow and CD3 - B220 - NK1.1 - CD11b + CD115 + Gr-1 + for peripheral haematopoietic tissues), which exhibited enhanced differentiation potential in vitro. Moreover, the total CD11b + population was expanded in arthritic mice as well as CD11b + F4/80 + macrophage, CD11b + NK1.1 + natural killer cell and CD11b + CD11c + myeloid dendritic cell populations in both bone marrow and peripheral blood. In addition, arthritic mice had increased expression of tumour necrosis factor-α, interleukin-6, CC chemokine ligand-2 (Ccl2) and Ccl5, with increased migration and differentiation of circulatory OCPs in response to CCL2 and, particularly, CCL5 signals. Our study characterized the frequency and functional properties of OCPs under inflammatory conditions associated with arthritis, which may help to clarify crucial molecular signals provided by immune cells to mediate systemically enhanced osteoresorption. © 2016 British Society for Immunology.

  18. Aberrant regulation of synthesis and degradation of viral proteins in coliphage lambda-infected UV-irradiated cells and in minicells

    International Nuclear Information System (INIS)

    Shaw, J.E.; Epp, C.; Pearson, M.L.; Reeve, J.N.

    1987-01-01

    The patterns of bacteriophage lambda proteins synthesized in UV-irradiated Escherichia coli cells and in anucleate minicells are significantly different; both systems exhibit aberrations of regulation in lambda gene expression. In unirradiated cells or cells irradiated with low UV doses (less than 600 J/m2), regulation of lambda protein synthesis is controlled by the regulatory proteins CI, N, CII, CIII, Cro, and Q. As the UV dose increases, activation of transcription of the cI, rexA, and int genes by CII and CIII proteins fails to occur and early protein synthesis, normally inhibited by the action of Cro, continues. After high UV doses (greater than 2000 J/m2), late lambda protein synthesis does not occur. Progression through the sequence of regulatory steps in lambda gene expression is slower in infected minicells. In minicells, there is no detectable cII- and cIII-dependent synthesis of CI, RexA, or Int proteins and inhibition of early protein synthesis by Cro activity is always incomplete. The synthesis of early b region proteins is not subject to control by CI, N, or Cro proteins, and evidence is presented suggesting that, in minicells, transcription of the early b region is initiated at a promoter(s) within the b region. Proteolytic cleavage of the regulatory proteins O and N and of the capsid proteins C, B, and Nu3 is much reduced in infected minicells. Exposure of minicells to very high UV doses before infection does not completely inhibit late lambda protein synthesis

  19. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  20. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  1. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  2. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  3. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  4. +91 1662 263153, Fax

    African Journals Online (AJOL)

    Michael Horsfall

    Chromium Tolerance and Bioremoval by Cyanobacteria Isolated from Textile Mill Oxidation Pond in. Pure and .... photosyntheticpigments of cyanobacteria (CI - Nostoc calcicola, CII - Chroococcus minutus, CI + II - Consortium) at the peak growth stage .... with high EPS production by the cyanobacterial cells resulting in loss ...

  5. 48 CFR 225.504 - Evaluation examples.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Evaluation examples. 225.504 Section 225.504 Federal Acquisition Regulations System DEFENSE ACQUISITION REGULATIONS SYSTEM... 225.504 Evaluation examples. For examples that illustrate the evaluation procedures in 225.502(c)(ii...

  6. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  7. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  8. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  9. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  10. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  11. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  12. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  13. 45 CFR 1326.9 - Contributions.

    Science.gov (United States)

    2010-10-01

    ... INDIAN TRIBES FOR SUPPORT AND NUTRITION SERVICES § 1326.9 Contributions. (a) Each tribal organization... contributions to expand comprehensive and coordinated services systems supported under this part, while using nutrition services contributions only to expand services as provided under section 307(a)(13)(c)(ii) of the...

  14. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  15. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  16. Explorations in computing an introduction to computer science

    CERN Document Server

    Conery, John S

    2010-01-01

    Introduction Computation The Limits of Computation Algorithms A Laboratory for Computational ExperimentsThe Ruby WorkbenchIntroducing Ruby and the RubyLabs environment for computational experimentsInteractive Ruby Numbers Variables Methods RubyLabs The Sieve of EratosthenesAn algorithm for finding prime numbersThe Sieve Algorithm The mod Operator Containers Iterators Boolean Values and the delete if Method Exploring the Algorithm The sieve Method A Better Sieve Experiments with the Sieve A Journey of a Thousand MilesIteration as a strategy for solving computational problemsSearching and Sortin

  17. Effect of temperature on enzymatic and physiological factors related to chilling injury in carambola fruit (Averrhoa carambola L.).

    Science.gov (United States)

    Pérez-Tello, G O; Silva-Espinoza, B A; Vargas-Arispuro, I; Briceño-Torres, B O; Martinez-Tellez, M A

    2001-10-05

    Three groups of carambola fruits (Averrhoa carambola L.) were stored at 2 and 10 degrees C (85-90% relative humidity). The major physicochemical, physiological, and enzymatic responses of fruit were measured in each group over a 30-day period: chilling injury index (CII), decay (%), intracuticular waxes, cuticle permeability, pulp firmness, weight loss, sucrose, fructose and glucose contents, ion electrolyte leakage in pulp (%), ethylene and carbon dioxide production rates, and the activities of peroxidase (POD), polyphenol oxidase (PPO), and phenylalanine ammonia-lyase (PAL) enzymes. CII values were statistically different at 2 and 10 degrees C, showing high significance with respect to sucrose content and weight loss (P < 0.05). Chilling injury included darkened ribs and skin desiccation. According to the CI symptom development, a possible relationship of POD and PPO activities was found at 2 degrees C. A significant sucrose content increase was observed at 10 degrees C. CI symptoms were associated with POD and PAL activities. Copyright 2001 Academic Press.

  18. Cyberspace and Critical Information Infrastructures

    Directory of Open Access Journals (Sweden)

    Dan COLESNIUC

    2013-01-01

    Full Text Available Every economy of an advanced nation relies on information systems and interconnected networks, thus in order to ensure the prosperity of a nation, making cyberspace a secure place becomes as crucial as securing society. Cyber security means ensuring the safety of this cyberspace from threats which can take different forms, such as stealing secret information from national companies and government institutions, attacking infrastructure vital for the functioning of the nation or attacking the privacy of the single citizen. The critical information infrastructure (CII represents the indispensable "nervous system", that allow modern societies to work and live. Besides, without it, there would be no distribution of energy, no services like banking or finance, no air traffic control and so on. But at the same time, in the development process of CII, security was never considered a top priority and for this reason they are subject to a high risk in relation to the organized crime.

  19. Computer Networking Laboratory for Undergraduate Computer Technology Program

    National Research Council Canada - National Science Library

    Naghedolfeizi, Masoud

    2000-01-01

    ...) To improve the quality of education in the existing courses related to computer networks and data communications as well as other computer science courses such programming languages and computer...

  20. Mathematics, Physics and Computer Sciences The computation of ...

    African Journals Online (AJOL)

    Mathematics, Physics and Computer Sciences The computation of system matrices for biquadraticsquare finite ... Global Journal of Pure and Applied Sciences ... The computation of system matrices for biquadraticsquare finite elements.

  1. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  2. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  3. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  4. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  5. 76 FR 55012 - Certain Steel Wheels From the People's Republic of China: Preliminary Affirmative Countervailing...

    Science.gov (United States)

    2011-09-06

    ... POI, Centurion was owned by a Hong Kong-registered company and a private individual. Jining CII Wheel... provincial price proposals for 2006 and 2008, because they are working documents for the National Development... average useful life (AUL) of the renewable physical assets used to produce the subject merchandise...

  6. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  7. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  8. STARFIRE: The Spectroscopic Terahertz Airborne Receiver for Far-InfraRed Exploration

    Science.gov (United States)

    Aguirre, James; STARFIRE Collaboration

    2018-01-01

    Understanding the formation and evolution of galaxies is one of the foremost goals of astrophysics and cosmology today. The cosmic star formation rate has undergone a dramatic evolution over the course of the last seven billion years, with a peak in cosmic star formation near z ~ 1, largely in dust-obscured star forming galaxies (DSFGs), followed by a dramatic fall in both the star formation rate and the fraction of star formation occurring in DSFGs. A variety of unextincted diagnostic lines are present in the far-infrared (FIR) which can provide insight into the conditions of star formation in DSFGs. Spectroscopy in the far-infrared is thus scientifically crucial for understanding galaxy evolution, yet remains technically difficult, particularly for wavelengths shorter than those accessible to ALMA.STARFIRE (the Spectroscopic Terahertz Airborne Receiver for Far-InfraRed Exploration) is a proposed integral-field spectrometer using kinetic inductance detectors, operating from 240 - 420 μm and coupled to a 2.5 meter low-emissivity carbon-fiber balloon-borne telescope. Using dispersive spectroscopy and the stratospheric platform, STARFIRE can achieve better performance than SOFIA or Herschel-SPIRE FTS. STARFIRE is designed to study the ISM of galaxies from 0.5 clustering, as well as shot noise, and will relate the mean [CII] intensity as a function of redshift (a proxy for star formation rate density) to the large scale structure. In addition, STARFIRE will detect at least 50 galaxies directly in the [CII] line, and will also be able to stack on optical galaxies to below the SPIRE confusion limit to measure the [CII] luminosity of more typical galaxies.

  9. OT2_smalhotr_3: Herschel Extreme Lensing Line Observations (HELLO)

    Science.gov (United States)

    Malhotra, S.

    2011-09-01

    We request 59.8 hours of Herschel time to observe 20 normal star-forming galaxies in the [CII] 158 micron and [OI] 63 micron lines. These galaxies lie at high redshift (1CII], [OI], or both. Herschel offers the unique opportunity to study both lines with high sensitivity throughout this epoch (using HIFI for [CII] and PACS for [OI]). These two lines are the main cooling lines of the atomic medium. By measuring their fluxes, we will measure (1) the cooling efficiency of gas, (2) gas densities and temperatures near starforming regions, and (3) gas pressures, which are important to drive the winds that provide feedback to starformation processes. By combining the proposed observations with existing multiwavelength data on these objects, we will obtain as complete a picture of galaxy-scale star formation and ISM physical conditions at high redshifts as we have at z=0. Then perhaps we can understand why star formation and AGN activity peaked at this epoch. In Herschel cycle OT1, 49 high redshift IR luminous galaxies were approved for spectroscopy, but only two so-called normal galaxies were included. This is an imbalance that should be corrected, to balance Herschel's legacy.

  10. Hematotoxicity response in rats by the novel copper-based anticancer agent: casiopeina II

    International Nuclear Information System (INIS)

    Vizcaya-Ruiz, A. de; Rivero-Mueller, A.; Ruiz-Ramirez, L.; Howarth, J.A.; Dobrota, M.

    2003-01-01

    The in vivo toxicity of the novel copper-based anticancer agent, casiopeina II (Cu(4,7-dimethyl-1,10-phenanthroline)(glycine)NO 3 ) (CII), was investigated. Casiopeinas are a family of copper-coordinated complexes that have shown promising anticancer activity. The major toxic effect attributed to a single i.v. administration of CII (5 mg/kg dose) in the rat was an hemolytic anemia (reduced hemoglobin concentration (HB), red blood cell (RBC) count and packed cell volume (PCV) accompanied by a marked neutrophilic leukocytosis) 12 h and 5 days after administration, attributed to a direct erythrocyte damage. Increased reticulocyte levels and presence of normoblasts in peripheral blood 5 days post-administration indicated an effective erythropoietic response with recovery at 15 days. Increase in spleen weight and the morphological evidence of congestion of the red pulp (RP) with erythrocytes (E) resulting in a higher ratio of red to white pulp (WP) was consistent with increased uptake of damaged erythrocytes by the reticuloendothelial system observed by histopathology and electron microscopy. Extramedullary hemopoiesis was markedly increased at 5 days giving further evidence of a regenerative erythropoietic response that had an effective recovery by 15 days. Morphological changes in spleen cellularity were consistent with hematotoxicity, mainly a reduction of the red pulp/white pulp ratio, increase in erythrocyte content at 12 h, and an infiltration of nucleated cells in the red pulp at 5 days, with a tendency towards recovery 15 days after administration. The erythrocyte damage is attributed to generation of free radicals and oxidative damage on the membrane and within cells resulting from the reduction of Cu(II) and the probable dissociation of the CII complex

  11. ELASTIC CLOUD COMPUTING ARCHITECTURE AND SYSTEM FOR HETEROGENEOUS SPATIOTEMPORAL COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-10-01

    Full Text Available Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs, while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  12. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    Science.gov (United States)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  13. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  14. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  15. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  16. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  17. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  19. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  20. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  1. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  2. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  3. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  4. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  5. Role of adipocyte-derived lipoprotein lipase in adipocyte hypertrophy

    Directory of Open Access Journals (Sweden)

    Orlando Robert A

    2007-10-01

    Full Text Available Abstract Background A major portion of available fatty acids for adipocyte uptake is derived from lipoprotein lipase (LPL-mediated hydrolysis of circulating lipoprotein particles. In vivo studies aimed at identifying the precise role of adipocyte-derived LPL in fat storage function of adipose tissue have been unable to provide conclusive evidence due to compensatory mechanisms that activate endogenous fatty acid synthesis. To address this gap in knowledge, we have measured the effect of reducing adipocyte LPL expression on intracellular lipid accumulation using a well-established cultured model of adipocyte differentiation. Methods siRNA specific for mouse LPL was transfected into 3T3-L1 adipocytes. Expression of LPL was measured by quantitative real-time PCR and cell surface-associated LPL enzymatic activity was measured by colorimetric detection following substrate (p-nitrophenyl butyrate hydrolysis. Apolipoprotein CII and CIII expression ratios were also measured by qRT-PCR. Intracellular lipid accumulation was quantified by Nile Red staining. Results During differentiation of 3T3-L1 pre-adipocytes, LPL mRNA expression increases 6-fold resulting in a 2-fold increase in cell surface-associated LPL enzymatic activity. Parallel to this increase in LPL expression, we found that intracellular lipids increased ~10-fold demonstrating a direct correlation between adipocyte-derived LPL expression and lipid storage. We next reduced LPL expression in adipocytes using siRNA transfections to directly quantify the contributions of adipocyte-derived LPL to lipid storage, This treatment reduced LPL mRNA expression and cell surface-associated LPL enzymatic activity to ~50% of non-treated controls while intracellular lipid levels were reduced by 80%. Exogenous addition of purified LPL (to restore extracellular lipolytic activity or palmitate (as a source of free fatty acids to siRNA-treated cells restored intracellular lipid levels to those measured for non

  6. Cytoplasmic Copper Detoxification in Salmonella Can Contribute to SodC Metalation but Is Dispensable during Systemic Infection.

    Science.gov (United States)

    Fenlon, Luke A; Slauch, James M

    2017-12-15

    Salmonella enterica serovar Typhimurium is a leading cause of foodborne disease worldwide. Severe infections result from the ability of S Typhimurium to survive within host immune cells, despite being exposed to various host antimicrobial factors. SodCI, a copper-zinc-cofactored superoxide dismutase, is required to defend against phagocytic superoxide. SodCII, an additional periplasmic superoxide dismutase, although produced during infection, does not function in the host. Previous studies suggested that CueP, a periplasmic copper binding protein, facilitates acquisition of copper by SodCII. CopA and GolT, both inner membrane ATPases that pump copper from the cytoplasm to the periplasm, are a source of copper for CueP. Using in vitro SOD assays, we found that SodCI can also utilize CueP to acquire copper. However, both SodCI and SodCII have a significant fraction of activity independent of CueP and cytoplasmic copper export. We utilized a series of mouse competition assays to address the in vivo role of CueP-mediated SodC activation. A copA golT cueP triple mutant was equally as competitive as the wild type, suggesting that sufficient SodCI is active to defend against phagocytic superoxide independent of CueP and cytoplasmic copper export. We also confirmed that a strain containing a modified SodCII, which is capable of complementing a sodCI deletion, was fully virulent in a copA golT cueP background competed against the wild type. These competitions also address the potential impact of cytoplasmic copper toxicity within the phagosome. Our data suggest that Salmonella does not encounter inhibitory concentrations of copper during systemic infection. IMPORTANCE Salmonella is a leading cause of gastrointestinal disease worldwide. In severe cases, Salmonella can cause life-threatening systemic infections, particularly in very young children, the elderly, or people who are immunocompromised. To cause disease, Salmonella must survive the hostile environment inside host

  7. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  8. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  9. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  10. The influence of cochlear implant electrode position on performance

    NARCIS (Netherlands)

    Marel, K.S. van der; Briaire, J.J.; Verbist, B.M.; Muurling, T.J.; Frijns, J.H.M.

    2015-01-01

    To study the relation between variables related to cochlear implant electrode position and speech perception performance scores in a large patient population.The study sample consisted of 203 patients implanted with a CII or HiRes90K implant with a HiFocus 1 or 1J electrode of Advanced Bionics.

  11. 76 FR 16422 - Medicare, Medicaid, and Children's Health Insurance Programs; Provider Enrollment Application Fee...

    Science.gov (United States)

    2011-03-23

    ...-physician practitioner organizations), CMS-855S or associated Internet-based PECOS enrollment application... calculated in accordance with the following: Section 1866(j)(2)(C)(i)(I) of the Social Security Act (the Act... period ending with June of the previous year. As stated in the Regulatory Impact Analysis section of the...

  12. 47 CFR 90.675 - Information exchange.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 5 2010-10-01 2010-10-01 false Information exchange. 90.675 Section 90.675... exchange. (a) Prior coordination. Public safety/CII licensees may notify an ESMR or part 22 Cellular... cell is activated. (c) Public safety information exchange. (1) Upon request by an ESMR or part 22...

  13. 75 FR 8428 - Submission for OMB Review; Comment Request

    Science.gov (United States)

    2010-02-24

    ... review and clearance under the Paperwork Reduction Act of 1995, Public Law 104-13 on or after the... through the Community Investment Impact System (CIIS). The QILR will help the CDFI Fund to meet its own... . Respondents: Private Sector: businesses or other for-profits, not- for-profit institutions. Estimated Total...

  14. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  15. Computer Series, 3: Computer Graphics for Chemical Education.

    Science.gov (United States)

    Soltzberg, Leonard J.

    1979-01-01

    Surveys the current scene in computer graphics from the point of view of a chemistry educator. Discusses the scope of current applications of computer graphics in chemical education, and provides information about hardware and software systems to promote communication with vendors of computer graphics equipment. (HM)

  16. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  17. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  18. Pascal-SC a computer language for scientific computation

    CERN Document Server

    Bohlender, Gerd; von Gudenberg, Jürgen Wolff; Rheinboldt, Werner; Siewiorek, Daniel

    1987-01-01

    Perspectives in Computing, Vol. 17: Pascal-SC: A Computer Language for Scientific Computation focuses on the application of Pascal-SC, a programming language developed as an extension of standard Pascal, in scientific computation. The publication first elaborates on the introduction to Pascal-SC, a review of standard Pascal, and real floating-point arithmetic. Discussions focus on optimal scalar product, standard functions, real expressions, program structure, simple extensions, real floating-point arithmetic, vector and matrix arithmetic, and dynamic arrays. The text then examines functions a

  19. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  20. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  1. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  2. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  3. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  4. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  5. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  6. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  7. New computational paradigms changing conceptions of what is computable

    CERN Document Server

    Cooper, SB; Sorbi, Andrea

    2007-01-01

    This superb exposition of a complex subject examines new developments in the theory and practice of computation from a mathematical perspective. It covers topics ranging from classical computability to complexity, from biocomputing to quantum computing.

  8. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  9. Mutation spectrum in FE1-MUTA(TM) Mouse lung epithelial cells exposed to nanoparticulate carbon black

    DEFF Research Database (Denmark)

    Jacobsen, Nicklas Raun; White, Paul A; Gingerich, John

    2011-01-01

    It has been shown previously that carbon black (CB), Printex 90 exposure induces cII and lacZ mutants in the FE1-Muta(TM) Mouse lung epithelial cell line and causes oxidatively damaged DNA and the production of reactive oxygen species (ROS). The purpose of this study was to determine the mutation...

  10. COMPORTAMIENTO MECANICO DE LA AORTA ASCENDENTE: CARACTERZACION EXPERIMENTAL Y SIMULACION NUMERICA.

    OpenAIRE

    GARCIA HERRERA, CLAUDIO; GARCIA HERRERA, CLAUDIO

    2008-01-01

    En este trabajo se realiza una caracterización experimental y numérica del comportamiento mecánico de la pared de la aorta humana. Se destaca la importancia (le este tema debido al creciente interés cii coiiocer las propiedades y la respuesta mecánica de 193p.

  11. Preparation of imidazo[1,2-c]pyrimidinones from a chloropyrimidine and an electron poor ω-allylic amine

    OpenAIRE

    Heaney, Frances; Bourke, Sharon; Burke, Cathriona; Cunningham, Desmond; McArdle, Patrick

    1998-01-01

    Synthesis of the title ring system by a sequential intermolecular nucleophilic displacement and intramolecular ‘conjugate’ addition has been achieved both as a one pot and as a stepwise procedure; an X-ray structure determination has been carried out to distinguish between the possible isomeric structures 8c-I and 8c-II.

  12. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  13. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  14. Abstract quantum computing machines and quantum computational logics

    Science.gov (United States)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  15. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  16. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  17. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  18. COMPUTER-ASSISTED ACCOUNTING

    Directory of Open Access Journals (Sweden)

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  19. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  20. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  1. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  2. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  3. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  4. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  5. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  6. 3rd International Conference on Computational Mathematics and Computational Geometry

    CERN Document Server

    Ravindran, Anton

    2016-01-01

    This volume presents original research contributed to the 3rd Annual International Conference on Computational Mathematics and Computational Geometry (CMCGS 2014), organized and administered by Global Science and Technology Forum (GSTF). Computational Mathematics and Computational Geometry are closely related subjects, but are often studied by separate communities and published in different venues. This volume is unique in its combination of these topics. After the conference, which took place in Singapore, selected contributions chosen for this volume and peer-reviewed. The section on Computational Mathematics contains papers that are concerned with developing new and efficient numerical algorithms for mathematical sciences or scientific computing. They also cover analysis of such algorithms to assess accuracy and reliability. The parts of this project that are related to Computational Geometry aim to develop effective and efficient algorithms for geometrical applications such as representation and computati...

  7. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  8. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  11. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  12. Perbandingan Kemampuan Embedded Computer dengan General Purpose Computer untuk Pengolahan Citra

    Directory of Open Access Journals (Sweden)

    Herryawan Pujiharsono

    2017-08-01

    Full Text Available Perkembangan teknologi komputer membuat pengolahan citra saat ini banyak dikembangkan untuk dapat membantu manusia di berbagai bidang pekerjaan. Namun, tidak semua bidang pekerjaan dapat dikembangkan dengan pengolahan citra karena tidak mendukung penggunaan komputer sehingga mendorong pengembangan pengolahan citra dengan mikrokontroler atau mikroprosesor khusus. Perkembangan mikrokontroler dan mikroprosesor memungkinkan pengolahan citra saat ini dapat dikembangkan dengan embedded computer atau single board computer (SBC. Penelitian ini bertujuan untuk menguji kemampuan embedded computer dalam mengolah citra dan membandingkan hasilnya dengan komputer pada umumnya (general purpose computer. Pengujian dilakukan dengan mengukur waktu eksekusi dari empat operasi pengolahan citra yang diberikan pada sepuluh ukuran citra. Hasil yang diperoleh pada penelitian ini menunjukkan bahwa optimasi waktu eksekusi embedded computer lebih baik jika dibandingkan dengan general purpose computer dengan waktu eksekusi rata-rata embedded computer adalah 4-5 kali waktu eksekusi general purpose computer dan ukuran citra maksimal yang tidak membebani CPU terlalu besar untuk embedded computer adalah 256x256 piksel dan untuk general purpose computer adalah 400x300 piksel.

  13. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  14. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  15. Regional projection of climate impact indices over the Mediterranean region

    Science.gov (United States)

    Casanueva, Ana; Frías, M.; Dolores; Herrera, Sixto; Bedia, Joaquín; San Martín, Daniel; Gutiérrez, José Manuel; Zaninovic, Ksenija

    2014-05-01

    Climate Impact Indices (CIIs) are being increasingly used in different socioeconomic sectors to transfer information about climate change impacts and risks to stakeholders. CIIs are typically based on different weather variables such as temperature, wind speed, precipitation or humidity and comprise, in a single index, the relevant meteorological information for the particular impact sector (in this study wildfires and tourism). This dependence on several climate variables poses important limitations to the application of statistical downscaling techniques, since physical consistency among variables is required in most cases to obtain reliable local projections. The present study assesses the suitability of the "direct" downscaling approach, in which the downscaling method is directly applied to the CII. In particular, for illustrative purposes, we consider two popular indices used in the wildfire and tourism sectors, the Fire Weather Index (FWI) and the Physiological Equivalent Temperature (PET), respectively. As an example, two case studies are analysed over two representative Mediterranean regions of interest for the EU CLIM-RUN project: continental Spain for the FWI and Croatia for the PET. Results obtained with this "direct" downscaling approach are similar to those found from the application of the statistical downscaling to the individual meteorological drivers prior to the index calculation ("component" downscaling) thus, a wider range of statistical downscaling methods could be used. As an illustration, future changes in both indices are projected by applying two direct statistical downscaling methods, analogs and linear regression, to the ECHAM5 model. Larger differences were found between the two direct statistical downscaling approaches than between the direct and the component approaches with a single downscaling method. While these examples focus on particular indices and Mediterranean regions of interest for CLIM-RUN stakeholders, the same study

  16. Linking xylem water storage with anatomical parameters in five temperate tree species.

    Science.gov (United States)

    Jupa, Radek; Plavcová, Lenka; Gloser, Vít; Jansen, Steven

    2016-06-01

    The release of water from storage compartments to the transpiration stream is an important functional mechanism that provides the buffering of sudden fluctuations in water potential. The ability of tissues to release water per change in water potential, referred to as hydraulic capacitance, is assumed to be associated with the anatomy of storage tissues. However, information about how specific anatomical parameters determine capacitance is limited. In this study, we measured sapwood capacitance (C) in terminal branches and roots of five temperate tree species (Fagus sylvatica L., Picea abies L., Quercus robur L., Robinia pseudoacacia L., Tilia cordata Mill.). Capacitance was calculated separately for water released mainly from capillary (CI; open vessels, tracheids, fibres, intercellular spaces and cracks) and elastic storage compartments (CII; living parenchyma cells), corresponding to two distinct phases of the moisture release curve. We found that C was generally higher in roots than branches, with CI being 3-11 times higher than CII Sapwood density and the ratio of dead to living xylem cells were most closely correlated with C In addition, the magnitude of CI was strongly correlated with fibre/tracheid lumen area, whereas CII was highly dependent on the thickness of axial parenchyma cell walls. Our results indicate that water released from capillary compartments predominates over water released from elastic storage in both branches and roots, suggesting the limited importance of parenchyma cells for water storage in juvenile xylem of temperate tree species. Contrary to intact organs, water released from open conduits in our small wood samples significantly increased CI at relatively high water potentials. Linking anatomical parameters with the hydraulic capacitance of a tissue contributes to a better understanding of water release mechanisms and their implications for plant hydraulics. © The Author 2016. Published by Oxford University Press. All rights

  17. Seawater pH Predicted for the Year 2100 Affects the Metabolic Response to Feeding in Copepodites of the Arctic Copepod Calanus glacialis.

    Science.gov (United States)

    Thor, Peter; Bailey, Allison; Halsband, Claudia; Guscelli, Ella; Gorokhova, Elena; Fransson, Agneta

    2016-01-01

    Widespread ocean acidification (OA) is transforming the chemistry of the global ocean, and the Arctic is recognised as a region where the earliest and strongest impacts of OA are expected. In the present study, metabolic effects of OA and its interaction with food availability was investigated in Calanus glacialis from the Kongsfjord, West Spitsbergen. We measured metabolic rates and RNA/DNA ratios (an indicator of biosynthesis) concurrently in fed and unfed individuals of copepodite stages CII-CIII and CV subjected to two different pH levels representative of present day and the "business as usual" IPCC scenario (RCP8.5) prediction for the year 2100. The copepods responded more strongly to changes in food level than to decreasing pH, both with respect to metabolic rate and RNA/DNA ratio. However, significant interactions between effects of pH and food level showed that effects of pH and food level act in synergy in copepodites of C. glacialis. While metabolic rates in copepodites stage CII-CIII increased by 78% as a response to food under present day conditions (high pH), the increase was 195% in CII-CIIIs kept at low pH-a 2.5 times greater increase. This interaction was absent for RNA/DNA, so the increase in metabolic rates were clearly not a reaction to changing biosynthesis at low pH per se but rather a reaction to increased metabolic costs per unit of biosynthesis. Interestingly, we did not observe this difference in costs of growth in stage CV. A 2.5 times increase in metabolic costs of growth will leave the copepodites with much less energy for growth. This may infer significant changes to the C. glacialis population during future OA.

  18. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  19. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  20. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  1. Intracochlear Position of Cochlear Implants Determined Using CT Scanning versus Fitting Levels: Higher Threshold Levels at Basal Turn

    NARCIS (Netherlands)

    Beek, F.B. van der; Briaire, J.J.; Marel, K.S. van der; Verbist, B.M.; Frijns, J.H.

    2016-01-01

    OBJECTIVES: In this study, the effects of the intracochlear position of cochlear implants on the clinical fitting levels were analyzed. DESIGN: A total of 130 adult subjects who used a CII/HiRes 90K cochlear implant with a HiFocus 1/1J electrode were included in the study. The insertion angle and

  2. Resonance – Journal of Science Education | News

    Indian Academy of Sciences (India)

    Ed Lorenz: Father of the 'Butterfly Effect' · G Ambika · More Details Fulltext PDF. pp 206-216 General Article. Multivariable Chinese Remainder Theorem · B Sury · More Details Fulltext PDF. pp 217-234 General Article. C-II Acid and the Stereochemistry of Abietic Acid · S N Balasubrahmanyam · More Details Fulltext PDF.

  3. Evaluation of the CAARS Infrequency Index for the Detection of Noncredible ADHD Symptom Report in Adulthood

    Science.gov (United States)

    Fuermaier, Anselm B. M.; Tucha, Lara; Koerts, Janneke; Weisbrod, Matthias; Grabemann, Marco; Zimmermann, Marco; Mette, Christian; Aschenbrenner, Steffen; Tucha, Oliver

    2016-01-01

    The reliance on self-reports in detecting noncredible symptom report of attention-deficit/hyperactivity disorder in adulthood (aADHD) has been questioned due to findings showing that symptoms can easily be feigned on self-report scales. In response, Suhr and colleagues developed an infrequency index for the Conners' Adult ADHD Rating Scale (CII)…

  4. Cellulose pretreatment with 1-n-butyl-3-methylimidazolium chloride for solid acid-catalyzed hydrolysis.

    Science.gov (United States)

    Kim, Soo-Jin; Dwiatmoko, Adid Adep; Choi, Jae Wook; Suh, Young-Woong; Suh, Dong Jin; Oh, Moonhyun

    2010-11-01

    This study has been focused on developing a cellulose pretreatment process using 1-n-butyl-3-methylimidazolium chloride ([bmim]Cl) for subsequent hydrolysis over Nafion(R) NR50. Thus, several pretreatment variables such as the pretreatment period and temperature, and the [bmim]Cl amount were varied. Additionally, the [bmim]Cl-treated cellulose samples were characterized by X-ray diffraction analysis, and their crystallinity index values including CI(XD), CI(XD-CI) and CI(XD-CII) were then calculated. When correlated with these values, the concentrations of total reducing sugars (TRS) obtained by the pretreatment of native cellulose (NC) and glucose produced by the hydrolysis reaction were found to show a distinct relationship with the [CI(NC)-CI(XD)] and CI(XD-CII) values, respectively. Consequently, the cellulose pretreatment step with [bmim]Cl is to loosen a crystalline cellulose through partial transformation of cellulose I to cellulose II and, furthermore, the TRS release, while the subsequent hydrolysis of [bmim]Cl-treated cellulose over Nafion(R) NR50 is effective to convert cellulose II to glucose. Copyright 2010 Elsevier Ltd. All rights reserved.

  5. Disección traumática de la arteria carótida interna y embolismo cerebral en un jugador de fútbol: Reporte del primer caso

    Directory of Open Access Journals (Sweden)

    Luisa S. Talledo Paredes

    2012-03-01

    Full Text Available Reportamos el caso de un varón de 18 años quien sufrió contusión cervical por impacto con balón de fútbol. Tres días después fue hospitalizado con hemiplejia derecha directa y afasia de expresión. La resonancia magnética cerebral mostró lesión isquémica izquierda en ganglios basales. El doppler carotideo evidenció trombo desde el origen de la arteria carótida común izquierda extendida a la arteria carótida interna izquierda (CII y la angiotomografía documentó disección de CII (lesión traumática carotídea tipo IV. Se optó por anticoagulación con warfarina y fisioterapia de rehabilitación; con evolución favorable. La incidencia de la injuria traumática carotídea es muy rara, más aún en futbolistas; el tratamiento es aún controversial, por lo que consideramos de interés el reporte de este caso.

  6. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  7. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  10. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  11. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  12. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    Science.gov (United States)

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  13. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  14. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  15. Seventh Medical Image Computing and Computer Assisted Intervention Conference (MICCAI 2012)

    CERN Document Server

    Miller, Karol; Nielsen, Poul; Computational Biomechanics for Medicine : Models, Algorithms and Implementation

    2013-01-01

    One of the greatest challenges for mechanical engineers is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, biomedical sciences, and medicine. This book is an opportunity for computational biomechanics specialists to present and exchange opinions on the opportunities of applying their techniques to computer-integrated medicine. Computational Biomechanics for Medicine: Models, Algorithms and Implementation collects the papers from the Seventh Computational Biomechanics for Medicine Workshop held in Nice in conjunction with the Medical Image Computing and Computer Assisted Intervention conference. The topics covered include: medical image analysis, image-guided surgery, surgical simulation, surgical intervention planning, disease prognosis and diagnostics, injury mechanism analysis, implant and prostheses design, and medical robotics.

  16. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  17. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  18. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  19. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  20. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  1. Girls and Computing: Female Participation in Computing in Schools

    Science.gov (United States)

    Zagami, Jason; Boden, Marie; Keane, Therese; Moreton, Bronwyn; Schulz, Karsten

    2015-01-01

    Computer education, with a focus on Computer Science, has become a core subject in the Australian Curriculum and the focus of national innovation initiatives. Equal participation by girls, however, remains unlikely based on their engagement with computing in recent decades. In seeking to understand why this may be the case, a Delphi consensus…

  2. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  3. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  4. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  5. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  6. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  7. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  8. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  9. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  10. Injury and Compensation Claims Module Maintenance Manual

    Science.gov (United States)

    1989-01-01

    8217-"" W’ " at "_Se(cii_蓐",9,1Th ":"_$e(ci _蓐",11,12) s XK ") 28 SID p(^I~r(~. M250 ),"^",6) 252 SICK LEWE PRI DIE S %CT.’"E" D ^VWr S X-Y K:Y(1 X

  11. 5 CFR 353.110 - OPM placement assistance.

    Science.gov (United States)

    2010-01-01

    ..., NW., Washington, DC 20415.) (i) Executive branch employees (other than an employee of an intelligence... control; and (iv) Employees of the intelligence agencies (defined in 5 U.S.C. 2302(a)(2)(C)(ii)) when... competitive status or is eligible to acquire it under 5 U.S.C. 3304(C). If the employee's agency is abolished...

  12. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  13. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  14. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  15. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  16. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  17. A Computational Fluid Dynamics Algorithm on a Massively Parallel Computer

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon

    1989-01-01

    The discipline of computational fluid dynamics is demanding ever-increasing computational power to deal with complex fluid flow problems. We investigate the performance of a finite-difference computational fluid dynamics algorithm on a massively parallel computer, the Connection Machine. Of special interest is an implicit time-stepping algorithm; to obtain maximum performance from the Connection Machine, it is necessary to use a nonstandard algorithm to solve the linear systems that arise in the implicit algorithm. We find that the Connection Machine ran achieve very high computation rates on both explicit and implicit algorithms. The performance of the Connection Machine puts it in the same class as today's most powerful conventional supercomputers.

  18. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  19. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  20. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  1. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  2. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  4. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  5. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  6. CI, CII, and CO as tracers of gas phase carbon

    International Nuclear Information System (INIS)

    Keene, J.

    1990-01-01

    In the dense interstellar medium, we find that about 20 percent of the total carbon abundance is in the form of CO, about 3 percent in C I , and 100 percent in C II with uncertainties of factors of order 2. The abundance of other forms of gaseous carbon is negligible. CO is widespread throughout molecular clouds as is C I . C II has only been observed near bright star-formation regions so far because of its high excitation energy. Further from ultraviolet sources it may be less abundant. Altogether we have accounted for about 1/3 of the total carbon abundance associated with dense molecular clouds. Since the other gaseous forms are thought to have negligible abundances, the rest of the carbon is probably in solid form

  7. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    Science.gov (United States)

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  8. Elementary EFL Teachers' Computer Phobia and Computer Self-Efficacy in Taiwan

    Science.gov (United States)

    Chen, Kate Tzuching

    2012-01-01

    The advent and application of computer and information technology has increased the overall success of EFL teaching; however, such success is hard to assess, and teachers prone to computer avoidance face negative consequences. Two major obstacles are high computer phobia and low computer self-efficacy. However, little research has been carried out…

  9. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  10. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  11. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  12. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  13. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  14. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  15. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  16. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  17. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  18. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  19. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  20. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  1. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  2. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  3. Genitores potenciais para hibridações identificados por divergência genética em feijão carioca Bean parents for hybridization identified by genetic divergence in "carioca" bean

    Directory of Open Access Journals (Sweden)

    Nerinéia Dalfollo Ribeiro

    2003-06-01

    Full Text Available Noventa genótipos de feijão carioca (Phaseolus vulgaris L. foram avaliados, em dois anos agrícolas, em Santa Maria, RS, a fim de definir quais características agromorfológicas constituem-se como melhores descritores, realizar agrupamento em função de dissimilaridade genética e de definir quais combinações híbridas mais promissoras serão obtidas para o desenvolvimento de populações segregantes. Dos 20 caracteres agromorfológicos avaliados, apenas nove (ferrugem nos legumes, acamamento, nota geral, cor do tegumento, rendimento de grãos, massa de 100 sementes, altura de inserção do primeiro legume, altura de inserção do último legume e número de sementes por legume apresentaram maior contribuição para a divergência genética. Os genótipos de feijão carioca foram agrupados pelo método hierárquico de ligação completa. Populações segregantes, com variabilidade genética superior, podem ser obtidas com hibridações entre o genótipo ESAL 550 com genótipos do grupo 2 (LH-6, 17-4-32, R-78, H-4-5 e R-102 e/ou com genótipos do grupo 3 (FT 97-188, Cati-Taquari, CII-328, Carioca Precoce, FT 97-41, LH-11, FT 91-4067, Iapar 31, CI 102, Carioca MG, CII-54 e R-102.Carioca bean genotypes (Phaseolus vulgaris L. were evaluated in two growing seasons in Santa Maria, RS, Brazil. The objectives of this work were to evaluate which morpho-agronomic characteristics were the best descriptors, to group the genotypes in relation to genetic diversity and to determine which hybrid combinations are promissing to obtain higher segregation populations in carioca bean. From the 20 morpho-agronomic characteristics evaluated, only seven (pod rust, lodging, general note, colour of seed tegument, grain yield, 100 seed weight, height of first and final pod insertion and number of seeds per pod showed higher contribution to genetic diversity. The evaluated carioca bean genotypes were clustered by the complete linkage method. The following hybrid

  4. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  5. Nurses' computer literacy and attitudes towards the use of computers in health care.

    Science.gov (United States)

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  6. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  7. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  8. Visual ergonomics and computer work--is it all about computer glasses?

    Science.gov (United States)

    Jonsson, Christina

    2012-01-01

    The Swedish Provisions on Work with Display Screen Equipment and the EU Directive on the minimum safety and health requirements for work with display screen equipment cover several important visual ergonomics aspects. But a review of cases and questions to the Swedish Work Environment Authority clearly shows that most attention is given to the demands for eyesight tests and special computer glasses. Other important visual ergonomics factors are at risk of being neglected. Today computers are used everywhere, both at work and at home. Computers can be laptops, PDA's, tablet computers, smart phones, etc. The demands on eyesight tests and computer glasses still apply but the visual demands and the visual ergonomics conditions are quite different compared to the use of a stationary computer. Based on this review, we raise the question if the demand on the employer to provide the employees with computer glasses is outdated.

  9. Computing with concepts, computing with numbers: Llull, Leibniz, and Boole

    NARCIS (Netherlands)

    Uckelman, S.L.

    2010-01-01

    We consider two ways to understand "reasoning as computation", one which focuses on the computation of concept symbols and the other on the computation of number symbols. We illustrate these two ways with Llull’s Ars Combinatoria and Leibniz’s attempts to arithmetize language, respectively. We then

  10. Processing computed tomography images by using personal computer

    International Nuclear Information System (INIS)

    Seto, Kazuhiko; Fujishiro, Kazuo; Seki, Hirofumi; Yamamoto, Tetsuo.

    1994-01-01

    Processing of CT images was attempted by using a popular personal computer. The program for image-processing was made with C compiler. The original images, acquired with CT scanner (TCT-60A, Toshiba), were transferred to the computer by 8-inch flexible diskette. Many fundamental image-processing, such as displaying image to the monitor, calculating CT value and drawing the profile curve. The result showed that a popular personal computer had ability to process CT images. It seemed that 8-inch flexible diskette was still useful medium of transferring image data. (author)

  11. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  12. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  13. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    Science.gov (United States)

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  15. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  16. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  17. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    Science.gov (United States)

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  18. Activity-Driven Computing Infrastructure - Pervasive Computing in Healthcare

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Olesen, Anders Konring

    In many work settings, and especially in healthcare, work is distributed among many cooperating actors, who are constantly moving around and are frequently interrupted. In line with other researchers, we use the term pervasive computing to describe a computing infrastructure that supports work...

  19. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  20. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  1. Reconfigurable computing the theory and practice of FPGA-based computation

    CERN Document Server

    Hauck, Scott

    2010-01-01

    Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design- the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardwa

  2. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  3. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  4. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  5. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  6. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  7. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  8. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  9. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  10. The Nature of Computational Thinking in Computing Education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    2018-01-01

    Computational Thinking has gained popularity in recent years within educational and political discourses. It is more than ever crucial to discuss the term itself and what it means. In June 2017, Denning articulated that computational thinking can be viewed as either “traditional” or “new”. New...

  11. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    Science.gov (United States)

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  12. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  13. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  14. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  15. The Modern Catholic Just War Tradition

    Science.gov (United States)

    2012-04-15

    offered a positive solution to the problems facing humanity. In fact in an address to the Vatican Diplomatic Corps, January 13, 2003, Pope John Paul II ...cii Pope John Paul II . “Address of His Holiness Pope John Paul II to the Diplomatice Corps.” Speech to the Vatican Diplomatic Corps Libreria...28 POPE JOHN PAUL II AND PAOP BENEDICT XVI

  16. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  17. Early Childhood Teacher Candidates\\' Attitudes towards Computer and Computer Assisted Instruction

    OpenAIRE

    Oğuz, Evrim; Ellez, A. Murat; Akamca, Güzin Özyılmaz; Kesercioğlu, Teoman İ.; Girgin, Günseli

    2011-01-01

    The aim of this research is to evaluate preschool candidates’ attitudes towards computers andattitudes towards use of computer assisted instruction. The sample of this study includes 481 early childhoodeducation students who attended Dokuz Eylül University’s department of Early Childhood Education. Data werecollected by using “Scale of Computer Assisted Instruction Attitudes” developed by the Arslan (2006),“Computer Attitudes Scale” developed by Çelik & Bindak (2005) and “General Info...

  18. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  19. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  20. Application of Blind Quantum Computation to Two-Party Quantum Computation

    Science.gov (United States)

    Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong

    2018-03-01

    Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.

  1. Application of Blind Quantum Computation to Two-Party Quantum Computation

    Science.gov (United States)

    Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong

    2018-06-01

    Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.

  2. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    Science.gov (United States)

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  3. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  4. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  5. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  6. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  7. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  8. Inleiding: 'History of computing'. Geschiedschrijving over computers en computergebruik in Nederland

    Directory of Open Access Journals (Sweden)

    Adrienne van den Boogaard

    2008-06-01

    Full Text Available Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software.It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such “home” interest, started in 1987 with the work of Eda Kranakis – then active in The Netherlands – commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard.Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh very early on in the dissertation by Ruud van Dael, Something to do with computers (2001 revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works

  9. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  10. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  11. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  12. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  13. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  14. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  15. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  16. From computer to brain foundations of computational neuroscience

    CERN Document Server

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  17. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  18. Computer users at risk: Health disorders associated with prolonged computer use

    OpenAIRE

    Abida Ellahi; M. Shahid Khalil; Fouzia Akram

    2011-01-01

    By keeping in view the ISO standards which emphasize the assessment of use of a product, this research aims to assess the prolonged use of computers and their effects on human health. The objective of this study was to investigate the association between extent of computer use (per day) and carpal tunnel syndrome, computer stress syndrome, computer vision syndrome and musculoskeletal problems. The second objective was to investigate the extent of simultaneous occurrence of carpal tunnel syndr...

  19. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  20. Computation at the edge of chaos: Phase transition and emergent computation

    International Nuclear Information System (INIS)

    Langton, C.

    1990-01-01

    In order for computation to emerge spontaneously and become an important factor in the dynamics of a system, the material substrate must support the primitive functions required for computation: the transmission, storage, and modification of information. Under what conditions might we expect physical systems to support such computational primitives? This paper presents research on Cellular Automata which suggests that the optimal conditions for the support of information transmission, storage, and modification, are achieved in the vicinity of a phase transition. We observe surprising similarities between the behaviors of computations and systems near phase-transitions, finding analogs of computational complexity classes and the Halting problem within the phenomenology of phase-transitions. We conclude that there is a fundamental connection between computation and phase-transitions, and discuss some of the implications for our understanding of nature if such a connection is borne out. 31 refs., 16 figs

  1. Computational force, mass, and energy

    International Nuclear Information System (INIS)

    Numrich, R.W.

    1997-01-01

    This paper describes a correspondence between computational quantities commonly used to report computer performance measurements and mechanical quantities from classical Newtonian mechanics. It defines a set of three fundamental computational quantities that are sufficient to establish a system of computational measurement. From these quantities, it defines derived computational quantities that have analogous physical counterparts. These computational quantities obey three laws of motion in computational space. The solutions to the equations of motion, with appropriate boundary conditions, determine the computational mass of the computer. Computational forces, with magnitudes specific to each instruction and to each computer, overcome the inertia represented by this mass. The paper suggests normalizing the computational mass scale by picking the mass of a register on the CRAY-1 as the standard unit of mass

  2. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  3. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  4. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  5. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  6. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  7. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  8. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  9. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  10. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  11. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  12. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  13. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  14. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  15. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  16. A Heterogeneous High-Performance System for Computational and Computer Science

    Science.gov (United States)

    2016-11-15

    expand the research infrastructure at the institution but also to enhance the high -performance computing training provided to both undergraduate and... cloud computing, supercomputing, and the availability of cheap memory and storage led to enormous amounts of data to be sifted through in forensic... High -Performance Computing (HPC) tools that can be integrated with existing curricula and support our research to modernize and dramatically advance

  17. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  18. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  19. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  20. An introduction to computer viruses

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  1. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  2. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  3. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  4. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  5. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Directory of Open Access Journals (Sweden)

    Kevin S Bonham

    2017-10-01

    Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  6. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  7. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  8. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  9. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  10. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  11. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  12. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  13. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  14. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  15. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  17. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  18. Computer Graphics 2: More of the Best Computer Art and Design.

    Science.gov (United States)

    1994

    This collection of computer generated images aims to present media tools and processes, stimulate ideas, and inspire artists and art students working in computer-related design. The images are representative of state-of-the-art editorial, broadcast, packaging, fine arts, and graphic techniques possible through computer generation. Each image is…

  19. Mitochondrial electron transport chain functions in long-lived Ames dwarf mice

    Science.gov (United States)

    Choksi, Kashyap B.; Nuss, Jonathan E.; DeFord, James H.; Papaconstantinou, John

    2011-01-01

    The age-associated decline in tissue function has been attributed to ROS-mediated oxidative damage due to mitochondrial dysfunction. The long-lived Ames dwarf mouse exhibits resistance to oxidative stress, a physiological characteristic of longevity. It is not known, however, whether there are differences in the electron transport chain (ETC) functions in Ames tissues that are associated with their longevity. In these studies we analyzed enzyme activities of ETC complexes, CI-CV and the coupled CI-CII and CII-CIII activities of mitochondria from several tissues of young, middle aged and old Ames dwarf mice and their corresponding wild type controls to identify potential mitochondrial prolongevity functions. Our studies indicate that post-mitotic heart and skeletal muscle from Ames and wild-type mice show similar changes in ETC complex activities with aging, with the exception of complex IV. Furthermore, the kidney, a slowly proliferating tissue, shows dramatic differences in ETC functions unique to the Ames mice. Our data show that there are tissue specific mitochondrial functions that are characteristic of certain tissues of the long-lived Ames mouse. We propose that this may be a factor in the determination of extended lifespan of dwarf mice. PMID:21934186

  20. Paracetamol and salicylic acid removal from contaminated water by microalgae.

    Science.gov (United States)

    Escapa, C; Coimbra, R N; Paniagua, S; García, A I; Otero, M

    2017-12-01

    The biomass growth, pharmaceutical removal and light conversion efficiency of Chlorella sorokiniana under the presence of paracetamol (PC) and salicylic acid (SaC) were assessed and compared at two different concentrations of these pharmaceuticals (I: 25 mg l -1 , II: 250 mg l -1 ). Microalgae were resistant to these concentrations and, moreover, their growth was significantly stimulated (p ≤ 0.05) under these drugs (biomass concentration increased above 33% PCI, 35% SaCI, 13% PCII and 45% SaCII, as compared with the respective positive controls). At the steady state of the semicontinuous culture, C. sorokiniana showed removal efficiencies above 41% and 69% for PCI and PCII, respectively; and above 93% and 98% for SaCI and SaCII, respectively. Under an irradiance of 370 μE m -2  s -1 , higher quantum yields were reached by microalgae under the presence of drugs, either at dose I or II, than by the respective positive controls. These results point to C. sorokiniana as a robust strain for the bioremediation of paracetamol and salicylic acid concentrated wastewaters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  2. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    Science.gov (United States)

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or

  3. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  4. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  5. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  6. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  8. 77 FR 20047 - Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-04-03

    ... INTERNATIONAL TRADE COMMISSION [DN 2889] Certain Computer and Computer Peripheral Devices and... Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing the Same... importation, and the sale within the United States after importation of certain computer and computer...

  9. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  10. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. The computer boys take over computers, programmers, and the politics of technical expertise

    CERN Document Server

    Ensmenger, Nathan L

    2010-01-01

    This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists -- programmers, systems analysts, and other software developers -- who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the "computer boys" were taking over, not just in the corporate setting, but also in government, politics, and society in general. In The Computer Boys Take Over, Nathan Ensmenger traces the rise to power of the computer expert in modern American society. His rich and nuanced portrayal of the ...

  17. Forensic Computing (Dagstuhl Seminar 13482)

    OpenAIRE

    Freiling, Felix C.; Hornung, Gerrit; Polcák, Radim

    2014-01-01

    Forensic computing} (sometimes also called digital forensics, computer forensics or IT forensics) is a branch of forensic science pertaining to digital evidence, i.e., any legal evidence that is processed by digital computer systems or stored on digital storage media. Forensic computing is a new discipline evolving within the intersection of several established research areas such as computer science, computer engineering and law. Forensic computing is rapidly gaining importance since the...

  18. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  19. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  20. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  1. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  2. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  3. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  4. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  5. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  6. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  7. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  8. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  9. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  10. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  11. Deliverable D8.4 - Report on Collaboration with Standardization Bodies, NIS and Other External Groups

    OpenAIRE

    Papastergiou, Spyros; Karantjias, Thanos; Chatzikou, Menia; Vidali, Peggy; Douligeris, Christos; Stavrakakis, Ioannis; Kalogeraki, Eleni-Maria; Zacharias, Marios; Purcarea, Razvan; Cocor, Gabriel; Schauer, Stefan; König, Sandra; Mouratidis, Haris; Pavlidis, Michalis; Polatidis, Nikolaos

    2018-01-01

    Despite the importance of Critical Information Infrastructures (CIIs) and dynamic ICT‐based maritime supply chains (SCs) for port operations, state‐of‐the‐art Risk Management (RM) methodologies for maritime environments pay limited attention to cyber‐security and do not adequately address security processes for international SCs. Motivated by these limitations, MITIGATE will introduce, integrate, validate and commercialize a novel RM system, which will empower stakeholders’ collaboration for ...

  12. Deliverable D4.2. Report on Standards and Regulations Compliance

    OpenAIRE

    Duzha, Armend; Polemi, Nineta; Gouvas, Panagiotis; Karantjias, Athanasis; Papastergious, Spyros; Patsakis, Constantinos; Douligeris, Christios; Glykos, Stamatios; Exarchou, Georgios

    2016-01-01

    Despite the importance of Critical Information Infrastructures (CIIs) and dynamic ICT-based maritime supply chains (SCs) for port operations, state-of-the-art Risk Management (RM) methodologies for maritime environments pay limited attention to cyber-security and do not adequately address security processes for international SCs. Motivated by these limitations, MITIGATE will introduce, integrate, validate and commercialize a novel RM system, which will empower stakeholders’ collaboration for ...

  13. Deliverable D7.5 - Best Practices for Replicability and Wider Use

    OpenAIRE

    Karantjias, Thanos; Papastergiou, Spyros; Zacharias, Marios; Purcarea, Razvan; Cocor, Gabriel; Chatzikou, Menia; Douligeris, Christos; Negkas, Dimitrios; Mitropoulos, Sarantis; Daikos, Lampros; Bosse, Claudia

    2018-01-01

    Despite the importance of Critical Information Infrastructures (CIIs) and dynamic ICT‐based maritime supply chains (SCs) for port operations, state‐of‐the‐art Risk Management (RM) methodologies for maritime environments pay limited attention to cyber‐security and do not adequately address security processes for international SCs. Motivated by these limitations, MITIGATE will introduce, integrate, validate and commercialize a novel RM system, which will empower stakeholders’ collaboration for ...

  14. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  15. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  16. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  17. Computing in high energy physics

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1991-01-01

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors

  18. Living with Computers. Young Danes' Uses of and Thoughts on the Uses of Computers

    DEFF Research Database (Denmark)

    Stald, Gitte Bang

    1998-01-01

    Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere......Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere...

  19. Gender and Computers: Two Surveys of Computer-Related Attitudes.

    Science.gov (United States)

    Wilder, Gita; And Others

    1985-01-01

    Describes two surveys used to (1) determine sex differences in attitudes toward computers and video games among schoolchildren and the relationship of these attitudes to attitudes about science, math, and writing; and (2) sex differences in attitudes toward computing among a select group of highly motivated college freshmen. (SA)

  20. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  1. Computer Self-Efficacy: A Practical Indicator of Student Computer Competency in Introductory IS Courses

    Directory of Open Access Journals (Sweden)

    Rex Karsten

    1998-01-01

    Full Text Available Students often receive their first college-level computer training in introductory information systems courses. Students and faculty frequently expect this training to develop a level of student computer competence that will support computer use in future courses. In this study, we applied measures of computer self-efficacy to students in a typical introductory IS course. The measures provided useful evidence that student perceptions of their ability to use computers effectively in the future significantly improved as a result of their training experience. The computer self-efficacy measures also provided enhanced insight into course-related factors of practical concern to IS educators. Study results also suggest computer self-efficacy measures may be a practical and informative means of assessing computer-training outcomes in the introductory IS course context

  2. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  3. Physical fitness and mitochondrial respiratory capacity in horse skeletal muscle.

    Directory of Open Access Journals (Sweden)

    Dominique-Marie Votion

    Full Text Available BACKGROUND: Within the animal kingdom, horses are among the most powerful aerobic athletic mammals. Determination of muscle respiratory capacity and control improves our knowledge of mitochondrial physiology in horses and high aerobic performance in general. METHODOLOGY/PRINCIPAL FINDINGS: We applied high-resolution respirometry and multiple substrate-uncoupler-inhibitor titration protocols to study mitochondrial physiology in small (1.0-2.5 mg permeabilized muscle fibres sampled from triceps brachii of healthy horses. Oxidative phosphorylation (OXPHOS capacity (pmol O(2 • s(-1 • mg(-1 wet weight with combined Complex I and II (CI+II substrate supply (malate+glutamate+succinate increased from 77 ± 18 in overweight horses to 103 ± 18, 122 ± 15, and 129 ± 12 in untrained, trained and competitive horses (N = 3, 8, 16, and 5, respectively. Similar to human muscle mitochondria, equine OXPHOS capacity was limited by the phosphorylation system to 0.85 ± 0.10 (N = 32 of electron transfer capacity, independent of fitness level. In 15 trained horses, OXPHOS capacity increased from 119 ± 12 to 134 ± 37 when pyruvate was included in the CI+II substrate cocktail. Relative to this maximum OXPHOS capacity, Complex I (CI-linked OXPHOS capacities were only 50% with glutamate+malate, 64% with pyruvate+malate, and 68% with pyruvate+malate+glutamate, and ~78% with CII-linked succinate+rotenone. OXPHOS capacity with glutamate+malate increased with fitness relative to CI+II-supported ETS capacity from a flux control ratio of 0.38 to 0.40, 0.41 and 0.46 in overweight to competitive horses, whereas the CII/CI+II substrate control ratio remained constant at 0.70. Therefore, the apparent deficit of the CI- over CII-linked pathway capacity was reduced with physical fitness. CONCLUSIONS/SIGNIFICANCE: The scope of mitochondrial density-dependent OXPHOS capacity and the density-independent (qualitative increase of CI-linked respiratory capacity with increased

  4. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  5. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  6. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  7. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  8. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  9. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  10. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  11. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  12. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  13. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  14. COMPUTING: International symposium

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Recent Developments in Computing, Processor, and Software Research for High Energy Physics, a four-day international symposium, was held in Guanajuato, Mexico, from 8-11 May, with 112 attendees from nine countries. The symposium was the third in a series of meetings exploring activities in leading-edge computing technology in both processor and software research and their effects on high energy physics. Topics covered included fixed-target on- and off-line reconstruction processors; lattice gauge and general theoretical processors and computing; multiprocessor projects; electron-positron collider on- and offline reconstruction processors; state-of-the-art in university computer science and industry; software research; accelerator processors; and proton-antiproton collider on and off-line reconstruction processors

  15. Quantum steady computation

    International Nuclear Information System (INIS)

    Castagnoli, G.

    1991-01-01

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition

  16. Quantum steady computation

    Energy Technology Data Exchange (ETDEWEB)

    Castagnoli, G. (Dipt. di Informatica, Sistemistica, Telematica, Univ. di Genova, Viale Causa 13, 16145 Genova (IT))

    1991-08-10

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

  17. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  18. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  19. Computational fluid dynamics on a massively parallel computer

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon

    1989-01-01

    A finite difference code was implemented for the compressible Navier-Stokes equations on the Connection Machine, a massively parallel computer. The code is based on the ARC2D/ARC3D program and uses the implicit factored algorithm of Beam and Warming. The codes uses odd-even elimination to solve linear systems. Timings and computation rates are given for the code, and a comparison is made with a Cray XMP.

  20. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  1. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  2. Computer self-efficacy and computer attitude as correlates of ...

    African Journals Online (AJOL)

    The Internet as a useful tool that supports teaching and learning is not in full use in most secondary schools in Nigeria hence limiting the students from maximizing the potentials of Internet in advancing their academic pursuits. This study, therefore, examined the extent to which computer self-efficacy and computer attitude ...

  3. The Future of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anamaroa SIclovan

    2011-12-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offeredto the consumers as a product delivered online. This represents an advantage for the organization both regarding the cost and the opportunity for the new business. This paper presents the future perspectives in cloud computing. The paper presents some issues of the cloud computing paradigm. It is a theoretical paper.Keywords: Cloud Computing, Pay-per-use

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  7. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  8. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  9. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  10. Heterogeneous compute in computer vision: OpenCL in OpenCV

    Science.gov (United States)

    Gasparakis, Harris

    2014-02-01

    We explore the relevance of Heterogeneous System Architecture (HSA) in Computer Vision, both as a long term vision, and as a near term emerging reality via the recently ratified OpenCL 2.0 Khronos standard. After a brief review of OpenCL 1.2 and 2.0, including HSA features such as Shared Virtual Memory (SVM) and platform atomics, we identify what genres of Computer Vision workloads stand to benefit by leveraging those features, and we suggest a new mental framework that replaces GPU compute with hybrid HSA APU compute. As a case in point, we discuss, in some detail, popular object recognition algorithms (part-based models), emphasizing the interplay and concurrent collaboration between the GPU and CPU. We conclude by describing how OpenCL has been incorporated in OpenCV, a popular open source computer vision library, emphasizing recent work on the Transparent API, to appear in OpenCV 3.0, which unifies the native CPU and OpenCL execution paths under a single API, allowing the same code to execute either on CPU or on a OpenCL enabled device, without even recompiling.

  11. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  12. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  13. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  14. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  15. Mission: Define Computer Literacy. The Illinois-Wisconsin ISACS Computer Coordinators' Committee on Computer Literacy Report (May 1985).

    Science.gov (United States)

    Computing Teacher, 1985

    1985-01-01

    Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…

  16. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  17. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  18. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  19. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  20. Utility Computing: Reality and Beyond

    Science.gov (United States)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?