WorldWideScience

Sample records for computer printouts

  1. Translator program converts computer printout into braille language

    Science.gov (United States)

    Powell, R. A.

    1967-01-01

    Computer program converts print image tape files into six dot Braille cells, enabling a blind computer programmer to monitor and evaluate data generated by his own programs. The Braille output is printed 8 lines per inch.

  2. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  3. Results of the First National Assessment of Computer Competence (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Discusses the findings of the National Assessment of Educational Progress 1985-86 survey of American students' computer competence, focusing on findings of interest to reading teachers who use computers. (MM)

  4. An atomic-absorption programme for the Apple 2 plus computer

    International Nuclear Information System (INIS)

    Wepener, J.H.; Pearton, D.C.G.

    1982-01-01

    An interactive computer programme, the AA-PROGRAM APPLE, has been designed and written to process data obtained during routine analysis by atomic-absorption spectrophotometry. The programme is fast, convenient for the user, and was found to perform satisfactorily during routine operation in the laboratory. The computer used is an Apple II Plus with a video screen, and the language of the programme is Applesoft BASIC. Operating instructions for the computer and a printout of the programme are given in the Appendices

  5. Holistic Approaches to Reading (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1989-01-01

    Presents eight guidelines to consider when using computers for language instruction, emphasizing computer use in a social and purposeful context. Suggests computer software which adheres to these guidelines. (MM)

  6. Connection and record programs, used for the PM-6 peripheral computer

    International Nuclear Information System (INIS)

    Karabutova, N.E.; Tikhomirova, I.N.

    1976-01-01

    ''Pochta'' monitor program is described which is intended for the organization of an inter-program communication of channel and monitor modules of PM-6 peripheral computer. Protocol programs ''Protocol-Consule'' and ''Protocol'' are considered. Those are used for the printout of protocol texts with typewriters and teleprinters. ''Recording'' standard routine is given, which is meant for code translation. Detailed instructions are presented on utilization of all the above-mentioned programs

  7. Computers in experimental nuclear power facilities

    International Nuclear Information System (INIS)

    Jukl, M.

    1982-01-01

    The CIS 3000 information system is described used for monitoring the operating modes of large technological equipment. The CIS system consists of two ADT computers, an external drum store an analog input side, a bivalent input side, 4 control consoles with monitors and acoustic signalling, a print-out area with typewriters and punching machines and linear recorders. Various applications are described of the installed CIS configuration as is the general-purpose program for processing measured values into a protocol. The program operates in the conversational mode. Different processing variants are shown on the display monitor. (M.D.)

  8. Encouraging Recreational Reading (The Printout).

    Science.gov (United States)

    Balajthy, Ernest

    1988-01-01

    Describes computer software, including "The Electronic Bookshelf" and "Return to Reading," which provides motivation for recreational reading in various ways, including: quizzes, games based on books, and whole language activities for children's literature and young adult fiction. (MM)

  9. Computer handling of Savannah River Plant environmental monitoring data

    International Nuclear Information System (INIS)

    Zeigler, C.C.

    1975-12-01

    At the Savannah River Plant, computer programs are used to calculate, store, and retrieve radioactive and nonradioactive environmental monitoring data. Objectives are to provide daily, monthly, and annual summaries of all routine monitoring data; to calculate and tabulate releases according to radioisotopic species or nonradioactive pollutant, source point, and mode of entry to the environment (atmosphere, stream, or earthen seepage basins). The computer programs use a compatible numeric coding system for the data, and printouts are in the form required for internal and external reports. Data input and program maintenance are accomplished with punched cards, paper or magnetic tapes, and when applicable, with computer terminals. Additional aids for data evaluation provided by the programs are statistical counting errors, maximum and minimum values, standard deviations of averages, and other statistical analyses

  10. Computer processing of nuclear material data in the German Democratic Republic - as of August 1980

    International Nuclear Information System (INIS)

    Burmester, M.; Helming, M.

    1981-01-01

    A description is given of the computer-based processing of safeguards information within the frame of the State System of Accounting for and Control of Nuclear Material. Software includes the programmes ICR, PILMBR, LISTE, POL, DELE and SIP which produce the required reports to the IAEA on magnetic type and in the form of printouts, and provide a series of relevant information and data essentially facilitating the fulfilment of national obligations in the field of nuclear material control. (author)

  11. K-TIF: a two-fluid computer program for downcomer flow dynamics. [PWR

    Energy Technology Data Exchange (ETDEWEB)

    Amsden, A.A.; Harlow, F.H.

    1977-10-01

    The K-TIF computer program has been developed for numerical solution of the time-varying dynamics of steam and water in a pressurized water reactor downcomer. The current status of physical and mathematical modeling is presented in detail. The report also contains a complete description of the numerical solution technique, a full description and listing of the computer program, instructions for its use, with a sample printout for a specific test problem. A series of calculations, performed with no change in the modeling parameters, shows consistent agreement with the experimental trends over a wide range of conditions, which gives confidence to the calculations as a basis for investigating the complicated physics of steam-water flows in the downcomer.

  12. Three Mile Island nuclear reactor accident of March 1979. Environmental radiation data: Volume IV. A report to the President's Commission on the Accident at Three Mile Island

    International Nuclear Information System (INIS)

    Bretthauer, E.W.; Grossman, R.F.; Thome, D.J.; Smith, A.E.

    1981-03-01

    This report contains a listing of environmental radiation monitoring data collected in the vicinity of Three Mile Island (TMI) following the March 28, 1979 accident. These data were collected by the EPA, NRC, DOE, HHS, the Commonwealth of Pennsylvania, or the Bethlehem Steel Corporation. The original report was printed in September 1979 and the update was released in December 1979. This volume consists of the following: Table 10 Summary of US Department of Energy (DOE) sampling and analytical procedures; Table 11 Computer printout of environmental data collected by DOE; Table 12 Summary of Commonwealth of Pennsylvania sampling and analytical procedures; Table 13 Computer printout of environmental data collected by the Commonwealth of Pennsylvania; Table 14 Summary of State of New Jersey sampling and analytical procedures; Table 15 Computer printout of data collected by the State of New Jersey

  13. The impact of technical conditions of X-ray imaging on reproducibility and precision of digital computer-assisted X-ray radiogrammetry (DXR)

    International Nuclear Information System (INIS)

    Malich, A.; Boettcher, J.; Pfeil, A.; Sauner, D.; Heyne, J.P.; Petrovitch, A.; Hansch, A.; Kaiser, W.A.; Linss, W.

    2004-01-01

    To evaluate the reproducibility of imaging and analysis for bone mineral density (BMD) determination using digital computer-assisted X-ray radiogrammetry (DXR; Pronosco X-posure, version V.2, Sectra Pronosco, Denmark); to verify potential factors that influence BMD extrapolation such as tube voltage, film-focus distance (FFD), film quality and brand (Kodak T-MAT-Plus, Konika SRH, Agfa Scopix), imaging technology (conventional, digital), imaging system (Kodak, Agfa) and exposure level (mAs); and to clarify whether DXR analysis based on printouts of digital images is comparable to analysis of conventional images. The hand of a cadaver was X-rayed using varied parameters: 4-8 mAs, 40-52 kV, 90-130 cm FFD. Radiographs under standardised conditions were performed 10 times using a conventional machine (Philips Super 80 CP) and the printouts of a digital system (Digital Diagnost Philips Optimus) for the analysis of reproducibility. One image was scanned and analysed 10 times additionally for imaging reproducibility. Reliability error of the system for the imaging process using conventional radiographs-rays was 0.49% (standard conditions: 6 mAs, 40 kV, 1 m FFD), using printouts of digital images was 2.89% (4 mAs, 42 kV, 1 m FFD) and regarding the analysis process was 0.22%. BMD calculation is not affected by alterations in FFD (precision error 1.21%), mAs (0.83%) or film quality/brand (0.38%), but differs significantly depending on tube voltage (2.70%). The system was not able to analyse conventional images with tube voltages of 49/52 kV. DXR technology is stable with most of the tested parameters. Normative data should exclusively be used for calculations using similar tube voltage or correction factors. All other parameters had no significant influence on the BMD calculation. Reproducibility is high. For technical reasons it is not recommended to use the printouts of digital images for BMD determination. (orig.)

  14. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  15. PFP MICON DCS computer software documentation

    Energy Technology Data Exchange (ETDEWEB)

    Silvan, G.R.

    1996-03-26

    This document contains the complete printout of the MICON A/S system configuration used in the Plutonium Finishing Plant. The document is divided into several volumes. Volume 1 covers the workstation display and configuration. All other volumes contain the controller configurations, or programs.

  16. PFP MICON DCS computer software documentation

    International Nuclear Information System (INIS)

    Silvan, G.R.

    1996-01-01

    This document contains the complete printout of the MICON A/S system configuration used in the Plutonium Finishing Plant. The document is divided into several volumes. Volume 1 covers the workstation display and configuration. All other volumes contain the controller configurations, or programs

  17. HEATUP: a computer program for the thermal anaysis of a LOFC accident in an HTGR

    International Nuclear Information System (INIS)

    Siman-Tov, I.I.; Turner, W.D.

    1976-11-01

    The HEATUP code, a modification of the general, time-dependent, one-, two-, and three-dimensional program HEATING5, was designed for the thermal analysis of a Loss of Forced Circulation accident in a High Temperature Gas-Cooled Reactor. This report contains a description of the computational model which includes: a description of the basic problem; a short review of preliminary results related to the choice of thermal properties, boundary conditions and initial conditions; a full description of a typical three-dimensional R-Z model and a limited one of a two-dimensional RZ model. HEATUP's additional computations are presented together with the method of input preparation. The three-dimensional model of the Fulton Generating Station Loss of Forced Circulation accident is used as a sample problem. A complete presentation of the input data is made. Also, the computer printout of the sample problem input data and results are given

  18. HEATUP: a computer program for the thermal anaysis of a LOFC accident in an HTGR

    Energy Technology Data Exchange (ETDEWEB)

    Siman-Tov, I.I.; Turner, W.D.

    1976-11-01

    The HEATUP code, a modification of the general, time-dependent, one-, two-, and three-dimensional program HEATING5, was designed for the thermal analysis of a Loss of Forced Circulation accident in a High Temperature Gas-Cooled Reactor. This report contains a description of the computational model which includes: a description of the basic problem; a short review of preliminary results related to the choice of thermal properties, boundary conditions and initial conditions; a full description of a typical three-dimensional R-Z model and a limited one of a two-dimensional RZ model. HEATUP's additional computations are presented together with the method of input preparation. The three-dimensional model of the Fulton Generating Station Loss of Forced Circulation accident is used as a sample problem. A complete presentation of the input data is made. Also, the computer printout of the sample problem input data and results are given.

  19. Shuttle user analysis (study 2.2): Volume 3. Business Risk And Value of Operations in space (BRAVO). Part 4: Computer programs and data look-up

    Science.gov (United States)

    1974-01-01

    Computer program listings as well as graphical and tabulated data needed by the analyst to perform a BRAVO analysis were examined. Graphical aid which can be used to determine the earth coverage of satellites in synchronous equatorial orbits was described. A listing for satellite synthesis computer program as well as a sample printout for the DSCS-11 satellite program and a listing of the symbols used in the program were included. The APL language listing for the payload program cost estimating computer program was given. This language is compatible with many of the time sharing remote terminals computers used in the United States. Data on the intelsat communications network was studied. Costs for telecommunications systems leasing, line of sight microwave relay communications systems, submarine telephone cables, and terrestrial power generation systems were also described.

  20. TVGP and SQUAW changes at Md., January 1, 1975--January 1, 1976. Technical report No. 76-093. Physics Department No. PP 76-182

    International Nuclear Information System (INIS)

    Hill, D.G.

    1976-03-01

    Changes made in two computer programs are discussed including for TVGP, the addition of film curl to the optical constants, the use of the full track fit, and the addition of fiducial quantities to the output tape; and for SQUAW, changes including revision of the methods of counting lines on a page, summarizing of track failures and average rms, changes in printout for type-3 tracks, addition of film rms to the printout for each mass hypothesis, and repairing a few small bugs

  1. Computer program for Scatchard analysis of protein: Ligand interaction - use for determination of soluble and nuclear steroid receptor concentrations

    International Nuclear Information System (INIS)

    Leake, R.; Cowan, S.; Eason, R.

    1998-01-01

    Steroid receptor concentration may be determined routinely in biopsy samples of breast and endometrial cancer by the competition method. This method yields data for both the soluble and nuclear fractions of the tissue. The data are usually subject to Scatchard analysis. This Appendix describes a computer program written initially for a PDP-11. It has been modified for use with IBM, Apple Macintosh and BBC microcomputers. The nature of the correction for competition is described and examples of the printout are given. The program is flexible and its use for different receptors is explained. The program can be readily adapted to other assays in which Scatchard analysis is appropriate

  2. 3D-printing of undisturbed soil imaged by X-ray

    Science.gov (United States)

    Bacher, Matthias; Koestel, John; Schwen, Andreas

    2014-05-01

    The unique pore structures in Soils are altered easily by water flow. Each sample has a different morphology and the results of repetitions vary as well. Soil macropores in 3D-printed durable material avoid erosion and have a known morphology. Therefore potential and limitations of reproducing an undisturbed soil sample by 3D-printing was evaluated. We scanned an undisturbed soil column of Ultuna clay soil with a diameter of 7 cm by micro X-ray computer tomography at a resolution of 51 micron. A subsample cube of 2.03 cm length with connected macropores was cut out from this 3D-image and printed in five different materials by a 3D-printing service provider. The materials were ABS, Alumide, High Detail Resin, Polyamide and Prime Grey. The five print-outs of the subsample were tested on their hydraulic conductivity by using the falling head method. The hydrophobicity was tested by an adapted sessile drop method. To determine the morphology of the print-outs and compare it to the real soil also the print-outs were scanned by X-ray. The images were analysed with the open source program ImageJ. The five 3D-image print-outs copied from the subsample of the soil column were compared by means of their macropore network connectivity, porosity, surface volume, tortuosity and skeleton. The comparison of pore morphology between the real soil and the print-outs showed that Polyamide reproduced the soil macropore structure best while Alumide print-out was the least detailed. Only the largest macropore was represented in all five print-outs. Printing residual material or printing aid material remained in and clogged the pores of all print-out materials apart from Prime Grey. Therefore infiltration was blocked in these print-outs and the materials are not suitable even though the 3D-printed pore shapes were well reproduced. All of the investigated materials were insoluble. The sessile drop method showed angles between 53 and 85 degrees. Prime Grey had the fastest flow rate; the

  3. Atmospheric, Magnetospheric, and Plasmas in Space (AMPS) spacelab payload definition study, appendixes

    Science.gov (United States)

    Keeley, J. T.

    1976-01-01

    An equipment list, instrument baseline data, engineering drawings, mass properties computer printouts, electrical energy management, and control and display functional analysis pertinent to the AMPS (Satellite Payload) are presented.

  4. Computer programmes for the control and data manipulation of a sequential x-ray-fluorescence spectrometer

    International Nuclear Information System (INIS)

    Spimpolo, G.F.

    1984-01-01

    Two computer programmes have been written for use on a fully automated Siemens SRS200 sequential X-ray-fluorescence spectrometer. The first of these is used to control the spectrometer via an LC200 logic controller using a Data General Nova IV minicomputer; the second is used for the on-line evaluation of the intensity results and the printout of the analytical results. This system is an alternative to the systems offered by Siemens Ltd, which consist of a Process PR310 or Digital DEC PDP1103 computer and the Siemens Spectra 310 software package. The multibatch capabilities of the programmes, with the option of measuring one sample or a tray of samples before the results are calculated, give the new programmes a major advantage over the dedicated software and, together with the elimination of human error in calculation, have resulted in increased efficiency and quality in routine analyses. A description is given of the two programmes, as well as instruction and guidelines to the user

  5. Radioimmunoassay evaluation and quality control by use of a simple computer program for a low cost desk top calculator

    International Nuclear Information System (INIS)

    Schwarz, S.

    1980-01-01

    A simple computer program for the data processing and quality control of radioimmunoassays is presented. It is written for low cost programmable desk top calculator (Hewlett Packard 97), which can be afforded by smaller laboratories. The untreated counts from the scintillation spectrometer are entered manually; the printout gives the following results: initial data, logit-log transformed calibration points, parameters of goodness of fit and of the position of the standard curve, control and unknown samples dose estimates (mean value from single dose interpolations and scatter of replicates) together with the automatic calculation of within assay variance and, by use of magnetic cards holding the control parameters of all previous assays, between assay variance. (orig.) [de

  6. Some Aspects of Process Computers Configuration Control in Nuclear Power Plant Krsko - Process Computer Signal Configuration Database (PCSCDB)

    International Nuclear Information System (INIS)

    Mandic, D.; Kocnar, R.; Sucic, B.

    2002-01-01

    During the operation of NEK and other nuclear power plants it has been recognized that certain issues related to the usage of digital equipment and associated software in NPP technological process protection, control and monitoring, is not adequately addressed in the existing programs and procedures. The term and the process of Process Computers Configuration Control joins three 10CFR50 Appendix B quality requirements of Process Computers application in NPP: Design Control, Document Control and Identification and Control of Materials, Parts and Components. This paper describes Process Computer Signal Configuration Database (PCSCDB), that was developed and implemented in order to resolve some aspects of Process Computer Configuration Control related to the signals or database points that exist in the life cycle of different Process Computer Systems (PCS) in Nuclear Power Plant Krsko. PCSCDB is controlled, master database, related to the definition and description of the configurable database points associated with all Process Computer Systems in NEK. PCSCDB holds attributes related to the configuration of addressable and configurable real time database points and attributes related to the signal life cycle references and history data such as: Input/Output signals, Manually Input database points, Program constants, Setpoints, Calculated (by application program or SCADA calculation tools) database points, Control Flags (example: enable / disable certain program feature) Signal acquisition design references to the DCM (Document Control Module Application software for document control within Management Information System - MIS) and MECL (Master Equipment and Component List MIS Application software for identification and configuration control of plant equipment and components) Usage of particular database point in particular application software packages, and in the man-machine interface features (display mimics, printout reports, ...) Signals history (EEAR Engineering

  7. 19 CFR 4.99 - Forms; substitution.

    Science.gov (United States)

    2010-04-01

    ... Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY... the instructions shall be followed. (c) The port director, in his discretion, may accept a computer printout instead of Customs Form 1302 for use at a specific port. However, to ensure that computer...

  8. International co-operation in use of computers

    Energy Technology Data Exchange (ETDEWEB)

    Sterling, T D [Medical Computing Centre, University of Cincinnati, College of Medicine, Cincinnati, OH (United States)

    1966-06-15

    National and international co-operation takes on concreteness in the exchange of programmes and research results . On a practical level and for the exchange of actual techniques the diversity between computers and computer languages may offer almost overwhelming obstacles which can be overcome but not without considerable effort. If computing centres have similar constellations of hardware, then the exchange of programmes is of course routine. We were able to exchange programmes with the Cancer Institute Board at Melbourne, Australia. It turned out that our programmes, although developed for a {sup 60}Co teletherapy beam unit (Model Eldorado A), could be fitted to the dose distribution from the Melbourne 4-MeV linear accelerator with the change of a few constants. The work of fitting the equations themselves was done by the Melbourne group. Once the new constants were found it was a simple matter of transcribing our programme on a reel of magnetic tape and returning it to Melbourne. Treatment centres may have access to different computers. However, it is becoming increasingly true that different computers will be able to accept programmes written in many languages. Language compatibility makes it possible t o take programmes and to rewrite them, at a very small cost, to fit another computer. Very often the only changes that have to be introduced are of 'input-output' instructions . If this is the case , then a copy of the flow diagram and a print-out of the source programme is usually all that is needed to make a programme operational on a different machine. But even here we have found it desirable t o send a programmer along so that resolution of detailed problems may be expedited. In this way we exchanged programmers with the Mallinckrodt Institute of Radiology at St. Louis, Missouri to make our own external beam methods compatible with their computer and to make their interstitial and intracavitary programmes operational on ours.

  9. 75 FR 27821 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2010-05-18

    ...-USNCB. The files contain electronic and hard copy records containing identifying particulars about... safety from persons, events, or things; investigative notes; computer printouts; letters; memoranda...: Case files closed as of April 5, 1982 and thereafter are disposed of as follows: The hard copy (paper...

  10. Automated technical validation--a real time expert system for decision support.

    Science.gov (United States)

    de Graeve, J S; Cambus, J P; Gruson, A; Valdiguié, P M

    1996-04-15

    Dealing daily with various machines and various control specimens provides a lot of data that cannot be processed manually. In order to help decision-making we wrote specific software coping with the traditional QC, with patient data (mean of normals, delta check) and with criteria related to the analytical equipment (flags and alarms). Four machines (3 Ektachem 700 and 1 Hitachi 911) analysing 25 common chemical tests are controlled. Every day, three different control specimens and one more once a week (regional survey) are run on the various pieces of equipment. The data are collected on a 486 microcomputer connected to the central computer. For every parameter the standard deviation is compared with the published acceptable limits and the Westgard's rules are computed. The mean of normals is continuously monitored. The final decision induces either an alarm sound and the print-out of the cause of rejection or, if no alarms happen, the daily print-out of recorded data, with or without the Levey Jennings graphs.

  11. 32 CFR 292.4 - Specific policy.

    Science.gov (United States)

    2010-07-01

    ... INFORMATION ACT PROGRAM DEFENSE INTELLIGENCE AGENCY (DIA) FREEDOM OF INFORMATION ACT § 292.4 Specific policy... of source and object codes, regardless of medium are not agency records. (This does not include the... existing computer program or printout for retrieval of the requested information. (c) The prior application...

  12. 75 FR 43500 - Privacy Act of 1974; System of Records

    Science.gov (United States)

    2010-07-26

    ... require an original signature or a notarized signature as a means of proving the identity of the... notarized signature as a means of proving the identity of the individual requesting access to the records..., cleared and trained. Manual records and computer printouts are available only to authorized personnel...

  13. 39 CFR 265.6 - Availability of records.

    Science.gov (United States)

    2010-07-01

    ... by the human eye, such as a computer print-out. On request, records will be provided in a different... investigation, or by an agency conducting a lawful national security intelligence investigation, information... authorized user of a postage meter or PC Postage product (postage evidencing systems) printing a specified...

  14. Comparative Analyses of Physics Candidates Scores in West African and National Examinations Councils

    Science.gov (United States)

    Utibe, Uduak James; Agah, John Joseph

    2015-01-01

    The study is a comparative analysis of physics candidates' scores in West African and National Examinations Councils. It also investigates influence of gender. Results of 480 candidates were randomly selected form three randomly selected Senior Science Colleges using the WASSCE and NECOSSCE computer printout sent to the schools, transformed using…

  15. Heat-flux gage measurements on a flat plate at a Mach number of 4.6 in the VSD high speed wind tunnel, a feasibility test (LA28). [wind tunnel tests of measuring instruments for boundary layer flow

    Science.gov (United States)

    1975-01-01

    The feasibility of employing thin-film heat-flux gages was studied as a method of defining boundary layer characteristics at supersonic speeds in a high speed blowdown wind tunnel. Flow visualization techniques (using oil) were employed. Tabulated data (computer printouts), a test facility description, and photographs of test equipment are given.

  16. Water cooling of RF structures

    International Nuclear Information System (INIS)

    Battersby, G.; Zach, M.

    1994-06-01

    We present computer codes for heat transfer in water cooled rf cavities. RF parameters obtained by SUPERFISH or analytically are operated on by a set of codes using PLOTDATA, a command-driven program developed and distributed by TRIUMF [1]. Emphasis is on practical solutions with designer's interactive input during the computations. Results presented in summary printouts and graphs include the temperature, flow, and pressure data. (authors). 4 refs., 4 figs

  17. Radiation dose to construction workers at operating nuclear power plant sites. Volume 2. Appendices A--F. Final report, September 1975--September 1978

    International Nuclear Information System (INIS)

    Endres, G.W.R.; Shipler, D.B.

    1978-12-01

    These appendices contain the dosimetry procedures and details of the personnel and environmental dosimeters used for the Radiation Dose to Construction Workers at Operating Nuclear Power Plant Sites Study. A printout of the computer codes used to analyze dosimeter data is included along with all the raw data obtained. Appendices C through F contain computer output and log-normal plots of dosimetry data for environmental location and construction worker groups

  18. Computed tomographic diagnosis of pulmonary nodules

    International Nuclear Information System (INIS)

    Onoue, Masataka

    1986-01-01

    One hundred and fifty-two pulmonary nodules (PNs) were examined by thin-section computed tomography (CT) and conventional tomography (tomography). In this study, 109 PNs were analyzed to assess tissue density by calculating the representative CT number (RCT no.) from a computer printout. For the primary malignancies, the mean RCT no. was 72 HU, with a standard deviation (SD) of 21 HU, and for metastases, it was 66 ± 19 HU. The RCT no. separating primary malignancy from benign lesions was 157 HU, the one including metastasis was 183 HU. In addition, dual-energy CT scan was performed to evaluate capability of diagnosing calcification in 35 PNs. Dual-energy CT scan had increased the reliability of CT diagnosis for PNs with RCT no. between 100 HU and 300 HU. The descriptive criteria of CT and tomography for 127 PNs were analyzed and statistical diagnoses by CT and tomography were compared with the results of their final diagnoses. CT and tomography had domonstrated similar sensitivities in the evaluation of primary malignancies; 86 % in CT and 90 % in tomography, and the same sensitivities (86 %) in diagnosing metastases. In evaluating benign PNs, CT was superior to tomography with capability to detect minimal calcification and fat density; the specificity was 80 % with CT, and 52 % with tomography (p < 0.025). Overall accuracy in the diagnosis for PNs was 82 % with CT, and 73 % with tomography, which was not different statistically. It can be concluded that CT is a reliable examination in the evaluation of PNs, and there is an advantage to the use of CT over tomography in diagnosing benign lesions. (author)

  19. WASTE-PRA: a computer package for probabilistic risk assessment of shallow-land burial of low-level radioactive waste

    International Nuclear Information System (INIS)

    Cox, N.D.; Atwood, C.L.

    1985-12-01

    This report is a user's manual for a package of computer programs and data files to be used for probabilistic risk assessment of shallow-land burial of low-level radioactive waste. The nuclide transport pathways modeled are an unsaturated groundwater column, an aquifer, and the atmosphere. An individual or the population receives a dose commitment through shine, inhalation, ingestion, direct exposure, and/or a puncture wound. The methodology of risk assessment is based on the response surface method of uncertainty analysis. The parameters of the model for predicting dose commitment due to a release are treated as statistical variables, in order to compute statistical distributions for various contributions to the dose commitment. The likelihood of a release is similarly treated as a statistical variable. Uncertainty distributions are obtained both for the dose commitment and for the corresponding risk. Plots and printouts are produced to aid in comparing the importance of various release scenarios and in assessing the total risk of a set of scenarios. The entire methodology is illustrated by an example. Information is included on parameter uncertainties, reference site characteristics, and probabilities of release events

  20. Three Mile Island nuclear reactor accident of March 1979. Environmental radiation data: Volume III. A report to the President's Commission on the Accident at Three Mile Island

    International Nuclear Information System (INIS)

    Bretthauer, E.W.; Grossman, R.F.; Thome, D.J.; Smith, A.E.

    1981-03-01

    This report contains a listing of environmental radiation monitoring data collected in the vicinity of Three Mile Island (TMI) following the March 28, 1979 accident. These data were collected by the EPA, NRC, DOE, HHS, the Commonwealth of Pennsylvania, or the Bethlehem Steel Corporation. This volume consists of Table 9 Computer printout of environmental data collected NRC

  1. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1980-06-01

    This list is produced directly from computer printout in two sorts: one ordered by accession number, and the other ordered by document number within each origin series (e.g. listing all INDC(SEC)-documents in one block). Reference to earlier INDSWG and ''interim'' INDC reports received between 1962 and 1967 are listed in report INDC/199 (dated November 1967)

  2. Analysis of chemical components from plant tissue samples

    Science.gov (United States)

    Laseter, J. L.

    1972-01-01

    Information is given on the type and concentration of sterols, free fatty acids, and total fatty acids in plant tissue samples. All samples were analyzed by gas chromatography and then by gas chromatography-mass spectrometry combination. In each case the mass spectral data was accumulated as a computer printout and plot. Typical gas chromatograms are included as well as tables describing test results.

  3. Local Area Networks (The Printout).

    Science.gov (United States)

    Aron, Helen; Balajthy, Ernest

    1989-01-01

    Describes the Local Area Network (LAN), a project in which students used LAN-based word processing and electronic mail software as the center of a writing process approach. Discusses the advantages and disadvantages of networking. (MM)

  4. MHD Advanced Power Train Phase I, Final Report, Volume 7

    Energy Technology Data Exchange (ETDEWEB)

    A. R. Jones

    1985-08-01

    This appendix provides additional data in support of the MHD/Steam Power Plant Analyses reported in report Volume 5. The data is in the form of 3PA/SUMARY computer code printouts. The order of presentation in all four cases is as follows: (1) Overall Performance; (2) Component/Subsystem Information; (3) Plant Cost Accounts Summary; and (4) Plant Costing Details and Cost of Electricity.

  5. Format Guide for Scientific and Technical Reports.

    Science.gov (United States)

    1984-01-01

    supported by the discussion. Graphkic Services The Graphic Services Section (Code 2632) provides a variety of layout and design services. Camera-ready artwork...complex typography , elaborate graphic elements, extensive computer printouts, and other unusual materials that explain the project. With few exceptions...2630 Publications Branch Office 222/253 72379 S Publications Control Center 222/253 73508 Editorial 222/253 72782 Graphic Services 222/234 72756 73989

  6. Use of Computer Imaging in Rhinoplasty: A Survey of the Practices of Facial Plastic Surgeons.

    Science.gov (United States)

    Singh, Prabhjyot; Pearlman, Steven

    2017-08-01

    The objective of this study was to quantify the use of computer imaging by facial plastic surgeons. AAFPRS Facial plastic surgeons were surveyed about their use of computer imaging during rhinoplasty consultations. The survey collected information about surgeon demographics, practice settings, practice patterns, and rates of computer imaging (CI) for primary and revision rhinoplasty. For those surgeons who used CI, additional information was also collected, which included who performed the imaging and whether the patient was given the morphed images after the consultation. A total of 238 out of 1200 (19.8%) facial plastic surgeons responded to the survey. Out of those who responded, 195 surgeons (83%) were board certified by the American Board of Facial Plastic and Reconstructive Surgeons (ABFPRS). The majority of respondents (150 surgeons, 63%) used CI during rhinoplasty consultation. Of the surgeons who use CI, 92% performed the image morphing themselves. Approximately two-thirds of surgeons who use CI gave their patient a printout of the morphed images after the consultation. Computer imaging (CI) is a frequently utilized tool for facial plastic surgeons during cosmetic consultations with patients. Based on these results of this study, it can be suggested that the majority of facial plastic surgeons who use CI do so for both primary and revision rhinoplasty. As more sophisticated systems become available, it is possible that utilization of CI modalities will increase. This provides the surgeon with further tools to use at his or her disposal during discussion of aesthetic surgery. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  7. A compact fast data acquisition and data analysis system for time of flight mass spectrometry. Time and S.I. intensity measurements with a new multistop TDC

    International Nuclear Information System (INIS)

    Della-Negra, S.; Le Beyec, Y.

    1987-01-01

    A data acquisition and processing system for time of flight mass spectrometry, based on a PC-AT computer with an additional memory card was developed, and analysis software was written. About 100,000 counts/sec can be analyzed and stored in the memory. In the coincidence mode, 1000 start events with 10 stop events (per sec) on one time digital converter allow 10 spectra to be recorded. Examples of printouts are shown

  8. "On-screen" writing and composing: two years experience with Manuscript Manager, Apple II and IBM-PC versions.

    Science.gov (United States)

    Offerhaus, L

    1989-06-01

    The problems of the direct composition of a biomedical manuscript on a personal computer are discussed. Most word processing software is unsuitable because literature references, once stored, cannot be rearranged if major changes are necessary. These obstacles have been overcome in Manuscript Manager, a combination of word processing and database software. As it follows Council of Biology Editors and Vancouver rules, the printouts should be technically acceptable to most leading biomedical journals.

  9. Information Services: Their Organization, Control and Use.

    Science.gov (United States)

    1981-01-01

    subscribed to weekly MARC tapes of British and American literature, and each tape was run on the AWRE computer, against a stored profile of Dewey Decimal ...are made, and local information, eg, call number, is added. The books are also indexed by the Universal Decimal Classification (UDC). A further update...Classification ( DDC ) numbers. A printout of the selected records was used as a book selection tool, whilst AMCOS stored these records in a cumulating

  10. Computerized assessment of the measurement of individual doses

    International Nuclear Information System (INIS)

    Kiibus, A.

    1981-06-01

    The department for the measurements of individual doses makes regular dose controls by means of film badges for approximately 14000 individuals. The operation is facilitated by a Honeywell Bull Mini 6 Mod 43 computer. The computer language is COBOL applied to registering of in-data such as delivery of badges, film development, calibration, invoices, recording of individual doses and customers. The print-out consists of customers, badge codes, dosimeter lists, development specifications, dose statements, addresses, bills, dose statistics and the register of individuals. As a consequence of charges the activity is financially self-supporting. (G.B.)

  11. Windows Program For Driving The TDU-850 Printer

    Science.gov (United States)

    Parrish, Brett T.

    1995-01-01

    Program provides WYSIWYG compatibility between video display and printout. PDW is Microsoft Windows printer-driver computer program for use with Raytheon TDU-850 printer. Provides previously unavailable linkage between printer and IBM PC-compatible computers running Microsoft Windows. Enhances capabilities of Raytheon TDU-850 hardcopier by emulating all textual and graphical features normally supported by laser/ink-jet printers and makes printer compatible with any Microsoft Windows application. Also provides capabilities not found in laser/ink-jet printer drivers by providing certain Windows applications with ability to render high quality, true gray-scale photographic hardcopy on TDU-850. Written in C language.

  12. [Evaluation of a registration card for logging electrocardiographic records into standard personal computers].

    Science.gov (United States)

    Pizzuti, A; Baralis, G; Bassignana, A; Antonielli, E; Di Leo, M

    1997-01-01

    The MS200 Cardioscope, from MRT Micro as., Norway, is a 12 channel ECG card to be directly inserted into a standard personal computer (PC). The standard ISA Bus compatible half length card comes with a set of 10 cables with electrodes and the software for recording, displaying and saving ECG signals. The system is supplied with DOS or Windows software. The goal of the present work was to evaluate the affordability and usability of the MS200 in a clinical setting. We tested the 1.5 DOS version of the software. In 30 patients with various cardiac diseases the ECG signal has been recorded with MS200 and with standard Hellige CardioSmart equipment. The saved ECGs were recalled and printed using an Epson Stylus 800 ink-jet printer. Two cardiologists reviewed the recordings for a looking at output quality, amplitude and speed precision, artifacts, etc. 1) Installation: the card has proven to be totally compatible with the hardware; no changes in default settings had to be made. 2) Usage: the screens are clear; the commands and menus are intuitive and easy to use. Due to the boot-strap and software loading procedures and, most important, off-line printing, the time needed to obtain a complete ECG printout has been longer than that of the reference machine. 3) Archiving and retrieval of ECG: the ECG curves can be saved in original or compressed form: selecting the latter, the noise and non-ECG information is filtered away and the space consumption on disk is reduced: on average, 20 Kb are needed for 10 seconds of signal. The MS200 can be run on a Local Area Network and is prepared for integrating with an existing informative system: we are currently testing the system in this scenery. 4) MS200 includes options for on-line diagnosis, a technology we have not tested in the present work. 5) The only setting allowed for printing full pages is letter size (A4): the quality of printouts is good, with a resolution of 180 DPI. In conclusion, the MS200 system seems reliable and

  13. Three Mile Island nuclear reactor accident of March 1979. Environmental radiation data: Volume V. A report to the President's Commission on the Accident at Three Mile Island

    International Nuclear Information System (INIS)

    Bretthauer, E.W.; Grossman, R.F.; Thome, D.J.; Smith, A.E.

    1981-03-01

    This report contains a listing of environmental radiation monitoring data collected in the vicinity of Three Mile Island (TMI) following the March 28, 1979 accident. These data were collected by the EPA, NRC, DOE, HHS, the Commonwealth of Pennsylvania, or the Bethlehem Steel Corporation. This volume consists of the following 2 volumes: Table 16 Summary of Metropolitan Edison Company (Met-Ed) sampling and analytical procedures; and Table 17 Computer printout of data collected by Met-Ed

  14. Automatic processing of list of journals and publications in the Nuclear Research Institute

    International Nuclear Information System (INIS)

    Vymetal, L.

    Using an EC 1040 computer, the Institute of Nuclear Research processed the list of journals in the reference library of the Czechoslovak Atomic Energy Commission including journals acquired by all institutions subordinated to the Czechoslovak Atomic Energy Commission, ie., UJV Rez (Nuclear Research Institute), Nuclear Information Centre Prague, UVVVR Prague (Institute for Research, Production and Application of Radioisotopes) and Institute of Radioecology and Applied Nuclear Techniques Kosice. Computer processing allowed obtaining files arranged by libraries, subject matters of the journals, countries of publication, and journal titles. Automated processing is being prepared of publications by UJV staff. The preparation is described of data for computer processing of both files and specimens are shown of printouts. (Ha)

  15. BIPAL - a data library for computing the burnup of fissionable isotopes and products of their decay

    International Nuclear Information System (INIS)

    Kralovcova, E.; Hep, J.; Valenta, V.

    1978-01-01

    The BIPAL databank contains data on 100 heavy metal isotopes starting with 206 Tl and finishing with 253 Es. Four are stable, the others are unstable. The following data are currently stored in the databank: the serial number and name of isotopes, decay modes and, for stable isotopes, the isotopic abundance (%), numbers of P decays and Q captures, numbers of corresponding final products, branching ratios, half-lives and their units, decay constants, thermal neutron captures, and fission cross sections, and other data (mainly alpha, beta and gamma intensities). The description of data and a printout of the BIPAL library are presented. (J.B.)

  16. Transferring data oscilloscope to an IBM using an Apple II+

    Science.gov (United States)

    Miller, D. L.; Frenklach, M. Y.; Laughlin, P. J.; Clary, D. W.

    1984-01-01

    A set of PASCAL programs permitting the use of a laboratory microcomputer to facilitate and control the transfer of data from a digital oscilloscope (used with photomultipliers in experiments on soot formation in hydrocarbon combustion) to a mainframe computer and the subsequent mainframe processing of these data is presented. Advantages of this approach include the possibility of on-line computations, transmission flexibility, automatic transfer and selection, increased capacity and analysis options (such as smoothing, averaging, Fourier transformation, and high-quality plotting), and more rapid availability of results. The hardware and software are briefly characterized, the programs are discussed, and printouts of the listings are provided.

  17. Computer-assisted radiological quantification of rheumatoid arthritis

    International Nuclear Information System (INIS)

    Peloschek, P.L.

    2000-03-01

    Specific objective was to develop the layout and structure of a platform for effective quantification of rheumatoid arthritis (RA). A fully operative Java stand-alone application software (RheumaCoach) was developed to support the efficacy of the scoring process in RA (Web address: http://www.univie.ac.at/radio/radio.htm). Addressed as potential users of such a program are physicians enrolled in clinical trials to evaluate the course of RA and its modulation with drug therapies and scientists developing new scoring modalities. The software 'RheumaCoach' consists of three major modules: The Tutorial starts with 'Rheumatoid Arthritis', to teach the basic pathology of the disease. Afterwards the section 'Imaging Standards' explains how to produce proper radiographs. 'Principles - How to use the 'Larsen Score', 'Radiographic Findings' and 'Quantification by Scoring' explain the requirements for unbiased scoring of RA. At the Data Input Sheet care was taken to follow the radiologist's approach in analysing films as published previously. At the compute sheet the calculated Larsen-Score may be compared with former scores and the further possibilities (calculate, export, print, send) are easily accessible. In a first pre-clinical study the system was tested in an unstructured. Two structured evaluations (30 fully documented and blinded cases of RA, four radiologists scored hands and feet with or without the RheumaCoach) followed. Between the evaluations we permanently improved the software. For all readers the usage of the RheumaCoach fastened the procedure, all together the scoring without computer-assistance needed about 20 % percent more time. Availability of the programme via the internet provides common access for potential quality control in multi-center studies. Documentation of results in a specifically designed printout improves communication between radiologists and rheumatologists. The possibilities of direct export to other programmes and electronic

  18. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1989-05-01

    The Nuclear Data Section of the IAEA receives documents originated by or for the International Nuclear Data Committee for distribution. This list includes all INDC documents received and distributed by the INDC Secretariat during the period September 1987 to February 1989. The list is produced directly from computer printouts into two groups: one ordered by accession number, and the other ordered by document number within each series. The document lists also in an Appendix the titles of reports received as single copies for information

  19. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1991-01-01

    The Nuclear Data Section of the IAEA receives documents originated by or for the International Nuclear Data Committee (INDC) for distribution. This list includes all INDC documents received and distributed by the INDC Secretariat during the period March 1989 to June 1990. The list is produced directly from computer printout, into two groups: one ordered by accession number, and the other ordered by document number within each origin series. The document lists also in an appendix the titles of reports received as single copies for information

  20. The Printout: Desktop Pulishing in the Classroom.

    Science.gov (United States)

    Balajthy, Ernest; Link, Gordon

    1988-01-01

    Reviews software available to the classroom teacher for desktop publishing and describes specific classroom activities. Suggests using desktop publishing to produce large print texts for students with limited sight or for primary students.(NH)

  1. Reading Diagnosis via the Microcomputer (The Printout).

    Science.gov (United States)

    Weisberg, Renee; Balajthy, Ernest

    1989-01-01

    Examines and evaluates microcomputer software designed to assist in diagnosing students' reading abilities and making instructional decisions. Claims that existing software shows valuable potential when used sensibly and critically by trained reading clinicians. (MM)

  2. A guide to automation techniques in calorimetry

    International Nuclear Information System (INIS)

    Renz, D.P.; Wetzel, J.R.; Breakall, K.L.; James, S.J.; Kasperski, P.W.

    1992-01-01

    Many improvements occurring in calorimetry measurement technology in the past few years will help current users of calorimeters to achieve better use of their existing systems. These include a more user friendly operator computer interface, a more detailed printout at the end of each run, improved data processing to eliminate operator error, and improved system monitoring to detect system or environmental problems. In addition, an electrical calibration heater has been developed to replace plutonium-238 heat standards for calibrating calorimeters, and several automation systems allow for easier and safer system operation

  3. U.S. Central Station Nuclear Power Plants: operating history

    International Nuclear Information System (INIS)

    1976-01-01

    The information assembled in this booklet highlights the operating history of U. S. Central Station nuclear power plants through December 31, 1976. The information presented is based on data furnished by the operating electric utilities. The information is presented in the form of statistical tables and computer printouts of major shutdown periods for each nuclear unit. The capacity factor data for each unit is presented both on the basis of its net design electrical rating and its net maximum dependable capacity, as reported by the operating utility to the Nuclear Regulatory Commission

  4. Preliminary report on digitalization of renal microangiograms used in analysing renal parenchymal diseases.

    Science.gov (United States)

    Takahashi, M; Kaneko, M

    1983-01-01

    Glomerulography is a useful method for the angiographic diagnosis of various renal parenchymal diseases. A new system for digitalization of the glomerulogram has been developed using a high resolution television camera and a CT computer. We describe the fundamental procedures involved in the clinical application of digital glomerulography by applying this method to a renal microangiogram of a cow. This new method aids a clearer understanding of the detailed microvasculatures by providing better magnification and storage and allowing for further processing of the original analogue images. With a computer printout of any part of the glomerulogram also possible, an estimation of the glomerular counts and their distribution can now be given for any unit of cross-sectional area of the renal cortex.

  5. Fourier Analysis: Creating A “Virtual Laboratory” Using Computer Simulation

    Directory of Open Access Journals (Sweden)

    Jeff Butterfield

    1998-01-01

    Full Text Available At times the desire for specialized laboratory apparatus to support class activities outstrips the available resources.  When this is the case the instructor must look for creative alternatives to help meet the desired objectives.  This report examines how a virtual laboratory was created to model and analyze high-speed networking signals in a LAN class using a spreadsheet simulation.  The students were able to printout various waveforms (e.g., signals of different frequencies/network media that are similar to output from test equipment that would have otherwise been cost prohibitive.  The activity proved to be valuable in helping students to understand an otherwise difficult concept that is central to modern networking applications.  Such simulation is not limited to network signals, but may be applicable in many situations where the artifact under study may be described mathematically.

  6. Simplified application of electronic data processing in a natural science and technology special library in combination with an improved literature description

    Energy Technology Data Exchange (ETDEWEB)

    Bretnuetz, E.

    1975-10-01

    A pilot project in a special library for natural science and technology to record bibliographic data on several kinds of literature within a simplified scheme and to process them in a computer by simple programs is described. The printout consists of several lists arranged according to several aspects. At the same time a relevant thesaurus is tested as to its suitability for an improved description of the literature. The results show that the literature handled is identified sufficiently within this simplified scheme. After supplementation by some special terms, the thesaurus can be used for a deeper analysis of the literature. (auth)

  7. Three Mile Island nuclear reactor accident of March 1979. Environmental radiation data: Volume II. A report to the President's Commission on the Accident at Three Mile Island

    International Nuclear Information System (INIS)

    Bretthauer, E.W.; Grossman, R.F.; Thome, D.J.; Smith, A.E.

    1981-03-01

    This report contains a listing of environmental radiation monitoring data collected in the vicinity of Three Mile Island (TMI) following the March 28, 1979 accident. These data were collected by the EPA, NRC, DOE, HHS, the Commonwealth of Pennsylvania, or the Bethlehem Steel Corporation. The original report was printed in September 1979 and the update was released in December 1979. Table 6-Summary of Department of Health, Education, and Welfare (HEW) sampling and analytical procedures; Table 7-Computer printout of environmental data collected by HEW; Table 8-Summary of US Nuclear Regulatory Commission (NRC) sampling and analytical procedures

  8. The use of historical data storage and retrieval systems at nuclear power plants

    International Nuclear Information System (INIS)

    Langen, P.A.

    1984-01-01

    In order to assist the nuclear plant operator in the assessment of useful historical plant information, C-E has developed the Historical Data Storage and Retrieval (HDSR) system, which will record, store, recall, and display historical information as it is needed by plant personnel. The system has been designed to respond to the user's needs under a variety of situations. The user is offered the choice of viewing historical data on color video displays as groups or on computer printouts as logs. The graphical representation is based upon a sectoring concept that provides a zoom-in enlargement of sections of the HDSR graphs

  9. Computer aided periodical and regulated service tests on radiation measuring systems

    International Nuclear Information System (INIS)

    Sandner, W.; Lin, R.; Rothhaupt, W.

    1994-01-01

    Measuring systems for radioactive radiation, which must be registered by official order, have to be tested periodically according to laid down rules (WKP). A strategy for a test-device was drawn up for a flexible adaption of the procedure to individual requests, but also for a standardization of the logical interface to the measuring system. Especially the interaction of testing and normal measuring procedures is clearly defined and transparent; the original functional parts of the measuring run are used during the test as far as possible. Adapation to individual requirements is controlled by ASCII-Files, so that the program code remains unchanged. The functional possibilities are extensive also for the inspections by customers and authorities. Due to the nearly automatical run of the procedure, including printout of the results, the tests are always comparable. The standard was checked by some actual projects, basede on SYSTEM 7000 (Thermo Instrument Systems GmbH) and PC runing under DOS. (orig.) [de

  10. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1979-05-01

    The Nuclear Data Section of the International Atomic Energy Agency receives documents originated by or for the International Nuclear Data Committee for distribution. This list includes all INDC documents received and distributed between January 1968 and May 1979, and supersedes INDC(SEC)-66/UN. This list is produced directly from computer printout in two sorts: one ordered by accession number, and the other ordered by document number with-in each origin series (e.g. listing all INDC(SEC)-documents in one block). Reference to earlier INDSWG and ''interim'' INDC reports received between 1962 and 1967 are listed in report INDC/199 (dated November 1967)

  11. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1987-09-01

    The Nuclear Data Section of the International Atomic Energy Agency receives documents originated by or for the International Nuclear Data Committee for distribution. This list includes all INDC documents received and distributed by the INDC Secretariat during the period January 1984 to March 1986. This list is produced directly from computer printout in two sorts: one ordered by accession number, and the other ordered by document number within each origin series (e.g. listing all INDC(SEC)-documents in one block). In addition to the INDC documents received by the INDC Secretariat for distribution, this document also lists the titles of reports received as single copies for information

  12. Simultaneous real-time data collection methods

    Science.gov (United States)

    Klincsek, Thomas

    1992-01-01

    This paper describes the development of electronic test equipment which executes, supervises, and reports on various tests. This validation process uses computers to analyze test results and report conclusions. The test equipment consists of an electronics component and the data collection and reporting unit. The PC software, display screens, and real-time data-base are described. Pass-fail procedures and data replay are discussed. The OS2 operating system and Presentation Manager user interface system were used to create a highly interactive automated system. The system outputs are hardcopy printouts and MS DOS format files which may be used as input for other PC programs.

  13. Isotopic analysis of radioactive waste packages (an inexpensive approach)

    International Nuclear Information System (INIS)

    Padula, D.A.; Richmond, J.S.

    1983-01-01

    A computer printout of the isotopic analysis for all radioactive waste packages containing resins, or other aqueous filter media is now required at the disposal sites at Barnwell, South Carolina, and Beatty, Nevada. Richland, Washington requires an isotopic analysis for all radioactive waste packages. The NRC (Nuclear Regulatory Commission), through 10 CFR 61, will require shippers of radioactive waste to classify and label for disposal all radioactive waste forms. These forms include resins, filters, sludges, and dry active waste (trash). The waste classification is to be based upon 10 CFR 61 (Section 1-7). The isotopes upon which waste classification is to be based are tabulated. 7 references, 8 tables

  14. A computerized program to educate adults about environmental health risks

    International Nuclear Information System (INIS)

    Adams, M.; Dewey, J.; Schur, P.

    1993-01-01

    A computerized program called Environmental Risk Appraisal (ERA) has been developed to educate adults about environmental health risks and to motivate positive behavior change. A questionnaire addresses issues such as radon, environmental tobacco smoke, pesticides, lead, air and water pollution, and work-site risks. Responses are computer processed in seconds to produce an individualized computer printout containing a score, educational messages, and phone numbers to call for more information. A variety of audiences including environmental groups, worksites, women's organizations and health professionals were represented in this study of 269 participants. Many respondents indicated they were exposed to important environmental hazards and nearly 40 percent reported they had, or might have had, an environmental related illness at some time. Preliminary evaluation indicates the program is effective as an educational tool in raising awareness of environmental health risks

  15. Work improvement by computerizing the process of shielding block production

    International Nuclear Information System (INIS)

    Kang, Dong Hyuk; Jeong, Do Hyeong; Kang, Dong Yoon; Jeon, Young Gung; Hwang, Jae Woong

    2013-01-01

    Introducing CR (Computed Radiography) system created a process of printing therapy irradiation images and converting the degree of enlargement. This is to increase job efficiency and contribute to work improvement using a computerized method with home grown software to simplify this process, work efficiency. Microsoft EXCEL (ver. 2007) and VISUAL BASIC (ver. 6.0) have been used to make the software. A window for each shield block was designed to enter patients' treatment information. Distances on the digital images were measured, the measured data were entered to the Excel program to calculate the degree of enlargement, and printouts were produced to manufacture shield blocks. By computerizing the existing method with this program, the degree of enlargement can easily be calculated and patients' treatment information can be entered into the printouts by using macro function. As a result, errors in calculation which may occur during the process of production or errors that the treatment information may be delivered wrongly can be reduced. In addition, with the simplification of the conversion process of the degree of enlargement, no copy machine was needed, which resulted in the reduction of use of paper. Works have been improved by computerizing the process of block production and applying it to practice which would simplify the existing method. This software can apply to and improve the actual conditions of each hospital in various ways using various features of EXCEL and VISUAL BASIC which has already been proven and used widely

  16. Work improvement by computerizing the process of shielding block production

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dong Hyuk; Jeong, Do Hyeong; Kang, Dong Yoon; Jeon, Young Gung; Hwang, Jae Woong [Proton Therapy Center, National Cancer Center, Goyang (Korea, Republic of)

    2013-09-15

    Introducing CR (Computed Radiography) system created a process of printing therapy irradiation images and converting the degree of enlargement. This is to increase job efficiency and contribute to work improvement using a computerized method with home grown software to simplify this process, work efficiency. Microsoft EXCEL (ver. 2007) and VISUAL BASIC (ver. 6.0) have been used to make the software. A window for each shield block was designed to enter patients' treatment information. Distances on the digital images were measured, the measured data were entered to the Excel program to calculate the degree of enlargement, and printouts were produced to manufacture shield blocks. By computerizing the existing method with this program, the degree of enlargement can easily be calculated and patients' treatment information can be entered into the printouts by using macro function. As a result, errors in calculation which may occur during the process of production or errors that the treatment information may be delivered wrongly can be reduced. In addition, with the simplification of the conversion process of the degree of enlargement, no copy machine was needed, which resulted in the reduction of use of paper. Works have been improved by computerizing the process of block production and applying it to practice which would simplify the existing method. This software can apply to and improve the actual conditions of each hospital in various ways using various features of EXCEL and VISUAL BASIC which has already been proven and used widely.

  17. Automated multispectra alpha spectrometer and data reduction system

    International Nuclear Information System (INIS)

    Hochel, R.C.

    1975-12-01

    A complete hardware and software package for the accumulation and rapid analysis of multiple alpha pulse height spectra has been developed. The system utilizes a 4096-channel analyzer capable of accepting up to sixteen inputs from solid-state surface barrier detectors via mixer-router modules. The analyzer is interfaced to a desk-top programmable calculator and thermal line printer. A chained software package including spectrum printout, peak analysis, plutonium-238 and plutonium-239 data reduction, and automatic energy calibration routines was written. With the chained program a complete printout, peak analysis, and plutonium data reduction of a 512-channel alpha spectrum are obtained in about three minutes with an accuracy within five percent of hand analyses

  18. Connection of control circuits of machine for automatic measurement of radioactive samples

    International Nuclear Information System (INIS)

    Vorlicek, J.

    1984-01-01

    A windowless through-flow gas detector is used for measurement. The automatic machine is controlled by four flip-flops defining the following states: the dish replacement in the measuring space, washing, measurement, measured value print-out, and resetting. The first and second outputs of the first, second and third flip-flops are connected to six inputs of a block whose four outputs provide counter reset and stop-watch reset, washing, measurement, and print-out. Such machine control eliminates measurement errors by disabling sample measurement until air is removed from the measurement space, introduced on an unwashed dish or on several dishes passed under the detector. The elimination of this error is also guaranteed in manual operation. (M.D.)

  19. Wien Automatic System Package (WASP). A computer code for power generating system expansion planning. Version WASP-III Plus. User's manual. Volume 2: Appendices

    International Nuclear Information System (INIS)

    1995-01-01

    With several Member States, the IAEA has completed a new version of the WASP program, which has been called WASP-Ill Plus since it follows quite closely the methodology of the WASP-Ill model. The major enhancements in WASP-Ill Plus with respect to the WASP-Ill version are: increase in the number of thermal fuel types (from 5 to 10); verification of which configurations generated by CONGEN have already been simulated in previous iterations with MERSIM; direct calculation of combined Loading Order of FIXSYS and VARSYS plants; simulation of system operation includes consideration of physical constraints imposed on some fuel types (i.e., fuel availability for electricity generation); extended output of the resimulation of the optimal solution; generation of a file that can be used for graphical representation of the results of the resimulation of the optimal solution and cash flows of the investment costs; calculation of cash flows allows to include the capital costs of plants firmly committed or in construction (FIXSYS plants); user control of the distribution of capital cost expenditures during the construction period (if required to be different from the general 'S' curve distribution used as default). This second volume of the document to support use of the WASP-Ill Plus computer code consists of 5 appendices giving some additional information about the WASP-Ill Plus program. Appendix A is mainly addressed to the WASP-Ill Plus system analyst and supplies some information which could help in the implementation of the program on the user computer facilities. This appendix also includes some aspects about WASP-Ill Plus that could not be treated in detail in Chapters 1 to 11. Appendix B identifies all error and warning messages that may appear in the WASP printouts and advises the user how to overcome the problem. Appendix C presents the flow charts of the programs along with a brief description of the objectives and structure of each module. Appendix D describes the

  20. 77 FR 37068 - Muzaffer Aslan, M.D.; Decision and Order

    Science.gov (United States)

    2012-06-20

    ... submitted printouts it obtained from the California Substance Utilization Review & Evaluation System showing.../ acetaminophen, a schedule III controlled substance, as well as zolpidem tartrate and diethylproprion hcl, both...

  1. A Novel Approach For Ankle Foot Orthosis Developed By Three Dimensional Technologies

    Science.gov (United States)

    Belokar, R. M.; Banga, H. K.; Kumar, R.

    2017-12-01

    This study presents a novel approach for testing mechanical properties of medical orthosis developed by three dimensional (3D) technologies. A hand-held type 3D laser scanner is used for generating 3D mesh geometry directly from patient’s limb. Subsequently 3D printable orthotic design is produced from crude input model by means of Computer Aided Design (CAD) software. Fused Deposition Modelling (FDM) method in Additive Manufacturing (AM) technologies is used to fabricate the 3D printable Ankle Foot Orthosis (AFO) prototype in order to test the mechanical properties on printout. According to test results, printed Acrylonitrile Butadiene Styrene (ABS) AFO prototype has sufficient elasticity modulus and durability for patient-specific medical device manufactured by the 3D technologies.

  2. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    Science.gov (United States)

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  3. Thermal Analysis of Braille Formed by Using Screen Printing and Inks with Thermo Powder

    Directory of Open Access Journals (Sweden)

    Svіtlana HAVENKO

    2015-03-01

    Full Text Available In order to improve the integration of blind people into society, suitable conditions should be provided for them. The expansion of Braille (BR use could serve the purpose. Depending on the materials used for Braille, it can be formed or printed in different ways: embossing, screen printing, thermoforming, digital printing. The aim of this research is to determine the effect of thermal properties of screen printing inks and inks with thermo-powder on the qualitative parameters of Braille. Screen printing inks and inks with thermo-powder were chosen for the research. Carrying out the qualitative analysis of printouts with Braille, the thermal stability was evaluated by analyzing the thermograms obtained with derivatograph Q-1500. This paper presents the findings of the thermogravimetric (TG, differential thermogravimetric (DTG and differential thermal analysis (DTA of printouts printed on paperboard Plike and using traditional screen printing inks and screen printing inks with thermo-powder. Based on the testing findings it is determined that thermal stability of printouts printed with thermo-powder ink is higher than printed with screen printing inks. It is determined that the appropriate temperature range of screen printing inks with thermo-powder drying is 98 ºC – 198 ºC because in this case better relief of Braille dots is obtained.DOI: http://dx.doi.org/10.5755/j01.ms.21.1.5702

  4. PLC and DTAM Software Programs for Pumping Instrumentation and Control Skid P

    International Nuclear Information System (INIS)

    KOCH, M.R.

    2000-01-01

    This document describe the software programs for the Programmable Logic Controller and the Datable Access Module for Pumping Instrumentation and Control skid ''P''. The Appendices contains copies of the printouts of these software programs

  5. Towards gloss control in fine art reproduction

    Science.gov (United States)

    Baar, Teun; Brettel, Hans; Ortiz Segovia, Maria V.

    2015-03-01

    The studies regarding fine art reproduction mainly focus on the accuracy of colour and the recreation of surface texture properties. Since reflection properties other than colour are neglected, important details of the artwork are lost. For instance, gloss properties, often characteristic to painters and particular movements in the history of art, are not well reproduced. The inadequate reproduction of the different gloss levels of a piece of fine art leads to a specular reflection mismatch in printed copies with respect to the original works that affects the perceptual quality of the printout. We used different print parameters of a 3D high resolution printing setup to control the gloss level on a printout locally. Our method can be used to control gloss automatically and in crucial applications such as fine art reproduction.

  6. PLC and DTAM Software Programs for Pumping Instrumentation and Control Skid ''M''

    International Nuclear Information System (INIS)

    KOCH, M.R.

    2000-01-01

    This document describes the software programs for the Programmable Logic Controller and the Data Table Access Module for Pumping Instrumentation and Control skid ''M''. The Appendices contains copies of the printouts of these software programs

  7. PLC/DTAM Software Programs for Pumping Instrumentation and Control Skid ''L''

    International Nuclear Information System (INIS)

    KOCH, M.R.

    2000-01-01

    This document describes the software programs for the Programmable Logic Controller and the Data Table Access Module for Pumping Instrumentation and Control skid ''L''. The Appendices contains copies of the printouts of these software programs

  8. Calibration of ADRET voltage generator type CV102. Program CODAV

    International Nuclear Information System (INIS)

    Lagarde, Gerard.

    1978-07-01

    The CODAV programm studied by the Metrology SES/SME laboratory is used for the calibration of ADRET voltage generator type CV.102. A JCAM.10 microcomputer run the measurement cycle and the printout of the results [fr

  9. Design of an autonomous mobile robot for service applications

    CSIR Research Space (South Africa)

    De Villiers, M

    2011-02-01

    Full Text Available This research project proposes the development of an autonomous, omnidirectional vehicle that will be used for general indoor service applications. A suggested trial application for this service robot will be to deliver printouts to various network...

  10. PLC and DTAM Software Programs for Pumping Instrumentation and Control Skid M

    Energy Technology Data Exchange (ETDEWEB)

    KOCH, M.R.

    2000-02-14

    This document describes the software programs for the Programmable Logic Controller and the Data Table Access Module for Pumping Instrumentation and Control skid ''M''. The Appendices contains copies of the printouts of these software programs.

  11. PLC/DTAM Software Programs for Pumping Instrumentation and Control Skid L

    Energy Technology Data Exchange (ETDEWEB)

    KOCH, M.R.

    2000-01-03

    This document describes the software programs for the Programmable Logic Controller and the Data Table Access Module for Pumping Instrumentation and Control skid ''L''. The Appendices contains copies of the printouts of these software programs.

  12. A users guide for the radioactive waste management code 'SIMULATION 2'

    International Nuclear Information System (INIS)

    Moore, D.; Tymons, B.J.

    1984-09-01

    This report is a users' guide to the radioactive waste management program SIMULATION. It gives a complete description of the calculational method used (with worked examples) a specification of the input data requirements, and samples of printout from the program. (author)

  13. Project W-058 monitor and control system logic

    International Nuclear Information System (INIS)

    ROBERTS, J.B.

    1999-01-01

    This supporting document contains the printout of the control logic for the Project W-058 Monitor and Control System, as developed by Programmable Control Services, Inc. The logic is arranged in five appendices, one for each programmable logic controller console

  14. ERTS-1 Virgin Islands experiment 589: Determine boundaries of ERTS and aircraft data within which useful water quality information can be obtained. [water pollution in St. Thomas harbor, Virgin Islands

    Science.gov (United States)

    Coulbourn, W. C.; Egan, W. G.; Olsen, D. A. (Principal Investigator); Heaslip, G. B.

    1973-01-01

    The author has identified the following significant results. The boundaries of application of ERTS-1 and aircraft data are established for St. Thomas harbor within which useful water quality information can be obtained. In situ physical, chemical, and biological water quality and benthic data were collected. Moored current meters were employed. Optical measurements of solar irradiance, color test panel radiance and water absorption were taken. Procedures for correlating in situ optical, biological, and chemical data with underflight aircraft I2S data and ERTS-1 MSS scanner data are presented. Comparison of bulk and precision CCT computer printout data for this application is made, and a simple method for geometrically locating bulk data individual pixels based on land-water interface is described. ERTS spacecraft data and I2S aircraft imagery are correlated with optical in situ measurements of the harbor water, with the aircraft green photographic and ERTS-1 MSS-4 bands being the most useful. The biological pigments correlate inversely with the optical data for inshore areas and directly further seaward. Automated computer data processing facilitated analysis.

  15. Environmental control implications of generating electric power from coal. 1977 technology status report. Appendix A (Part 2). Coal preparation and cleaning assessment study appendix

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-12-01

    This report presents the results of integrating coal washability and coal reserves data obtained from the U.S. Bureau of Mines. Two computer programs were developed to match the appropriate entries in each data set and then merge the data into the form presented in this report. Approximately 18% of the total demonstrated coal reserves were matched with washability data. However, about 35% of the reserves that account for 80% of current production were successfully matched. Each computer printout specifies the location and size of the reserve, and then describes the coal with data on selected physical and chemical characteristics. Washability data are presented for three crush sizes (1.5 in., /sup 3///sub 8/ in., and 14 mesh) and several specific gravities. In each case, the percent recovery, Btu/lb, percent ash, percent sulfur, lb SO/sub 2//10/sup 6/ Btu, and reserves available at 1.2 lb SO/sub 2//10/sup 6/ Btu are given. The sources of the original data and the methods used in the integration are discussed briefly.

  16. ISOLA II: a FORTRAN IV program for the calculation of long-term dose distribution in the vicinity of nuclear installations

    International Nuclear Information System (INIS)

    Huebschmann, W.; Nagel, D.

    The computer code ISOLA serves for the annual calculation of the radiation burden of the environment of the Nuclear Research Center at Karlsruhe resulting from the release of alpha-active and beta-active off-gases. In the improved version ISOLA II the model of a double Gaussian Distribution function is strictly-maintained, so that the influence due to neighboring sectors is included. The emissions are assumed to be constant in time during a given time period. The user may select either the print-out of an isodose map for a desired area (for example a map square 20 km on each edge) or he may obtain a list of doses for up to 2000 filed points (for example in the surrounding communities). The input and output forms will be shown by the use of an example

  17. Computerized management report system for monitoring manpower and cost

    International Nuclear Information System (INIS)

    Bullington, V.R.; Stephenson, R.L.; Cardwell, R.G.

    1980-04-01

    Although most cost systems offer complete detail and traceability, not all provide timely detail in a concise form useful to senior management. This system was developed for a multifunction research organization funded from many sources. It extracts cost and manpower data from the general cost systems, summarizes it, compares it by program with previous cost periods, and presents it with minimum detail yet with maximum overview. The system monitors the basic manpower distribution of effort at the source, that is, the division time-card input. Cost data are taken from the central computer ahead of the print-out and report-distribution steps; thus, the summary information is available several days ahead of the detailed reports. This procedure has been regularly used for several months, and has proven to be a valuable tool in management action and planning. 9 figures

  18. 50 CFR 679.50 - Groundfish Observer Program.

    Science.gov (United States)

    2010-10-01

    ... weights, scale calibration records, bin sensor readouts, and production records. (viii) Assistance... regulations; printouts or tallies of scale weights; scale calibration records; bin sensor readouts; and... Program's drug and alcohol policy. Observer job pamphlets and the drug and alcohol policy are available...

  19. Nuclear timer/counter

    International Nuclear Information System (INIS)

    Wuthayavanich, S.

    1978-01-01

    This thesis represents the development of a Timer/COUNTER compatible to the standard Nuclear Instrument Module Specifications. The unit exhibits high accuracy, light weight and ease of maintenance. The unit also has a built-in precision discriminator to discriminate unwanted signals that may cause interference in counting. With line frequency time base the timer can be preset in steps from 0.1 sec. to 9 x 10 5 min. The counter with six digits miniature display and an overflow output has a maximum counting rate of 10 MHz. The accumulated counting data can be transferred to a teletype or printer for hard copy printout with the aid of ORTEC 777 Line Printer or 432 A Print-out Control or any print out interface with input compatible to the print output of the Timer/Counter. Owing to its NIM compatibility the unit is directly powered by the NIM power supply

  20. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  1. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  2. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    OpenAIRE

    Dang Hung; Dinh Tien Tuan Anh; Chang Ee-Chien; Ooi Beng Chin

    2017-01-01

    We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation effi...

  3. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  4. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  5. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  6. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  7. Planning for Student Assessment: Participant's Handbook. Bilingual Evaluation Technical Assistance, Workshop III.

    Science.gov (United States)

    California Univ., Los Angeles. Center for the Study of Evaluation.

    This participant's handbook is designed to be used in conjunction with a workshop for planning bilingual student assessment. The following materials are included: (1) simulation materials, including descriptions of simulated programs, tests, test manuals, and printouts; (2) checklists, diagrams, and charts illustrating important points of the…

  8. 76 FR 40329 - Certain Polyester Staple Fiber From the People's Republic of China: Notice of Preliminary Results...

    Science.gov (United States)

    2011-07-08

    ... autonomy from the government in making decisions regarding the selection of management; and (4) whether the... authority to negotiate and sign contracts and other agreements; (3) the companies have autonomy from the...'') as published in the International Financial Statistics of the International Monetary Fund, a printout...

  9. Clinical evaluation of automated processing of electrocardiograms by the Veterans Administration program (AVA 3.4).

    Science.gov (United States)

    Brohet, C R; Richman, H G

    1979-06-01

    Automated processing of electrocardiograms by the Veterans Administration program was evaluated for both agreement with physician interpretation and interpretative accuracy as assessed with nonelectrocardiographic criteria. One thousand unselected electrocardiograms were analyzed by two reviewer groups, one familiar and the other unfamiliar with the computer program. A significant number of measurement errors involving repolarization changes and left axis deviation occurred; however, interpretative disagreements related to statistical decision were largely language-related. Use of a printout with a more traditional format resulted in agreement with physician interpretation by both reviewer groups in more than 80 percent of cases. Overall sensitivity based on agreement with nonelectrocardiographic criteria was significantly greater with use of the computer program than with use of the conventional criteria utilized by the reviewers. This difference was particularly evident in the subgroup analysis of myocardial infarction and left ventricular hypertrophy. The degree of overdiagnosis of left ventricular hypertrophy and posteroinferior infarction was initially unacceptable, but this difficulty was corrected by adjustment of probabilities. Clinical acceptability of the Veterans Administration program appears to require greater physician education than that needed for other computer programs of electrocardiographic analysis; the flexibility of interpretation by statistical decision offers the potential for better diagnostic accuracy.

  10. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  11. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  12. CHARACTERISTICS OF PRODUCTS MADE OF 17-4PH STEEL BY MEANS OF 3D PRINTING METHOD

    Directory of Open Access Journals (Sweden)

    Mariusz WALCZAK

    2016-09-01

    Full Text Available The article presents the results of tests of 17-4PH steel fabricated by means of the method consisting in laser additive manufacturing (LAM – direct metal laser sintering (DMLS. This grade of steel is characterized by excellent stress corrosion resistance in the first place and is applied as construction material in chemical, aircraft, medical or mould making industry. 3D metal printing is a relatively new method enabling significant change of structural properties of these materials at printing parameters predetermined by printers manufacturer for ”offline” printing mode.In order to achieve this goal, the authors have carried out the analysis of chemical composition, SEM tests and the tests of products surface roughness. Furthermore the products have been subjected to X-ray analysis by means of computed tomography (X-ray CT. Structural discontinuities have been found in upper layer and inside printouts subjected to tests.

  13. List of documents received by the INDC Secretariat

    International Nuclear Information System (INIS)

    1984-08-01

    The Nuclear Data Section of the International Atomic Energy Agency receives documents originated by or for the International Nuclear Data Committee for distribution. The present list includes all INDC documents received and distributed by the INDC secretariat during the period January 1982 to June 1984, and supersedes last years' edition of this report, INDC(SEC)-85/UN. This list is produced directly from computer printout in two sorts: one ordered by accession number, and the other ordered by document number within each origin series (e.g. listing all INDC(SEC)-documents in one block). In addition to the INDC documents received by the INDC Secretariat for distribution, this document also lists the titles of reports received as single copies for information. All of the single copy documents which have been received between June 1983 and July 1984 are listed in Appendix A to this document

  14. Three Mile Island nuclear reactor accident of March 1979. Environmental radiation data: Volume I. A report to the President's Commission on the Accident at Three Mile Island

    International Nuclear Information System (INIS)

    Bretthauer, E.W.; Grossman, R.F.; Thome, D.J.; Smith, A.E.

    1981-03-01

    This report contains a listing of environmental radiation monitoring data collected in the vicinity of Three Mile Island (TMI) following the March 28, 1979 accident. These data were collected by the EPA, NRC, DOE, HHS, the Commonwealth of Pennsylvania, or the Bethlehem Steel Corporation. The original report was printed in September 1979 and the update was released in December 1979. Volume 1 consists of the following 5 tables: Table 1-Measurements made by principal participants; Table 2-Cross-check program instituted by US Environmental Protection Agency (EPA) for iodine-131 in milk. Table 3-Comparison of EPA and US Nuclear Regulatory Commission (NRC) air data collected at the Three Mile Island (TMI) Observation Center; Table 4-Summary of EPA Environmental Monitoring Systems Laboratory-Las Vegas (EMSL-LV) and EPA Eastern Environmental Radiation Facility-Montgomery (EERF-Montgomery) sampling and analytical procedures; Table 5-Computer printout of environmental data collected by EPA

  15. Use of remote sensing techniques for inventorying and planning utilization of land resources in South Dakota

    Science.gov (United States)

    Myers, V. I.; Frazee, C. J.; Rusche, A. E.; Moore, D. G.; Nelson, G. D.; Westin, F. C.

    1974-01-01

    The basic procedures for interpreting remote sensing imagery to rapidly develop general soils and land use inventories were developed and utilized in Pennington County, South Dakota. These procedures and remote sensing data products were illustrated and explained to many user groups, some of whom are interested in obtaining similar data. The general soils data were integrated with land soils data supplied by the county director of equalization to prepare a land value map. A computer print-out of this map indicating a land value for each quarter section is being used in tax reappraisal of Pennington County. The land use data provided the land use planners with the present use of land in Pennington County. Additional uses of remote sensing applications are also discussed including tornado damage assessment, hail damage evaluation, and presentation of soil and land value information on base maps assembled from ERTS-1 imagery.

  16. Exshall: A Turkel-Zwas explicit large time-step FORTRAN program for solving the shallow-water equations in spherical coordinates

    Science.gov (United States)

    Navon, I. M.; Yu, Jian

    A FORTRAN computer program is presented and documented applying the Turkel-Zwas explicit large time-step scheme to a hemispheric barotropic model with constraint restoration of integral invariants of the shallow-water equations. We then proceed to detail the algorithms embodied in the code EXSHALL in this paper, particularly algorithms related to the efficiency and stability of T-Z scheme and the quadratic constraint restoration method which is based on a variational approach. In particular we provide details about the high-latitude filtering, Shapiro filtering, and Robert filtering algorithms used in the code. We explain in detail the various subroutines in the EXSHALL code with emphasis on algorithms implemented in the code and present the flowcharts of some major subroutines. Finally, we provide a visual example illustrating a 4-day run using real initial data, along with a sample printout and graphic isoline contours of the height field and velocity fields.

  17. Man-Machine Integrated Design and Analysis System (MIDAS): Functional Overview

    Science.gov (United States)

    Corker, Kevin; Neukom, Christian

    1998-01-01

    Included in the series of screen print-outs illustrates the structure and function of the Man-Machine Integrated Design and Analysis System (MIDAS). Views into the use of the system and editors are featured. The use-case in this set of graphs includes the development of a simulation scenario.

  18. Documentation of the status of international geothermal power plants and a list by country of selected geothermally active governmental and private sector entities

    Energy Technology Data Exchange (ETDEWEB)

    1992-10-01

    This report includes the printouts from the International Geothermal Power Plant Data Base and the Geothermally Active Entity Data Base. Also included are the explanation of the abbreviations used in the power plant data base, maps of geothermal installations by country, and data base questionnaires and mailing lists.

  19. On the Improvement of the "Copyright Law" of Korea for Library Services for Persons with Disabilities

    Science.gov (United States)

    Yoon, Hee-Yoon; Kim, Sin-Young

    2013-01-01

    One of the most important issues for world libraries at the present time is to extend copyright limitations and exceptions for reproduction, for library preservation and distribution services including lending and ILL/DDS, the printout and transmission of Internet information resources, copying of library materials which are rarely available…

  20. Documentation of the status of international geothermal power plants and a list by country of selected geothermally active governmental and private sector entities

    International Nuclear Information System (INIS)

    1992-10-01

    This report includes the printouts from the International Geothermal Power Plant Data Base and the Geothermally Active Entity Data Base. Also included are the explanation of the abbreviations used in the power plant data base, maps of geothermal installations by country, and data base questionnaires and mailing lists

  1. Agreement among graders on Heidelberg retina tomograph (HRT) topographic change analysis (TCA) glaucoma progression interpretation.

    Science.gov (United States)

    Iester, Michele M; Wollstein, Gadi; Bilonick, Richard A; Xu, Juan; Ishikawa, Hiroshi; Kagemann, Larry; Schuman, Joel S

    2015-04-01

    To evaluate agreement among experts of Heidelberg retina tomography's (HRT) topographic change analysis (TCA) printout interpretations of glaucoma progression and explore methods for improving agreement. 109 eyes of glaucoma, glaucoma suspect and healthy subjects with ≥5 visits and 2 good quality HRT scans acquired at each visit were enrolled. TCA printouts were graded as progression or non-progression. Each grader was presented with 2 sets of tests: a randomly selected single test from each visit and both tests from each visit. Furthermore, the TCA printouts were classified with grader's individual criteria and with predefined criteria (reproducible changes within the optic nerve head, disregarding changes along blood vessels or at steep rim locations and signs of image distortion). Agreement among graders was modelled using common latent factor measurement error structural equation models for ordinal data. Assessment of two scans per visit without using the predefined criteria reduced overall agreement, as indicated by a reduction in the slope, reflecting the correlation with the common factor, for all graders with no effect on reducing the range of the intercepts between the graders. Using the predefined criteria improved grader agreement, as indicated by the narrower range of intercepts among the graders compared with assessment using individual grader's criteria. A simple set of predefined common criteria improves agreement between graders in assessing TCA progression. The inclusion of additional scans from each visit does not improve the agreement. We, therefore, recommend setting standardised criteria for TCA progression evaluation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Neural computation and the computational theory of cognition.

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-04-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism-neural processes are computations in the generic sense. After that, we reject on empirical grounds the common assimilation of neural computation to either analog or digital computation, concluding that neural computation is sui generis. Analog computation requires continuous signals; digital computation requires strings of digits. But current neuroscientific evidence indicates that typical neural signals, such as spike trains, are graded like continuous signals but are constituted by discrete functional elements (spikes); thus, typical neural signals are neither continuous signals nor strings of digits. It follows that neural computation is sui generis. Finally, we highlight three important consequences of a proper understanding of neural computation for the theory of cognition. First, understanding neural computation requires a specially designed mathematical theory (or theories) rather than the mathematical theories of analog or digital computation. Second, several popular views about neural computation turn out to be incorrect. Third, computational theories of cognition that rely on non-neural notions of computation ought to be replaced or reinterpreted in terms of neural computation. Copyright © 2012 Cognitive Science Society, Inc.

  3. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  4. The National Shipbuilding Research Program: Report on a Shipyard Surface Preparation and Quality Program

    Science.gov (United States)

    1998-07-23

    laser writer print-outs • As electronic copies using the FrameMaker  file format for duplication and printing by a service bureau, the FrameMaker ...The software platform used to develop the written and visual texts for the program ( FrameMaker ) provides this facility for creating on-line

  5. Simple Model of the Circulation.

    Science.gov (United States)

    Greenway, Clive A.

    1980-01-01

    Describes a program in BASIC-11 that explores the relationships between various variables in the circulatory system and permits manipulation of several semiindependent variables to model the effects of hemorrhage, drug infusions, etc. A flow chart and accompanying sample printout are provided; the program is listed in the appendix. (CS)

  6. Digital blocks in Camac standard for synchrocyclotron investigations

    International Nuclear Information System (INIS)

    Zhuravlev, N.I.; Li Zu Ehk; Nguen Man' Shat; Petrov, A.G.

    1975-01-01

    Described are brief characteristics and block diagrams of the following 12 blocks in the CAMAC standard designed for experiments on a synchrocyclotron: output register, digital printout, frame controller, logic signal commutator, controlled delay, binary counters of 4 types, exposure-set counter, decimal counter with full indication and L signal grader

  7. 78 FR 12091 - Brian Earl Cressman, M.D.; Decision and Order

    Science.gov (United States)

    2013-02-21

    ... Disposition (``MSD''), seeking: (1) Summary disposition; and (2) a recommendation that ``the Respondent's DEA COR as a practitioner be revoked based on the Respondent's lack of a state license.'' MSD, at 5. A...'s ACSC was attached to the MSD. MSD App. A. Additionally, the Government included a printout from...

  8. Technical Reports (Part I). End of Project Report, 1968-1971, Volume III.

    Science.gov (United States)

    Western Nevada Regional Education Center, Lovelock.

    The pamphlets included in this volume are technical reports prepared as outgrowths of the Student Information Systems of the Western Nevada Regional Education Center (WN-REC) funded by a Title III (Elementary and Secondary Education Act) grant. These reports describe methods of interpreting the printouts from the Student Information System;…

  9. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  10. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    Directory of Open Access Journals (Sweden)

    Karlheinz Schwarz

    2013-09-01

    Full Text Available Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized below. In each section a further focusing will be provided by occasionally organizing special issues on topics of high interests, collecting papers on fundamental work in the field. More applied papers should be submitted to their corresponding specialist journals. To help us achieve our goal with this journal, we have an excellent editorial board to advise us on the exciting current and future trends in computation from methodology to application. We very much look forward to hearing all about the research going on across the world. [...

  11. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  12. Operation and maintenance manual Bendix model M-163-01 monitor-controller

    Energy Technology Data Exchange (ETDEWEB)

    Chandler, L.E.

    1981-04-01

    As a part of the modular automatic welding system, the Model M-163-01 Monitor-Controlled permits weld energy and electrode force values to be preset for each of six different weld combinations. It also provides in-process monitoring of RMS current, electrode force, pulse position, and a digital printout of weld data.

  13. Computer architecture fundamentals and principles of computer design

    CERN Document Server

    Dumas II, Joseph D

    2005-01-01

    Introduction to Computer ArchitectureWhat is Computer Architecture?Architecture vs. ImplementationBrief History of Computer SystemsThe First GenerationThe Second GenerationThe Third GenerationThe Fourth GenerationModern Computers - The Fifth GenerationTypes of Computer SystemsSingle Processor SystemsParallel Processing SystemsSpecial ArchitecturesQuality of Computer SystemsGenerality and ApplicabilityEase of UseExpandabilityCompatibilityReliabilitySuccess and Failure of Computer Architectures and ImplementationsQuality and the Perception of QualityCost IssuesArchitectural Openness, Market Timi

  14. Computer surety: computer system inspection guidance. [Contains glossary

    Energy Technology Data Exchange (ETDEWEB)

    1981-07-01

    This document discusses computer surety in NRC-licensed nuclear facilities from the perspective of physical protection inspectors. It gives background information and a glossary of computer terms, along with threats and computer vulnerabilities, methods used to harden computer elements, and computer audit controls.

  15. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  16. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  17. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  18. Use of a generic protocol in documentation of prescription errors in Estonia, Norway and Sweden

    Directory of Open Access Journals (Sweden)

    Haavik S

    2012-06-01

    Full Text Available Pharmacists have an important role in detecting, preventing, and solving prescription problems, which if left unresolved, may pose a risk of harming the patient.Objective: The objectives of this study were to evaluate the feasibility of a generic study instrument for documentation of prescription problems requiring contact with prescriber before dispensing. The study was organized: 1 by countries: Estonia, Norway and Sweden; 2 by type of prescriptions: handwritten prescriptions, printouts of prescriptions in the electronic medical record and electronically transmitted prescriptions to pharmacies; and 3 by recording method - self-completion by pharmacists and independent observers.Methods: Observational study with independent observers at community pharmacies in Estonia (n=4 and Sweden (n=7 and self-completed protocols in Norway (n=9.Results: Pharmacists’ in Estonia contacted the prescriber for 1.47% of the prescriptions, about 3 times as often as in Norway (0.45% and Sweden (0.38%. Handwritten prescriptions dominated among the problem prescriptions in Estonia (73.2%, printouts of prescriptions in the electronic medical record (89.1% in Norway and electronically transmitted prescriptions to pharmacies (55.9% in Sweden.More administrative errors were identified on handwritten prescriptions and printouts of prescriptions in the electronic medical record in Estonia and in Norway compared with electronically transmitted prescriptions to pharmacies in Sweden (p<0.05 for prescription types and p<0.01 for countries. However, clinically important errors and delivery problems appeared equally often on the different types of prescriptions. In all three countries, only few cases of drug interactions and adverse drug reactions were identified.Conclusion: Despite the different patterns of prescription problems in three countries, the instrument was feasible and can be regarded appropriate to document and classify prescription problems necessitating contact

  19. Hierarchical cluster analysis of progression patterns in open-angle glaucoma patients with medical treatment.

    Science.gov (United States)

    Bae, Hyoung Won; Rho, Seungsoo; Lee, Hye Sun; Lee, Naeun; Hong, Samin; Seong, Gong Je; Sung, Kyung Rim; Kim, Chan Yun

    2014-04-29

    To classify medically treated open-angle glaucoma (OAG) by the pattern of progression using hierarchical cluster analysis, and to determine OAG progression characteristics by comparing clusters. Ninety-five eyes of 95 OAG patients who received medical treatment, and who had undergone visual field (VF) testing at least once per year for 5 or more years. OAG was classified into subgroups using hierarchical cluster analysis based on the following five variables: baseline mean deviation (MD), baseline visual field index (VFI), MD slope, VFI slope, and Glaucoma Progression Analysis (GPA) printout. After that, other parameters were compared between clusters. Two clusters were made after a hierarchical cluster analysis. Cluster 1 showed -4.06 ± 2.43 dB baseline MD, 92.58% ± 6.27% baseline VFI, -0.28 ± 0.38 dB per year MD slope, -0.52% ± 0.81% per year VFI slope, and all "no progression" cases in GPA printout, whereas cluster 2 showed -8.68 ± 3.81 baseline MD, 77.54 ± 12.98 baseline VFI, -0.72 ± 0.55 MD slope, -2.22 ± 1.89 VFI slope, and seven "possible" and four "likely" progression cases in GPA printout. There were no significant differences in age, sex, mean IOP, central corneal thickness, and axial length between clusters. However, cluster 2 included more high-tension glaucoma patients and used a greater number of antiglaucoma eye drops significantly compared with cluster 1. Hierarchical cluster analysis of progression patterns divided OAG into slow and fast progression groups, evidenced by assessing the parameters of glaucomatous progression in VF testing. In the fast progression group, the prevalence of high-tension glaucoma was greater and the number of antiglaucoma medications administered was increased versus the slow progression group. Copyright 2014 The Association for Research in Vision and Ophthalmology, Inc.

  20. READS: the rapid electronic assessment documentation system.

    LENUS (Irish Health Repository)

    Hickey, Ann

    2012-12-13

    Patient documentation is time consuming and can detract from care. The authors report a novel computer programme that manipulates routinely collected information to quantify nursing workload, along with the reason for admission, functional status, estimates of in-hospital mortality and life expectancy. The programme stores information in a database, and produces a print-out in a situation\\/background\\/assessment\\/recommendation (SBAR) format. The average time taken to enter 629 patient encounters was 6.6 minutes. Pain was the most common presentation for low workload patients, while high workload patients often presented with altered mental status and reduced mobility. There was only a modest correlation between the risk of death and nursing workload. The programme measures nursing workload without further paperwork, and improves routine documentation with a legible brief report that is automatically generated. This report can be shared and provides data that is immediately available for day-to-day care, audit, quality control and service planning.

  1. Report of the International Ice Patrol Service in the North Atlantic Ocean. Season of 1979.

    Science.gov (United States)

    1979-01-01

    used is ( Scobie and Schultz, 1976) and it is basically these the sum of a mean value and a wind driven compo- updated currents which were used during...direction REFERENCES and U (east-west) and V (north-south) components Scobie , R. W. and R. H. Schultz (1976). Oceanography of on a printout and plots

  2. Fulltext PDF

    Indian Academy of Sciences (India)

    educational institutions of the country to interact and learn important topics on PDE and its applications from some of the experts of this subject. Applications are invited ... After completing online application, one must take a printout of the application, get it forwarded by the head of the institution, and email its scanned copy to ...

  3. Technical Reports (Part II). End of Project Report, 1968-1971, Volume IV.

    Science.gov (United States)

    Western Nevada Regional Education Center, Lovelock.

    The pamphlets included in this volume are technical reports prepared as outgrowths of the Student Information System of the Western Nevada Regional Education Center funded by a Title III grant under the Elementary and Secondary Education Act of 1965. These reports demonstrate the use of the stored data; methods of interpreting the printouts from…

  4. Computation code TEP 1 for automated evaluation of technical and economic parameters of operation of WWER-440 nuclear power plant units

    International Nuclear Information System (INIS)

    Zadrazil, J.; Cvan, M.; Strimelsky, V.

    1987-01-01

    The TEP 1 program is used for automated evaluation of the technical and economic parameters of nuclear power plant units with WWER-440 reactors. This is an application program developed by the Research Institute for Nuclear Power Plants in Jaslovske Bohunice for the KOMPLEX-URAN 2M information system, delivered by the USSR to the V-2 nuclear power plants in Jaslovske Bohunice and in Dukovany. The TEP 1 program is written in FORTRAN IV and its operation has two parts. First the evaluation of technical and economic parameters of operation for a calculation interval of 10 mins and second, the control of the calculation procedure, follow-up on input data, determination of technical and economic parameters for a lengthy time interval, and data printout and storage. The TEP 1 program was tested at the first unit of the V-2 power plant and no serious faults appeared in the process of the evaluation of technical and economic parameters. A modification of the TEP 1 programme for the Dukovany nuclear power plant is now being tested on the first unit of the plant. (Z.M.)

  5. Privacy-Preserving Computation with Trusted Computing via Scramble-then-Compute

    Directory of Open Access Journals (Sweden)

    Dang Hung

    2017-07-01

    Full Text Available We consider privacy-preserving computation of big data using trusted computing primitives with limited private memory. Simply ensuring that the data remains encrypted outside the trusted computing environment is insufficient to preserve data privacy, for data movement observed during computation could leak information. While it is possible to thwart such leakage using generic solution such as ORAM [42], designing efficient privacy-preserving algorithms is challenging. Besides computation efficiency, it is critical to keep trusted code bases lean, for large ones are unwieldy to vet and verify. In this paper, we advocate a simple approach wherein many basic algorithms (e.g., sorting can be made privacy-preserving by adding a step that securely scrambles the data before feeding it to the original algorithms. We call this approach Scramble-then-Compute (StC, and give a sufficient condition whereby existing external memory algorithms can be made privacy-preserving via StC. This approach facilitates code-reuse, and its simplicity contributes to a smaller trusted code base. It is also general, allowing algorithm designers to leverage an extensive body of known efficient algorithms for better performance. Our experiments show that StC could offer up to 4.1× speedups over known, application-specific alternatives.

  6. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  7. Compilation of modal analyses of volcanic rocks from the Nevada Test Site area, Nye County, Nevada

    International Nuclear Information System (INIS)

    Page, W.R.

    1990-01-01

    Volcanic rock samples collected from the Nevada Test Site, Nye County, Nevada, between 1960 and 1985 were analyzed by thin section to obtain petrographic mode data. In order to provide rapid accessibility to the entire database, all data from the cards were entered into a computerized database. This computer format will enable workers involved in stratigraphic studies in the Nevada Test Site area and other locations in southern Nevada to perform independent analyses of the data. The data were compiled from the mode cards into two separate computer files. The first file consists of data collected from core samples taken from drill holes in the Yucca Mountain area. The second group of samples were collected from measured sections and surface mapping traverses in the Nevada Test Site area. Each data file is composed of computer printouts of tables with mode data from thin section point counts, comments on additional data, and location data. Tremendous care was taken in transferring the data from the cards to computer, in order to preserve the original information and interpretations provided by the analyzer. In addition to the data files above, a file is included that consists of Nevada Test Site petrographic data published in other US Geological Survey and Los Alamos National Laboratory reports. These data are presented to supply the user with an essentially complete modal database of samples from the volcanic stratigraphic section in the Nevada Test Site area. 18 refs., 4 figs

  8. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  9. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  10. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  11. Seven health physics calculator programs for the HP-41CV

    International Nuclear Information System (INIS)

    Rittmann, P.D.

    1984-08-01

    Several user-oriented programs for the Hewlett-Packard HP-41CV are explained. The first program builds, stores, alters, and ages a list of radionuclides. This program only handles single- and double-decay chains. The second program performs convenient conversions for the six nuclides of concern in plutonium handling. The conversions are between mass, activity, and weight percents of the isotopes. The source can be aged and/or neutron generation rates can be computed. The third program is a timekeeping program that improves the process of manually estimating and tracking personnel exposure during high dose rate tasks by replacing the pencil, paper, and stopwatch method. This program requires a time module. The remaining four programs deal with computations of time-integrated air concentrations at various distances from an airborne release. Building wake effects, source depletion by ground deposition, and sector averaging can all be included in the final printout of the X/Q - Hanford and X/Q - Pasquill programs. The shorter versions of these, H/Q and P/Q, compute centerline or sector-averaged values and include a subroutine to facilitate dose estimation by entering dose factors and quantities released. The horizontal and vertical dispersion parameters in the Pasquill-Gifford programs were modeled with simple, two-parameter functions that agreed very well with the usual textbook graphs. 8 references, 7 appendices

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  14. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  15. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  16. Quantum Computing and the Limits of the Efficiently Computable

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    I'll discuss how computational complexity---the study of what can and can't be feasibly computed---has been interacting with physics in interesting and unexpected ways. I'll first give a crash course about computer science's P vs. NP problem, as well as about the capabilities and limits of quantum computers. I'll then touch on speculative models of computation that would go even beyond quantum computers, using (for example) hypothetical nonlinearities in the Schrodinger equation. Finally, I'll discuss BosonSampling ---a proposal for a simple form of quantum computing, which nevertheless seems intractable to simulate using a classical computer---as well as the role of computational complexity in the black hole information puzzle.

  17. COMPARATIVE STUDY OF CLOUD COMPUTING AND MOBILE CLOUD COMPUTING

    OpenAIRE

    Nidhi Rajak*, Diwakar Shukla

    2018-01-01

    Present era is of Information and Communication Technology (ICT) and there are number of researches are going on Cloud Computing and Mobile Cloud Computing such security issues, data management, load balancing and so on. Cloud computing provides the services to the end user over Internet and the primary objectives of this computing are resource sharing and pooling among the end users. Mobile Cloud Computing is a combination of Cloud Computing and Mobile Computing. Here, data is stored in...

  18. Patients Reading Their Medical Records: Differences in Experiences and Attitudes between Regular and Inexperienced Readers

    Science.gov (United States)

    Huvila, Isto; Daniels, Mats; Cajander, Åsa; Åhlfeldt, Rose-Mharie

    2016-01-01

    Introduction: We report results of a study of how ordering and reading of printouts of medical records by regular and inexperienced readers relate to how the records are used, to the health information practices of patients, and to their expectations of the usefulness of new e-Health services and online access to medical records. Method: The study…

  19. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  20. Quantum Computing's Classical Problem, Classical Computing's Quantum Problem

    OpenAIRE

    Van Meter, Rodney

    2013-01-01

    Tasked with the challenge to build better and better computers, quantum computing and classical computing face the same conundrum: the success of classical computing systems. Small quantum computing systems have been demonstrated, and intermediate-scale systems are on the horizon, capable of calculating numeric results or simulating physical systems far beyond what humans can do by hand. However, to be commercially viable, they must surpass what our wildly successful, highly advanced classica...

  1. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  2. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  3. Computational biomechanics for medicine imaging, modeling and computing

    CERN Document Server

    Doyle, Barry; Wittek, Adam; Nielsen, Poul; Miller, Karol

    2016-01-01

    The Computational Biomechanics for Medicine titles provide an opportunity for specialists in computational biomechanics to present their latest methodologies and advancements. This volume comprises eighteen of the newest approaches and applications of computational biomechanics, from researchers in Australia, New Zealand, USA, UK, Switzerland, Scotland, France and Russia. Some of the interesting topics discussed are: tailored computational models; traumatic brain injury; soft-tissue mechanics; medical image analysis; and clinically-relevant simulations. One of the greatest challenges facing the computational engineering community is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. We hope the research presented within this book series will contribute to overcoming this grand challenge.

  4. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  5. Explorations in computing an introduction to computer science

    CERN Document Server

    Conery, John S

    2010-01-01

    Introduction Computation The Limits of Computation Algorithms A Laboratory for Computational ExperimentsThe Ruby WorkbenchIntroducing Ruby and the RubyLabs environment for computational experimentsInteractive Ruby Numbers Variables Methods RubyLabs The Sieve of EratosthenesAn algorithm for finding prime numbersThe Sieve Algorithm The mod Operator Containers Iterators Boolean Values and the delete if Method Exploring the Algorithm The sieve Method A Better Sieve Experiments with the Sieve A Journey of a Thousand MilesIteration as a strategy for solving computational problemsSearching and Sortin

  6. Computer Networking Laboratory for Undergraduate Computer Technology Program

    National Research Council Canada - National Science Library

    Naghedolfeizi, Masoud

    2000-01-01

    ...) To improve the quality of education in the existing courses related to computer networks and data communications as well as other computer science courses such programming languages and computer...

  7. Mathematics, Physics and Computer Sciences The computation of ...

    African Journals Online (AJOL)

    Mathematics, Physics and Computer Sciences The computation of system matrices for biquadraticsquare finite ... Global Journal of Pure and Applied Sciences ... The computation of system matrices for biquadraticsquare finite elements.

  8. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  9. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  10. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  11. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  12. 3D face reconstruction from 2D pictures: first results of a web-based computer aided system for aesthetic procedures.

    Science.gov (United States)

    Oliveira-Santos, Thiago; Baumberger, Christian; Constantinescu, Mihai; Olariu, Radu; Nolte, Lutz-Peter; Alaraibi, Salman; Reyes, Mauricio

    2013-05-01

    The human face is a vital component of our identity and many people undergo medical aesthetics procedures in order to achieve an ideal or desired look. However, communication between physician and patient is fundamental to understand the patient's wishes and to achieve the desired results. To date, most plastic surgeons rely on either "free hand" 2D drawings on picture printouts or computerized picture morphing. Alternatively, hardware dependent solutions allow facial shapes to be created and planned in 3D, but they are usually expensive or complex to handle. To offer a simple and hardware independent solution, we propose a web-based application that uses 3 standard 2D pictures to create a 3D representation of the patient's face on which facial aesthetic procedures such as filling, skin clearing or rejuvenation, and rhinoplasty are planned in 3D. The proposed application couples a set of well-established methods together in a novel manner to optimize 3D reconstructions for clinical use. Face reconstructions performed with the application were evaluated by two plastic surgeons and also compared to ground truth data. Results showed the application can provide accurate 3D face representations to be used in clinics (within an average of 2 mm error) in less than 5 min.

  13. Task 5c: measurement and instrumentation under subsystem design of the LLL safeguard material control program. [For fuel reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    1976-12-31

    A product survey was conducted of all security products currently available on the market. Documentation is presented of the survey and a printout of the data is included. A general description is given of new but recommended instrumentation and security devices for application to fuel reprocessing plants. Security systems and hardware recommended for development, assembly, and testing are discussed briefly. (DLC)

  14. A comprehensive inventory of radiological and nonradiological contaminants in waste buried or projected to be buried in the subsurface disposal area of the INEL RWMC during the years 1984-2003, Volume 2

    International Nuclear Information System (INIS)

    1995-05-01

    This is the second volume of this comprehensive report of the inventory of radiological and nonradiological contaminants in waste buried or projected to be buried in the subsurface disposal area of the Idaho National Engineering Laboratory. Appendix B contains a complete printout of contaminant inventory and other information from the CIDRA Database and is presented in volumes 2 and 3 of the report

  15. Task 5c: measurement and instrumentation under subsystem design of the LLL safeguard material control program

    International Nuclear Information System (INIS)

    1976-01-01

    A product survey was conducted of all security products currently available on the market. Documentation is presented of the survey and a printout of the data is included. A general description is given of new but recommended instrumentation and security devices for application to fuel reprocessing plants. Security systems and hardware recommended for development, assembly, and testing are discussed briefly

  16. On teaching computer ethics within a computer science department.

    Science.gov (United States)

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  17. DELIN and DELOG codes for graphic representation of gamma ray spectra; Programas DELIN y DELOG para la representacion grafica de espectros gamma

    Energy Technology Data Exchange (ETDEWEB)

    Romero, L; Travesi, A

    1983-07-01

    Two Fortran IV Codes has been developed for graphic representation of the gamma-ray spectra obtained with Ge Li detectors and multichannel analyzers. The grafic plotting es carried out with the H.P. Graphic Plotter Mod HP-7221 A, using the graphic package software GRAPHICS-1000 from Hewlett-Packard. The codes have a great versatility and the representation of gamma spectra can ba done in a lineal, semi log, or log-log scale, as desired. The gamma ray spectra data are feed into the computer through magnetic tape or perfored paper tape. The different out-put options and complementary data are given in a conversational way through a terminal with T.V. displays. Among the options that can be selected by the user are the following: - smoothing the spectra - drawing the spectra point by point or continuous - out-put drawing an 1, 2, or 4 sheet with automatic division of the energy scale. - overlapping of selected spectra regions in Y scale ampliation with automatic print-out of the region limits and ampliation factor. - Printing spectra data and identifications of selected photo peaks. The codes can be employed with any computer using printing devices, HP-Graphics 1000 software compatible, but are easily modified for another printing software since their modular structure with Fortran IV written.

  18. Delin and Delog codes for graphic representation of gamma ray spectra

    International Nuclear Information System (INIS)

    Travesi, A.; Romero, L.

    1983-01-01

    Two FORTRAN IV Codes have been developed for graphic representation of the gamma-ray spectra obtained with GeLi detectors and multichannel analyzers. The graphic plotting is carried out with the H.P. Graphic Plotter Mod HP-7221 A, using the graphic package software GRAPHICS-1000 from Hewlett-Packard. The codes have a great versatility and the representation of gamma spectra can be done in a lineal, semilog, or log-log scale, as desired. The gamma ray spectra data are fed into the computer through magnetic tape or perforated paper tape. The different output options and complementary data are given in a conversational way through a terminal with TV display. Among the options that can be selected by the user are the following: 1) smoothing the spectra; 2) drawing the spectra point by point or continuous; 3) output drawing in 1, 2 or 4 sheets with automatic division of the energy scale; 4) overlapping of selected spectra regions in γ-scale ampliation with automatic printout of the region limits and ampliation factor; 5) printing spectra data and identifications of selected photopeaks. The codes can be employed with any computer using printing devices, HP-GRAPHICS 1000 software compatible, but are easily modified for another printing software since their modular structure with FORTRAN IV written subroutines. (author)

  19. DELIN and DELOG codes for graphic representation of gamma ray spectra

    International Nuclear Information System (INIS)

    Romero, L.; Travesi, A.

    1983-01-01

    Two Fortran IV Codes has been developed for graphic representation of the gamma-ray spectra obtained with Ge Li detectors and multichannel analyzers. The grafic plotting es carried out with the H.P. Graphic Plotter Mod HP-7221 A, using the graphic package software GRAPHICS-1000 from Hewlett-Packard. The codes have a great versatility and the representation of gamma spectra can ba done in a lineal, semi log, or log-log scale, as desired. The gamma ray spectra data are feed into the computer through magnetic tape or perfored paper tape. The different out-put options and complementary data are given in a conversational way through a terminal with T.V. displays. Among the options that can be selected by the user are the following: - smoothing the spectra - drawing the spectra point by point or continuous - out-put drawing an 1, 2, or 4 sheet with automatic division of the energy scale. - overlapping of selected spectra regions in Y scale ampliation with automatic print-out of the region limits and ampliation factor. - Printing spectra data and identifications of selected photo peaks. The codes can be employed with any computer using printing devices, HP-Graphics 1000 software compatible, but are easily modified for another printing software since their modular structure with Fortran IV written

  20. Parallel quantum computing in a single ensemble quantum computer

    International Nuclear Information System (INIS)

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  1. ELASTIC CLOUD COMPUTING ARCHITECTURE AND SYSTEM FOR HETEROGENEOUS SPATIOTEMPORAL COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Shi

    2017-10-01

    Full Text Available Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs, while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  2. Elastic Cloud Computing Architecture and System for Heterogeneous Spatiotemporal Computing

    Science.gov (United States)

    Shi, X.

    2017-10-01

    Spatiotemporal computation implements a variety of different algorithms. When big data are involved, desktop computer or standalone application may not be able to complete the computation task due to limited memory and computing power. Now that a variety of hardware accelerators and computing platforms are available to improve the performance of geocomputation, different algorithms may have different behavior on different computing infrastructure and platforms. Some are perfect for implementation on a cluster of graphics processing units (GPUs), while GPUs may not be useful on certain kind of spatiotemporal computation. This is the same situation in utilizing a cluster of Intel's many-integrated-core (MIC) or Xeon Phi, as well as Hadoop or Spark platforms, to handle big spatiotemporal data. Furthermore, considering the energy efficiency requirement in general computation, Field Programmable Gate Array (FPGA) may be a better solution for better energy efficiency when the performance of computation could be similar or better than GPUs and MICs. It is expected that an elastic cloud computing architecture and system that integrates all of GPUs, MICs, and FPGAs could be developed and deployed to support spatiotemporal computing over heterogeneous data types and computational problems.

  3. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  4. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  5. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  6. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  7. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  9. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  10. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  11. Fast computation of the characteristics method on vector computers

    International Nuclear Information System (INIS)

    Kugo, Teruhiko

    2001-11-01

    Fast computation of the characteristics method to solve the neutron transport equation in a heterogeneous geometry has been studied. Two vector computation algorithms; an odd-even sweep (OES) method and an independent sequential sweep (ISS) method have been developed and their efficiency to a typical fuel assembly calculation has been investigated. For both methods, a vector computation is 15 times faster than a scalar computation. From a viewpoint of comparison between the OES and ISS methods, the followings are found: 1) there is a small difference in a computation speed, 2) the ISS method shows a faster convergence and 3) the ISS method saves about 80% of computer memory size compared with the OES method. It is, therefore, concluded that the ISS method is superior to the OES method as a vectorization method. In the vector computation, a table-look-up method to reduce computation time of an exponential function saves only 20% of a whole computation time. Both the coarse mesh rebalance method and the Aitken acceleration method are effective as acceleration methods for the characteristics method, a combination of them saves 70-80% of outer iterations compared with a free iteration. (author)

  12. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  13. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  14. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  15. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  16. Die spacer thickness reproduction for central incisor crown fabrication with combined computer-aided design and 3D printing technology: an in vitro study.

    Science.gov (United States)

    Hoang, Lisa N; Thompson, Geoffrey A; Cho, Seok-Hwan; Berzins, David W; Ahn, Kwang Woo

    2015-05-01

    The inability to control die spacer thickness has been reported. However, little information is available on the congruency between the computer-aided design parameters for die spacer thickness and the actual printout. The purpose of this study was to evaluate the accuracy and precision of the die spacer thickness achieved by combining computer-aided design and 3-dimensional printing technology. An ivorine maxillary central incisor was prepared for a ceramic crown. The prepared tooth was duplicated by using polyvinyl siloxane duplicating silicone, and 80 die-stone models were produced from Type IV dental stone. The dies were randomly divided into 5 groups with assigned die spacer thicknesses of 25 μm, 45 μm, 65 μm, 85 μm, and 105 μm (n=16). The printed resin copings, obtained from a printer (ProJet DP 3000; 3D Systems), were cemented onto their respective die-stone models with self-adhesive resin cement and stored at room temperature until sectioning into halves in a buccolingual direction. The internal gap was measured at 5 defined locations per side of the sectioned die. Images of the printed resin coping/die-stone model internal gap dimensions were obtained with an inverted bright field metallurgical microscope at ×100 magnification. The acquired digital image was calibrated, and measurements were made using image analysis software. Mixed models (α=.05) were used to evaluate accuracy. A false discovery rate at 5% was used to adjust for multiple testing. Coefficient of variation was used to determine the precision for each group and was evaluated statistically with the Wald test (α=.05). The accuracy, expressed in terms of the mean differences between the prescribed die spacer thickness and the measured internal gap (standard deviation), was 50 μm (11) for the 25 μm group simulated die spacer thickness, 30 μm (10) for the 45 μm group, 15 μm (14) for the 65 μm group, 3 μm (23) for the 85 μm group, and -10 μm (32) for the 105 μm group. The

  17. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  18. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  19. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  20. Computer Series, 3: Computer Graphics for Chemical Education.

    Science.gov (United States)

    Soltzberg, Leonard J.

    1979-01-01

    Surveys the current scene in computer graphics from the point of view of a chemistry educator. Discusses the scope of current applications of computer graphics in chemical education, and provides information about hardware and software systems to promote communication with vendors of computer graphics equipment. (HM)

  1. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  2. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  3. Pascal-SC a computer language for scientific computation

    CERN Document Server

    Bohlender, Gerd; von Gudenberg, Jürgen Wolff; Rheinboldt, Werner; Siewiorek, Daniel

    1987-01-01

    Perspectives in Computing, Vol. 17: Pascal-SC: A Computer Language for Scientific Computation focuses on the application of Pascal-SC, a programming language developed as an extension of standard Pascal, in scientific computation. The publication first elaborates on the introduction to Pascal-SC, a review of standard Pascal, and real floating-point arithmetic. Discussions focus on optimal scalar product, standard functions, real expressions, program structure, simple extensions, real floating-point arithmetic, vector and matrix arithmetic, and dynamic arrays. The text then examines functions a

  4. Inkjet 3D printing of microfluidic structures—on the selection of the printer towards printing your own microfluidic chips

    International Nuclear Information System (INIS)

    Walczak, Rafał; Adamski, Krzysztof

    2015-01-01

    This article reports, for the first time, the results of detailed research on the application of inkjet 3D printing for the fabrication of microfluidic structures. CAD designed test structures were printed with four different printers. Dimensional fidelity, shape conformity, and surface roughness were studied for each printout. It was found that the minimum dimension (width or depth) for a properly printed microfluidic channel was approximately 200 μm. Although the nominal resolution of the printers was one order of magnitude better, smaller structures were significantly deformed or not printed at all. It was also found that a crucial step in one-step fabrication of embedded microchannels is the removal of the support material. We also discuss the source of print error and present a way to evaluate other printers. The printouts obtained from the four different printers were compared, and the optimal printing technique and printer were used to fabricate a microfluidic structure for the spectrophotometric characterisation of beverages. UV/VIS absorbance characteristics were collected using this microfluidic structure, demonstrating that the fabricated spectrophotometric chip operated properly. Thus, a proof-of-concept for using inkjet 3D printing for the fabrication of microfluidic structures was obtained. (paper)

  5. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  6. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  7. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  8. Mobile cloud computing for computation offloading: Issues and challenges

    Directory of Open Access Journals (Sweden)

    Khadija Akherfi

    2018-01-01

    Full Text Available Despite the evolution and enhancements that mobile devices have experienced, they are still considered as limited computing devices. Today, users become more demanding and expect to execute computational intensive applications on their smartphone devices. Therefore, Mobile Cloud Computing (MCC integrates mobile computing and Cloud Computing (CC in order to extend capabilities of mobile devices using offloading techniques. Computation offloading tackles limitations of Smart Mobile Devices (SMDs such as limited battery lifetime, limited processing capabilities, and limited storage capacity by offloading the execution and workload to other rich systems with better performance and resources. This paper presents the current offloading frameworks, computation offloading techniques, and analyzes them along with their main critical issues. In addition, it explores different important parameters based on which the frameworks are implemented such as offloading method and level of partitioning. Finally, it summarizes the issues in offloading frameworks in the MCC domain that requires further research.

  9. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  10. Building a cluster computer for the computing grid of tomorrow

    International Nuclear Information System (INIS)

    Wezel, J. van; Marten, H.

    2004-01-01

    The Grid Computing Centre Karlsruhe takes part in the development, test and deployment of hardware and cluster infrastructure, grid computing middleware, and applications for particle physics. The construction of a large cluster computer with thousands of nodes and several PB data storage capacity is a major task and focus of research. CERN based accelerator experiments will use GridKa, one of only 8 world wide Tier-1 computing centers, for its huge computer demands. Computing and storage is provided already for several other running physics experiments on the exponentially expanding cluster. (orig.)

  11. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  12. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  13. New computational paradigms changing conceptions of what is computable

    CERN Document Server

    Cooper, SB; Sorbi, Andrea

    2007-01-01

    This superb exposition of a complex subject examines new developments in the theory and practice of computation from a mathematical perspective. It covers topics ranging from classical computability to complexity, from biocomputing to quantum computing.

  14. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  15. Material test data of SUS304 welded joints

    Energy Technology Data Exchange (ETDEWEB)

    Asayama, Tai [Japan Nuclear Cycle Development Inst., Oarai, Ibaraki (Japan). Oarai Engineering Center; Kawakami, Tomohiro [Nuclear Energy System Incorporation, Tokyo (Japan)

    1999-10-01

    This report summarizes the material test data of SUS304 welded joints. Numbers of the data are as follows: Tensile tests 71 (Post-irradiation: 39, Others: 32), Creep tests 77 (Post-irradiation: 20, Others: 57), Fatigue tests 50 (Post-irradiation: 0), Creep-fatigue tests 14 (Post-irradiation: 0). This report consists of the printouts from 'the structural material data processing system'. (author)

  16. Patient Care Planning: An Interdisciplinary Approach

    OpenAIRE

    Prophet, Colleen M.

    1989-01-01

    The INFORMM Patient Care Planning System provides interdepartmental communication and individualized patient care plans based upon current standards of care. This interdisciplinary system facilitates the identification of patient problems and nursing diagnoses as well as patient care orders. The selected nurses' and physicians' orders are integrated and organized by care plan categories in printouts. As a system by-product, Patient Care Planning automatically generates and calculates patient ...

  17. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  18. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  19. Abstract quantum computing machines and quantum computational logics

    Science.gov (United States)

    Chiara, Maria Luisa Dalla; Giuntini, Roberto; Sergioli, Giuseppe; Leporini, Roberto

    2016-06-01

    Classical and quantum parallelism are deeply different, although it is sometimes claimed that quantum Turing machines are nothing but special examples of classical probabilistic machines. We introduce the concepts of deterministic state machine, classical probabilistic state machine and quantum state machine. On this basis, we discuss the question: To what extent can quantum state machines be simulated by classical probabilistic state machines? Each state machine is devoted to a single task determined by its program. Real computers, however, behave differently, being able to solve different kinds of problems. This capacity can be modeled, in the quantum case, by the mathematical notion of abstract quantum computing machine, whose different programs determine different quantum state machines. The computations of abstract quantum computing machines can be linguistically described by the formulas of a particular form of quantum logic, termed quantum computational logic.

  20. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  1. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  2. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  3. COMPUTER-ASSISTED ACCOUNTING

    Directory of Open Access Journals (Sweden)

    SORIN-CIPRIAN TEIUŞAN

    2009-01-01

    Full Text Available What is computer-assisted accounting? Where is the place and what is the role of the computer in the financial-accounting activity? What is the position and importance of the computer in the accountant’s activity? All these are questions that require scientific research in order to find the answers. The paper approaches the issue of the support granted to the accountant to organize and manage the accounting activity by the computer. Starting from the notions of accounting and computer, the concept of computer-assisted accounting is introduced, it has a general character and it refers to the accounting performed with the help of the computer or using the computer to automate the procedures performed by the person who is doing the accounting activity; this is a concept used to define the computer applications of the accounting activity. The arguments regarding the use of the computer to assist accounting targets the accounting informatization, the automating of the financial-accounting activities and the endowment with modern technology of the contemporary accounting.

  4. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  5. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  6. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  7. Touchable Computing: Computing-Inspired Bio-Detection.

    Science.gov (United States)

    Chen, Yifan; Shi, Shaolong; Yao, Xin; Nakano, Tadashi

    2017-12-01

    We propose a new computing-inspired bio-detection framework called touchable computing (TouchComp). Under the rubric of TouchComp, the best solution is the cancer to be detected, the parameter space is the tissue region at high risk of malignancy, and the agents are the nanorobots loaded with contrast medium molecules for tracking purpose. Subsequently, the cancer detection procedure (CDP) can be interpreted from the computational optimization perspective: a population of externally steerable agents (i.e., nanorobots) locate the optimal solution (i.e., cancer) by moving through the parameter space (i.e., tissue under screening), whose landscape (i.e., a prescribed feature of tissue environment) may be altered by these agents but the location of the best solution remains unchanged. One can then infer the landscape by observing the movement of agents by applying the "seeing-is-sensing" principle. The term "touchable" emphasizes the framework's similarity to controlling by touching the screen with a finger, where the external field for controlling and tracking acts as the finger. Given this analogy, we aim to answer the following profound question: can we look to the fertile field of computational optimization algorithms for solutions to achieve effective cancer detection that are fast, accurate, and robust? Along this line of thought, we consider the classical particle swarm optimization (PSO) as an example and propose the PSO-inspired CDP, which differs from the standard PSO by taking into account realistic in vivo propagation and controlling of nanorobots. Finally, we present comprehensive numerical examples to demonstrate the effectiveness of the PSO-inspired CDP for different blood flow velocity profiles caused by tumor-induced angiogenesis. The proposed TouchComp bio-detection framework may be regarded as one form of natural computing that employs natural materials to compute.

  8. International Conference on Computer, Communication and Computational Sciences

    CERN Document Server

    Mishra, Krishn; Tiwari, Shailesh; Singh, Vivek

    2017-01-01

    Exchange of information and innovative ideas are necessary to accelerate the development of technology. With advent of technology, intelligent and soft computing techniques came into existence with a wide scope of implementation in engineering sciences. Keeping this ideology in preference, this book includes the insights that reflect the ‘Advances in Computer and Computational Sciences’ from upcoming researchers and leading academicians across the globe. It contains high-quality peer-reviewed papers of ‘International Conference on Computer, Communication and Computational Sciences (ICCCCS 2016), held during 12-13 August, 2016 in Ajmer, India. These papers are arranged in the form of chapters. The content of the book is divided into two volumes that cover variety of topics such as intelligent hardware and software design, advanced communications, power and energy optimization, intelligent techniques used in internet of things, intelligent image processing, advanced software engineering, evolutionary and ...

  9. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  10. Hardware for soft computing and soft computing for hardware

    CERN Document Server

    Nedjah, Nadia

    2014-01-01

    Single and Multi-Objective Evolutionary Computation (MOEA),  Genetic Algorithms (GAs), Artificial Neural Networks (ANNs), Fuzzy Controllers (FCs), Particle Swarm Optimization (PSO) and Ant colony Optimization (ACO) are becoming omnipresent in almost every intelligent system design. Unfortunately, the application of the majority of these techniques is complex and so requires a huge computational effort to yield useful and practical results. Therefore, dedicated hardware for evolutionary, neural and fuzzy computation is a key issue for designers. With the spread of reconfigurable hardware such as FPGAs, digital as well as analog hardware implementations of such computation become cost-effective. The idea behind this book is to offer a variety of hardware designs for soft computing techniques that can be embedded in any final product. Also, to introduce the successful application of soft computing technique to solve many hard problem encountered during the design of embedded hardware designs. Reconfigurable em...

  11. 3rd International Conference on Computational Mathematics and Computational Geometry

    CERN Document Server

    Ravindran, Anton

    2016-01-01

    This volume presents original research contributed to the 3rd Annual International Conference on Computational Mathematics and Computational Geometry (CMCGS 2014), organized and administered by Global Science and Technology Forum (GSTF). Computational Mathematics and Computational Geometry are closely related subjects, but are often studied by separate communities and published in different venues. This volume is unique in its combination of these topics. After the conference, which took place in Singapore, selected contributions chosen for this volume and peer-reviewed. The section on Computational Mathematics contains papers that are concerned with developing new and efficient numerical algorithms for mathematical sciences or scientific computing. They also cover analysis of such algorithms to assess accuracy and reliability. The parts of this project that are related to Computational Geometry aim to develop effective and efficient algorithms for geometrical applications such as representation and computati...

  12. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  13. Computational Biology and High Performance Computing 2000

    Energy Technology Data Exchange (ETDEWEB)

    Simon, Horst D.; Zorn, Manfred D.; Spengler, Sylvia J.; Shoichet, Brian K.; Stewart, Craig; Dubchak, Inna L.; Arkin, Adam P.

    2000-10-19

    The pace of extraordinary advances in molecular biology has accelerated in the past decade due in large part to discoveries coming from genome projects on human and model organisms. The advances in the genome project so far, happening well ahead of schedule and under budget, have exceeded any dreams by its protagonists, let alone formal expectations. Biologists expect the next phase of the genome project to be even more startling in terms of dramatic breakthroughs in our understanding of human biology, the biology of health and of disease. Only today can biologists begin to envision the necessary experimental, computational and theoretical steps necessary to exploit genome sequence information for its medical impact, its contribution to biotechnology and economic competitiveness, and its ultimate contribution to environmental quality. High performance computing has become one of the critical enabling technologies, which will help to translate this vision of future advances in biology into reality. Biologists are increasingly becoming aware of the potential of high performance computing. The goal of this tutorial is to introduce the exciting new developments in computational biology and genomics to the high performance computing community.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  15. Usage of Cloud Computing Simulators and Future Systems For Computational Research

    OpenAIRE

    Lakshminarayanan, Ramkumar; Ramalingam, Rajasekar

    2016-01-01

    Cloud Computing is an Internet based computing, whereby shared resources, software and information, are provided to computers and devices on demand, like the electricity grid. Currently, IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service) are used as a business model for Cloud Computing. Nowadays, the adoption and deployment of Cloud Computing is increasing in various domains, forcing researchers to conduct research in the area of Cloud Computing ...

  16. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  17. Perbandingan Kemampuan Embedded Computer dengan General Purpose Computer untuk Pengolahan Citra

    Directory of Open Access Journals (Sweden)

    Herryawan Pujiharsono

    2017-08-01

    Full Text Available Perkembangan teknologi komputer membuat pengolahan citra saat ini banyak dikembangkan untuk dapat membantu manusia di berbagai bidang pekerjaan. Namun, tidak semua bidang pekerjaan dapat dikembangkan dengan pengolahan citra karena tidak mendukung penggunaan komputer sehingga mendorong pengembangan pengolahan citra dengan mikrokontroler atau mikroprosesor khusus. Perkembangan mikrokontroler dan mikroprosesor memungkinkan pengolahan citra saat ini dapat dikembangkan dengan embedded computer atau single board computer (SBC. Penelitian ini bertujuan untuk menguji kemampuan embedded computer dalam mengolah citra dan membandingkan hasilnya dengan komputer pada umumnya (general purpose computer. Pengujian dilakukan dengan mengukur waktu eksekusi dari empat operasi pengolahan citra yang diberikan pada sepuluh ukuran citra. Hasil yang diperoleh pada penelitian ini menunjukkan bahwa optimasi waktu eksekusi embedded computer lebih baik jika dibandingkan dengan general purpose computer dengan waktu eksekusi rata-rata embedded computer adalah 4-5 kali waktu eksekusi general purpose computer dan ukuran citra maksimal yang tidak membebani CPU terlalu besar untuk embedded computer adalah 256x256 piksel dan untuk general purpose computer adalah 400x300 piksel.

  18. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  19. Quantum analogue computing.

    Science.gov (United States)

    Kendon, Vivien M; Nemoto, Kae; Munro, William J

    2010-08-13

    We briefly review what a quantum computer is, what it promises to do for us and why it is so hard to build one. Among the first applications anticipated to bear fruit is the quantum simulation of quantum systems. While most quantum computation is an extension of classical digital computation, quantum simulation differs fundamentally in how the data are encoded in the quantum computer. To perform a quantum simulation, the Hilbert space of the system to be simulated is mapped directly onto the Hilbert space of the (logical) qubits in the quantum computer. This type of direct correspondence is how data are encoded in a classical analogue computer. There is no binary encoding, and increasing precision becomes exponentially costly: an extra bit of precision doubles the size of the computer. This has important consequences for both the precision and error-correction requirements of quantum simulation, and significant open questions remain about its practicality. It also means that the quantum version of analogue computers, continuous-variable quantum computers, becomes an equally efficient architecture for quantum simulation. Lessons from past use of classical analogue computers can help us to build better quantum simulators in future.

  20. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  1. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  2. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  3. Tomography

    International Nuclear Information System (INIS)

    Brown, B.H.; Barber, D.C.; Freeston, I.L.

    1983-01-01

    Tomography images of a body are constructed by placing a plurality of surface electrodes at spaced intervals on the body, causing currents to flow in the body (e.g. by applying a potential between each pair of electrodes in turn, or by induction), and measuring the potential between pairs of electrodes, calculating the potential expected in each case on the assumption that the body consists of a medium of uniform impedance, plotting the isopotentials corresponding to the calculated results to create a uniform image of the body, obtaining the ratio between the measured potential and the calculated potential in each case, and modifying the image in accordance with the respective ratios by increasing the assumed impedance along an isopotential in proportion to a ratio greater than unity or decreasing the assumed impedance in proportion to a ratio less than unity. The modified impedances along the isopotentials for each pair of electrodes are superimposed. The calculations are carried out using a computer and the plotting is carried out by a visual display unit and/or a print-out unit. (author)

  4. Thyroid Uptake Measurement System

    International Nuclear Information System (INIS)

    Nguyen Duc Tuan; Nguyen Thi Bao My; Nguyen Van Sy

    2007-01-01

    The NED-UP.M7 is a complete thyroid uptake and analysis system specifically designed for nuclear medicine. Capable of performing a full range of studies this system provides fast, accurate results for Uptake Studies. The heart of the NED-UP.M7 is a microprocessor-controlled 2048 channel Compact Multi-Channel Analyzer, coupled to a 2 inch x 2 inch NaI(Tl) detector with a USB personal computer interface. The system offers simple, straight-forward operation using pre-programmed isotopes, and menudriven prompts to guide the user step by step through each procedure. The pre-programmed radionuclides include I-123, I-125, I-131, Tc-99m and Cs-137. The user-defined radionuclides also allow for isotope identification while the printer provides hard copy printouts for patient and department record keeping. The included software program running on PC (Windows XP-based) is a user friendly program with menudriven and graphic interface for easy controlling the system and managing measurement results of patient on Excel standard form. (author)

  5. Airborne gamma-ray spectrometer and magnetometer survey: Barrow Quadrangle, Alaska. Final report. Volume I

    International Nuclear Information System (INIS)

    1981-03-01

    During the months of July-August 1980, Aero Service Division Western Geophysical Company of America conducted an airborne high sensitivity gamma-ray spectrometer and magnetometer survey over eleven (11) 3 0 x 1 0 and one (1) 4 0 x 1 0 NTMS quadrangles of the Alaskan North Slope. This report discusses the results obtained over the Barrow map area. The final data are presented in four different forms: on magnetic tape; on microfiche; in graphic form as profiles and histograms; and in map form as anomaly maps, flight path maps, and computer printer maps. The histograms and the multiparameter profiles are presented with the anomaly maps and flight path map in a separate bound volume. Complete data listings of both the reduced single record and the reduced averaged record data are found in the back of this report. The format of the printout of the microfiches and the format of the data files delivered on magnetic tape are in accordance with the specifications of the BFEC 1200-C and are described in appendices F through L of this report

  6. A new fully automated TLD badge reader

    International Nuclear Information System (INIS)

    Kannan, S.; Ratna, P.; Kulkarni, M.S.

    2003-01-01

    At present personnel monitoring in India is being carried out using a number of manual and semiautomatic TLD badge Readers and the BARC TL dosimeter badge designed during 1970. Of late the manual TLD badge readers are almost completely replaced by semiautomatic readers with a number of performance improvements like use of hot gas heating to reduce the readout time considerably. PC based design with storage of glow curve for every dosimeter, on-line dose computation and printout of dose reports, etc. However the semiautomatic system suffers from the lack of a machine readable ID code on the badge and the physical design of the dosimeter card not readily compatible for automation. This paper describes a fully automated TLD badge Reader developed in the RSS Division, using a new TLD badge with machine readable ID code. The new PC based reader has a built-in reader for reading the ID code, in the form of an array of holes, on the dosimeter card. The reader has a number of self-diagnostic features to ensure a high degree of reliability. (author)

  7. Video-based lectures: An emerging paradigm for teaching human anatomy and physiology to student nurses

    Directory of Open Access Journals (Sweden)

    Rabab El-Sayed Hassan El-Sayed

    2013-09-01

    Full Text Available Video-based teaching material is a rich and powerful medium being used in computer assisted learning. This paper aimed to assess the learning outcomes and student nurses’ acceptance and satisfaction with the video-based lectures versus the traditional method of teaching human anatomy and physiology courses. Data were collected from 27 students in a Bachelor of Nursing program and experimental control was achieved using an alternating-treatments design. Overall, students experienced 10 lectures, which delivered by the teacher as either video-based or PowerPoint-based lectures. Results revealed that video-based lectures offer more successes and reduce failures in the immediate and follow-up measures as compared with the traditional method of teaching human anatomy and physiology that was based on printout illustrations, but these differences were not statistically significant. Moreover, nurse students appeared positive about their learning experiences, as they rated highly all the items assessing their acceptance and satisfaction with the video-based lectures. KEYWORDS: Video-based lecture, Traditional, Print-based illustration

  8. Support compass energy. BINE database. Support programs for energy saving measures and renewable energies; Foerderkompass Energie. Eine BINE-Datenbank. Foerderprogramme fuer Energie sparende Massnahmen und erneuerbare Energien

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    With respect to energy saving measures and renewable energies, BINE Informationsdienst (Bonn, Federal Republic of Germany) presents a database with a comprehensive description of supply information. These information are fast in access, available at any time and up to date by an internet actualization service. The database contains: (a) actual support programs for private, commercial and institutional investors; (b) support conditions and references to the filling of an application; (c) filling of an application, sheets of instruction, original texts of the regulations, addresses, internet links; (d) information with respect to the ability of accumulation of different support programs. The functions of the database under consideration are: (a) comfortable search according to projects and target group; (b) daily actualization service via internet; (c) assumption and processing of the results in usual office applications; (d) printout of all results of search, individually or completely; (e) fast overview of all changes of the last four weeks. The advantages are: time-saving fast search; actual and carefully inquired knowledge as well as adhoc availability on your personal computer.

  9. FRENCH TAXES NOTIFICATION OF JULY/AUGUST 2001 RELATING TO THE 2000 INCOME DECLARATION

    CERN Document Server

    HR Division

    2001-01-01

    Members of the personnel residing in France who are not of French nationality are about to receive or have already received at their home addresses a document from their Centre des Impôts (CDI) [The Tax Office], which is drafted in a way that raises a number of questions. On page 1 of this pre-printed recto/verso form appears a computer print-out of the following statements: "LA DECLARATION QUE VOUS AVEZ DEPOSEE NE COMPORTE AUCUN REVENU POUR L'ANNEE 2000. JE VOUS INFORME QUE LE PRESENT AVIS NE CONSTITUE PAS UN JUSTIFICATIF D'ABSENCE DE TOUT REVENU. VOUS AVEZ DES REVENUS PERCUS EN PROVENANCE D'ORGANISMES INTERNATIONAUX, DE MISSIONS DIPLOMATIQUES OU CONSULAIRES EXONERES D'IMPOT EN FRANCE. INDIQUEZ-LES AU BAS DE CET AVIS.' (i.e. The declaration which you have returned shows no income for 2000. I would like to inform you that this notification does not represent certification of the absence of any income. You receive income from an international organization or a displomatic mission or consulate which is exempt...

  10. FRENCH TAXES NOTIFICATION OF JULY/AUGUST 2000 RELATING TO THE 1999 INCOME DECLARATION

    CERN Document Server

    2000-01-01

    Members of the personnel residing in France who are not of French nationality are about to receive or have already received at their home addresses a document from their Centre des Impôts (CDI) [The Tax Office], which is drafted in a way that raises a number of questions. On page 1 of this pre-printed recto/verso form appears a computer printout of the following statements: 'LA DECLARATION QUE VOUS AVEZ DEPOSEE NE COMPORTE AUCUN REVENU POUR L'ANNEE 1999. JE VOUS INFORME QUE LE PRESENT AVIS NE CONSTITUE PAS UN JUSTIFICATIF D'ABSENCE DE TOUT REVENU. VOUS AVEZ DES REVENUS PERCUS EN PROVENANCE D'ORGANISMES INTERNATIONAUX, DE MISSIONS DIPLOMATIQUES OU CONSULAIRES EXONERES D'IMPOT EN FRANCE. INDIQUEZ-LES AU BAS DE CET AVIS.' (i.e. The declaration which you have returned shows no income for 1999. I would like to inform you that this notification does not represent certification of the absence of any income. You receive income from an international organisation or a diplomatic mission or consulate which is exempt fr...

  11. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  15. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  16. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  17. Attitudes towards Computer and Computer Self-Efficacy as Predictors of Preservice Mathematics Teachers' Computer Anxiety

    Science.gov (United States)

    Awofala, Adeneye O. A.; Akinoso, Sabainah O.; Fatade, Alfred O.

    2017-01-01

    The study investigated attitudes towards computer and computer self-efficacy as predictors of computer anxiety among 310 preservice mathematics teachers from five higher institutions of learning in Lagos and Ogun States of Nigeria using the quantitative research method within the blueprint of the descriptive survey design. Data collected were…

  18. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  19. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  20. Seventh Medical Image Computing and Computer Assisted Intervention Conference (MICCAI 2012)

    CERN Document Server

    Miller, Karol; Nielsen, Poul; Computational Biomechanics for Medicine : Models, Algorithms and Implementation

    2013-01-01

    One of the greatest challenges for mechanical engineers is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, biomedical sciences, and medicine. This book is an opportunity for computational biomechanics specialists to present and exchange opinions on the opportunities of applying their techniques to computer-integrated medicine. Computational Biomechanics for Medicine: Models, Algorithms and Implementation collects the papers from the Seventh Computational Biomechanics for Medicine Workshop held in Nice in conjunction with the Medical Image Computing and Computer Assisted Intervention conference. The topics covered include: medical image analysis, image-guided surgery, surgical simulation, surgical intervention planning, disease prognosis and diagnostics, injury mechanism analysis, implant and prostheses design, and medical robotics.

  1. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  2. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  3. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  4. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  5. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  6. Girls and Computing: Female Participation in Computing in Schools

    Science.gov (United States)

    Zagami, Jason; Boden, Marie; Keane, Therese; Moreton, Bronwyn; Schulz, Karsten

    2015-01-01

    Computer education, with a focus on Computer Science, has become a core subject in the Australian Curriculum and the focus of national innovation initiatives. Equal participation by girls, however, remains unlikely based on their engagement with computing in recent decades. In seeking to understand why this may be the case, a Delphi consensus…

  7. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  8. Computer in radiology

    International Nuclear Information System (INIS)

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  9. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  10. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  11. 75 FR 30839 - Privacy Act of 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer...

    Science.gov (United States)

    2010-06-02

    ... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS... Services (CMS). ACTION: Notice of renewal of an existing computer matching program (CMP) that has an...'' section below for comment period. DATES: Effective Dates: CMS filed a report of the Computer Matching...

  12. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  13. NET-COMPUTER: Internet Computer Architecture and its Application in E-Commerce

    Directory of Open Access Journals (Sweden)

    P. O. Umenne

    2012-12-01

    Full Text Available Research in Intelligent Agents has yielded interesting results, some of which have been translated into commer­cial ventures. Intelligent Agents are executable software components that represent the user, perform tasks on behalf of the user and when the task terminates, the Agents send the result to the user. Intelligent Agents are best suited for the Internet: a collection of computers connected together in a world-wide computer network. Swarm and HYDRA computer architectures for Agents’ execution were developed at the University of Surrey, UK in the 90s. The objective of the research was to develop a software-based computer architecture on which Agents execution could be explored. The combination of Intelligent Agents and HYDRA computer architecture gave rise to a new computer concept: the NET-Computer in which the comput­ing resources reside on the Internet. The Internet computers form the hardware and software resources, and the user is provided with a simple interface to access the Internet and run user tasks. The Agents autonomously roam the Internet (NET-Computer executing the tasks. A growing segment of the Internet is E-Commerce for online shopping for products and services. The Internet computing resources provide a marketplace for product suppliers and consumers alike. Consumers are looking for suppliers selling products and services, while suppliers are looking for buyers. Searching the vast amount of information available on the Internet causes a great deal of problems for both consumers and suppliers. Intelligent Agents executing on the NET-Computer can surf through the Internet and select specific information of interest to the user. The simulation results show that Intelligent Agents executing HYDRA computer architecture could be applied in E-Commerce.

  14. Computational intelligence synergies of fuzzy logic, neural networks and evolutionary computing

    CERN Document Server

    Siddique, Nazmul

    2013-01-01

    Computational Intelligence: Synergies of Fuzzy Logic, Neural Networks and Evolutionary Computing presents an introduction to some of the cutting edge technological paradigms under the umbrella of computational intelligence. Computational intelligence schemes are investigated with the development of a suitable framework for fuzzy logic, neural networks and evolutionary computing, neuro-fuzzy systems, evolutionary-fuzzy systems and evolutionary neural systems. Applications to linear and non-linear systems are discussed with examples. Key features: Covers all the aspect

  15. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  16. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  17. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  18. Disciplines, models, and computers: the path to computational quantum chemistry.

    Science.gov (United States)

    Lenhard, Johannes

    2014-12-01

    Many disciplines and scientific fields have undergone a computational turn in the past several decades. This paper analyzes this sort of turn by investigating the case of computational quantum chemistry. The main claim is that the transformation from quantum to computational quantum chemistry involved changes in three dimensions. First, on the side of instrumentation, small computers and a networked infrastructure took over the lead from centralized mainframe architecture. Second, a new conception of computational modeling became feasible and assumed a crucial role. And third, the field of computa- tional quantum chemistry became organized in a market-like fashion and this market is much bigger than the number of quantum theory experts. These claims will be substantiated by an investigation of the so-called density functional theory (DFT), the arguably pivotal theory in the turn to computational quantum chemistry around 1990.

  19. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  20. A Computational Fluid Dynamics Algorithm on a Massively Parallel Computer

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon

    1989-01-01

    The discipline of computational fluid dynamics is demanding ever-increasing computational power to deal with complex fluid flow problems. We investigate the performance of a finite-difference computational fluid dynamics algorithm on a massively parallel computer, the Connection Machine. Of special interest is an implicit time-stepping algorithm; to obtain maximum performance from the Connection Machine, it is necessary to use a nonstandard algorithm to solve the linear systems that arise in the implicit algorithm. We find that the Connection Machine ran achieve very high computation rates on both explicit and implicit algorithms. The performance of the Connection Machine puts it in the same class as today's most powerful conventional supercomputers.

  1. Computer performance evaluation of FACOM 230-75 computer system, (2)

    International Nuclear Information System (INIS)

    Fujii, Minoru; Asai, Kiyoshi

    1980-08-01

    In this report are described computer performance evaluations for FACOM230-75 computers in JAERI. The evaluations are performed on following items: (1) Cost/benefit analysis of timesharing terminals, (2) Analysis of the response time of timesharing terminals, (3) Analysis of throughout time for batch job processing, (4) Estimation of current potential demands for computer time, (5) Determination of appropriate number of card readers and line printers. These evaluations are done mainly from the standpoint of cost reduction of computing facilities. The techniques adapted are very practical ones. This report will be useful for those people who are concerned with the management of computing installation. (author)

  2. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  3. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  4. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    OpenAIRE

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  5. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  7. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  8. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  9. Pulmonary lobar volumetry using novel volumetric computer-aided diagnosis and computed tomography

    Science.gov (United States)

    Iwano, Shingo; Kitano, Mariko; Matsuo, Keiji; Kawakami, Kenichi; Koike, Wataru; Kishimoto, Mariko; Inoue, Tsutomu; Li, Yuanzhong; Naganawa, Shinji

    2013-01-01

    OBJECTIVES To compare the accuracy of pulmonary lobar volumetry using the conventional number of segments method and novel volumetric computer-aided diagnosis using 3D computed tomography images. METHODS We acquired 50 consecutive preoperative 3D computed tomography examinations for lung tumours reconstructed at 1-mm slice thicknesses. We calculated the lobar volume and the emphysematous lobar volume volumetry computer-aided diagnosis system could more precisely measure lobar volumes than the conventional number of segments method. Because semi-automatic computer-aided diagnosis and automatic computer-aided diagnosis were complementary, in clinical use, it would be more practical to first measure volumes by automatic computer-aided diagnosis, and then use semi-automatic measurements if automatic computer-aided diagnosis failed. PMID:23526418

  10. Elementary EFL Teachers' Computer Phobia and Computer Self-Efficacy in Taiwan

    Science.gov (United States)

    Chen, Kate Tzuching

    2012-01-01

    The advent and application of computer and information technology has increased the overall success of EFL teaching; however, such success is hard to assess, and teachers prone to computer avoidance face negative consequences. Two major obstacles are high computer phobia and low computer self-efficacy. However, little research has been carried out…

  11. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  12. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  13. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  14. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  15. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  16. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  17. Cloud Computing Bible

    CERN Document Server

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  18. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  19. Cartoon computation: quantum-like computing without quantum mechanics

    International Nuclear Information System (INIS)

    Aerts, Diederik; Czachor, Marek

    2007-01-01

    We present a computational framework based on geometric structures. No quantum mechanics is involved, and yet the algorithms perform tasks analogous to quantum computation. Tensor products and entangled states are not needed-they are replaced by sets of basic shapes. To test the formalism we solve in geometric terms the Deutsch-Jozsa problem, historically the first example that demonstrated the potential power of quantum computation. Each step of the algorithm has a clear geometric interpretation and allows for a cartoon representation. (fast track communication)

  20. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  1. Blackboard architecture and qualitative model in a computer aided assistant designed to define computers for HEP computing

    International Nuclear Information System (INIS)

    Nodarse, F.F.; Ivanov, V.G.

    1991-01-01

    Using BLACKBOARD architecture and qualitative model, an expert systm was developed to assist the use in defining the computers method for High Energy Physics computing. The COMEX system requires an IBM AT personal computer or compatible with than 640 Kb RAM and hard disk. 5 refs.; 9 figs

  2. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  3. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  4. Community Cloud Computing

    Science.gov (United States)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  5. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  6. Nurses' computer literacy and attitudes towards the use of computers in health care.

    Science.gov (United States)

    Gürdaş Topkaya, Sati; Kaya, Nurten

    2015-05-01

    This descriptive and cross-sectional study was designed to address nurses' computer literacy and attitudes towards the use of computers in health care and to determine the correlation between these two variables. This study was conducted with the participation of 688 nurses who worked at two university-affiliated hospitals. These nurses were chosen using a stratified random sampling method. The data were collected using the Multicomponent Assessment of Computer Literacy and the Pretest for Attitudes Towards Computers in Healthcare Assessment Scale v. 2. The nurses, in general, had positive attitudes towards computers, and their computer literacy was good. Computer literacy in general had significant positive correlations with individual elements of computer competency and with attitudes towards computers. If the computer is to be an effective and beneficial part of the health-care system, it is necessary to help nurses improve their computer competency. © 2014 Wiley Publishing Asia Pty Ltd.

  7. Parallelized computation for computer simulation of electrocardiograms using personal computers with multi-core CPU and general-purpose GPU.

    Science.gov (United States)

    Shen, Wenfeng; Wei, Daming; Xu, Weimin; Zhu, Xin; Yuan, Shizhong

    2010-10-01

    Biological computations like electrocardiological modelling and simulation usually require high-performance computing environments. This paper introduces an implementation of parallel computation for computer simulation of electrocardiograms (ECGs) in a personal computer environment with an Intel CPU of Core (TM) 2 Quad Q6600 and a GPU of Geforce 8800GT, with software support by OpenMP and CUDA. It was tested in three parallelization device setups: (a) a four-core CPU without a general-purpose GPU, (b) a general-purpose GPU plus 1 core of CPU, and (c) a four-core CPU plus a general-purpose GPU. To effectively take advantage of a multi-core CPU and a general-purpose GPU, an algorithm based on load-prediction dynamic scheduling was developed and applied to setting (c). In the simulation with 1600 time steps, the speedup of the parallel computation as compared to the serial computation was 3.9 in setting (a), 16.8 in setting (b), and 20.0 in setting (c). This study demonstrates that a current PC with a multi-core CPU and a general-purpose GPU provides a good environment for parallel computations in biological modelling and simulation studies. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  8. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  9. Visual ergonomics and computer work--is it all about computer glasses?

    Science.gov (United States)

    Jonsson, Christina

    2012-01-01

    The Swedish Provisions on Work with Display Screen Equipment and the EU Directive on the minimum safety and health requirements for work with display screen equipment cover several important visual ergonomics aspects. But a review of cases and questions to the Swedish Work Environment Authority clearly shows that most attention is given to the demands for eyesight tests and special computer glasses. Other important visual ergonomics factors are at risk of being neglected. Today computers are used everywhere, both at work and at home. Computers can be laptops, PDA's, tablet computers, smart phones, etc. The demands on eyesight tests and computer glasses still apply but the visual demands and the visual ergonomics conditions are quite different compared to the use of a stationary computer. Based on this review, we raise the question if the demand on the employer to provide the employees with computer glasses is outdated.

  10. Computing with concepts, computing with numbers: Llull, Leibniz, and Boole

    NARCIS (Netherlands)

    Uckelman, S.L.

    2010-01-01

    We consider two ways to understand "reasoning as computation", one which focuses on the computation of concept symbols and the other on the computation of number symbols. We illustrate these two ways with Llull’s Ars Combinatoria and Leibniz’s attempts to arithmetize language, respectively. We then

  11. Processing computed tomography images by using personal computer

    International Nuclear Information System (INIS)

    Seto, Kazuhiko; Fujishiro, Kazuo; Seki, Hirofumi; Yamamoto, Tetsuo.

    1994-01-01

    Processing of CT images was attempted by using a popular personal computer. The program for image-processing was made with C compiler. The original images, acquired with CT scanner (TCT-60A, Toshiba), were transferred to the computer by 8-inch flexible diskette. Many fundamental image-processing, such as displaying image to the monitor, calculating CT value and drawing the profile curve. The result showed that a popular personal computer had ability to process CT images. It seemed that 8-inch flexible diskette was still useful medium of transferring image data. (author)

  12. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  13. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  14. Computer proficiency questionnaire: assessing low and high computer proficient seniors.

    Science.gov (United States)

    Boot, Walter R; Charness, Neil; Czaja, Sara J; Sharit, Joseph; Rogers, Wendy A; Fisk, Arthur D; Mitzner, Tracy; Lee, Chin Chin; Nair, Sankaran

    2015-06-01

    Computers and the Internet have the potential to enrich the lives of seniors and aid in the performance of important tasks required for independent living. A prerequisite for reaping these benefits is having the skills needed to use these systems, which is highly dependent on proper training. One prerequisite for efficient and effective training is being able to gauge current levels of proficiency. We developed a new measure (the Computer Proficiency Questionnaire, or CPQ) to measure computer proficiency in the domains of computer basics, printing, communication, Internet, calendaring software, and multimedia use. Our aim was to develop a measure appropriate for individuals with a wide range of proficiencies from noncomputer users to extremely skilled users. To assess the reliability and validity of the CPQ, a diverse sample of older adults, including 276 older adults with no or minimal computer experience, was recruited and asked to complete the CPQ. The CPQ demonstrated excellent reliability (Cronbach's α = .98), with subscale reliabilities ranging from .86 to .97. Age, computer use, and general technology use all predicted CPQ scores. Factor analysis revealed three main factors of proficiency related to Internet and e-mail use; communication and calendaring; and computer basics. Based on our findings, we also developed a short-form CPQ (CPQ-12) with similar properties but 21 fewer questions. The CPQ and CPQ-12 are useful tools to gauge computer proficiency for training and research purposes, even among low computer proficient older adults. © The Author 2013. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  16. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  17. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  18. Ubiquitous Computing: The Universal Use of Computers on College Campuses.

    Science.gov (United States)

    Brown, David G., Ed.

    This book is a collection of vignettes from 13 universities where everyone on campus has his or her own computer. These 13 institutions have instituted "ubiquitous computing" in very different ways at very different costs. The chapters are: (1) "Introduction: The Ubiquitous Computing Movement" (David G. Brown); (2) "Dartmouth College" (Malcolm…

  19. Activity-Driven Computing Infrastructure - Pervasive Computing in Healthcare

    DEFF Research Database (Denmark)

    Bardram, Jakob Eyvind; Christensen, Henrik Bærbak; Olesen, Anders Konring

    In many work settings, and especially in healthcare, work is distributed among many cooperating actors, who are constantly moving around and are frequently interrupted. In line with other researchers, we use the term pervasive computing to describe a computing infrastructure that supports work...

  20. Computer Skills Training and Readiness to Work with Computers

    Directory of Open Access Journals (Sweden)

    Arnon Hershkovitz

    2016-05-01

    Full Text Available In today’s job market, computer skills are part of the prerequisites for many jobs. In this paper, we report on a study of readiness to work with computers (the dependent variable among unemployed women (N=54 after participating in a unique, web-supported training focused on computer skills and empowerment. Overall, the level of participants’ readiness to work with computers was much higher at the end of the course than it was at its begin-ning. During the analysis, we explored associations between this variable and variables from four categories: log-based (describing the online activity; computer literacy and experience; job-seeking motivation and practice; and training satisfaction. Only two variables were associated with the dependent variable: knowledge post-test duration and satisfaction with content. After building a prediction model for the dependent variable, another log-based variable was highlighted: total number of actions in the course website along the course. Overall, our analyses shed light on the predominance of log-based variables over variables from other categories. These findings might hint at the need of developing new assessment tools for learners and trainees that take into consideration human-computer interaction when measuring self-efficacy variables.

  1. Computational Science at the Argonne Leadership Computing Facility

    Science.gov (United States)

    Romero, Nichols

    2014-03-01

    The goal of the Argonne Leadership Computing Facility (ALCF) is to extend the frontiers of science by solving problems that require innovative approaches and the largest-scale computing systems. ALCF's most powerful computer - Mira, an IBM Blue Gene/Q system - has nearly one million cores. How does one program such systems? What software tools are available? Which scientific and engineering applications are able to utilize such levels of parallelism? This talk will address these questions and describe a sampling of projects that are using ALCF systems in their research, including ones in nanoscience, materials science, and chemistry. Finally, the ways to gain access to ALCF resources will be presented. This research used resources of the Argonne Leadership Computing Facility at Argonne National Laboratory, which is supported by the Office of Science of the U.S. Department of Energy under contract DE-AC02-06CH11357.

  2. Reconfigurable computing the theory and practice of FPGA-based computation

    CERN Document Server

    Hauck, Scott

    2010-01-01

    Reconfigurable Computing marks a revolutionary and hot topic that bridges the gap between the separate worlds of hardware and software design- the key feature of reconfigurable computing is its groundbreaking ability to perform computations in hardware to increase performance while retaining the flexibility of a software solution. Reconfigurable computers serve as affordable, fast, and accurate tools for developing designs ranging from single chip architectures to multi-chip and embedded systems. Scott Hauck and Andre DeHon have assembled a group of the key experts in the fields of both hardwa

  3. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  4. Conray cystography and volumetry of the cyst using computed tomography

    International Nuclear Information System (INIS)

    Kobayashi, Tatsuya; Negoro, Makoto; Asano, Yoshio; Kageyama, Naoki

    1980-01-01

    The method is to administer 30% Conray in a dosage not exceeding 1cc into the cystic cavity which has been previously drained into the subgaleal space via the Ommaya tube and reservoir, and to take CT scans at the plane passing through the cyst. The cyst becomes clearly enhanced by this small amount of the contrast media, and its size and extension into the brain become better evaluated than the nonenhanced CT. Then the EMI units of the cystic contents with and without enhancement are calculated from the print-out data. Based on the fundamental experiments using human skull phantom filled with the water and a sphere containing varying densities of the contrast media, the relationship between the EMI number of the cystic content and the dilution ratio of the contrast media is found to correlate well with a formula: Y = 988.15Xsup(-0.73) where Y = EMI number of the cystic content and X = dilution ratio of the contrast media. Subtracting the EMI number of the cystic content before adminstration of the contrast media from that after administration, an increase in EMI number is calculated. When this value of increase is inserted into the above formula, the cyst volume, which is equivalent to the dilution ratio, if exactly 1cc of the contrast media is injected into the cystic cavity, can be easily assessed. Three cases with cystic brain tumors, an ependymoma and two craniopharyngiomas, were studied using this method and calculated volume of the cysts were compared with the value obtained by Conray cystograms. From these studies, it is concluded that (1) this volumetric method seems to be more accurate than that by Conray cystography, especially in case where the cyst is irregular in shape, and (2) this method is useful not only for the volumetry of intracranial cysts per se but also for the evaluation of treatment of cystic tumors by external or internal irradiation, or chemotherapy. (J.P.N.)

  5. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  6. Cloud Computing (1/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  7. Cloud Computing (2/2)

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Cloud computing, the recent years buzzword for distributed computing, continues to attract and keep the interest of both the computing and business world. These lectures aim at explaining "What is Cloud Computing?" identifying and analyzing it's characteristics, models, and applications. The lectures will explore different "Cloud definitions" given by different authors and use them to introduce the particular concepts. The main cloud models (SaaS, PaaS, IaaS), cloud types (public, private, hybrid), cloud standards and security concerns will be presented. The borders between Cloud Computing and Grid Computing, Server Virtualization, Utility Computing will be discussed and analyzed.

  8. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  9. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  10. Computational error and complexity in science and engineering computational error and complexity

    CERN Document Server

    Lakshmikantham, Vangipuram; Chui, Charles K; Chui, Charles K

    2005-01-01

    The book "Computational Error and Complexity in Science and Engineering” pervades all the science and engineering disciplines where computation occurs. Scientific and engineering computation happens to be the interface between the mathematical model/problem and the real world application. One needs to obtain good quality numerical values for any real-world implementation. Just mathematical quantities symbols are of no use to engineers/technologists. Computational complexity of the numerical method to solve the mathematical model, also computed along with the solution, on the other hand, will tell us how much computation/computational effort has been spent to achieve that quality of result. Anyone who wants the specified physical problem to be solved has every right to know the quality of the solution as well as the resources spent for the solution. The computed error as well as the complexity provide the scientific convincing answer to these questions. Specifically some of the disciplines in which the book w...

  11. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  12. The Nature of Computational Thinking in Computing Education

    DEFF Research Database (Denmark)

    Spangsberg, Thomas Hvid; Brynskov, Martin

    2018-01-01

    Computational Thinking has gained popularity in recent years within educational and political discourses. It is more than ever crucial to discuss the term itself and what it means. In June 2017, Denning articulated that computational thinking can be viewed as either “traditional” or “new”. New...

  13. Physical Computing and Its Scope--Towards a Constructionist Computer Science Curriculum with Physical Computing

    Science.gov (United States)

    Przybylla, Mareen; Romeike, Ralf

    2014-01-01

    Physical computing covers the design and realization of interactive objects and installations and allows students to develop concrete, tangible products of the real world, which arise from the learners' imagination. This can be used in computer science education to provide students with interesting and motivating access to the different topic…

  14. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  15. Adaptation of HAMMER computer code to CYBER 170/750 computer

    International Nuclear Information System (INIS)

    Pinheiro, A.M.B.S.; Nair, R.P.K.

    1982-01-01

    The adaptation of HAMMER computer code to CYBER 170/750 computer is presented. The HAMMER code calculates cell parameters by multigroup transport theory and reactor parameters by few group diffusion theory. The auxiliary programs, the carried out modifications and the use of HAMMER system adapted to CYBER 170/750 computer are described. (M.C.K.) [pt

  16. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  17. Advanced Technology for Portable Personal Visualization.

    Science.gov (United States)

    1992-06-01

    later printout as slides. "* The number of users and applications of the HMD system continues to grow . Software engineering team projects investigated...scaling, and moving objects. To reduce the distortion in the magnetic field, we are experimenting with a " unicorn mount" that moves the Poihemus source...requirements which would grow without bound if we linear translation of a transducer inside a tube inserted into the retained all the past sample points. The

  18. Instrumentation of a manually programmed neutron diffractometer

    DEFF Research Database (Denmark)

    Hansen, K.B.; Neisig, K.E.

    1966-01-01

    This paper describes essentially the digital part of the instrumentation for a neutron diffractometer in which the measuring procedure is governed by a control unit involving a fixed number of program points. A simultaneously running test program monitors the information transfer from the data...... sources and to the print-out in table form. The experimental conditions must be set by a panel switch selected program, which allows a desired parameter program to be executed....

  19. International Conference on Computational Intelligence, Cyber Security, and Computational Models

    CERN Document Server

    Ramasamy, Vijayalakshmi; Sheen, Shina; Veeramani, C; Bonato, Anthony; Batten, Lynn

    2016-01-01

    This book aims at promoting high-quality research by researchers and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security, and Computational Models ICC3 2015 organized by PSG College of Technology, Coimbatore, India during December 17 – 19, 2015. This book enriches with innovations in broad areas of research like computational modeling, computational intelligence and cyber security. These emerging inter disciplinary research areas have helped to solve multifaceted problems and gained lot of attention in recent years. This encompasses theory and applications, to provide design, analysis and modeling of the aforementioned key areas.

  20. Early Childhood Teacher Candidates\\' Attitudes towards Computer and Computer Assisted Instruction

    OpenAIRE

    Oğuz, Evrim; Ellez, A. Murat; Akamca, Güzin Özyılmaz; Kesercioğlu, Teoman İ.; Girgin, Günseli

    2011-01-01

    The aim of this research is to evaluate preschool candidates’ attitudes towards computers andattitudes towards use of computer assisted instruction. The sample of this study includes 481 early childhoodeducation students who attended Dokuz Eylül University’s department of Early Childhood Education. Data werecollected by using “Scale of Computer Assisted Instruction Attitudes” developed by the Arslan (2006),“Computer Attitudes Scale” developed by Çelik & Bindak (2005) and “General Info...

  1. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  2. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  3. Application of Blind Quantum Computation to Two-Party Quantum Computation

    Science.gov (United States)

    Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong

    2018-03-01

    Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.

  4. Application of Blind Quantum Computation to Two-Party Quantum Computation

    Science.gov (United States)

    Sun, Zhiyuan; Li, Qin; Yu, Fang; Chan, Wai Hong

    2018-06-01

    Blind quantum computation (BQC) allows a client who has only limited quantum power to achieve quantum computation with the help of a remote quantum server and still keep the client's input, output, and algorithm private. Recently, Kashefi and Wallden extended BQC to achieve two-party quantum computation which allows two parties Alice and Bob to perform a joint unitary transform upon their inputs. However, in their protocol Alice has to prepare rotated single qubits and perform Pauli operations, and Bob needs to have a powerful quantum computer. In this work, we also utilize the idea of BQC to put forward an improved two-party quantum computation protocol in which the operations of both Alice and Bob are simplified since Alice only needs to apply Pauli operations and Bob is just required to prepare and encrypt his input qubits.

  5. Implementing an Affordable High-Performance Computing for Teaching-Oriented Computer Science Curriculum

    Science.gov (United States)

    Abuzaghleh, Omar; Goldschmidt, Kathleen; Elleithy, Yasser; Lee, Jeongkyu

    2013-01-01

    With the advances in computing power, high-performance computing (HPC) platforms have had an impact on not only scientific research in advanced organizations but also computer science curriculum in the educational community. For example, multicore programming and parallel systems are highly desired courses in the computer science major. However,…

  6. Computer Education and Computer Use by Preschool Educators

    Science.gov (United States)

    Towns, Bernadette

    2010-01-01

    Researchers have found that teachers seldom use computers in the preschool classroom. However, little research has examined why preschool teachers elect not to use computers. This case study focused on identifying whether community colleges that prepare teachers for early childhood education include in their curriculum how teachers can effectively…

  7. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  8. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  9. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  10. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  11. Inleiding: 'History of computing'. Geschiedschrijving over computers en computergebruik in Nederland

    Directory of Open Access Journals (Sweden)

    Adrienne van den Boogaard

    2008-06-01

    Full Text Available Along with the international trends in history of computing, Dutch contributions over the past twenty years moved away from a focus on machinery to the broader scope of use of computers, appropriation of computing technologies in various traditions, labour relations and professionalisation issues, and, lately, software.It is only natural that an emerging field like computer science sets out to write its genealogy and canonise the important steps in its intellectual endeavour. It is fair to say that a historiography diverging from such “home” interest, started in 1987 with the work of Eda Kranakis – then active in The Netherlands – commissioned by the national bureau for technology assessment, and Gerard Alberts, turning a commemorative volume of the Mathematical Center into a history of the same institute. History of computing in The Netherlands made a major leap in the spring of 1994 when Dirk de Wit, Jan van den Ende and Ellen van Oost defended their dissertations, on the roads towards adoption of computing technology in banking, in science and engineering, and on the gender aspect in computing. Here, history of computing had already moved from machines to the use of computers. The three authors joined Gerard Alberts and Onno de Wit in preparing a volume on the rise of IT in The Netherlands, the sequel of which in now in preparation in a team lead by Adrienne van den Bogaard.Dutch research reflected the international attention for professionalisation issues (Ensmenger, Haigh very early on in the dissertation by Ruud van Dael, Something to do with computers (2001 revealing how occupations dealing with computers typically escape the pattern of closure by professionalisation as expected by the, thus outdated, sociology of professions. History of computing not only takes use and users into consideration, but finally, as one may say, confronts the technological side of putting the machine to use, software, head on. The groundbreaking works

  12. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  13. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  14. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  15. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  16. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  17. A Distributed Snapshot Protocol for Efficient Artificial Intelligence Computation in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    JongBeom Lim

    2018-01-01

    Full Text Available Many artificial intelligence applications often require a huge amount of computing resources. As a result, cloud computing adoption rates are increasing in the artificial intelligence field. To support the demand for artificial intelligence applications and guarantee the service level agreement, cloud computing should provide not only computing resources but also fundamental mechanisms for efficient computing. In this regard, a snapshot protocol has been used to create a consistent snapshot of the global state in cloud computing environments. However, the existing snapshot protocols are not optimized in the context of artificial intelligence applications, where large-scale iterative computation is the norm. In this paper, we present a distributed snapshot protocol for efficient artificial intelligence computation in cloud computing environments. The proposed snapshot protocol is based on a distributed algorithm to run interconnected multiple nodes in a scalable fashion. Our snapshot protocol is able to deal with artificial intelligence applications, in which a large number of computing nodes are running. We reveal that our distributed snapshot protocol guarantees the correctness, safety, and liveness conditions.

  18. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  19. From computer to brain foundations of computational neuroscience

    CERN Document Server

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  20. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  1. Computer users at risk: Health disorders associated with prolonged computer use

    OpenAIRE

    Abida Ellahi; M. Shahid Khalil; Fouzia Akram

    2011-01-01

    By keeping in view the ISO standards which emphasize the assessment of use of a product, this research aims to assess the prolonged use of computers and their effects on human health. The objective of this study was to investigate the association between extent of computer use (per day) and carpal tunnel syndrome, computer stress syndrome, computer vision syndrome and musculoskeletal problems. The second objective was to investigate the extent of simultaneous occurrence of carpal tunnel syndr...

  2. Molecular computing towards a novel computing architecture for complex problem solving

    CERN Document Server

    Chang, Weng-Long

    2014-01-01

    This textbook introduces a concise approach to the design of molecular algorithms for students or researchers who are interested in dealing with complex problems. Through numerous examples and exercises, you will understand the main difference of molecular circuits and traditional digital circuits to manipulate the same problem and you will also learn how to design a molecular algorithm of solving any a problem from start to finish. The book starts with an introduction to computational aspects of digital computers and molecular computing, data representation of molecular computing, molecular operations of molecular computing and number representation of molecular computing, and provides many molecular algorithm to construct the parity generator and the parity checker of error-detection codes on digital communication, to encode integers of different formats, single precision and double precision of floating-point numbers, to implement addition and subtraction of unsigned integers, to construct logic operations...

  3. Computation at the edge of chaos: Phase transition and emergent computation

    International Nuclear Information System (INIS)

    Langton, C.

    1990-01-01

    In order for computation to emerge spontaneously and become an important factor in the dynamics of a system, the material substrate must support the primitive functions required for computation: the transmission, storage, and modification of information. Under what conditions might we expect physical systems to support such computational primitives? This paper presents research on Cellular Automata which suggests that the optimal conditions for the support of information transmission, storage, and modification, are achieved in the vicinity of a phase transition. We observe surprising similarities between the behaviors of computations and systems near phase-transitions, finding analogs of computational complexity classes and the Halting problem within the phenomenology of phase-transitions. We conclude that there is a fundamental connection between computation and phase-transitions, and discuss some of the implications for our understanding of nature if such a connection is borne out. 31 refs., 16 figs

  4. Computational force, mass, and energy

    International Nuclear Information System (INIS)

    Numrich, R.W.

    1997-01-01

    This paper describes a correspondence between computational quantities commonly used to report computer performance measurements and mechanical quantities from classical Newtonian mechanics. It defines a set of three fundamental computational quantities that are sufficient to establish a system of computational measurement. From these quantities, it defines derived computational quantities that have analogous physical counterparts. These computational quantities obey three laws of motion in computational space. The solutions to the equations of motion, with appropriate boundary conditions, determine the computational mass of the computer. Computational forces, with magnitudes specific to each instruction and to each computer, overcome the inertia represented by this mass. The paper suggests normalizing the computational mass scale by picking the mass of a register on the CRAY-1 as the standard unit of mass

  5. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  6. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  7. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  8. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  9. Do Clouds Compute? A Framework for Estimating the Value of Cloud Computing

    Science.gov (United States)

    Klems, Markus; Nimis, Jens; Tai, Stefan

    On-demand provisioning of scalable and reliable compute services, along with a cost model that charges consumers based on actual service usage, has been an objective in distributed computing research and industry for a while. Cloud Computing promises to deliver on this objective: consumers are able to rent infrastructure in the Cloud as needed, deploy applications and store data, and access them via Web protocols on a pay-per-use basis. The acceptance of Cloud Computing, however, depends on the ability for Cloud Computing providers and consumers to implement a model for business value co-creation. Therefore, a systematic approach to measure costs and benefits of Cloud Computing is needed. In this paper, we discuss the need for valuation of Cloud Computing, identify key components, and structure these components in a framework. The framework assists decision makers in estimating Cloud Computing costs and to compare these costs to conventional IT solutions. We demonstrate by means of representative use cases how our framework can be applied to real world scenarios.

  10. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  11. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  12. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  13. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  14. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  15. The efficacy of a health-related quality-of-life intervention during 48 weeks of biologic treatment of patients with moderate to severe psoriasis: study protocol for a multicenter randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Prinsen Cecilia AC

    2012-12-01

    Full Text Available Abstract Background Interest in health-related quality of life (HRQoL outcome research in dermatology is increasing, especially in the systemic treatment of psoriasis with biologic agents. In other specialties, such as oncology, the application of a HRQoL intervention is considered to be an aid for monitoring disease and treatment over time, for the communication with the patient, and for improving treatment outcome. However, in dermatology practice, the application of this intervention is relatively new. Moreover, evidence on the effectiveness of a HRQoL intervention in dermatology is missing. It is hypothesized that the application of a HRQoL intervention in dermatology practice will have a positive impact on patients’ HRQoL as well as on doctor-patient communication. Methods/design In a prospective multicenter cluster randomized controlled trial, patients diagnosed with moderate to severe psoriasis who receive biologic treatment, will be followed for 48 weeks. The study sites, and not the patients, will be randomly allocated via a computer-based randomization system to either the intervention (treatment with etanercept and standardized HRQoL assessment and communication or the control group (treatment with etanercept alone. The HRQoL intervention will include 1 the electronic assessment of the Skindex-29, a well-studied dermatology-specific HRQoL questionnaire, and 2 the communication of the resulting Skindex-29 data with the patient. Prior to study start, dermatologists in the intervention group will be educated and trained in standardized HRQoL assessment and communication using the Skindex-29. At six consecutive visits, patients at study sites in the intervention group will be asked to complete the Skindex-29 on a desk-top pc at the clinic, just before their consultation with the dermatologist. A print-out of the completed questionnaire will be made and, guided by this print-out, feedback on the HRQoL scores will be given during the

  16. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  17. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  18. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  19. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  20. A Heterogeneous High-Performance System for Computational and Computer Science

    Science.gov (United States)

    2016-11-15

    expand the research infrastructure at the institution but also to enhance the high -performance computing training provided to both undergraduate and... cloud computing, supercomputing, and the availability of cheap memory and storage led to enormous amounts of data to be sifted through in forensic... High -Performance Computing (HPC) tools that can be integrated with existing curricula and support our research to modernize and dramatically advance

  1. Geometric computations with interval and new robust methods applications in computer graphics, GIS and computational geometry

    CERN Document Server

    Ratschek, H

    2003-01-01

    This undergraduate and postgraduate text will familiarise readers with interval arithmetic and related tools to gain reliable and validated results and logically correct decisions for a variety of geometric computations plus the means for alleviating the effects of the errors. It also considers computations on geometric point-sets, which are neither robust nor reliable in processing with standard methods. The authors provide two effective tools for obtaining correct results: (a) interval arithmetic, and (b) ESSA the new powerful algorithm which improves many geometric computations and makes th

  2. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  3. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  4. An introduction to computer viruses

    Energy Technology Data Exchange (ETDEWEB)

    Brown, D.R.

    1992-03-01

    This report on computer viruses is based upon a thesis written for the Master of Science degree in Computer Science from the University of Tennessee in December 1989 by David R. Brown. This thesis is entitled An Analysis of Computer Virus Construction, Proliferation, and Control and is available through the University of Tennessee Library. This paper contains an overview of the computer virus arena that can help the reader to evaluate the threat that computer viruses pose. The extent of this threat can only be determined by evaluating many different factors. These factors include the relative ease with which a computer virus can be written, the motivation involved in writing a computer virus, the damage and overhead incurred by infected systems, and the legal implications of computer viruses, among others. Based upon the research, the development of a computer virus seems to require more persistence than technical expertise. This is a frightening proclamation to the computing community. The education of computer professionals to the dangers that viruses pose to the welfare of the computing industry as a whole is stressed as a means of inhibiting the current proliferation of computer virus programs. Recommendations are made to assist computer users in preventing infection by computer viruses. These recommendations support solid general computer security practices as a means of combating computer viruses.

  5. Navier-Stokes computer

    International Nuclear Information System (INIS)

    Hayder, M.E.

    1988-01-01

    A new scientific supercomputer, known as the Navier-Stokes Computer (NSC), has been designed. The NSC is a multi-purpose machine, and for applications in the field of computational fluid dynamics (CFD), this supercomputer is expected to yield a computational speed far exceeding that of the present-day super computers. This computer has a few very powerful processors (known as nodes) connected by an internodal network. There are three versions of the NSC nodes: micro-, mini- and full-node. The micro-node was developed to prove, to demonstrate and to refine the key architectural features of the NSC. Architectures of the two recent versions of the NSC nodes are presented, with the main focus on the full-node. At a clock speed of 20 MHz, the mini- and the full-node have peak computational speeds of 200 and 640 MFLOPS, respectively. The full-node is the final version for the NSC nodes and an NSC is expected to have 128 full-nodes. To test the suitability of different algorithms on the NSC architecture, an NSC simulator was developed. Some of the existing computational fluid dynamics codes were placed on this simulator to determine important and relevant issues relating to the efficient use of the NSC architecture

  6. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  7. Demonstration of blind quantum computing.

    Science.gov (United States)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  8. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  9. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Directory of Open Access Journals (Sweden)

    Kevin S Bonham

    2017-10-01

    Full Text Available While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  10. Women are underrepresented in computational biology: An analysis of the scholarly literature in biology, computer science and computational biology.

    Science.gov (United States)

    Bonham, Kevin S; Stefan, Melanie I

    2017-10-01

    While women are generally underrepresented in STEM fields, there are noticeable differences between fields. For instance, the gender ratio in biology is more balanced than in computer science. We were interested in how this difference is reflected in the interdisciplinary field of computational/quantitative biology. To this end, we examined the proportion of female authors in publications from the PubMed and arXiv databases. There are fewer female authors on research papers in computational biology, as compared to biology in general. This is true across authorship position, year, and journal impact factor. A comparison with arXiv shows that quantitative biology papers have a higher ratio of female authors than computer science papers, placing computational biology in between its two parent fields in terms of gender representation. Both in biology and in computational biology, a female last author increases the probability of other authors on the paper being female, pointing to a potential role of female PIs in influencing the gender balance.

  11. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  12. Place-Specific Computing

    DEFF Research Database (Denmark)

    Messeter, Jörn; Johansson, Michael

    project place- specific computing is explored through design oriented research. This article reports six pilot studies where design students have designed concepts for place-specific computing in Berlin (Germany), Cape Town (South Africa), Rome (Italy) and Malmö (Sweden). Background and arguments...... for place-specific computing as a genre of interaction design are described. A total number of 36 design concepts designed for 16 designated zones in the four cities are presented. An analysis of the design concepts is presented indicating potentials, possibilities and problems as directions for future......An increased interest in the notion of place has evolved in interaction design. Proliferation of wireless infrastructure, developments in digital media, and a ‘spatial turn’ in computing provides the base for place-specific computing as a suggested new genre of interaction design. In the REcult...

  13. Non-Causal Computation

    Directory of Open Access Journals (Sweden)

    Ämin Baumeler

    2017-07-01

    Full Text Available Computation models such as circuits describe sequences of computation steps that are carried out one after the other. In other words, algorithm design is traditionally subject to the restriction imposed by a fixed causal order. We address a novel computing paradigm beyond quantum computing, replacing this assumption by mere logical consistency: We study non-causal circuits, where a fixed time structure within a gate is locally assumed whilst the global causal structure between the gates is dropped. We present examples of logically consistent non-causal circuits outperforming all causal ones; they imply that suppressing loops entirely is more restrictive than just avoiding the contradictions they can give rise to. That fact is already known for correlations as well as for communication, and we here extend it to computation.

  14. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  15. Reliability in the utility computing era: Towards reliable Fog computing

    DEFF Research Database (Denmark)

    Madsen, Henrik; Burtschy, Bernard; Albeanu, G.

    2013-01-01

    This paper considers current paradigms in computing and outlines the most important aspects concerning their reliability. The Fog computing paradigm as a non-trivial extension of the Cloud is considered and the reliability of the networks of smart devices are discussed. Combining the reliability...... requirements of grid and cloud paradigms with the reliability requirements of networks of sensor and actuators it follows that designing a reliable Fog computing platform is feasible....

  16. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  17. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  18. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  19. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  1. Methodical Approaches to Teaching of Computer Modeling in Computer Science Course

    Science.gov (United States)

    Rakhimzhanova, B. Lyazzat; Issabayeva, N. Darazha; Khakimova, Tiyshtik; Bolyskhanova, J. Madina

    2015-01-01

    The purpose of this study was to justify of the formation technique of representation of modeling methodology at computer science lessons. The necessity of studying computer modeling is that the current trends of strengthening of general education and worldview functions of computer science define the necessity of additional research of the…

  2. Computer Graphics 2: More of the Best Computer Art and Design.

    Science.gov (United States)

    1994

    This collection of computer generated images aims to present media tools and processes, stimulate ideas, and inspire artists and art students working in computer-related design. The images are representative of state-of-the-art editorial, broadcast, packaging, fine arts, and graphic techniques possible through computer generation. Each image is…

  3. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  4. Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.

    Science.gov (United States)

    Handels, H; Ehrhardt, J

    2009-01-01

    Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or

  5. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  6. Distributed computing system with dual independent communications paths between computers and employing split tokens

    Science.gov (United States)

    Rasmussen, Robert D. (Inventor); Manning, Robert M. (Inventor); Lewis, Blair F. (Inventor); Bolotin, Gary S. (Inventor); Ward, Richard S. (Inventor)

    1990-01-01

    This is a distributed computing system providing flexible fault tolerance; ease of software design and concurrency specification; and dynamic balance of the loads. The system comprises a plurality of computers each having a first input/output interface and a second input/output interface for interfacing to communications networks each second input/output interface including a bypass for bypassing the associated computer. A global communications network interconnects the first input/output interfaces for providing each computer the ability to broadcast messages simultaneously to the remainder of the computers. A meshwork communications network interconnects the second input/output interfaces providing each computer with the ability to establish a communications link with another of the computers bypassing the remainder of computers. Each computer is controlled by a resident copy of a common operating system. Communications between respective ones of computers is by means of split tokens each having a moving first portion which is sent from computer to computer and a resident second portion which is disposed in the memory of at least one of computer and wherein the location of the second portion is part of the first portion. The split tokens represent both functions to be executed by the computers and data to be employed in the execution of the functions. The first input/output interfaces each include logic for detecting a collision between messages and for terminating the broadcasting of a message whereby collisions between messages are detected and avoided.

  7. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  8. Modelling, abstraction, and computation in systems biology: A view from computer science.

    Science.gov (United States)

    Melham, Tom

    2013-04-01

    Systems biology is centrally engaged with computational modelling across multiple scales and at many levels of abstraction. Formal modelling, precise and formalised abstraction relationships, and computation also lie at the heart of computer science--and over the past decade a growing number of computer scientists have been bringing their discipline's core intellectual and computational tools to bear on biology in fascinating new ways. This paper explores some of the apparent points of contact between the two fields, in the context of a multi-disciplinary discussion on conceptual foundations of systems biology. Copyright © 2012 Elsevier Ltd. All rights reserved.

  9. All-optical reservoir computing.

    Science.gov (United States)

    Duport, François; Schneider, Bendix; Smerieri, Anteo; Haelterman, Marc; Massar, Serge

    2012-09-24

    Reservoir Computing is a novel computing paradigm that uses a nonlinear recurrent dynamical system to carry out information processing. Recent electronic and optoelectronic Reservoir Computers based on an architecture with a single nonlinear node and a delay loop have shown performance on standardized tasks comparable to state-of-the-art digital implementations. Here we report an all-optical implementation of a Reservoir Computer, made of off-the-shelf components for optical telecommunications. It uses the saturation of a semiconductor optical amplifier as nonlinearity. The present work shows that, within the Reservoir Computing paradigm, all-optical computing with state-of-the-art performance is possible.

  10. 77 FR 20047 - Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing...

    Science.gov (United States)

    2012-04-03

    ... INTERNATIONAL TRADE COMMISSION [DN 2889] Certain Computer and Computer Peripheral Devices and... Certain Computer and Computer Peripheral Devices and Components Thereof and Products Containing the Same... importation, and the sale within the United States after importation of certain computer and computer...

  11. Numbers and computers

    CERN Document Server

    Kneusel, Ronald T

    2015-01-01

    This is a book about numbers and how those numbers are represented in and operated on by computers. It is crucial that developers understand this area because the numerical operations allowed by computers, and the limitations of those operations, especially in the area of floating point math, affect virtually everything people try to do with computers. This book aims to fill this gap by exploring, in sufficient but not overwhelming detail, just what it is that computers do with numbers. Divided into two parts, the first deals with standard representations of integers and floating point numb

  12. New computing systems, future computing environment, and their implications on structural analysis and design

    Science.gov (United States)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  18. The computer boys take over computers, programmers, and the politics of technical expertise

    CERN Document Server

    Ensmenger, Nathan L

    2010-01-01

    This is a book about the computer revolution of the mid-twentieth century and the people who made it possible. Unlike most histories of computing, it is not a book about machines, inventors, or entrepreneurs. Instead, it tells the story of the vast but largely anonymous legions of computer specialists -- programmers, systems analysts, and other software developers -- who transformed the electronic computer from a scientific curiosity into the defining technology of the modern era. As the systems that they built became increasingly powerful and ubiquitous, these specialists became the focus of a series of critiques of the social and organizational impact of electronic computing. To many of their contemporaries, it seemed the "computer boys" were taking over, not just in the corporate setting, but also in government, politics, and society in general. In The Computer Boys Take Over, Nathan Ensmenger traces the rise to power of the computer expert in modern American society. His rich and nuanced portrayal of the ...

  19. Forensic Computing (Dagstuhl Seminar 13482)

    OpenAIRE

    Freiling, Felix C.; Hornung, Gerrit; Polcák, Radim

    2014-01-01

    Forensic computing} (sometimes also called digital forensics, computer forensics or IT forensics) is a branch of forensic science pertaining to digital evidence, i.e., any legal evidence that is processed by digital computer systems or stored on digital storage media. Forensic computing is a new discipline evolving within the intersection of several established research areas such as computer science, computer engineering and law. Forensic computing is rapidly gaining importance since the...

  20. Design of Computer Experiments

    DEFF Research Database (Denmark)

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  1. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  2. Performing an allreduce operation on a plurality of compute nodes of a parallel computer

    Science.gov (United States)

    Faraj, Ahmad [Rochester, MN

    2012-04-17

    Methods, apparatus, and products are disclosed for performing an allreduce operation on a plurality of compute nodes of a parallel computer. Each compute node includes at least two processing cores. Each processing core has contribution data for the allreduce operation. Performing an allreduce operation on a plurality of compute nodes of a parallel computer includes: establishing one or more logical rings among the compute nodes, each logical ring including at least one processing core from each compute node; performing, for each logical ring, a global allreduce operation using the contribution data for the processing cores included in that logical ring, yielding a global allreduce result for each processing core included in that logical ring; and performing, for each compute node, a local allreduce operation using the global allreduce results for each processing core on that compute node.

  3. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  4. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  5. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  6. Intelligent distributed computing

    CERN Document Server

    Thampi, Sabu

    2015-01-01

    This book contains a selection of refereed and revised papers of the Intelligent Distributed Computing Track originally presented at the third International Symposium on Intelligent Informatics (ISI-2014), September 24-27, 2014, Delhi, India.  The papers selected for this Track cover several Distributed Computing and related topics including Peer-to-Peer Networks, Cloud Computing, Mobile Clouds, Wireless Sensor Networks, and their applications.

  7. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  8. Feynman diagram drawing made easy

    International Nuclear Information System (INIS)

    Baillargeon, M.

    1997-01-01

    We present a drawing package optimised for Feynman diagrams. These can be constructed interactively with a mouse-driven graphical interface or from a script file, more suitable to work with a diagram generator. It provides most features encountered in Feynman diagrams and allows to modify every part of a diagram after its creation. Special attention has been paid to obtain a high quality printout as easily as possible. This package is written in Tcl/Tk and in C. (orig.)

  9. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  10. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  11. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  12. Computational Ocean Acoustics

    CERN Document Server

    Jensen, Finn B; Porter, Michael B; Schmidt, Henrik

    2011-01-01

    Since the mid-1970s, the computer has played an increasingly pivotal role in the field of ocean acoustics. Faster and less expensive than actual ocean experiments, and capable of accommodating the full complexity of the acoustic problem, numerical models are now standard research tools in ocean laboratories. The progress made in computational ocean acoustics over the last thirty years is summed up in this authoritative and innovatively illustrated new text. Written by some of the field's pioneers, all Fellows of the Acoustical Society of America, Computational Ocean Acoustics presents the latest numerical techniques for solving the wave equation in heterogeneous fluid–solid media. The authors discuss various computational schemes in detail, emphasizing the importance of theoretical foundations that lead directly to numerical implementations for real ocean environments. To further clarify the presentation, the fundamental propagation features of the techniques are illustrated in color. Computational Ocean A...

  13. Research on cloud computing solutions

    OpenAIRE

    Liudvikas Kaklauskas; Vaida Zdanytė

    2015-01-01

    Cloud computing can be defined as a new style of computing in which dynamically scala-ble and often virtualized resources are provided as a services over the Internet. Advantages of the cloud computing technology include cost savings, high availability, and easy scalability. Voas and Zhang adapted six phases of computing paradigms, from dummy termi-nals/mainframes, to PCs, networking computing, to grid and cloud computing. There are four types of cloud computing: public cloud, private cloud, ...

  14. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  15. Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources

    International Nuclear Information System (INIS)

    Evans, D; Fisk, I; Holzman, B; Pordes, R; Tiradani, A; Melo, A; Sheldon, P; Metson, S

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely 'on-demand' as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the 'base-line' needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.

  16. Computer algebra and operators

    Science.gov (United States)

    Fateman, Richard; Grossman, Robert

    1989-01-01

    The symbolic computation of operator expansions is discussed. Some of the capabilities that prove useful when performing computer algebra computations involving operators are considered. These capabilities may be broadly divided into three areas: the algebraic manipulation of expressions from the algebra generated by operators; the algebraic manipulation of the actions of the operators upon other mathematical objects; and the development of appropriate normal forms and simplification algorithms for operators and their actions. Brief descriptions are given of the computer algebra computations that arise when working with various operators and their actions.

  17. Computing in high energy physics

    International Nuclear Information System (INIS)

    Watase, Yoshiyuki

    1991-01-01

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors

  18. Living with Computers. Young Danes' Uses of and Thoughts on the Uses of Computers

    DEFF Research Database (Denmark)

    Stald, Gitte Bang

    1998-01-01

    Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere......Young Danes, computers,users, super users, non users, computer access, unge danskere, computere,brugere,superbrugere,ikke-brugere...

  19. Gender and Computers: Two Surveys of Computer-Related Attitudes.

    Science.gov (United States)

    Wilder, Gita; And Others

    1985-01-01

    Describes two surveys used to (1) determine sex differences in attitudes toward computers and video games among schoolchildren and the relationship of these attitudes to attitudes about science, math, and writing; and (2) sex differences in attitudes toward computing among a select group of highly motivated college freshmen. (SA)

  20. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  1. Computer Self-Efficacy: A Practical Indicator of Student Computer Competency in Introductory IS Courses

    Directory of Open Access Journals (Sweden)

    Rex Karsten

    1998-01-01

    Full Text Available Students often receive their first college-level computer training in introductory information systems courses. Students and faculty frequently expect this training to develop a level of student computer competence that will support computer use in future courses. In this study, we applied measures of computer self-efficacy to students in a typical introductory IS course. The measures provided useful evidence that student perceptions of their ability to use computers effectively in the future significantly improved as a result of their training experience. The computer self-efficacy measures also provided enhanced insight into course-related factors of practical concern to IS educators. Study results also suggest computer self-efficacy measures may be a practical and informative means of assessing computer-training outcomes in the introductory IS course context

  2. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  3. Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.

    Science.gov (United States)

    Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei

    2018-06-15

    Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.

  4. Cloud Computing Governance Lifecycle

    Directory of Open Access Journals (Sweden)

    Soňa Karkošková

    2016-06-01

    Full Text Available Externally provisioned cloud services enable flexible and on-demand sourcing of IT resources. Cloud computing introduces new challenges such as need of business process redefinition, establishment of specialized governance and management, organizational structures and relationships with external providers and managing new types of risk arising from dependency on external providers. There is a general consensus that cloud computing in addition to challenges brings many benefits but it is unclear how to achieve them. Cloud computing governance helps to create business value through obtain benefits from use of cloud computing services while optimizing investment and risk. Challenge, which organizations are facing in relation to governing of cloud services, is how to design and implement cloud computing governance to gain expected benefits. This paper aims to provide guidance on implementation activities of proposed Cloud computing governance lifecycle from cloud consumer perspective. Proposed model is based on SOA Governance Framework and consists of lifecycle for implementation and continuous improvement of cloud computing governance model.

  5. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  6. Computational invariant theory

    CERN Document Server

    Derksen, Harm

    2015-01-01

    This book is about the computational aspects of invariant theory. Of central interest is the question how the invariant ring of a given group action can be calculated. Algorithms for this purpose form the main pillars around which the book is built. There are two introductory chapters, one on Gröbner basis methods and one on the basic concepts of invariant theory, which prepare the ground for the algorithms. Then algorithms for computing invariants of finite and reductive groups are discussed. Particular emphasis lies on interrelations between structural properties of invariant rings and computational methods. Finally, the book contains a chapter on applications of invariant theory, covering fields as disparate as graph theory, coding theory, dynamical systems, and computer vision. The book is intended for postgraduate students as well as researchers in geometry, computer algebra, and, of course, invariant theory. The text is enriched with numerous explicit examples which illustrate the theory and should be ...

  7. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  8. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  9. Volunteer Computing for Science Gateways

    OpenAIRE

    Anderson, David

    2017-01-01

    This poster offers information about volunteer computing for science gateways that offer high-throughput computing services. Volunteer computing can be used to get computing power. This increases the visibility of the gateway to the general public as well as increasing computing capacity at little cost.

  10. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  11. Modern computer hardware and the role of central computing facilities in particle physics

    International Nuclear Information System (INIS)

    Zacharov, V.

    1981-01-01

    Important recent changes in the hardware technology of computer system components are reviewed, and the impact of these changes assessed on the present and future pattern of computing in particle physics. The place of central computing facilities is particularly examined, to answer the important question as to what, if anything, should be their future role. Parallelism in computing system components is considered to be an important property that can be exploited with advantage. The paper includes a short discussion of the position of communications and network technology in modern computer systems. (orig.)

  12. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  13. COMPUTING: International symposium

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    Recent Developments in Computing, Processor, and Software Research for High Energy Physics, a four-day international symposium, was held in Guanajuato, Mexico, from 8-11 May, with 112 attendees from nine countries. The symposium was the third in a series of meetings exploring activities in leading-edge computing technology in both processor and software research and their effects on high energy physics. Topics covered included fixed-target on- and off-line reconstruction processors; lattice gauge and general theoretical processors and computing; multiprocessor projects; electron-positron collider on- and offline reconstruction processors; state-of-the-art in university computer science and industry; software research; accelerator processors; and proton-antiproton collider on and off-line reconstruction processors

  14. Quantum steady computation

    International Nuclear Information System (INIS)

    Castagnoli, G.

    1991-01-01

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition

  15. Quantum steady computation

    Energy Technology Data Exchange (ETDEWEB)

    Castagnoli, G. (Dipt. di Informatica, Sistemistica, Telematica, Univ. di Genova, Viale Causa 13, 16145 Genova (IT))

    1991-08-10

    This paper reports that current conceptions of quantum mechanical computers inherit from conventional digital machines two apparently interacting features, machine imperfection and temporal development of the computational process. On account of machine imperfection, the process would become ideally reversible only in the limiting case of zero speed. Therefore the process is irreversible in practice and cannot be considered to be a fundamental quantum one. By giving up classical features and using a linear, reversible and non-sequential representation of the computational process - not realizable in classical machines - the process can be identified with the mathematical form of a quantum steady state. This form of steady quantum computation would seem to have an important bearing on the notion of cognition.

  16. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  17. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  18. Computational fluid dynamics on a massively parallel computer

    Science.gov (United States)

    Jespersen, Dennis C.; Levit, Creon

    1989-01-01

    A finite difference code was implemented for the compressible Navier-Stokes equations on the Connection Machine, a massively parallel computer. The code is based on the ARC2D/ARC3D program and uses the implicit factored algorithm of Beam and Warming. The codes uses odd-even elimination to solve linear systems. Timings and computation rates are given for the code, and a comparison is made with a Cray XMP.

  19. Contracting for Computer Software in Standardized Computer Languages

    OpenAIRE

    Brannigan, Vincent M.; Dayhoff, Ruth E.

    1982-01-01

    The interaction between standardized computer languages and contracts for programs which use these languages is important to the buyer or seller of software. The rationale for standardization, the problems in standardizing computer languages, and the difficulties of determining whether the product conforms to the standard are issues which must be understood. The contract law processes of delivery, acceptance testing, acceptance, rejection, and revocation of acceptance are applicable to the co...

  20. The CMS Computing Model

    International Nuclear Information System (INIS)

    Bonacorsi, D.

    2007-01-01

    The CMS experiment at LHC has developed a baseline Computing Model addressing the needs of a computing system capable to operate in the first years of LHC running. It is focused on a data model with heavy streaming at the raw data level based on trigger, and on the achievement of the maximum flexibility in the use of distributed computing resources. The CMS distributed Computing Model includes a Tier-0 centre at CERN, a CMS Analysis Facility at CERN, several Tier-1 centres located at large regional computing centres, and many Tier-2 centres worldwide. The workflows have been identified, along with a baseline architecture for the data management infrastructure. This model is also being tested in Grid Service Challenges of increasing complexity, coordinated with the Worldwide LHC Computing Grid community

  1. Computer self-efficacy and computer attitude as correlates of ...

    African Journals Online (AJOL)

    The Internet as a useful tool that supports teaching and learning is not in full use in most secondary schools in Nigeria hence limiting the students from maximizing the potentials of Internet in advancing their academic pursuits. This study, therefore, examined the extent to which computer self-efficacy and computer attitude ...

  2. The Future of Cloud Computing

    Directory of Open Access Journals (Sweden)

    Anamaroa SIclovan

    2011-12-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offeredto the consumers as a product delivered online. This represents an advantage for the organization both regarding the cost and the opportunity for the new business. This paper presents the future perspectives in cloud computing. The paper presents some issues of the cloud computing paradigm. It is a theoretical paper.Keywords: Cloud Computing, Pay-per-use

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  4. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  5. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  6. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  7. On several computer-oriented studies

    International Nuclear Information System (INIS)

    Takahashi, Ryoichi

    1982-01-01

    To utilize fully digital techniques for solving various difficult problems, nuclear engineers have recourse to computer-oriented approaches. The current trend, in such fields as optimization theory, control system theory and computational fluid dynamics reflect the ability to use computers to obtain numerical solutions to complex problems. Special purpose computers will be used as the integral part of the solving system to process a large amount of data, to implement a control law and even to produce a decision-making. Many problem-solving systems designed in the future will incorporate special-purpose computers as system component. The optimum use of computer system is discussed: why are energy model, energy data base and a big computer used; why will the economic process-computer be allocated to nuclear plants in the future; why should the super-computer be demonstrated at once. (Mori, K.)

  8. Experiments in computing: a survey.

    Science.gov (United States)

    Tedre, Matti; Moisseinen, Nella

    2014-01-01

    Experiments play a central role in science. The role of experiments in computing is, however, unclear. Questions about the relevance of experiments in computing attracted little attention until the 1980s. As the discipline then saw a push towards experimental computer science, a variety of technically, theoretically, and empirically oriented views on experiments emerged. As a consequence of those debates, today's computing fields use experiments and experiment terminology in a variety of ways. This paper analyzes experimentation debates in computing. It presents five ways in which debaters have conceptualized experiments in computing: feasibility experiment, trial experiment, field experiment, comparison experiment, and controlled experiment. This paper has three aims: to clarify experiment terminology in computing; to contribute to disciplinary self-understanding of computing; and, due to computing's centrality in other fields, to promote understanding of experiments in modern science in general.

  9. Heterogeneous compute in computer vision: OpenCL in OpenCV

    Science.gov (United States)

    Gasparakis, Harris

    2014-02-01

    We explore the relevance of Heterogeneous System Architecture (HSA) in Computer Vision, both as a long term vision, and as a near term emerging reality via the recently ratified OpenCL 2.0 Khronos standard. After a brief review of OpenCL 1.2 and 2.0, including HSA features such as Shared Virtual Memory (SVM) and platform atomics, we identify what genres of Computer Vision workloads stand to benefit by leveraging those features, and we suggest a new mental framework that replaces GPU compute with hybrid HSA APU compute. As a case in point, we discuss, in some detail, popular object recognition algorithms (part-based models), emphasizing the interplay and concurrent collaboration between the GPU and CPU. We conclude by describing how OpenCL has been incorporated in OpenCV, a popular open source computer vision library, emphasizing recent work on the Transparent API, to appear in OpenCV 3.0, which unifies the native CPU and OpenCL execution paths under a single API, allowing the same code to execute either on CPU or on a OpenCL enabled device, without even recompiling.

  10. Tracking and computing

    International Nuclear Information System (INIS)

    Niederer, J.

    1983-01-01

    This note outlines several ways in which large scale simulation computing and programming support may be provided to the SSC design community. One aspect of the problem is getting supercomputer power without the high cost and long lead times of large scale institutional computing. Another aspect is the blending of modern programming practices with more conventional accelerator design programs in ways that do not also swamp designers with the details of complicated computer technology

  11. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  12. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  13. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  14. Mission: Define Computer Literacy. The Illinois-Wisconsin ISACS Computer Coordinators' Committee on Computer Literacy Report (May 1985).

    Science.gov (United States)

    Computing Teacher, 1985

    1985-01-01

    Defines computer literacy and describes a computer literacy course which stresses ethics, hardware, and disk operating systems throughout. Core units on keyboarding, word processing, graphics, database management, problem solving, algorithmic thinking, and programing are outlined, together with additional units on spreadsheets, simulations,…

  15. Medical three-dimensional printing opens up new opportunities in cardiology and cardiac surgery.

    Science.gov (United States)

    Bartel, Thomas; Rivard, Andrew; Jimenez, Alejandro; Mestres, Carlos A; Müller, Silvana

    2018-04-14

    Advanced percutaneous and surgical procedures in structural and congenital heart disease require precise pre-procedural planning and continuous quality control. Although current imaging modalities and post-processing software assists with peri-procedural guidance, their capabilities for spatial conceptualization remain limited in two- and three-dimensional representations. In contrast, 3D printing offers not only improved visualization for procedural planning, but provides substantial information on the accuracy of surgical reconstruction and device implantations. Peri-procedural 3D printing has the potential to set standards of quality assurance and individualized healthcare in cardiovascular medicine and surgery. Nowadays, a variety of clinical applications are available showing how accurate 3D computer reformatting and physical 3D printouts of native anatomy, embedded pathology, and implants are and how they may assist in the development of innovative therapies. Accurate imaging of pathology including target region for intervention, its anatomic features and spatial relation to the surrounding structures is critical for selecting optimal approach and evaluation of procedural results. This review describes clinical applications of 3D printing, outlines current limitations, and highlights future implications for quality control, advanced medical education and training.

  16. `95 computer system operation project

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new.

  17. '95 computer system operation project

    International Nuclear Information System (INIS)

    Kim, Young Taek; Lee, Hae Cho; Park, Soo Jin; Kim, Hee Kyung; Lee, Ho Yeun; Lee, Sung Kyu; Choi, Mi Kyung

    1995-12-01

    This report describes overall project works related to the operation of mainframe computers, the management of nuclear computer codes and the project of nuclear computer code conversion. The results of the project are as follows ; 1. The operation and maintenance of the three mainframe computers and other utilities. 2. The management of the nuclear computer codes. 3. The finishing of the computer codes conversion project. 26 tabs., 5 figs., 17 refs. (Author) .new

  18. Using Amazon's Elastic Compute Cloud to scale CMS' compute hardware dynamically.

    CERN Document Server

    Melo, Andrew Malone

    2011-01-01

    Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud-computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely on-demand as limits and caps on usage are imposed. Our trial workflows allow us t...

  19. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  20. Utility Computing: Reality and Beyond

    Science.gov (United States)

    Ivanov, Ivan I.

    Utility Computing is not a new concept. It involves organizing and providing a wide range of computing-related services as public utilities. Much like water, gas, electricity and telecommunications, the concept of computing as public utility was announced in 1955. Utility Computing remained a concept for near 50 years. Now some models and forms of Utility Computing are emerging such as storage and server virtualization, grid computing, and automated provisioning. Recent trends in Utility Computing as a complex technology involve business procedures that could profoundly transform the nature of companies' IT services, organizational IT strategies and technology infrastructure, and business models. In the ultimate Utility Computing models, organizations will be able to acquire as much IT services as they need, whenever and wherever they need them. Based on networked businesses and new secure online applications, Utility Computing would facilitate "agility-integration" of IT resources and services within and between virtual companies. With the application of Utility Computing there could be concealment of the complexity of IT, reduction of operational expenses, and converting of IT costs to variable `on-demand' services. How far should technology, business and society go to adopt Utility Computing forms, modes and models?

  1. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  2. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  3. Physicist or computer specialist?

    Energy Technology Data Exchange (ETDEWEB)

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  4. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  5. Applications of symbolic algebraic computation

    International Nuclear Information System (INIS)

    Brown, W.S.; Hearn, A.C.

    1979-01-01

    This paper is a survey of applications of systems for symbomic algebraic computation. In most successful applications, calculations that can be taken to a given order by hand are then extended one or two more orders by computer. Furthermore, with a few notable exceptins, these applications also involve numerical computation in some way. Therefore the authors emphasize the interface between symbolic and numerical computation, including: 1. Computations with both symbolic and numerical phases. 2. Data involving both the unpredictible size and shape that typify symbolic computation and the (usually inexact) numerical values that characterize numerical computation. 3. Applications of one field to the other. It is concluded that the fields of symbolic and numerical computation can advance most fruitfully in harmony rather than in competition. (Auth.)

  6. Coping with distributed computing

    International Nuclear Information System (INIS)

    Cormell, L.

    1992-09-01

    The rapid increase in the availability of high performance, cost-effective RISC/UNIX workstations has been both a blessing and a curse. The blessing of having extremely powerful computing engines available on the desk top is well-known to many users. The user has tremendous freedom, flexibility, and control of his environment. That freedom can, however, become the curse of distributed computing. The user must become a system manager to some extent, he must worry about backups, maintenance, upgrades, etc. Traditionally these activities have been the responsibility of a central computing group. The central computing group, however, may find that it can no longer provide all of the traditional services. With the plethora of workstations now found on so many desktops throughout the entire campus or lab, the central computing group may be swamped by support requests. This talk will address several of these computer support and management issues by providing some examples of the approaches taken at various HEP institutions. In addition, a brief review of commercial directions or products for distributed computing and management will be given

  7. Distributed GPU Computing in GIScience

    Science.gov (United States)

    Jiang, Y.; Yang, C.; Huang, Q.; Li, J.; Sun, M.

    2013-12-01

    Geoscientists strived to discover potential principles and patterns hidden inside ever-growing Big Data for scientific discoveries. To better achieve this objective, more capable computing resources are required to process, analyze and visualize Big Data (Ferreira et al., 2003; Li et al., 2013). Current CPU-based computing techniques cannot promptly meet the computing challenges caused by increasing amount of datasets from different domains, such as social media, earth observation, environmental sensing (Li et al., 2013). Meanwhile CPU-based computing resources structured as cluster or supercomputer is costly. In the past several years with GPU-based technology matured in both the capability and performance, GPU-based computing has emerged as a new computing paradigm. Compare to traditional computing microprocessor, the modern GPU, as a compelling alternative microprocessor, has outstanding high parallel processing capability with cost-effectiveness and efficiency(Owens et al., 2008), although it is initially designed for graphical rendering in visualization pipe. This presentation reports a distributed GPU computing framework for integrating GPU-based computing within distributed environment. Within this framework, 1) for each single computer, computing resources of both GPU-based and CPU-based can be fully utilized to improve the performance of visualizing and processing Big Data; 2) within a network environment, a variety of computers can be used to build up a virtual super computer to support CPU-based and GPU-based computing in distributed computing environment; 3) GPUs, as a specific graphic targeted device, are used to greatly improve the rendering efficiency in distributed geo-visualization, especially for 3D/4D visualization. Key words: Geovisualization, GIScience, Spatiotemporal Studies Reference : 1. Ferreira de Oliveira, M. C., & Levkowitz, H. (2003). From visual data exploration to visual data mining: A survey. Visualization and Computer Graphics, IEEE

  8. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  9. Private quantum computation: an introduction to blind quantum computing and related protocols

    Science.gov (United States)

    Fitzsimons, Joseph F.

    2017-06-01

    Quantum technologies hold the promise of not only faster algorithmic processing of data, via quantum computation, but also of more secure communications, in the form of quantum cryptography. In recent years, a number of protocols have emerged which seek to marry these concepts for the purpose of securing computation rather than communication. These protocols address the task of securely delegating quantum computation to an untrusted device while maintaining the privacy, and in some instances the integrity, of the computation. We present a review of the progress to date in this emerging area.

  10. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  11. Symmetry Groups for the Decomposition of Reversible Computers, Quantum Computers, and Computers in between

    Directory of Open Access Journals (Sweden)

    Alexis De Vos

    2011-06-01

    Full Text Available Whereas quantum computing circuits follow the symmetries of the unitary Lie group, classical reversible computation circuits follow the symmetries of a finite group, i.e., the symmetric group. We confront the decomposition of an arbitrary classical reversible circuit with w bits and the decomposition of an arbitrary quantum circuit with w qubits. Both decompositions use the control gate as building block, i.e., a circuit transforming only one (qubit, the transformation being controlled by the other w−1 (qubits. We explain why the former circuit can be decomposed into 2w − 1 control gates, whereas the latter circuit needs 2w − 1 control gates. We investigate whether computer circuits, not based on the full unitary group but instead on a subgroup of the unitary group, may be decomposable either into 2w − 1 or into 2w − 1 control gates.

  12. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  13. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  14. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  15. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  16. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  17. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  19. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  20. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)