WorldWideScience

Sample records for macintosh ii computer

  1. Customized Geological Map Patterns for the Macintosh Computer.

    Science.gov (United States)

    Boyer, Paul Slayton

    1986-01-01

    Describes how the graphics capabilities of the Apple Macintosh computer can be used in geological teaching by customizing fill patterns with lithologic symbols. Presents two methods for doing this: creating a dummy document, or by changing the pattern resource resident in the operating system. Special symbols can also replace fonts. (TW)

  2. Coping with Computer Viruses: General Discussion and Review of Symantec Anti-Virus for the Macintosh.

    Science.gov (United States)

    Primich, Tracy

    1992-01-01

    Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)

  3. Macintosh Computer Classroom and Laboratory Security: Preventing Unwanted Changes to the System.

    Science.gov (United States)

    Senn, Gary J.; Smyth, Thomas J. C.

    Because of the graphical interface and "openness" of the operating system, Macintosh computers are susceptible to undesirable changes by the user. This presentation discusses the advantages and disadvantages of software packages that offer protection for the Macintosh system. The two basic forms of software security packages include a…

  4. The Library Macintosh at SCIL [Small Computers in Libraries]'88.

    Science.gov (United States)

    Valauskas, Edward J.; And Others

    1988-01-01

    The first of three papers describes the role of Macintosh workstations in a library. The second paper explains why the Macintosh was selected for end-user searching in an academic library, and the third discusses advantages and disadvantages of desktop publishing for librarians. (8 references) (MES)

  5. How to Build an AppleSeed: A Parallel Macintosh Cluster for Numerically Intensive Computing

    Science.gov (United States)

    Decyk, V. K.; Dauger, D. E.

    We have constructed a parallel cluster consisting of a mixture of Apple Macintosh G3 and G4 computers running the Mac OS, and have achieved very good performance on numerically intensive, parallel plasma particle-incell simulations. A subset of the MPI message-passing library was implemented in Fortran77 and C. This library enabled us to port code, without modification, from other parallel processors to the Macintosh cluster. Unlike Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the main stream of computing.

  6. Macintosh Plus

    CERN Document Server

    1986-01-01

    Apple introduced the Macintosh Plus on January 16, 1986. The Macintosh Plus has an 8 MHz 68000 processor and an internal 800K floppy disk drive. It supports up to 4 MB of RAM. The Plus is a significant improvement over the previous compact Macs primarily due to the addition of the SCSI bus. Previous Macs did not have SCSI, thus making it more difficult to find a suitable external hard drive able to connect through the drive port, the printer port, or the modem port. These drives are considerably slower (as much as 4 times slower) than external SCSI hard drives. The Macintosh Plus is a very important computer in the history of the Apple Computers. It set up many of the standards that Apple followed for over a decade going forward.

  7. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    OpenAIRE

    Gregory H. Carlton

    2008-01-01

    The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems a...

  8. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2008-09-01

    Full Text Available The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems are marketed with the ability to accommodate these three common operating system environments, including Windows XP in native and virtual environments. We performed a series of experiments to measure the functionality and performance of the two most commonly used Windows-based computer forensics applications on a Macintosh running Windows XP in native mode and in two virtual environments relative to a similarly configured Dell personal computer. The research results are directly beneficial to practitioners, and the process illustrates affective pedagogy whereby students were engaged in applied research.

  9. A Macintosh based data system for array spectrometers (Poster)

    Science.gov (United States)

    Bregman, J.; Moss, N.

    An interactive data aquisition and reduction system has been assembled by combining a Macintosh computer with an instrument controller (an Apple II computer) via an RS-232 interface. The data system provides flexibility for operating different linear array spectrometers. The standard Macintosh interface is used to provide ease of operation and to allow transferring the reduced data to commercial graphics software.

  10. System Software 7 Macintosh

    CERN Multimedia

    1991-01-01

    System 7 is a single-user graphical user interface-based operating system for Macintosh computers and was part of the classic Mac OS line of operating systems. It was introduced on May 13, 1991, by Apple Computer. It succeeded System 6, and was the main Macintosh operating system until it was succeeded by Mac OS 8 in 1997. Features added with the System 7 release included virtual memory, personal file sharing, QuickTime, QuickDraw 3D, and an improved user interface. This is the first real major evolution of the Macintosh system, bringing a significant improvement in the user interface, improved stability and many new features such as the ability to use multiple applications at the same time. "System 7" is the last operating system name of the Macintosh that contains the word "system". Macintosh operating systems were later called "Mac OS" (for Macintosh Operating System).

  11. High speed acquisition of multi-parameter data using a Macintosh II CX

    International Nuclear Information System (INIS)

    Berno, A.; Vogel, J.S.; Caffee, M.

    1990-08-01

    Accelerator mass spectrometry systems based on >3MV tandem accelerators often use multi-anode ionization detectors and/or time-of-flight detectors to identify individual isotopes through multi-parameter analysis. A Macintosh llcx has been programmed to collect AMS data from a CAMAC-implemented analyzer and to display the histogrammed individual parameters and a double-parameter array. The computer-CAMAC connection is through a Nu-Bus to CAMAC dataway interface which allows direct addressing to all functions and locations in the crate. The asynchronous data from counting the rare isotope is sorted into a CAMAC memory module by a list sequence controller. Isotope switching is controlled by a one-cycle timing generator. A rate-dependent amount of time is used to transfer the data from the memory module at the end of each timing cycle. The present configuration uses 10 to 75 ms for rates of 500--10000 cps. Parameter analysis occurs during the rest of the 520 ms data collection cycle. Completed measurements of the isotope concentrations of each sample are written to files which are compatible with standard Macintosh databases or other processing programs. The system is inexpensive and operates at speeds comparable to those obtainable using larger computers

  12. No Special Equipment Required: The Accessibility Features Built into the Windows and Macintosh Operating Systems make Computers Accessible for Students with Special Needs

    Science.gov (United States)

    Kimball,Walter H.; Cohen,Libby G.; Dimmick,Deb; Mills,Rick

    2003-01-01

    The proliferation of computers and other electronic learning devices has made knowledge and communication accessible to people with a wide range of abilities. Both Windows and Macintosh computers have accessibility options to help with many different special needs. This documents discusses solutions for: (1) visual impairments; (2) hearing…

  13. 37 CFR 1.96 - Submission of computer program listings.

    Science.gov (United States)

    2010-07-01

    ... specifications (no other format shall be allowed): (i) Computer Compatibility: IBM PC/XT/AT, or compatibles, or Apple Macintosh; (ii) Operating System Compatibility: MS-DOS, MS-Windows, Unix, or Macintosh; (iii) Line...

  14. Why not make a PC cluster of your own? 5. AppleSeed: A Parallel Macintosh Cluster for Scientific Computing

    Science.gov (United States)

    Decyk, Viktor K.; Dauger, Dean E.

    We have constructed a parallel cluster consisting of Apple Macintosh G4 computers running both Classic Mac OS as well as the Unix-based Mac OS X, and have achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. Unlike other Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the mainstream of computing.

  15. Greek-English Word Processing on the Macintosh.

    Science.gov (United States)

    Rusten, Jeffrey

    1986-01-01

    Discusses the complete Greek-English word processing system of the Apple Macintosh computer. Describes the features of its operating system, shows how the Greek fonts look and work, and enumerates both the advantages and drawbacks of the Macintosh. (SED)

  16. Informatics in radiology (infoRAD): free DICOM image viewing and processing software for the Macintosh computer: what's available and what it can do for you.

    Science.gov (United States)

    Escott, Edward J; Rubinstein, David

    2004-01-01

    It is often necessary for radiologists to use digital images in presentations and conferences. Most imaging modalities produce images in the Digital Imaging and Communications in Medicine (DICOM) format. The image files tend to be large and thus cannot be directly imported into most presentation software, such as Microsoft PowerPoint; the large files also consume storage space. There are many free programs that allow viewing and processing of these files on a personal computer, including conversion to more common file formats such as the Joint Photographic Experts Group (JPEG) format. Free DICOM image viewing and processing software for computers running on the Microsoft Windows operating system has already been evaluated. However, many people use the Macintosh (Apple Computer) platform, and a number of programs are available for these users. The World Wide Web was searched for free DICOM image viewing or processing software that was designed for the Macintosh platform or is written in Java and is therefore platform independent. The features of these programs and their usability were evaluated. There are many free programs for the Macintosh platform that enable viewing and processing of DICOM images. (c) RSNA, 2004.

  17. Think different: applying the old macintosh mantra to the computability of the SUSY auxiliary field problem

    Energy Technology Data Exchange (ETDEWEB)

    Calkins, Mathew; Gates, D.E.A.; Gates, S. James Jr. [Center for String and Particle Theory, Department of Physics, University of Maryland,College Park, MD 20742-4111 (United States); Golding, William M. [Sensors and Electron Devices Directorate, US Army Research Laboratory,Adelphi, Maryland 20783 (United States)

    2015-04-13

    Starting with valise supermultiplets obtained from 0-branes plus field redefinitions, valise adinkra networks, and the “Garden Algebra,” we discuss an architecture for algorithms that (starting from on-shell theories and, through a well-defined computation procedure), search for off-shell completions. We show in one dimension how to directly attack the notorious “off-shell auxiliary field” problem of supersymmetry with algorithms in the adinkra network-world formulation.

  18. Scientific Graphical Displays on the Macintosh

    Energy Technology Data Exchange (ETDEWEB)

    Grotch, S. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    In many organizations scientists have ready access to more than one computer, often both a workstation (e.g., SUN, HP, SGI) as well as a Macintosh or other PC. The scientist commonly uses the work station for `number-crunching` and data analysis whereas the Macintosh is relegated to either word processing or serves as a `dumb terminal` to a larger main-frame computer. In an informal poll of my colleagues, very few of them used their Macintoshes for either statistical analysis or for graphical data display. I believe that this state of affairs is particularly unfortunate because over the last few years both the computational capability, and even more so, the software availability for the Macintosh have become quite formidable. In some instances, very powerful tools are now available on the Macintosh that may not exist (or be far too costly) on the so-called `high end` workstations. Many scientists are simply unaware of the wealth of extremely useful, `off-the-shelf` software that already exists on the Macintosh for scientific graphical and statistical analysis.

  19. A user`s guide to LUGSAN II. A computer program to calculate and archive lug and sway brace loads for aircraft-carried stores

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, W.N. [Sandia National Labs., Albuquerque, NM (United States). Mechanical and Thermal Environments Dept.

    1998-03-01

    LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggested resources for programmers.

  20. Computing at Belle II

    International Nuclear Information System (INIS)

    Kuhr, Thomas

    2012-01-01

    Belle II, a next-generation B-factory experiment, will search for new physics effects in a data sample about 50 times larger than the one collected by its predecessor, the Belle experiment. To match the advances in accelerator and detector technology, the computing system and the software have to be upgraded as well. The Belle II computing model is presented and an overview of the distributed computing system and the offline software framework is given.

  1. Vectronic's Power Macintosh G3 (B & W)

    CERN Multimedia

    1999-01-01

    Apple introduced the Power Macintosh G3 Blue and White (B & W) on January 5, 1999. The Power Macintosh G3 line stayed in production until August 1999, and was replaced by the Power Macintosh G4, which used the same chassis. The Power Macintosh G3 originally cost between $1599 and $2900 depending on options. The three original Power Macintosh G3 models shipped with a 300 MHz, 350 MHz, or 400 MHz PowerPC 750 (G3) processor. Just pull on the small round handle on the side of the tower, and the entire side of the computer opens up. The G3's motherboard is mounted on that surface, giving you easy access for upgrading RAM or installed PCI cards. Apple added new ports (USB and the much-anticipated FireWire) that took the place of historic, and quickly becoming antiquated, Mac serial (printer and modem) ports. The Power Macintosh G3 has two USB (12 Mbps) ports, two FireWire (400 Mbps) ports, one 10/100BaseT Ethernet port, an RJ-11 jack for an optional 56K modem, a sound out and sound in jack, and one ADB (Apple D...

  2. Digital optical computer II

    Science.gov (United States)

    Guilfoyle, Peter S.; Stone, Richard V.

    1991-12-01

    OptiComp is currently completing a 32-bit, fully programmable digital optical computer (DOC II) that is designed to operate in a UNIX environment running RISC microcode. OptiComp's DOC II architecture is focused toward parallel microcode implementation where data is input in a dual rail format. By exploiting the physical principals inherent to optics (speed and low power consumption), an architectural balance of optical interconnects and software code efficiency can be achieved including high fan-in and fan-out. OptiComp's DOC II program is jointly sponsored by the Office of Naval Research (ONR), the Strategic Defense Initiative Office (SDIO), NASA space station group and Rome Laboratory (USAF). This paper not only describes the motivational basis behind DOC II but also provides an optical overview and architectural summary of the device that allows the emulation of any digital instruction set.

  3. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  4. Knowledge-based geographic information systems on the Macintosh computer: a component of the GypsES project

    Science.gov (United States)

    Gregory Elmes; Thomas Millette; Charles B. Yuill

    1991-01-01

    GypsES, a decision-support and expert system for the management of Gypsy Moth addresses five related research problems in a modular, computer-based project. The modules are hazard rating, monitoring, prediction, treatment decision and treatment implementation. One common component is a geographic information system designed to function intelligently. We refer to this...

  5. Power Macintosh 7300/166

    CERN Document Server

    1997-01-01

    The Power Macintosh 7300 was released in 1997 and was the same case as the Power Macintosh 7600. Its main evolution is that it was equipped with a faster processor. It also had a bigger hard drive (2 GB) and a faster CD-ROM drive (12x to 8x). In return, Apple chose to remove the audiovisual connections that were present on all its predecessors of the range 7x00.

  6. Design and implementation of a Macintosh-CAMAC based system for neutral beam diagnostics

    International Nuclear Information System (INIS)

    Wight, J.; Hong, R.M.; Phillips, J.C.; Lee, R.L.; Colleraine, A.P.; Kim, J.

    1989-12-01

    An automated personal computer based CAMAC data acquisition system is being implemented on the DIII-D neutral beamlines for certain diagnostics. The waterflow calorimetry (WFC) diagnostic is the first system to be upgraded. It includes data acquisition by a Macintosh II computer containing a National Instruments IEEE-488 card, and running their LabView software. Macintosh to CAMAC communications are carried out through an IEEE-488 crate controller. The Doppler shift spectroscopy, residual gas analysis, and armor tile infrared image diagnostics will be modified in similar ways. To reduce the demand for Macintosh CPU time, the extensive serial high-way data activity is performed by means of a new Kinetic Systems 3982 List sequencing Crate Controller dedicated to these operations. A simple Local Area Network file server is used to store data from all diagnostics together, and in a format readable by a standard commercial database. This reduces the problem of redundant data storage and allows simpler inter-diagnostic analysis. 3 refs., 4 figs

  7. Hamlet on the Macintosh: An Experimental Seminar That Worked.

    Science.gov (United States)

    Strange, William C.

    1987-01-01

    Describes experimental college Shakespeare seminar that used Macintosh computers and software called ELIZA and ADVENTURE to develop character dialogs and adventure games based on Hamlet's characters and plots. Programming languages are examined, particularly their relationship to metaphor, and the use of computers in humanities is discussed. (LRW)

  8. SAGE FOR MACINTOSH (MSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Macintosh, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating aMacintosh personal computer under the System 7.0 (or higher) operating system. SAGE for ...

  9. Macintosh in the laboratory - an approach

    International Nuclear Information System (INIS)

    Taylor, B.

    1988-01-01

    The high degree of parallelism possible in a distributed configuration of VME multiprocessors minimizes dead-time in the read-out of a large detector and allows sophisticated triggering and filtering systems to be implemented. Experience has also shown that systems based on standard instrumentation can be readily reconfigured as needs change, and enhanced as technology evolves. Apple Macintosh computers can be used as cost-effective software development workstations for such systems and their graphics-oriented use-interface has proved well-suited to control and monitoring tasks during data-taking. (orig./HSI).

  10. MACSSA (Macintosh Safeguards Systems Analyzer)

    International Nuclear Information System (INIS)

    Argentesi, F.; Costantini, L.; Kohl, M.

    1986-01-01

    This paper discusses MACSSA a fully interactive menu-driven software system for accountancy of nuclear safeguards systems written for Apple Macintosh. Plant inventory and inventory change records can be entered interactively or can be downloaded from a mainframe database. Measurement procedures and instrument parameters can be defined. Partial or total statistics on propagated errors is performed and shown in tabular or graphic form

  11. Plasma Physics Calculations on a Parallel Macintosh Cluster

    Science.gov (United States)

    Decyk, Viktor; Dauger, Dean; Kokelaar, Pieter

    2000-03-01

    We have constructed a parallel cluster consisting of 16 Apple Macintosh G3 computers running the MacOS, and achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. A subset of the MPI message-passing library was implemented in Fortran77 and C. This library enabled us to port code, without modification, from other parallel processors to the Macintosh cluster. For large problems where message packets are large and relatively few in number, performance of 50-150 MFlops/node is possible, depending on the problem. This is fast enough that 3D calculations can be routinely done. Unlike Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. Full details are available on our web site: http://exodus.physics.ucla.edu/appleseed/.

  12. MacVEE - the intimate Macintosh-VME system

    International Nuclear Information System (INIS)

    Taylor, B.G.

    1986-01-01

    The marriage of a mass-produced personal computer with the versatile VMEbus and CAMAC systems creates a cost-effective solution to many laboratory small system requirements. This paper describes MacVEE (Microcomputer Applied to the Control of VME Electronic Equipment), a novel system in which an Apple Macintosh computer is equipped with a special interface which allows it direct memory-mapped access to single or multiple VME and CAMAC crates interconnected by a ribbon cable bus. The bus is driven by an electronics plinth called MacPlinth, which attaches to the computer and becomes an integral part of it. (Auth.)

  13. A Comparison of the Apple Macintosh and IBM PC in Laboratory Applications.

    Science.gov (United States)

    Williams, Ron

    1986-01-01

    Compares Apple Macintosh and IBM PC microcomputers in terms of their usefulness in the laboratory. No attempt is made to equalize the two computer systems since they represent opposite ends of the computer spectrum. Indicates that the IBM PC is the most useful general-purpose personal computer for laboratory applications. (JN)

  14. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.

  15. Library Signage: Applications for the Apple Macintosh and MacPaint.

    Science.gov (United States)

    Diskin, Jill A.; FitzGerald, Patricia

    1984-01-01

    Describes specific applications of the Macintosh computer at Carnegie-Mellon University Libraries, where MacPaint was used as a flexible, easy to use, and powerful tool to produce informational, instructional, and promotional signage. Profiles of system hardware and software, an evaluation of the computer program MacPaint, and MacPaint signage…

  16. High speed acquisition of multiparameter data using a Macintosh IIcx

    Science.gov (United States)

    Berno, Anthony; Vogel, John S.; Caffee, Marc

    1991-05-01

    Accelerator mass spectrometry systems based on > 3 MV tandem accelerators often use multianode ionization detectors and/or time-of-flight detectors to identify individual isotopes through multiparameter analysis. A Macintosh IIcx has been programmed to collect AMS data from a CAMAC-implemented analyzer and to display the histogrammed individual parameters and a doubleparameter array. The computer-CAMAC connection is through a NuBus to CAMAC dataway interface which allows direct addressing to all functions and registers in the crate. Asynchronous data from the rare isotope are sorted into a CAMAC memory module by a list sequence controller. Isotope switching is controlled by a one-cycle timing generator. A rate-dependent amount of time is used to transfer the data from the memory module at the end of each timing cycle. The present configuration uses 10-75 ms for rates of 500-10000 cps. Parameter analysis occurs during the rest of the 520 ms data collection cycle. Completed measurements of the isotope concentrations of each sample are written to files which are compatible with standard Macintosh databases or other processing programs. The system is inexpensive and operates at speeds comparable to those obtainable using larger computers.

  17. High Resolution Displays In The Apple Macintosh And IBM PC Environments

    Science.gov (United States)

    Winegarden, Steven

    1989-07-01

    High resolution displays are one of the key elements that distinguish user oriented document finishing or publishing stations. A number of factors have been involved in bringing these to the desktop environment. At Sigma Designs we have concentrated on enhancing the capabilites of IBM PCs and compatibles and Apple Macintosh computer systems.

  18. A mannequin study of intubation with the AP advance and GlideScope Ranger videolaryngoscopes and the Macintosh laryngoscope.

    Science.gov (United States)

    Hodd, Jack A R; Doyle, D John; Gupta, Shipra; Dalton, Jarrod E; Cata, Juan P; Brewer, Edward J; James, Monyulona; Sessler, Daniel I

    2011-10-01

    The AP Advance (APA) is a videolaryngoscope with interchangeable blades: intubators can choose standard Macintosh blades or a difficult-airway blade with increased curvature and a channel to guide the tube to the larynx. The APA may therefore be comparably effective in both normal and difficult airways. We tested the hypotheses that intubation with the APA is no slower than Macintosh laryngoscopy for normal mannequin airways, and that it is no slower than videolaryngoscopy using a GlideScope Ranger in difficult mannequin airways. Medical professionals whose roles potentially include tracheal intubation were trained with each device. Participants intubated simulated (Laerdal SimMan) normal and difficult airways with the APA, GlideScope, and a conventional Macintosh blade. Speed of intubation was compared using Cox proportional hazards regression, with a hazard ratio >0.8 considered noninferior. We also compared laryngeal visualization, failures, and participant preferences. Unadjusted intubation times in the normal airway with the APA and Macintosh were virtually identical (median, 22 vs 23 seconds); after adjustment for effects of experience, order, and period, the hazard ratio (95% confidence interval) comparing APA with Macintosh laryngoscopy was 0.87 (0.65, 1.17), which was not significantly more than our predefined noninferiority boundary of 0.8 (P = 0.26). Intubation with the APA was faster than with the GlideScope in difficult airways (hazard ratio = 7.6 [5.0, 11.3], P APA, whereas 33% and 37% failed with the GlideScope and Macintosh, respectively. In the difficult airway, 99% of participants achieved a Cormack and Lehane grade I to II view with the APA, versus 85% and 33% with the GlideScope and Macintosh, respectively. When asked to choose 1 device overall, 82% chose the APA. Intubation times were similar with the APA and Macintosh laryngoscopes in mannequins with normal airways. However, intubation with the APA was significantly faster than with the Glide

  19. Macintosh Troubleshooting Pocket Guide for Mac OS

    CERN Document Server

    Lerner, David; Corporation, Tekserve

    2009-01-01

    The Macintosh Troubleshooting Pocket Guide covers the most common user hardware and software trouble. It's not just a book for Mac OS X (although it includes tips for OS X and Jaguar), it's for anyone who owns a Mac of any type-- there are software tips going back as far as OS 6. This slim guide distills the answers to the urgent questions that Tekserve's employee's answer every week into a handy guide that fits in your back pocket or alongside your keyboard.

  20. A Survey of Computer Use by Undergraduate Psychology Departments in Virginia.

    Science.gov (United States)

    Stoloff, Michael L.; Couch, James V.

    1987-01-01

    Reports a survey of computer use in psychology departments in Virginia's four year colleges. Results showed that faculty, students, and clerical staff used word processing, statistical analysis, and database management most frequently. The three most numerous computers brands were the Apple II family, IBM PCs, and the Apple Macintosh. (Author/JDH)

  1. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    Science.gov (United States)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  2. Office X for Macintosh the missing manual

    CERN Document Server

    Barber, Nan; Reynolds, David

    2002-01-01

    Mac OS X, Apple's super-advanced, Unix-based operating system, offers every desirable system-software feature known to humans. But without a compatible software library, the Mac of the future was doomed. Microsoft Office X for Macintosh is exactly the software suite most Mac fans were waiting for. Its four programs--Word, Excel, PowerPoint, and Entourage--have been completely overhauled to take advantage of the stunning looks and rock-like stability of Mac OS X. But this magnificent package comes without a single page of printed instructions. Fortunately, Pogue Press/O'Reilly is once again

  3. NETS - A NEURAL NETWORK DEVELOPMENT TOOL, VERSION 3.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Phillips, T. A.

    1994-01-01

    allows the user to generate C code to implement the network loaded into the system. This permits the placement of networks as components, or subroutines, in other systems. In short, once a network performs satisfactorily, the Generate C Code option provides the means for creating a program separate from NETS to run the network. Other features: files may be stored in binary or ASCII format; multiple input propagation is permitted; bias values may be included; capability to scale data without writing scaling code; quick interactive testing of network from the main menu; and several options that allow the user to manipulate learning efficiency. NETS is written in ANSI standard C language to be machine independent. The Macintosh version (MSC-22108) includes code for both a graphical user interface version and a command line interface version. The machine independent version (MSC-21588) only includes code for the command line interface version of NETS 3.0. The Macintosh version requires a Macintosh II series computer and has been successfully implemented under System 7. Four executables are included on these diskettes, two for floating point operations and two for integer arithmetic. It requires Think C 5.0 to compile. A minimum of 1Mb of RAM is required for execution. Sample input files and executables for both the command line version and the Macintosh user interface version are provided on the distribution medium. The Macintosh version is available on a set of three 3.5 inch 800K Macintosh format diskettes. The machine independent version has been successfully implemented on an IBM PC series compatible running MS-DOS, a DEC VAX running VMS, a SunIPC running SunOS, and a CRAY Y-MP running UNICOS. Two executables for the IBM PC version are included on the MS-DOS distribution media, one compiled for floating point operations and one for integer arithmetic. The machine independent version is available on a set of three 5.25 inch 360K MS-DOS format diskettes (standard

  4. A CAMAC-based data acquisition system with a Macintosh interface

    International Nuclear Information System (INIS)

    McKisson, J.E.; Ely, D.W.; Weisenberger, A.G.; Piercy, R.B.; Haskins, P.S.

    1990-01-01

    This paper describes a commercially available Macintosh-based data acquisition system and its application to a specific measurement. Based on Computer Aided Measurement and Control (CAMAC) and Nuclear Instrumentation Module (NIM) standard modules, the data acquisition system features a hardware and software interface to a Macintosh computer. This system has been used both for laboratory and remote site measurements, and has been found to perform well as both a highly interactive laboratory system and as a very automatable system for long term data acquisition. Ease in configuration allows for flexibility in fast response applications where a data acquisition system is needed in short time. The system software also supports much of the data analysis and presentation of results with a versatile set of histogram display and manipulation tools. In a recent application, the system controlled data acquisition for two germanium detectors used as part of the whole- spacecraft induced activation measurements of the Long Duration Exposure Facility (LDEF) satellite

  5. An abstract interactive graphics interface for the IBM/PC and Macintosh.

    OpenAIRE

    Ko-Hsin, Liang

    1988-01-01

    Approved for public release; distribution is unlimited Different computer systems have different programming environments in spite of their similar capabilities. GEM and Macintosh software system both provide an operating environment in which the users can utilize all kinds of functions and routines to produce a user-friendly application program. Unfortunately, the programmers have to repeat the learning procedure and recode the source works if for some reason the application p...

  6. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  7. Embedded computer systems for control applications in EBR-II

    International Nuclear Information System (INIS)

    Carlson, R.B.; Start, S.E.

    1993-01-01

    The purpose of this paper is to describe the embedded computer systems approach taken at Experimental Breeder Reactor II (EBR-II) for non-safety related systems. The hardware and software structures for typical embedded systems are presented The embedded systems development process is described. Three examples are given which illustrate typical embedded computer applications in EBR-II

  8. A CLINICAL ASSESSMENT OF MACINTOSH BLADE, MILLER BLADE AND KING VISIONTM VIDEOLARYNGOSCOPE FOR LARYNGEAL EXPOSURE AND DIFFICULTY IN ENDOTRACHEAL INTUBATION

    Directory of Open Access Journals (Sweden)

    Apoorva Mahendera

    2016-03-01

    Full Text Available CONTEXT Previous studies suggest glottic view is better achieved with straight blades while tracheal intubation is easier with curved blades and videolaryngoscope is better than conventional laryngoscope. AIMS Comparison of conventional laryngoscope (Macintosh blade and Miller blade with channelled videolaryngoscope (King Vision TM with respect to laryngeal visualisation and difficulty in endotracheal intubation. SETTINGS AND DESIGN This prospective randomised comparative study was conducted at a tertiary care hospital (in ASA I and ASA II patients after approval from the Institutional Ethics Committee. METHODS We compared Macintosh, Miller, and the King VisionTM videolaryngoscope for glottic visualisation and ease of tracheal intubation. Patients undergoing elective surgeries under general anaesthesia requiring endotracheal intubation were randomly divided into three groups (N=180. After induction of anaesthesia, laryngoscopy was performed and trachea intubated. We recorded visualisation of glottis (Cormack-Lehane grade-CL, ease of intubation, number of attempts, need to change blade, and need for external laryngeal manipulation. STATISTICAL ANALYSIS Demographic data, Mandibular length, Mallampati classification were compared using ANOVA, Chi-square test, Kruskal-Wallis Test, where P value <0.005 is statically significant. RESULTS CL grade 1 was most often observed in King Vision -TM VL group (90% which is followed by Miller (28.33%, and Macintosh group (15%. We found intubation was to be easier (grade 1 with King Vision -TM VL group (73.33%, followed by Macintosh (38.33%, and Miller group (1.67%. External manipulation (BURP was needed more frequently in patients in Miller group (71.67%, followed by Macintosh (28.33% and in King Vision -TM VL group (6.67%. All (100% patients were intubated in the 1 st attempt with King Vision -TM VL group, followed by Macintosh group (90% and Miller group (58.33%. CONCLUSIONS In patients with normal airway

  9. A network-based Macintosh serial host interface program

    International Nuclear Information System (INIS)

    Wight, J.

    1991-03-01

    A program has been written for the Apple Macintosh to replace conventional host RS232 terminals with customizable user interfaces. Serial port NuBus cards in the Macintosh allow many simultaneous sessions to be maintained. A powerful system is attained by connecting multiple Macintoshes on a network, each running this program. Each is then able to share incoming data from any of its serial ports with any other Macintosh, as well as accept data from any other Macintosh for output to any of its serial ports. The program has been used to eliminate multiple host terminals, modernize the user interface, and to centralize operation of a complex control system. Minimal changes to host software have been required. By making extensive use of Macintosh resources, the same executable code serves in a variety of roles. An object oriented C language with a class library made the development straightforward and easy to modify. This program is used to control a 2 MW neutral beam system on the DIII-D magnetic fusion tokamak. 7 figs

  10. EBR-II high-ramp transients under computer control

    International Nuclear Information System (INIS)

    Forrester, R.J.; Larson, H.A.; Christensen, L.J.; Booty, W.F.; Dean, E.M.

    1983-01-01

    During reactor run 122, EBR-II was subjected to 13 computer-controlled overpower transients at ramps of 4 MWt/s to qualify the facility and fuel for transient testing of LMFBR oxide fuels as part of the EBR-II operational-reliability-testing (ORT) program. A computer-controlled automatic control-rod drive system (ACRDS), designed by EBR-II personnel, permitted automatic control on demand power during the transients

  11. DET/MPS - THE GSFC ENERGY BALANCE PROGRAM, DIRECT ENERGY TRANSFER/MULTIMISSION SPACECRAFT MODULAR POWER SYSTEM (MACINTOSH A/UX VERSION)

    Science.gov (United States)

    Jagielski, J. M.

    1994-01-01

    The DET/MPS programs model and simulate the Direct Energy Transfer and Multimission Spacecraft Modular Power System in order to aid both in design and in analysis of orbital energy balance. Typically, the DET power system has the solar array directly to the spacecraft bus, and the central building block of MPS is the Standard Power Regulator Unit. DET/MPS allows a minute-by-minute simulation of the power system's performance as it responds to various orbital parameters, focusing its output on solar array output and battery characteristics. While this package is limited in terms of orbital mechanics, it is sufficient to calculate eclipse and solar array data for circular or non-circular orbits. DET/MPS can be adjusted to run one or sequential orbits up to about one week, simulated time. These programs have been used on a variety of Goddard Space Flight Center spacecraft projects. DET/MPS is written in FORTRAN 77 with some VAX-type extensions. Any FORTRAN 77 compiler that includes VAX extensions should be able to compile and run the program with little or no modifications. The compiler must at least support free-form (or tab-delineated) source format and 'do do-while end-do' control structures. DET/MPS is available for three platforms: GSC-13374, for DEC VAX series computers running VMS, is available in DEC VAX Backup format on a 9-track 1600 BPI tape (standard distribution) or TK50 tape cartridge; GSC-13443, for UNIX-based computers, is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format; and GSC-13444, for Macintosh computers running AU/X with either the NKR FORTRAN or AbSoft MacFORTRAN II compilers, is available on a 3.5 inch 800K Macintosh format diskette. Source code and test data are supplied. The UNIX version of DET requires 90K of main memory for execution. DET/MPS was developed in 1990. A/UX and Macintosh are registered trademarks of Apple Computer, Inc. VMS, DEC VAX and TK50 are trademarks of Digital Equipment Corporation. UNIX is a

  12. A randomized controlled study to evaluate and compare Truview blade with Macintosh blade for laryngoscopy and intubation under general anesthesia.

    Science.gov (United States)

    Timanaykar, Ramesh T; Anand, Lakesh K; Palta, Sanjeev

    2011-04-01

    The Truview EVO2™ laryngoscope is a recently introduced device with a unique blade that provides a magnified laryngeal view at 42° anterior reflected view. It facilitates visualization of the glottis without alignment of oral, pharyngeal, and tracheal axes. We compared the view obtained at laryngoscopy, intubating conditions and hemodynamic parameters of Truview with Macintosh blade. In prospective, randomized and controlled manner, 200 patients of ASA I and II of either sex (20-50 years), presenting for surgery requiring tracheal intubation, were assigned to undergo intubation using a Truview or Macintosh laryngoscope. Visualization of the vocal cord, ease of intubation, time taken for intubation, number of attempts, and hemodynamic parameters were evaluated. Truview provided better results for the laryngeal view using Cormack and Lehane grading, particularly in patients with higher airway Mallampati grading (P < 0.05). The time taken for intubation (33.06±5.6 vs. 23.11±57 seconds) was more with Truview than with Macintosh blade (P < 0.01). The Percentage of Glottic Opening (POGO) score was significantly higher (97.26±8) in Truview as that observed with Macintosh blade (83.70±21.5). Hemodynamic parameters increased after tracheal intubation from pre-intubation value (P < 0.05) in both the groups, but they were comparable amongst the groups. No postoperative adverse events were noted. Tracheal intubation using Truview blade provided consistently improved laryngeal view as compared to Macintosh blade without the need to align the oral, pharyngeal and tracheal axes, with equal attempts for successful intubation and similar changes in hemodynamics. However, the time taken for intubation was more with Truview.

  13. A randomized controlled study to evaluate and compare Truview blade with Macintosh blade for laryngoscopy and intubation under general anesthesia

    Directory of Open Access Journals (Sweden)

    Ramesh T Timanaykar

    2011-01-01

    Full Text Available Background: The Truview EVO2 TM laryngoscope is a recently introduced device with a unique blade that provides a magnified laryngeal view at 42° anterior reflected view. It facilitates visualization of the glottis without alignment of oral, pharyngeal, and tracheal axes. We compared the view obtained at laryngoscopy, intubating conditions and hemodynamic parameters of Truview with Macintosh blade. Materials and Methods: In prospective, randomized and controlled manner, 200 patients of ASA I and II of either sex (20-50 years, presenting for surgery requiring tracheal intubation, were assigned to undergo intubation using a Truview or Macintosh laryngoscope. Visualization of the vocal cord, ease of intubation, time taken for intubation, number of attempts, and hemodynamic parameters were evaluated. Results: Truview provided better results for the laryngeal view using Cormack and Lehane grading, particularly in patients with higher airway Mallampati grading (P < 0.05. The time taken for intubation (33.06±5.6 vs. 23.11±57 seconds was more with Truview than with Macintosh blade (P < 0.01. The Percentage of Glottic Opening (POGO score was significantly higher (97.26±8 in Truview as that observed with Macintosh blade (83.70±21.5. Hemodynamic parameters increased after tracheal intubation from pre-intubation value (P < 0.05 in both the groups, but they were comparable amongst the groups. No postoperative adverse events were noted. Conclusion: Tracheal intubation using Truview blade provided consistently improved laryngeal view as compared to Macintosh blade without the need to align the oral, pharyngeal and tracheal axes, with equal attempts for successful intubation and similar changes in hemodynamics. However, the time taken for intubation was more with Truview.

  14. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  15. Type II Quantum Computing With Superconductors

    National Research Council Canada - National Science Library

    Orlando, Terry

    2004-01-01

    ... for adiabatic quantum computing using these qubits. The major experimental results on single superconducting persistent current qubits have been the observation of the quantum energy level crossings in niobium qubits, and the microwave measurements...

  16. Scientific computing vol II - eigenvalues and optimization

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the second of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses more advanced topics than volume one, and is largely not a prerequisite for volume three. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 49 examples, 110 exercises, 66 algorithms, 24 interactive JavaScript programs, 77 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in LAPACK, GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either upper level undergraduate...

  17. Desktop Publishing on the Macintosh: A Software Perspective.

    Science.gov (United States)

    Devan, Steve

    1987-01-01

    Discussion of factors to be considered in selecting desktop publishing software for the Macintosh microcomputer focuses on the two approaches to such software, i.e., batch and interactive, and three technical considerations, i.e., document, text, and graphics capabilities. Some new developments in graphics software are also briefly described. (MES)

  18. Comparative study between the use of Macintosh Laryngoscope ...

    African Journals Online (AJOL)

    Comparative study between the use of Macintosh Laryngoscope and Airtraq in patients with cervical spine immobilization. ... Conclusion: The Airtraq Laryngoscope offers a new approach for the management of difficult airway like patients with potential cervical spine injury, it is fast, easy to use, gets an easy view of the ...

  19. Computer simulation of a magnetohydrodynamic dynamo II

    International Nuclear Information System (INIS)

    Kageyama, Akira; Sato, Tetsuya.

    1994-11-01

    We performed a computer simulation of a magnetohydrodynamic dynamo in a rapidly rotating spherical shell. Extensive parameter runs are carried out changing the electrical resistivity. It is found that the total magnetic energy can grow more than ten times larger than the total kinetic energy of the convection motion when the resistivity is sufficiently small. When the resistivity is relatively large and the magnetic energy is comparable or smaller than the kinetic energy, the convection motion maintains its well-organized structure. However, when the resistivity is small and the magnetic energy becomes larger than the kinetic energy, the well-organized convection motion is highly disturbed. The generated magnetic field is organized as a set of flux tubes which can be divided into two categories. The magnetic field component parallel to the rotation axis tends to be confined inside the anticyclonic columnar convection cells. On the other hand, the component perpendicular to the rotation axis is confined outside the convection cells. (author)

  20. Computer and engineering calculations of Brazilian Tokamak-II

    International Nuclear Information System (INIS)

    Wang, S.; Chen, Y.; Sa, W.P. de; Nascimento, I.C.; Tuszel, A.G.; Galvao, R.M.O.; Machida, M.

    1990-01-01

    Analytical and computer calculations carried out by researches of Physics Institute - University of Sao Paulo (IFUSP), for defining the engineering project and constructing the TBR-II tokamak are presented. The hydrodynamics behavioue and determined parameters for magnetic confinement of the plasma were analysed. The computer code was developed using magnetohydrodynamics (MHD) equations which involve plasma interactions, magnetic field and electrical current circulating in more than 20 coils distributed around toroidal vase of the plasma. The electromagnetic, thermal and mechanical couplings are also presented. The TBR-II will be feed by two turbo-generators with 15 MW each one. (M.C.K.) [pt

  1. Macintosh support is provided at the level of the Service Desk

    CERN Multimedia

    2011-01-01

    Since September 2010 the Apple laptops & desktops with Mac OS are recognized and supported at CERN by the IT department. Therefore, the “Macintosh support” procedure now follows the same ITIL*) schema as for all IT services, i.e.: All CERN users must address any request for support on Macintosh PCs to the Service Desk. The Service Desk will move on questions or problems they cannot solve to “IT 2nd level” support people, provided by the “computing support” contract managed by IT department. Mac OS being officially supported by the IT department, a 3rd level support is provided by CERN IT staff; they may give specialized expert assistance, within the scope described at the ITUM-2 presentation, for all incidents or requests which can be neither resolved nor fulfilled by the Service Desk (1st level) and the 2nd level support people. Therefore, users who have problems related to Mac OS should simply fill-in the appropriate form from th...

  2. Supporting geoscience with graphical-user-interface Internet tools for the Macintosh

    Science.gov (United States)

    Robin, Bernard

    1995-07-01

    This paper describes a suite of Macintosh graphical-user-interface (GUI) software programs that can be used in conjunction with the Internet to support geoscience education. These software programs allow science educators to access and retrieve a large body of resources from an increasing number of network sites, taking advantage of the intuitive, simple-to-use Macintosh operating system. With these tools, educators easily can locate, download, and exchange not only text files but also sound resources, video movie clips, and software application files from their desktop computers. Another major advantage of these software tools is that they are available at no cost and may be distributed freely. The following GUI software tools are described including examples of how they can be used in an educational setting: ∗ Eudora—an e-mail program ∗ NewsWatcher—a newsreader ∗ TurboGopher—a Gopher program ∗ Fetch—a software application for easy File Transfer Protocol (FTP) ∗ NCSA Mosaic—a worldwide hypertext browsing program. An explosive growth of online archives currently is underway as new electronic sites are being added continuously to the Internet. Many of these resources may be of interest to science educators who learn they can share not only ASCII text files, but also graphic image files, sound resources, QuickTime movie clips, and hypermedia projects with colleagues from locations around the world. These powerful, yet simple to learn GUI software tools are providing a revolution in how knowledge can be accessed, retrieved, and shared.

  3. User's manual for the NEFTRAN II computer code

    International Nuclear Information System (INIS)

    Olague, N.E.; Campbell, J.E.; Leigh, C.D.; Longsine, D.E.

    1991-02-01

    This document describes the NEFTRAN II (NEtwork Flow and TRANsport in Time-Dependent Velocity Fields) computer code and is intended to provide the reader with sufficient information to use the code. NEFTRAN II was developed as part of a performance assessment methodology for storage of high-level nuclear waste in unsaturated, welded tuff. NEFTRAN II is a successor to the NEFTRAN and NWFT/DVM computer codes and contains several new capabilities. These capabilities include: (1) the ability to input pore velocities directly to the transport model and bypass the network fluid flow model, (2) the ability to transport radionuclides in time-dependent velocity fields, (3) the ability to account for the effect of time-dependent saturation changes on the retardation factor, and (4) the ability to account for time-dependent flow rates through the source regime. In addition to these changes, the input to NEFTRAN II has been modified to be more convenient for the user. This document is divided into four main sections consisting of (1) a description of all the models contained in the code, (2) a description of the program and subprograms in the code, (3) a data input guide and (4) verification and sample problems. Although NEFTRAN II is the fourth generation code, this document is a complete description of the code and reference to past user's manuals should not be necessary. 19 refs., 33 figs., 25 tabs

  4. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-05-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunchspacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  5. Computing Models of CDF and D0 in Run II

    International Nuclear Information System (INIS)

    Lammel, S.

    1997-01-01

    The next collider run of the Fermilab Tevatron, Run II, is scheduled for autumn of 1999. Both experiments, the Collider Detector at Fermilab (CDF) and the D0 experiment are being modified to cope with the higher luminosity and shorter bunch spacing of the Tevatron. New detector components, higher event complexity, and an increased data volume require changes from the data acquisition systems up to the analysis systems. In this paper we present a summary of the computing models of the two experiments for Run II

  6. Modernization of computer of plant Vandellos-II

    International Nuclear Information System (INIS)

    Fuente Arias, E. de la

    2014-01-01

    The Plant computer from the nuclear de Vandellos II, whose modernization process will carry out Westinghouse, is a centralized system which performs monitoring and supervision in real time of plant processes and which performs the calculations necessary for an efficient assessment of plant operation, without performing any action on it. Its main function is to provide current and historical information on the status of the plant, both in normal operation and emergency conditions. (Author)

  7. SEISRISK II; a computer program for seismic hazard estimation

    Science.gov (United States)

    Bender, Bernice; Perkins, D.M.

    1982-01-01

    The computer program SEISRISK II calculates probabilistic ground motion values for use in seismic hazard mapping. SEISRISK II employs a model that allows earthquakes to occur as points within source zones and as finite-length ruptures along faults. It assumes that earthquake occurrences have a Poisson distribution, that occurrence rates remain constant during the time period considered, that ground motion resulting from an earthquake is a known function of magnitude and distance, that seismically homogeneous source zones are defined, that fault locations are known, that fault rupture lengths depend on magnitude, and that earthquake rates as a function of magnitude are specified for each source. SEISRISK II calculates for each site on a grid of sites the level of ground motion that has a specified probability of being exceeded during a given time period. The program was designed to process a large (essentially unlimited) number of sites and sources efficiently and has been used to produce regional and national maps of seismic hazard.}t is a substantial revision of an earlier program SEISRISK I, which has never been documented. SEISRISK II runs considerably [aster and gives more accurate results than the earlier program and in addition includes rupture length and acceleration variability which were not contained in the original version. We describe the model and how it is implemented in the computer program and provide a flowchart and listing of the code.

  8. CLIPS 6.0 - C LANGUAGE INTEGRATED PRODUCTION SYSTEM, VERSION 6.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Riley, G.

    1994-01-01

    command line version. For the UNIX version of CLIPS 6.0, the command line interface has been successfully implemented on a Sun4 running SunOS, a DECstation running DEC RISC ULTRIX, an SGI Indigo Elan running IRIX, a DEC Alpha AXP running OSF/1, and an IBM RS/6000 running AIX. Command line interface executables are included for Sun4 computers running SunOS 4.1.1 or later and for the DEC RISC ULTRIX platform. The makefiles may have to be modified slightly to be used on other UNIX platforms. The UNIX, Macintosh, and IBM PC versions of CLIPS 6.0 each have a platform specific interface. Source code, a makefile, and an executable for the Windows 3.1 interface version of CLIPS 6.0 are provided only on the IBM PC distribution diskettes. Source code, a makefile, and an executable for the Macintosh interface version of CLIPS 6.0 are provided only on the Macintosh distribution diskettes. Likewise, for the UNIX version of CLIPS 6.0, only source code and a makefile for an X-Windows interface are provided. The X-Windows interface requires MIT's X Window System, Version 11, Release 4 (X11R4), the Athena Widget Set, and the Xmu library. The source code for the Athena Widget Set is provided on the distribution medium. The X-Windows interface has been successfully implemented on a Sun4 running SunOS 4.1.2 with the MIT distribution of X11R4 (not OpenWindows), an SGI Indigo Elan running IRIX 4.0.5, and a DEC Alpha AXP running OSF/1 1.2. The VAX version of CLIPS 6.0 comes only with the generic command line interface. ASCII makefiles for the command line version of CLIPS are provided on all the distribution media for UNIX, VMS, and DOS. Four executables are provided with the IBM PC version: a windowed interface executable for Windows 3.1 built using Borland C++ v3.1, an editor for use with the windowed interface, a command line version of CLIPS for Windows 3.1, and a 386 command line executable for DOS built using Zortech C++ v3.1. All four executables are capable of utilizing extended memory

  9. Computational design of binding proteins to EGFR domain II.

    Directory of Open Access Journals (Sweden)

    Yoon Sup Choi

    Full Text Available We developed a process to produce novel interactions between two previously unrelated proteins. This process selects protein scaffolds and designs protein interfaces that bind to a surface patch of interest on a target protein. Scaffolds with shapes complementary to the target surface patch were screened using an exhaustive computational search of the human proteome and optimized by directed evolution using phage display. This method was applied to successfully design scaffolds that bind to epidermal growth factor receptor (EGFR domain II, the interface of EGFR dimerization, with high reactivity toward the target surface patch of EGFR domain II. One potential application of these tailor-made protein interactions is the development of therapeutic agents against specific protein targets.

  10. Life system modeling and intelligent computing. Pt. II. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Li, Kang; Irwin, George W. (eds.) [Belfast Queen' s Univ. (United Kingdom). School of Electronics, Electrical Engineering and Computer Science; Fei, Minrui; Jia, Li [Shanghai Univ. (China). School of Mechatronical Engineering and Automation

    2010-07-01

    This book is part II of a two-volume work that contains the refereed proceedings of the International Conference on Life System Modeling and Simulation, LSMS 2010 and the International Conference on Intelligent Computing for Sustainable Energy and Environment, ICSEE 2010, held in Wuxi, China, in September 2010. The 194 revised full papers presented were carefully reviewed and selected from over 880 submissions and recommended for publication by Springer in two volumes of Lecture Notes in Computer Science (LNCS) and one volume of Lecture Notes in Bioinformatics (LNBI). This particular volume of Lecture Notes in Computer Science (LNCS) includes 55 papers covering 7 relevant topics. The 56 papers in this volume are organized in topical sections on advanced evolutionary computing theory and algorithms; advanced neural network and fuzzy system theory and algorithms; modeling and simulation of societies and collective behavior; biomedical signal processing, imaging, and visualization; intelligent computing and control in distributed power generation systems; intelligent methods in power and energy infrastructure development; intelligent modeling, monitoring, and control of complex nonlinear systems. (orig.)

  11. Job monitoring on DIRAC for Belle II distributed computing

    Science.gov (United States)

    Kato, Yuji; Hayasaka, Kiyoshi; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo

    2015-12-01

    We developed a monitoring system for Belle II distributed computing, which consists of active and passive methods. In this paper we describe the passive monitoring system, where information stored in the DIRAC database is processed and visualized. We divide the DIRAC workload management flow into steps and store characteristic variables which indicate issues. These variables are chosen carefully based on our experiences, then visualized. As a result, we are able to effectively detect issues. Finally, we discuss the future development for automating log analysis, notification of issues, and disabling problematic sites.

  12. SAMGrid experiences with the Condor technology in Run II computing

    International Nuclear Information System (INIS)

    Baranovski, A.; Loebel-Carpenter, L.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Kreymer, A.; Kumar, A.; Lueking, L.; Lyon, A.; Merritt, W.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; St. Denis, R.; Jain, S.; Nishandar, A.

    2004-01-01

    SAMGrid is a globally distributed system for data handling and job management, developed at Fermilab for the D0 and CDF experiments in Run II. The Condor system is being developed at the University of Wisconsin for management of distributed resources, computational and otherwise. We briefly review the SAMGrid architecture and its interaction with Condor, which was presented earlier. We then present our experiences using the system in production, which have two distinct aspects. At the global level, we deployed Condor-G, the Grid-extended Condor, for the resource brokering and global scheduling of our jobs. At the heart of the system is Condor's Matchmaking Service. As a more recent work at the computing element level, we have been benefiting from the large computing cluster at the University of Wisconsin campus. The architecture of the computing facility and the philosophy of Condor's resource management have prompted us to improve the application infrastructure for D0 and CDF, in aspects such as parting with the shared file system or reliance on resources being dedicated. As a result, we have increased productivity and made our applications more portable and Grid-ready. Our fruitful collaboration with the Condor team has been made possible by the Particle Physics Data Grid

  13. Computer and Information Sciences II : 26th International Symposium on Computer and Information Sciences

    CERN Document Server

    Lent, Ricardo; Sakellari, Georgia

    2012-01-01

    Information technology is the enabling foundation for all of human activity at the beginning of the 21st century, and advances in this area are crucial to all of us. These advances are taking place all over the world and can only be followed and perceived when researchers from all over the world assemble, and exchange their ideas in conferences such as the one presented in this proceedings volume regarding the 26th International Symposium on Computer and Information Systems, held at the Royal Society in London on 26th to 28th September 2011. Computer and Information Sciences II contains novel advances in the state of the art covering applied research in electrical and computer engineering and computer science, across the broad area of information technology. It provides access to the main innovative activities in research across the world, and points to the results obtained recently by some of the most active teams in both Europe and Asia.

  14. Computer imaging of EBR-II fuel handling equipment

    International Nuclear Information System (INIS)

    Peters, G.G.; Hansen, L.H.

    1995-01-01

    This paper describes a three-dimensional graphics application used to visualize the positions of remotely operated fuel handling equipment in the EBR-II reactor. A three-dimensional (3D) visualization technique is necessary to simulate direct visual observation of the transfers of fuel and experiments into and out of the reactor because the fuel handling equipment is submerged in liquid sodium and therefore is not visible to the operator. The system described in this paper uses actual signals to drive a three-dimensional computer-generated model in real-time in response to movements of equipment in the plant This paper will present details on how the 3D model of the intank equipment was created and how real-time dynamic behavior was added to each of the moving components

  15. Internal radiation dose calculations with the INREM II computer code

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Killough, G.G.

    1978-01-01

    A computer code, INREM II, was developed to calculate the internal radiation dose equivalent to organs of man which results from the intake of a radionuclide by inhalation or ingestion. Deposition and removal of radioactivity from the respiratory tract is represented by the Internal Commission on Radiological Protection Task Group Lung Model. A four-segment catenary model of the gastrointestinal tract is used to estimate movement of radioactive material that is ingested, or swallowed after being cleared from the respiratory tract. Retention of radioactivity in other organs is specified by linear combinations of decaying exponential functions. The formation and decay of radioactive daughters is treated explicitly, with each radionuclide in the decay chain having its own uptake and retention parameters, as supplied by the user. The dose equivalent to a target organ is computed as the sum of contributions from each source organ in which radioactivity is assumed to be situated. This calculation utilizes a matrix of dosimetric S-factors (rem/μCi-day) supplied by the user for the particular choice of source and target organs. Output permits the evaluation of components of dose from cross-irradiations when penetrating radiations are present. INREM II has been utilized with current radioactive decay data and metabolic models to produce extensive tabulations of dose conversion factors for a reference adult for approximately 150 radionuclides of interest in environmental assessments of light-water-reactor fuel cycles. These dose conversion factors represent the 50-year dose commitment per microcurie intake of a given radionuclide for 22target organs including contributions from specified source organs and surplus activity in the rest of the body. These tabulations are particularly significant in their consistent use of contemporary models and data and in the detail of documentation

  16. EBR-II Cover Gas Cleanup System upgrade distributed control and front end computer systems

    International Nuclear Information System (INIS)

    Carlson, R.B.

    1992-01-01

    The Experimental Breeder Reactor II (EBR-II) Cover Gas Cleanup System (CGCS) control system was upgraded in 1991 to improve control and provide a graphical operator interface. The upgrade consisted of a main control computer, a distributed control computer, a front end input/output computer, a main graphics interface terminal, and a remote graphics interface terminal. This paper briefly describes the Cover Gas Cleanup System and the overall control system; gives reasons behind the computer system structure; and then gives a detailed description of the distributed control computer, the front end computer, and how these computers interact with the main control computer. The descriptions cover both hardware and software

  17. Endotracheal intubation in patients with cervical spine immobilization: a comparison of macintosh and airtraq laryngoscopes.

    LENUS (Irish Health Repository)

    Maharaj, Chrisen H

    2007-07-01

    The Airtraq laryngoscope (Prodol Ltd., Vizcaya, Spain) is a novel single-use tracheal intubation device. The authors compared ease of intubation with the Airtraq and Macintosh laryngoscopes in patients with cervical spine immobilization in a randomized, controlled clinical trial.

  18. Microcomputer Decisions for the 1990s [and] Apple's Macintosh: A Viable Choice.

    Science.gov (United States)

    Grosch, Audrey N.

    1989-01-01

    Discussion of the factors that should be considered when purchasing or upgrading a microcomputer focuses on the MS-DOS and OS/2 operating systems. Macintosh purchasing decisions are discussed in a sidebar. A glossary is provided. (CLB)

  19. Principles of quantum computation and information volume II

    International Nuclear Information System (INIS)

    Kok, P

    2007-01-01

    Any new textbook in quantum information has some pretty strong competition to contend with. Not only is there the classic text by Nielsen and Chuang from 2000, but also John Preskill's lecture notes, available for free online. Nevertheless, a proper textbook seems more enduring than online notes, and the field has progressed considerably in the seven years since Nielsen and Chuang was published. A new textbook is a great opportunity to give a snapshot of our current state of knowledge in quantum information. Therein also lies a problem: The field has expanded so much that it is impossible to cover everything at the undergraduate level. Quantum information theory is relevant to an extremely large portion of physics, from solid state and condensed matter physics to particle physics. Every discipline that has some relation to quantum mechanics is affected by our understanding of quantum information theory. Those who wish to write a book on quantum information therefore have to make some profound choices: Do you keep the ultimate aim of a quantum computer in mind, or do you focus on quantum communication and precision measurements as well? Do you describe how to build a quantum computer with all possible physical systems or do you present only the underlying principles? Do you include only the tried and tested ideas, or will you also explore more speculative directions? You don't have to take a black-or-white stance on these questions, but how you approach them will profoundly determine the character of your book. The authors of 'Principles of Quantum Computation and Information (Volume II: Basic Tools and Special Topics)' have chosen to focus on the construction of quantum computers, but restrict themselves mainly to general techniques. Only in the last chapter do they explicitly address the issues that arise in the different implementations. The book is the second volume in a series, and consists of four chapters (labelled 5 to 8) called 'Quantum Information Theory

  20. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Science.gov (United States)

    2010-01-01

    ... DEPARTMENT OF ENERGY (CONTINUED) ALTERNATE FUELS EXISTING POWERPLANTS Pt. 504, App. II Appendix II to Part... effects of future real price increases for each fuel. The delivered price of an alternate fuel used to calculate delivered fuel expenses must reflect the petitioner's delivered price of the alternate fuel and...

  1. PARIS II: Computer Aided Solvent Design for Pollution Prevention

    Science.gov (United States)

    This product is a summary of U.S. EPA researchers' work developing the solvent substitution software tool PARIS II (Program for Assisting the Replacement of Industrial Solvents, version 2.0). PARIS II finds less toxic solvents or solvent mixtures to replace more toxic solvents co...

  2. Computer simulation of vortex pinning in type II superconductors. II. Random point pins

    International Nuclear Information System (INIS)

    Brandt, E.H.

    1983-01-01

    Pinning of vortices in a type II superconductor by randomly positioned identical point pins is simulated using the two-dimensional method described in a previous paper (Part I). The system is characterized by the vortex and pin numbers (N/sub v/, N/sub p/), the vortex and pin interaction ranges (R/sub v/, R/sub p/), and the amplitude of the pin potential A/sub p/. The computation is performed for many cases: dilute or dense, sharp or soft, attractive or repulsive, weak or strong pins, and ideal or amorphous vortex lattice. The total pinning force F as a function of the mean vortex displacment X increases first linearly (over a distance usually much smaller than the vortex spacing and than R/sub p/) and then saturates, fluctuating about its averaging F-bar. We interpret F-bar as the maximum pinning force j/sub c/B of a large specimen. For weak pins the prediction of Larkin and Ovchinnikov for two-dimensional collective pinning is confirmed: F-bar = const. iW/R/sub p/c 66 , where W-bar is the mean square pinning force and c 66 is the shear modulus of the vortex lattice. If the initial vortex lattice is chosen highly defective (''amorphous'') the constant is 1.3--3 times larger than for the ideal triangular lattice. This finding may explain the often observed ''history effect.'' The function F-bar(A/sub p/) exhibits a jump, which for dilute, sharp, attractive pins occurs close to the ''threshold value'' predicted for isolated pins by Labusch. This jump reflects the onset of plastic deformation of the vortex lattice, and in some cases of vortex trapping, but is not a genuine threshold

  3. Computational Models for Nonlinear Aeroelastic Systems, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. and Duke University propose to develop and demonstrate new and efficient computational methods of modeling nonlinear aeroelastic systems. The...

  4. 75 FR 64258 - Cloud Computing Forum & Workshop II

    Science.gov (United States)

    2010-10-19

    ... architecture and taxonomy; defining target United States Government Cloud Computing Business Use Cases; and... architecture to support cloud adoption; key cloud computing issues and proposed solutions; security in the... attend this meeting must register at https://www-s.nist.gov/CRS/ by close of business Thursday, October...

  5. Nuclear Magnetic Resonance Spectrometer Console Upgrade for a Type II Quantum Computer

    National Research Council Canada - National Science Library

    Cory, David

    2003-01-01

    ...) spectrometer to enable an improved implementation of type II quantum computers (TTQC). This upgrade is fully functional and has permitted our NMR studies to be moved to higher strength magnetic fields for better sensitivity and spectral dispersion...

  6. BelleII@home: Integrate volunteer computing resources into DIRAC in a secure way

    Science.gov (United States)

    Wu, Wenjing; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo; Kan, Wenxiao; Urquijo, Phillip

    2017-10-01

    The exploitation of volunteer computing resources has become a popular practice in the HEP computing community as the huge amount of potential computing power it provides. In the recent HEP experiments, the grid middleware has been used to organize the services and the resources, however it relies heavily on the X.509 authentication, which is contradictory to the untrusted feature of volunteer computing resources, therefore one big challenge to utilize the volunteer computing resources is how to integrate them into the grid middleware in a secure way. The DIRAC interware which is commonly used as the major component of the grid computing infrastructure for several HEP experiments proposes an even bigger challenge to this paradox as its pilot is more closely coupled with operations requiring the X.509 authentication compared to the implementations of pilot in its peer grid interware. The Belle II experiment is a B-factory experiment at KEK, and it uses DIRAC for its distributed computing. In the project of BelleII@home, in order to integrate the volunteer computing resources into the Belle II distributed computing platform in a secure way, we adopted a new approach which detaches the payload running from the Belle II DIRAC pilot which is a customized pilot pulling and processing jobs from the Belle II distributed computing platform, so that the payload can run on volunteer computers without requiring any X.509 authentication. In this approach we developed a gateway service running on a trusted server which handles all the operations requiring the X.509 authentication. So far, we have developed and deployed the prototype of BelleII@home, and tested its full workflow which proves the feasibility of this approach. This approach can also be applied on HPC systems whose work nodes do not have outbound connectivity to interact with the DIRAC system in general.

  7. Effects of Computer-Assisted Jigsaw II Cooperative Learning Strategy on Physics Achievement and Retention

    Science.gov (United States)

    Gambari, Isiaka Amosa; Yusuf, Mudasiru Olalere

    2016-01-01

    This study investigated the effects of computer-assisted Jigsaw II cooperative strategy on physics achievement and retention. The study also determined how moderating variables of achievement levels as it affects students' performance in physics when Jigsaw II cooperative learning is used as an instructional strategy. Purposive sampling technique…

  8. Computer measurement system of reactor period for China fast burst reactor-II

    International Nuclear Information System (INIS)

    Zhao Wuwen; Jiang Zhiguo

    1997-01-01

    The author simply introduces the hardware, principle, and software of the reactor period computer measure system for China Fast Burst Reactor-II (CFBR-II). It also gives the relation between Fission yield and Pre-reactivity of CFBR-II reactor system of bared reactor with decoupled-component and system of bared reactor with multiple light-material. The computer measure system makes the reactor period measurement into automation and intellectualization and also improves the speed and precision of period data process on-line

  9. Computational complexity of the landscape II-Cosmological considerations

    Science.gov (United States)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  10. Evaluation of the Airtraq and Macintosh laryngoscopes in patients at increased risk for difficult tracheal intubation.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2008-02-01

    The Airtraq, a novel single use indirect laryngoscope, has demonstrated promise in the normal and simulated difficult airway. We compared the ease of intubation using the Airtraq with the Macintosh laryngoscope, in patients at increased risk for difficult tracheal intubation, in a randomised, controlled clinical trial. Forty consenting patients presenting for surgery requiring tracheal intubation, who were deemed to possess at least three characteristics indicating an increased risk for difficulty in tracheal intubation, were randomly assigned to undergo tracheal intubation using a Macintosh (n = 20) or Airtraq (n = 20) laryngoscope. All patients were intubated by one of three anaesthetists experienced in the use of both laryngoscopes. Four patients were not successfully intubated with the Macintosh laryngoscope, but were intubated successfully with the Airtraq. The Airtraq reduced the duration of intubation attempts (mean (SD); 13.4 (6.3) vs 47.7 (8.5) s), the need for additional manoeuvres, and the intubation difficulty score (0.4 (0.8) vs 7.7 (3.0)). Tracheal intubation with the Airtraq also reduced the degree of haemodynamic stimulation and minor trauma compared to the Macintosh laryngoscope.

  11. Assessment of the storz video Macintosh laryngoscope for use in difficult airways: A human simulator study.

    Science.gov (United States)

    Bair, Aaron E; Olmsted, Kalani; Brown, Calvin A; Barker, Tobias; Pallin, Daniel; Walls, Ron M

    2010-10-01

    Video laryngoscopy has been shown to improve glottic exposure when compared to direct laryngoscopy in operating room studies. However, its utility in the hands of emergency physicians (EPs) remains undefined. A simulated difficult airway was used to determine if intubation by EPs using a video Macintosh system resulted in an improved glottic view, was easier, was faster, or was more successful than conventional direct laryngoscopy. Emergency medicine (EM) residents and attending physicians at two academic institutions performed endotracheal intubation in one normal and two identical difficult airway scenarios. With the difficult scenarios, the participants used video laryngoscopy during the second case. Intubations were performed on a medium-fidelity human simulator. The difficult scenario was created by limiting cervical spine mobility and inducing trismus. The primary outcome was the proportion of direct versus video intubations with a grade I or II Cormack-Lehane glottic view. Ease of intubation (self-reported via 10-cm visual analog scale [VAS]), time to intubation, and success rate were also recorded. Descriptive statistics as well as medians with interquartile ranges (IQRs) are reported where appropriate. The Wilcoxon matched pairs signed-rank test was used for comparison testing of nonparametric data. Participants (n = 39) were residents (59%) and faculty. All had human intubation experience; 51% reported more than 100 prior intubations. On difficult laryngoscopy, a Cormack-Lehane grade I or II view was obtained in 20 (51%) direct laryngoscopies versus 38 (97%) of the video-assisted laryngoscopies (p < 0.01). The median VAS score for difficult airways was 50 mm (IQR = 28–73 mm) for direct versus 18 mm (IQR = 9–50 mm) for video (p < 0.01). The median time to intubation in difficult airways was 25 seconds (IQR = 16–44 seconds) for direct versus 20 seconds (IQR = 12–35 seconds) for video laryngoscopy (p < 0.01). All intubations were successful without

  12. A randomized controlled trial comparing C Mac D Blade and Macintosh laryngoscope for nasotracheal intubation in patients undergoing surgeries for head and neck cancer.

    Science.gov (United States)

    Hazarika, Hrishikesh; Saxena, Anudeep; Meshram, Pradeep; Kumar Bhargava, Ajay

    2018-01-01

    Several devices are available to take care of difficult airway, but C-MAC D-Blade has scant evidence of its use in nasotracheal intubation in a difficult airway scenario. We compared the C-MAC D-Blade videolaryngoscope ™ , and the standard Macintosh laryngoscope for nasal intubation in patients with difficult airways selected by El-Ganzouri risk index using parameters of time and attempts required for intubation, glottic view in terms of Cormack-Lehane grade, ease of intubation, success rate, use of accessory maneuvers, incidence of complications, and hemodynamic changes. One hundred American Society of Anesthesiologists (ASA) I-III patients aged 20-70 years with EGRI score 1-≤7 scheduled for head and neck surgery requiring nasal intubation. ASA IV patients, patients with mouth opening <2.5 cm, patients difficult to mask ventilate, and patients with hyperkalemia and history of malignant hyperthermia were excluded from the study. Primary outcome was time taken to intubation, and secondary outcomes were a number of attempts, glottic view in terms of C/L grade, use of accessory maneuvers, success rate, incidence of trauma, ease of intubation, and hemodynamic changes before and after intubation. Time required for intubation was less (39.56 ± 15.65 s) in Group C than in Group M (50.34 ± 15.65 s). Cormack-Lehane Grade I and II view were more in C-MAC D-Blade group ( P < 0.05). Success rate and ease of intubation were found to be more in C-MAC D-Blade group than in Macintosh group ( P < 0.05). A number of attempts and incidence of complications such as trauma, bleeding, and failed intubation were greater in Macintosh group than in C-MAC D-Blade group. Hemodynamic changes were observed to be comparable in both the groups. C-MAC D-Blade videolaryngoscope ™ is a better tool in anesthetic management of difficult airway for nasal intubation compared to conventional Macintosh laryngoscope.

  13. Novel theory of the HD dipole moment. II. Computations

    International Nuclear Information System (INIS)

    Thorson, W.R.; Choi, J.H.; Knudson, S.K.

    1985-01-01

    In the preceding paper we derived a new theory of the dipole moments of homopolar but isotopically asymmetric molecules (such as HD, HT, and DT) in which the electrical asymmetry appears directly in the electronic Hamiltonian (in an appropriate Born-Oppenheimer separation) and the dipole moment may be computed as a purely electronic property. In the present paper we describe variation-perturbation calculations and convergence studies on the dipole moment for HD, which is found to have the value 8.51 x 10 -4 debye at 1.40 a.u. Using the two alternative formulations of the electronic problem, we can provide a test of basis-set adequacy and convergence of the results, and such convergence studies are reported here. We have also computed vibration-rotation transition matrix elements and these are compared with experimental and other theoretical results

  14. Modernization of computer of plant Vandellos-II; Modernizacion del ordenador de planta de Vandello-II

    Energy Technology Data Exchange (ETDEWEB)

    Fuente Arias, E. de la

    2014-07-01

    The Plant computer from the nuclear de Vandellos II, whose modernization process will carry out Westinghouse, is a centralized system which performs monitoring and supervision in real time of plant processes and which performs the calculations necessary for an efficient assessment of plant operation, without performing any action on it. Its main function is to provide current and historical information on the status of the plant, both in normal operation and emergency conditions. (Author)

  15. Computer imaging of EBR-II handling equipment

    International Nuclear Information System (INIS)

    Hansen, L.H.; Peters, G.G.

    1994-10-01

    This paper describes a three-dimensional graphics application used to visualize the positions of remotely operated fuel handling equipment in the EBR-II reactor. The system described in this paper uses actual signals to move a three-dimensional graphics model in real-time in response to movements of equipment in the plant. A three-dimensional (3D) visualization technique is necessary to simulate direct visual observation of the transfers of fuel and experiments into and out of the reactor because the fuel handling equipment is submerged in liquid sodium and therefore is not visible to the operator. This paper will present details on how the 3D model was created and how real-time dynamic behavior was added to each of the moving components

  16. Woman in Either/Or, I & II: A Computer Analysis

    Directory of Open Access Journals (Sweden)

    Alastair McKinnon

    2013-11-01

    Full Text Available The author analyzes Kierkegaard's pronouncing on the woman and the feminine in the text Either/Or, specially in parts I and II. For this, key words such as “woman”, “feminine”, “virginity”, “bride”, “wife”, among many others, were selected and Michael J. Greenacre's SimCA 2.0 comparative analysis program was employed in order to show both the common and different perspectives of the aesthete A and the aesthete B, as well as to offer a series of discursive dimensions that are present in the text, which gives a set of valuable conclusions for the understanding of this Kierkegaardian work and the opinion of the danish philosopher on the woman and what is proper to her. The results of the analysis are shown at the end of the paper with tables which compare A's and B's vocabulary and show their frequency; other figures exhibit with graphs the behaviour of the key words against the most important discursive dimensions.

  17. Fast Algorithm for Computing the Discrete Hartley Transform of Type-II

    Directory of Open Access Journals (Sweden)

    Mounir Taha Hamood

    2016-06-01

    Full Text Available The generalized discrete Hartley transforms (GDHTs have proved to be an efficient alternative to the generalized discrete Fourier transforms (GDFTs for real-valued data applications. In this paper, the development of direct computation of radix-2 decimation-in-time (DIT algorithm for the fast calculation of the GDHT of type-II (DHT-II is presented. The mathematical analysis and the implementation of the developed algorithm are derived, showing that this algorithm possesses a regular structure and can be implemented in-place for efficient memory utilization.The performance of the proposed algorithm is analyzed and the computational complexity is calculated for different transform lengths. A comparison between this algorithm and existing DHT-II algorithms shows that it can be considered as a good compromise between the structural and computational complexities.

  18. GSTARS computer models and their applications, Part II: Applications

    Science.gov (United States)

    Simoes, F.J.M.; Yang, C.T.

    2008-01-01

    In part 1 of this two-paper series, a brief summary of the basic concepts and theories used in developing the Generalized Stream Tube model for Alluvial River Simulation (GSTARS) computer models was presented. Part 2 provides examples that illustrate some of the capabilities of the GSTARS models and how they can be applied to solve a wide range of river and reservoir sedimentation problems. Laboratory and field case studies are used and the examples show representative applications of the earlier and of the more recent versions of GSTARS. Some of the more recent capabilities implemented in GSTARS3, one of the latest versions of the series, are also discussed here with more detail. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  19. Computer Aided Design of Polygalacturonase II from Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Ibrahim Ali Noorbatcha

    2011-12-01

    Full Text Available Pectin is a complex polysaccharide found in the cell walls of plants and consisting mainly of esterified D-galacturonic acid resides in α-(1-4 chain. In production of fruit juice, pectin contributes to fruit juice viscosity, thereby reducing the juice production and increasing the filtration time. Polygalacturonase improves the juice production process by rapid degradation of pectin. In this project we have designed a novel polygalacturonase enzyme using computer aided design approaches. The three dimension structure of polygalacturonase is first modeled on the basis of the known crystal structure. The active site in this enzyme is identified by manual and automated docking methods. Lamarckian genetic algorithm is used for automated docking and the active site is validated by comparing with existing experimental data. This is followed by in silico mutations of the enzymes and the automated docking process is repeated using the mutant enzymes. The strength of the binding of the ligands inside the active site is evaluated by computing the binding score using Potential Mean Force (PMF method. The in silico mutations R256Q and K258N are found to decrease the binding strength of the ligand at the active site, indicating lowering of enzyme activity, which is consistent with the experimental results. Hence in silico mutations can be used to design new polygalacturonase enzymes with improved enzyme activity.ABSTRAK: Pektin adalah polisakarida kompleks yang terdapat di dalam dinding sel tumbuhan dan sebahagian besarnya terdiri daripada asid D-galakturonik terester yang ditemui di dalam rantaian α-(1-4. Dalam penghasilan jus buah-buahan, pektin menyumbang dalam kepekatan jus buah-buahan, di mana ia mengurangkan penghasilan jus dan menambahkan masa penapisan. Poligalakturonase meningkatkan proses penghasilan jus dengan pemecahan pektin dengan cepat. Dalam projek ini, kami telah merangka satu enzim poligalakturonase baru dengan menggunakan pendekatan reka

  20. Electromagnetic Systems Effects Database (EMSED). AERO 90, Phase II User's Manual

    National Research Council Canada - National Science Library

    Sawires, Kalim

    1998-01-01

    The Electromagnetic Systems Effects Database (EMSED), also called AIRBASE, is a training guide for users not familiar with the AIRBASE database and its operating platform, the Macintosh computer (Mac...

  1. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  2. Computer Series, 115.

    Science.gov (United States)

    Birk, James P., Ed.

    1990-01-01

    Reviewed are six computer programs which may be useful in teaching college level chemistry. Topics include dynamic data storage in FORTRAN, "KC?DISCOVERER," pH of acids and bases, calculating percent boundary surfaces for orbitals, and laboratory interfacing with PT Nomograph for the Macintosh. (CW)

  3. A CAMAC-VME-Macintosh data acquisition system for nuclear experiments

    Science.gov (United States)

    Anzalone, A.; Giustolisi, F.

    1989-10-01

    A multiprocessor system for data acquisition and analysis in low-energy nuclear physics has been realized. The system is built around CAMAC, the VMEbus, and the Macintosh PC. Multiprocessor software has been developed, using RTF, MACsys, and CERN cross-software. The execution of several programs that run on several VME CPUs and on an external PC is coordinated by a mailbox protocol. No operating system is used on the VME CPUs. The hardware, software, and system performance are described.

  4. Retention of tracheal intubation skills by novice personnel: a comparison of the Airtraq and Macintosh laryngoscopes.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2007-03-01

    Direct laryngoscopic tracheal intubation is a potentially lifesaving manoeuvre, but it is a difficult skill to acquire and to maintain. These difficulties are exacerbated if the opportunities to utilise this skill are infrequent, and by the fact that the consequences of poorly performed intubation attempts may be severe. Novice users find the Airtraq laryngoscope easier to use than the conventional Macintosh laryngoscope. We therefore wished to determine whether novice users would have greater retention of intubation skills with the Airtraq rather than the Macintosh laryngoscope. Twenty medical students who had no prior airway management experience participated in this study. Following brief didactic instruction, each took turns performing laryngoscopy and intubation using the Macintosh and Airtraq devices in easy and simulated difficult laryngoscopy scenarios. The degree of success with each device, the time taken to perform intubation and the assistance required, and the potential for complications were then assessed. Six months later, the assessment process was repeated. No didactic instruction or practice attempts were provided on this latter occasion. Tracheal intubation skills declined markedly with both devices. However, the Airtraq continued to provide better intubating conditions, resulting in greater success of intubation, with fewer optimisation manoeuvres required, and reduced potential for dental trauma, particularly in the difficult laryngoscopy scenarios. The substantial decline in direct laryngoscopy skills over time emphasise the need for continued reinforcement of this complex skill.

  5. Russian HyperTutor: Designing Interactive Multimedia for the Macintosh.

    Science.gov (United States)

    Mitrevski, George

    1995-01-01

    Describes an interactive, multimedia computer program designed to teach Russian grammar, and accompany a commercial textbook. Each of the 35 lessons integrates graphics, sound, and animation. A dictionary and extensive vocabulary exercises are also included. Tutorials provide simple but concise grammar explanations that the teacher can edit or…

  6. Computer control system of the cooler-synchrotron TARN-II

    International Nuclear Information System (INIS)

    Watanabe, S.; Watanabe, T.; Yoshizawa, M.; Katayama, T.

    1993-11-01

    The client-server model enables us to develop the flexible control system such as a TARN-II computer control system. The system forms a single machine including a message bus to communicate between them. An auxiliary control path in the client-server model serves a high speed device control. The configuration and performance of that control system are described. (author)

  7. Design layout for gas monitoring system II, GMS-2, computer system

    International Nuclear Information System (INIS)

    Vo, V.; Philipp, B.L.; Manke, M.P.

    1995-01-01

    This document provides a general overview of the computer systems software that perform the data acquisition and control for the 241-SY-101 Gas Monitoring System II (GMS-2). It outlines the system layout, and contains descriptions of components and the functions they perform. The GMS-2 system was designed and implemented by Los Alamos National Laboratory and supplied to Westinghouse Hanford Company

  8. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  9. THE IMPROVEMENT OF COMPUTER NETWORK PERFORMANCE WITH BANDWIDTH MANAGEMENT IN KEMURNIAN II SENIOR HIGH SCHOOL

    Directory of Open Access Journals (Sweden)

    Bayu Kanigoro

    2012-05-01

    Full Text Available This research describes the improvement of computer network performance with bandwidth management in Kemurnian II Senior High School. The main issue of this research is the absence of bandwidth division on computer, which makes user who is downloading data, the provided bandwidth will be absorbed by the user. It leads other users do not get the bandwidth. Besides that, it has been done IP address division on each room, such as computer, teacher and administration room for supporting learning process in Kemurnian II Senior High School, so wireless network is needed. The method is location observation and interview with related parties in Kemurnian II Senior High School, the network analysis has run and designed a new topology network including the wireless network along with its configuration and separation bandwidth on microtic router and its limitation. The result is network traffic on Kemurnian II Senior High School can be shared evenly to each user; IX and IIX traffic are separated, which improve the speed on network access at school and the implementation of wireless network.Keywords: Bandwidth Management; Wireless Network

  10. Iron(II) porphyrins induced conversion of nitrite into nitric oxide: A computational study.

    Science.gov (United States)

    Zhang, Ting Ting; Liu, Yong Dong; Zhong, Ru Gang

    2015-09-01

    Nitrite reduction to nitric oxide by heme proteins was reported as a protective mechanism to hypoxic injury in mammalian physiology. In this study, the pathways of nitrite reduction to nitric oxide mediated by iron(II) porphyrin (P) complexes, which were generally recognized as models for heme proteins, were investigated by using density functional theory (DFT). In view of two type isomers of combination of nitrite and Fe(II)(P), N-nitro- and O-nitrito-Fe(II)-porphyrin complexes, and two binding sites of proton to the different O atoms of nitrite moiety, four main pathways for the conversion of nitrite into nitric oxide mediated by iron(II) porphyrins were proposed. The results indicate that the pathway of N-bound Fe(II)(P)(NO2) isomer into Fe(III)(P)(NO) and water is similar to that of O-bound isomer into nitric oxide and Fe(III)(P)(OH) in both thermodynamical and dynamical aspects. Based on the initial computational studies of five-coordinate nitrite complexes, the conversion of nitrite into NO mediated by Fe(II)(P)(L) complexes with 14 kinds of proximal ligands was also investigated. Generally, the same conclusion that the pathways of N-bound isomers are similar to those of O-bound isomer was obtained for iron(II) porphyrin with ligands. Different effects of ligands on the reduction reactions were also found. It is notable that the negative proximal ligands can improve reactive abilities of N-nitro-iron(II) porphyrins in the conversion of nitrite into nitric oxide compared to neutral ligands. The findings will be helpful to expand our understanding of the mechanism of nitrite reduction to nitric oxide by iron(II) porphyrins. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Empowering Middle School Teachers with Portable Computers.

    Science.gov (United States)

    Weast, Jerry D.; And Others

    1993-01-01

    A Sioux Falls (South Dakota) project that supplied middle school teachers with Macintosh computers and training to use them showed gratifying results. Easy access to portable notebook computers made teachers more active computer users, increased teacher interaction and collaboration, enhanced teacher productivity regarding management tasks and…

  12. The theoretical and computational models of the GASFLOW-II code

    International Nuclear Information System (INIS)

    Travis, J.R.

    1999-01-01

    GASFLOW-II is a finite-volume computer code that solves the time-dependent compressible Navier-Stokes equations for multiple gas species in a dispersed liquid water two-phase medium. The fluid-dynamics algorithm is coupled to the chemical kinetics of combusting gases to simulate diffusion or propagating flames in complex geometries of nuclear containments. GASFLOW-II is therefore able to predict gaseous distributions and thermal and pressure loads on containment structures and safety related equipment in the event combustion occurs. Current developments of GASFLOW-II are focused on hydrogen distribution, mitigation measures including carbon dioxide inerting, and possible combustion events in nuclear reactor containments. Fluid turbulence is calculated to enhance the transport and mixing of gases in rooms and volumes that may be connected by a ventilation system. Condensation, vaporization, and heat transfer to walls, floors, ceilings, internal structures, and within the fluid are calculated to model the appropriate mass and energy sinks. (author)

  13. Reproducing a Prospective Clinical Study as a Computational Retrospective Study in MIMIC-II.

    Science.gov (United States)

    Kury, Fabrício S P; Huser, Vojtech; Cimino, James J

    2015-01-01

    In this paper we sought to reproduce, as a computational retrospective study in an EHR database (MIMIC-II), a recent large prospective clinical study: the 2013 publication, by the Japanese Association for Acute Medicine (JAAM), about disseminated intravascular coagulation, in the journal Critical Care (PMID: 23787004). We designed in SQL and Java a set of electronic phenotypes that reproduced the study's data sampling, and used R to perform the same statistical inference procedures. All produced source code is available online at https://github.com/fabkury/paamia2015. Our program identified 2,257 eligible patients in MIMIC-II, and the results remarkably agreed with the prospective study. A minority of the needed data elements was not found in MIMIC-II, and statistically significant inferences were possible in the majority of the cases.

  14. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  15. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  16. Tracheal intubation by inexperienced medical residents using the Airtraq and Macintosh laryngoscopes--a manikin study.

    LENUS (Irish Health Repository)

    Maharaj, Chrisen H

    2006-11-01

    The Airtraq laryngoscope is a novel intubation device that may possess advantages over conventional direct laryngoscopes for use by personnel that are infrequently required to perform tracheal intubation. We conducted a prospective study in 20 medical residents with little prior airway management experience. After brief didactic instruction, each participant took turns performing laryngoscopy and intubation using the Macintosh (Welch Allyn, Welch Allyn, NY) and Airtraq (Prodol Ltd. Vizcaya, Spain) devices, in 3 laryngoscopy scenarios in a Laerdal Intubation Trainer (Laerdal, Stavanger, Norway) and 1 scenario in a Laerdal SimMan manikin (Laerdal, Kent, UK). They then performed tracheal intubation of the normal airway a second time to characterize the learning curve. In all scenarios tested, the Airtraq decreased the duration of intubation attempts, reduced the number of optimization maneuvers required, and reduced the potential for dental trauma. The residents found the Airtraq easier to use in all scenarios compared with the Macintosh laryngoscope. The Airtraq may constitute a superior device for use by personnel infrequently required to perform tracheal intubation.

  17. SASSYS-1 computer code verification with EBR-II test data

    International Nuclear Information System (INIS)

    Warinner, D.K.; Dunn, F.E.

    1985-01-01

    The EBR-II natural circulation experiment, XX08 Test 8A, is simulated with the SASSYS-1 computer code and the results for the latter are compared with published data taken during the transient at selected points in the core. The SASSYS-1 results provide transient temperature and flow responses for all points of interest simultaneously during one run, once such basic parameters as pipe sizes, initial core flows, and elevations are specified. The SASSYS-1 simulation results for the EBR-II experiment XX08 Test 8A, conducted in March 1979, are within the published plant data uncertainties and, thereby, serve as a partial verification/validation of the SASSYS-1 code

  18. HYDRA-II: A hydrothermal analysis computer code: Volume 1, Equations and numerics

    International Nuclear Information System (INIS)

    McCann, R.A.

    1987-04-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in Cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the Cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits of modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. This volume, Volume I - Equations and Numerics, describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. The final volume, Volume III - Verification/Validation Assessments, presents results of numerical simulations of single- and multiassembly storage systems and comparisons with experimental data. 4 refs

  19. Implementing of AMPX-II system for a univac computer neutron cross-section libraries

    International Nuclear Information System (INIS)

    Sancho, J.; Verdu, G.; Serradell, V.

    1984-01-01

    The AMPX-II system, developed at ORNL, is constituted by a modular set of computer programs, for generation and handling of several nuclear data libraries. The processing starts from ENDF/B library. Along this paper, we refer mainly to the modules related with neutron cross section libraries: master, working and weighted. These modules have been implemented recently for a UNIVAC 1100/60 computer in the Universidad Politecnica de Valencia (Spain). In order to run the programs in that machine it has been necessary to introduce a number of modifications into their programing structure. The main difficulties found in this work and the need of verification for the new versions are also pointed out. We also refer to the results obtained from the execution of a set of little sample problems. (author)

  20. Tracheal intubation in patients with cervical spine immobilization: a comparison of the Airwayscope, LMA CTrach, and the Macintosh laryngoscopes.

    LENUS (Irish Health Repository)

    Malik, M A

    2009-05-01

    The purpose of this study was to evaluate the effectiveness of the Pentax AWS, and the LMA CTrach, in comparison with the Macintosh laryngoscope, when performing tracheal intubation in patients with neck immobilization using manual in-line axial cervical spine stabilization.

  1. Comparison of Macintosh, Truview EVO2, Glidescope, and Airwayscope laryngoscope use in patients with cervical spine immobilization.

    LENUS (Irish Health Repository)

    Malik, M A

    2008-11-01

    The purpose of this study was to evaluate the effectiveness of the Pentax AWS, Glidescope, and the Truview EVO2, in comparison with the Macintosh laryngoscope, when performing tracheal intubation in patients with neck immobilization using manual in-line axial cervical spine stabilization.

  2. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  3. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II) user's manual

    International Nuclear Information System (INIS)

    David P. Colton

    2007-01-01

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time

  4. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    Science.gov (United States)

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  5. Prescriptions for schedule II opioids and benzodiazepines increase after the introduction of computer-generated prescriptions.

    Science.gov (United States)

    McGerald, Genevieve; Dvorkin, Ronald; Levy, David; Lovell-Rose, Stephanie; Sharma, Adhi

    2009-06-01

    Prescriptions for controlled substances decrease when regulatory barriers are put in place. The converse has not been studied. The objective was to determine whether a less complicated prescription writing process is associated with a change in the prescribing patterns of controlled substances in the emergency department (ED). The authors conducted a retrospective nonconcurrent cohort study of all patients seen in an adult ED between April 19, 2005, and April 18, 2007, who were discharged with a prescription. Prior to April 19, 2006, a specialized prescription form stored in a locked cabinet was obtained from the nursing staff to write a prescription for benzodiazepines or Schedule II opioids. After April 19, 2006, New York State mandated that all prescriptions, regardless of schedule classification, be generated on a specialized bar-coded prescription form. The main outcome of the study was to compare the proportion of Schedule III-V opioids to Schedule II opioids and benzodiazepines prescribed in the ED before and after the introduction of a less cumbersome prescription writing process. Of the 26,638 charts reviewed, 2.1% of the total number of prescriptions generated were for a Schedule II controlled opioid before the new system was implemented compared to 13.6% after (odds ratio [OR] = 7.3, 95% confidence interval [CI] = 6.4 to 8.4). The corresponding percentages for Schedule III-V opioids were 29.9% to 18.1% (OR = 0.52, 95% CI = 0.49 to 0.55) and for benzodiazepines 1.4% to 3.9% (OR = 2.8, 95% CI = 2.4 to 3.4). Patients were more likely to receive a prescription for a Schedule II opioid or a benzodiazepine after a more streamlined computer-generated prescription writing process was introduced in this ED. (c) 2009 by the Society for Academic Emergency Medicine.

  6. A Comparison of Macintosh and Airtraq Laryngoscopes for Endotracheal Intubation in Adult Patients With Cervical Spine Immobilization Using Manual In Line Axial Stabilization: A Prospective Randomized Study.

    Science.gov (United States)

    Vijayakumar, Vinodhadevi; Rao, Shwethapriya; Shetty, Nanda

    2016-10-01

    During cervical spine immobilization using Manual In Line Axial Stabilization (MILS), it is difficult to visualize the larynx by aligning the oropharyngeolaryngeal axes using Macintosh laryngoscope. Theoretically, Airtraq an anatomically shaped blade with endotracheal tube guide channel offers advantage over Macintosh. We hypothesized that intubation would be easier and faster with Airtraq compared with Macintosh laryngoscope. Ninety anesthetized adult patients with normal airways were intubated by experienced anesthesiologists after cervical immobilization with MILS either with Macintosh or Airtraq. Primary outcomes compared were successful intubation, and degree of difficulty of intubation as assessed by Intubation Difficulty Scale (IDS) score. Secondary outcomes compared were duration of laryngoscopy and intubation, degree of difficulty of intubation as assessed by Numerical Rating Scale score, soft tissue, and dental trauma. All 90 patients were successfully intubated in the first attempt. Intubation as assessed by IDS score was easier in Airtraq (84.44%) in contrast to slight difficulty in the Macintosh (77.78%) group; Numerical Rating Scale score was easy in both the groups (Airtraq-91.12%; Macintosh-93.34%). The median (interquartile range [IQR]) time for laryngoscopy, (12 s [IQR, 8 to 17.5) vs. 8 s [IQR, 6 to 12]); total duration for intubation (25 s [IQR, 20-33] vs. 22 s [IQR, 18-27.5]) were prolonged in Airtraq group in comparison to Macintosh group. In anesthetized adult patients with MILS compared with Macintosh, Airtraq provides equal success rate of intubation, statistically significant (although clinically insignificant) longer duration for laryngoscopy and intubation. Intubation with Airtraq was significantly easier than Macintosh as assessed by the IDS score.

  7. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    International Nuclear Information System (INIS)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T.

    2011-01-01

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  8. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    Energy Technology Data Exchange (ETDEWEB)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T. [School of Engineering of Universidad Pontificia Comillas, C/Alberto Aguilera, 23, 28015 Madrid (Spain)

    2011-02-15

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  9. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    Science.gov (United States)

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  10. TRACER-II: a complete computational model for mixing and propagation of vapor explosions

    Energy Technology Data Exchange (ETDEWEB)

    Bang, K.H. [School of Mechanical Engineering, Korea Maritime Univ., Pusan (Korea, Republic of); Park, I.G.; Park, G.C.

    1998-01-01

    A vapor explosion is a physical process in which very rapid energy transfer occurs between a hot liquid and a volatile, colder liquid when the two liquids come into a sudden contact. For the analyses of potential impacts from such explosive events, a computer program, TRACER-II, has been developed, which contains a complete description of mixing and propagation phases of vapor explosions. The model consists of fuel, fragmented fuel (debris), coolant liquid, and coolant vapor in two-dimensional Eulerian coordinates. The set of governing equations are solved numerically using finite difference method. The results of this numerical simulation of vapor explosions are discussed in comparison with the recent experimental data of FARO and KROTOS tests. When compared to some selected FARO and KROTOS data, the fuel-coolant mixing and explosion propagation behavior agree reasonably with the data, although the results are yet sensitive primarily to the melt breakup and fragmentation modeling. (author)

  11. User's manual for computer code RIBD-II, a fission product inventory code

    International Nuclear Information System (INIS)

    Marr, D.R.

    1975-01-01

    The computer code RIBD-II is used to calculate inventories, activities, decay powers, and energy releases for the fission products generated in a fuel irradiation. Changes from the earlier RIBD code are: the expansion to include up to 850 fission product isotopes, input in the user-oriented NAMELIST format, and run-time choice of fuels from an extensively enlarged library of nuclear data. The library that is included in the code package contains yield data for 818 fission product isotopes for each of fourteen different fissionable isotopes, together with fission product transmutation cross sections for fast and thermal systems. Calculational algorithms are little changed from those in RIBD. (U.S.)

  12. [Vision test program for ophthalmologists on Apple II, IIe and IIc computers].

    Science.gov (United States)

    Huber, C

    1985-03-01

    A microcomputer program for the Apple II family of computers on a monochrome and a color screen is described. The program draws most of the tests used by ophthalmologists, and is offered as an alternative to a projector system. One advantage of the electronic generation of drawings is that true random orientation of Pflueger's E is possible. Tests are included for visual acuity (Pflueger's E, Landolt rings, numbers and children's drawings). Colored tests include a duochrome test, simple color vision tests, a fixation help with a musical background, a cobalt blue test and a Worth figure. In the astigmatic dial a mobile pointer helps to determine the axis. New tests can be programmed by the user and exchanged on disks among collageues.

  13. Battle Staff Training System II: Computer-Based Instruction Supporting the Force XXI Training Program

    National Research Council Canada - National Science Library

    Wampler, Richard

    1998-01-01

    This report documents the methodology and lessons learned in the development of the Innovative Tools and Techniques for Brigade and Below Staff Training II - Battle Staff Training System II (ITTBBST-BSTS II...

  14. Computer code PRECIP-II for the calculation of Zr-steam reaction

    International Nuclear Information System (INIS)

    Suzuki, Motoye; Kawasaki, Satoru; Furuta, Teruo

    1978-06-01

    The computer code PRECIP-II developed, a modification of S.Malang's SIMTRAN-I, is to calculate Zr-Steam reaction under LOCA conditions. Improved are the following: 1. treatment of boundary conditions at alpha/beta phase interface during temperature decrease. 2. method of time-mesh control. 3. number of input-controllable parameters, and output format. These improvements made possible physically reasonable calculations for an increased number of temperature history patterns, including the cladding temperature excursion assumed during LOCA. Calculations were made along various transient temperature histories, with the parameters so modified as to enable fitting of numerical results of weight gain, oxide thickness and alpha phase thickness in isothermal reactions to the experimental data. Then the computed results were compared with the corresponding experimental values, which revealed that most of the differences lie within +-10%. Slow cooling effect on ductility change of Zircaloy-4 was investigated with some of the oxidized specimens by a ring compression test; the effect is only slight. (auth.)

  15. Click! 101 Computer Activities and Art Projects for Kids and Grown-Ups.

    Science.gov (United States)

    Bundesen, Lynne; And Others

    This book presents 101 computer activities and projects geared toward children and adults. The activities for both personal computers (PCs) and Macintosh were developed on the Windows 95 computer operating system, but they are adaptable to non-Windows personal computers as well. The book is divided into two parts. The first part provides an…

  16. Computational studies of a paramagnetic planar dibenzotetraaza[14]annulene Ni(II) complex.

    Science.gov (United States)

    Rabaâ, Hassan; Khaledi, Hamid; Olmstead, Marilyn M; Sundholm, Dage

    2015-05-28

    A square-planar Ni(II) dibenzotetraaza[14]annulene complex substituted with two 3,3-dimethylindolenine groups in the meso positions has recently been synthesized and characterized experimentally. In the solid-state, the Ni(II) complex forms linear π-interacting stacks with Ni···Ni separations of 3.448(2) Å. Measurements of the temperature dependence of the magnetic susceptibility revealed a drastic change in the magnetic properties at a temperature of 13 K, indicating a transition from low-to-high spin states. The molecular structures of the free-base ligand, the lowest singlet, and triplet states of the monomer and the dimer of the Ni complex have been studied computationally using density functional theory (DFT) and ab initio correlation levels of theory. In calculations at the second-order Møller-Plesset (MP2) perturbation theory level, a large energy of 260 kcal mol(-1) was obtained for the singlet-triplet splitting, suggesting that an alternative explanation of the observed magnetic properties is needed. The large energy splitting between the singlet and triplet states suggests that the observed change in the magnetism at very low temperatures is due to spin-orbit coupling effects originating from weak interactions between the fine-structure states of the Ni cations in the complex. The lowest electronic excitation energies of the dibenzotetraaza[14]annulene Ni(II) complex calculated at the time-dependent density functional theory (TDDFT) levels are in good agreement with values deduced from the experimental UV-vis spectrum. Calculations at the second-order algebraic-diagrammatic construction (ADC(2)) level on the dimer of the meso-substituted 3,3-dimethylindolenine dibenzotetraaza[14] annulene Ni(II) complex yielded Stokes shifts of 85-100 nm for the lowest excited singlet states. Calculations of the strength of the magnetically induced ring current for the free-base 3,3-dimethylindolenine-substituted dibenzotetraaza[14]annulene show that the annulene

  17. So, you are buying your first computer.

    Science.gov (United States)

    Ferrara-Love, R

    1999-06-01

    Buying your first computer need not be that complicated. The first thing that is needed is an understanding of what you want and need the computer for. By making a list of the various essentials, you will be on your way to purchasing that computer. Once that is completed, you will need an understanding of what each of the components of the computer is, how it works, and what options you have. This way, you will be better able to discuss your needs with the salesperson. The focus of this article is limited to personal computers or PCs (i.e., IBMs [Armonk, NY], IBM clones, Compaq [Houston, TX], Gateway [North Sioux City, SD], and so on). I am not including Macintosh or Apple [Cupertino, CA] in this discussion; most software is often made exclusively for personal computers or at least on the market for personal computers before becoming available in Macintosh version.

  18. A comparison of the Glidescope, Pentax AWS, and Macintosh laryngoscopes when used by novice personnel: a manikin study.

    LENUS (Irish Health Repository)

    Malik, Muhammad A

    2009-11-01

    Direct laryngoscopic tracheal intubation is a potentially lifesaving procedure, but a difficult skill to acquire and maintain. The consequences of poorly performed intubation attempts are potentially severe. The Pentax AWS and the Glidescope are indirect laryngoscopes that may require less skill to use. We therefore hypothesized that AWS and Glidescope would prove superior to the Macintosh laryngoscope when used by novices in the normal and simulated difficult airway.

  19. A Computer-Aided Writing Program for Learning Disabled Adolescents.

    Science.gov (United States)

    Fais, Laurie; Wanderman, Richard

    The paper describes the application of a computer-assisted writing program in a special high school for learning disabled and dyslexic students and reports on a study of the program's effectiveness. Particular advantages of the Macintosh Computer for such a program are identified including use of the mouse pointing tool, graphic icons to identify…

  20. Computer code for the thermal-hydraulic analysis of ITU TRIGA Mark-II reactor

    International Nuclear Information System (INIS)

    Ustun, G.; Durmayaz, A.

    2002-01-01

    Istanbul Technical University (ITU) TRIGA Mark-II reactor core consists of ninety vertical cylindrical elements located in five rings. Sixty-nine of them are fuel elements. The reactor is operated and cooled with natural convection by pool water, which is also cooled and purified in external coolant circuits by forced convection. This characteristic leads to consider both the natural and forced convection heat transfer in a 'porous-medium analysis'. The safety analysis of the reactor requires a thermal-hydraulic model of the reactor to determine the thermal-hydraulic parameters in each mode of operation. In this study, a computer code cooled TRIGA-PM (TRIGA - Porous Medium) for the thermal-hydraulic analysis of ITU is considered. TRIGA Mark-II reactor code has been developed to obtain velocity, pressure and temperature distributions in the reactor pool as a function of core design parameters and pool configuration. The code is a transient, thermal-hydraulic code and requires geometric and physical modelling parameters. In the model, although the reactor is considered as only porous medium, the other part of the reactor pool is considered partly as continuum and partly as porous medium. COMMIX-1C code is used for the benchmark purpose of TRIGA-PM code. For the normal operating conditions of the reactor, estimations of TRIGA-PM are in good agreement with those of COMMIX-1C. After some more improvements, this code will be employed for the estimation of LOCA scenario, which can not be analyses by COMMIX-1C and the other multi-purpose codes, considering a break at one of the beam tubes of the reactor

  1. Endotracheal Intubation Using the Macintosh Laryngoscope or KingVision Video Laryngoscope during Uninterrupted Chest Compression

    Directory of Open Access Journals (Sweden)

    Ewelina Gaszynska

    2014-01-01

    Full Text Available Objective. Advanced airway management, endotracheal intubation (ETI, during CPR is more difficult than, for example, during anesthesia. However, new devices such as video laryngoscopes should help in such circumstances. The aim of this study was to assess the performance of the KingVision video laryngoscopes in a manikin cardiopulmonary resuscitation (CPR scenario. Methods. Thirty students enrolled in the third year of paramedic school took part in the study. The simulated CPR scenario was ETI using the standard laryngoscope with a Macintosh blade (MCL and ETI using the KingVision video laryngoscope performed during uninterrupted chest compressions. The primary endpoints were the time needed for ETI and the success ratio. Results. The mean time required for intubation was similar for both laryngoscopes: 16.6 (SD 5.11, median 15.64, range 7.9–27.9 seconds versus 17.91 (SD 5.6, median 16.28, range 10.6–28.6 seconds for the MCL and KingVision, respectively (P=0.1888. On the first attempt at ETI, the success rate during CPR was comparable between the evaluated laryngoscopes: P=0.9032. Conclusion. The KingVision video laryngoscope proves to be less superior when used for endotracheal intubation during CPR compared to the standard laryngoscope with a Mackintosh blade. This proves true in terms of shortening the time needed for ETI and increasing the success ratio.

  2. Effect of Jigsaw II, Reading-Writing-Presentation, and Computer Animations on the Teaching of "Light" Unit

    Science.gov (United States)

    Koç, Yasemin; Yildiz, Emre; Çaliklar, Seyma; Simsek, Ümit

    2016-01-01

    The aim of this study is to determine the effect of Jigsaw II technique, reading-writing-presentation method, and computer animation on students' academic achievements, epistemological beliefs, attitudes towards science lesson, and the retention of knowledge in the "Light" unit covered in the 7th grade. The sample of the study consists…

  3. Computational analysis of neutronic parameters of CENM TRIGA Mark II research reactor

    International Nuclear Information System (INIS)

    El Younoussi, C.; El Bakkari, B.; Boulaich, Y.; Riyach, D.; Otmani, S.; Marrhich, I.; Badri, H.; Htet, A.; Nacir, B.; El Bardouni, T.; Boukhal, H.; Zoubair, M.; Ossama, M.; Chakir, E.

    2010-01-01

    The CENM TRIGA MARK II reactor is part of the National Center for Energy, Sciences and Nuclear Techniques (CNESTEN). It's a standard design 2MW, natural-convection-cooled reactor with a graphite reflector containing 4 beam tubes and a thermal column. The reactor has several applications in different fields as industry, agriculture, medicine, training and education. In the present work a computational study has been carried out in the framework of neutronic parameters studies of the reactor. A detailed MCNP model that include all elements of the core and surrounding structures has been developed to calculate different parameters of the core (The effective multiplication factor, reactivity experiments comprising control rods worth, excess reactivity and shutdown margin). Further calculations have been carried out to calculate the neutron flux profiles at different locations of the reactor core. The cross sections used are processed from the library provided with MCNP5 and based on the ENDF/B-VII with continuous dependence in energy and special treatment of thermal neutrons in lightweight materials. (author)

  4. Satzarten unterscheiden - Kann das der Computer? Syntaktische Explorationen anhand von COSMAS II

    Directory of Open Access Journals (Sweden)

    Näf, Anton

    2006-01-01

    Full Text Available Is the computer capable of recognizing different sentence types in a linguistic corpus such as COSMAS II (Mannheim, which has not been previously treated by a tagger or a parser? The answer is in fact no. However, in the present article it is shown that under certain circumstances an automatic distinction is nevertheless possible. Making use of a procedure that we have called Anfragezuspitzung (literally: making a query pointed; encirclement of a grammatical phenomenon by a combination of several specific queries, and taking as a starting point philological prior knowledge that has been gathered "by hand", it proves to be perfectly possible to arrive at a satisfactory result. With the example of sentence types in German, in particular the distinction between interrogative and exclamatory sentences, we demonstrate in this article that such a distinction can be carried out automatically with a high degree of accuracy, e.g. the distinction between War das eine gute Idee? (Was this a good idea? and War das eine gute Idee! (What a good idea this was!.

  5. Investigation of mixed mode - I/II fracture problems - Part 1: computational and experimental analyses

    Directory of Open Access Journals (Sweden)

    O. Demir

    2016-01-01

    Full Text Available In this study, to investigate and understand the nature of fracture behavior properly under in-plane mixed mode (Mode-I/II loading, three-dimensional fracture analyses and experiments of compact tension shear (CTS specimen are performed under different mixed mode loading conditions. Al 7075-T651 aluminum machined from rolled plates in the L-T rolling direction (crack plane is perpendicular to the rolling direction is used in this study. Results from finite element analyses and fracture loads, crack deflection angles obtained from the experiments are presented. To simulate the real conditions in the experiments, contacts are defined between the contact surfaces of the loading devices, specimen and loading pins. Modeling, meshing and the solution of the problem involving the whole assembly, i.e., loading devices, pins and the specimen, with contact mechanics are performed using ANSYSTM. Then, CTS specimen is analyzed separately using a submodeling approach, in which three-dimensional enriched finite elements are used in FRAC3D solver to calculate the resulting stress intensity factors along the crack front. Having performed the detailed computational and experimental studies on the CTS specimen, a new specimen type together with its loading device is also proposed that has smaller dimensions compared to the regular CTS specimen. Experimental results for the new specimen are also presented.

  6. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  7. C-MAC videolaryngoscope versus Macintosh laryngoscope for tracheal intubation: A systematic review and meta-analysis with trial sequential analysis.

    Science.gov (United States)

    Hoshijima, Hiroshi; Mihara, Takahiro; Maruyama, Koichi; Denawa, Yohei; Mizuta, Kentaro; Shiga, Toshiya; Nagasaka, Hiroshi

    2018-06-09

    The C-MAC laryngoscope (C-MAC) is a videolaryngoscope that uses a modified Macintosh blade. Although several anecdotal reports exist, it remains unclear whether the C-MAC is superior to the Macintosh laryngoscope for tracheal intubation in the adult population. Systematic review, meta-analysis. Operating room, intensive care unit. For inclusion in our analysis, studies had to be prospective randomised trials which compared the C-MAC with the Macintosh laryngoscope for tracheal intubation in the adult population. Data on success rates, intubation time, glottic visualisation and incidence of external laryngeal manipulations (ELM) during tracheal intubation were extracted from the identified studies. In subgroup analysis, we separated those parameters to assess the influence of the airway condition (normal or difficult) and laryngoscopists (novice or experienced). We conducted a trial sequential analysis (TSA). Sixteen articles with 18 trials met the inclusion criteria. The C-MAC provided better glottic visualisation compared to the Macintosh (RR, 1.08; 95% CI, 1.03-1.14). TSA corrected the CI to 1.01-1.19; thus, total sample size reached the required information size (RIS). Success rates and intubation time did not differ significantly between the laryngoscopes. TSA showed that total sample size reached the RIS for success rates. The TSA Z curve surpassed the futility boundary. The C-MAC required less ELM compared to the Macintosh (RR, 0.83; 95% CI, 0.72-0.96). TSA corrected the CI to 0.67-1.03; 52.3% of the RIS was achieved. In difficult airways, the C-MAC showed superior success rates, glottic visualisation, and less ELM compared to the Macintosh. Among experienced laryngoscopists, the C-MAC offered better glottic visualisation with less ELM than the Macintosh. The C-MAC provided better glottic visualisation and less ELM (GRADE: Very Low or Moderate), with improved success rates, glottic visualisation, and less ELM in difficult airways. Copyright © 2018 Elsevier

  8. New Cu (II), Co(II) and Ni(II) complexes of chalcone derivatives: Synthesis, X-ray crystal structure, electrochemical properties and DFT computational studies

    Science.gov (United States)

    Tabti, Salima; Djedouani, Amel; Aggoun, Djouhra; Warad, Ismail; Rahmouni, Samra; Romdhane, Samir; Fouzi, Hosni

    2018-03-01

    The reaction of nickel(II), copper(II) and cobalt(II) with 4-hydroxy-3-[(2E)-3-(1H-indol-3-yl)prop-2-enoyl]-6-methyl-2H-pyran-2-one (HL) leads to a series of new complexes: Ni(L)2(NH3), Cu(L)2(DMF)2 and Co(L)2(H2O). The crystal structure of the Cu(L)2(DMF)2 complex have been determined by X-ray diffraction methods. The Cu(II) lying on an inversion centre is coordinated to six oxygen atoms forming an octahedral elongated. Additionally, the electrochemical behavior of the metal complexes were investigated by cyclic voltammetry at a glassy carbon electrode (GC) in CH3CN solutions, showing the quasi-reversible redox process ascribed to the reduction of the MII/MI couples. The X-ray single crystal structure data of the complex was matched excellently with the optimized monomer structure of the desired compound; Hirschfeld surface analysis supported the packed crystal lattice 3D network intermolecular forces. HOMO/LUMO energy level and the global reactivity descriptors quantum parameters are also calculated. The electrophilic and nucleophilic potions in the complex surface are theoretically evaluated by molecular electrostatic potential and Mulliken atomic charges analysis.

  9. Oxidized calmodulin kinase II regulates conduction following myocardial infarction: a computational analysis.

    Directory of Open Access Journals (Sweden)

    Matthew D Christensen

    2009-12-01

    Full Text Available Calmodulin kinase II (CaMKII mediates critical signaling pathways responsible for divergent functions in the heart including calcium cycling, hypertrophy and apoptosis. Dysfunction in the CaMKII signaling pathway occurs in heart disease and is associated with increased susceptibility to life-threatening arrhythmia. Furthermore, CaMKII inhibition prevents cardiac arrhythmia and improves heart function following myocardial infarction. Recently, a novel mechanism for oxidative CaMKII activation was discovered in the heart. Here, we provide the first report of CaMKII oxidation state in a well-validated, large-animal model of heart disease. Specifically, we observe increased levels of oxidized CaMKII in the infarct border zone (BZ. These unexpected new data identify an alternative activation pathway for CaMKII in common cardiovascular disease. To study the role of oxidation-dependent CaMKII activation in creating a pro-arrhythmia substrate following myocardial infarction, we developed a new mathematical model of CaMKII activity including both oxidative and autophosphorylation activation pathways. Computer simulations using a multicellular mathematical model of the cardiac fiber demonstrate that enhanced CaMKII activity in the infarct BZ, due primarily to increased oxidation, is associated with reduced conduction velocity, increased effective refractory period, and increased susceptibility to formation of conduction block at the BZ margin, a prerequisite for reentry. Furthermore, our model predicts that CaMKII inhibition improves conduction and reduces refractoriness in the BZ, thereby reducing vulnerability to conduction block and reentry. These results identify a novel oxidation-dependent pathway for CaMKII activation in the infarct BZ that may be an effective therapeutic target for improving conduction and reducing heterogeneity in the infarcted heart.

  10. GPS-MBA: computational analysis of MHC class II epitopes in type 1 diabetes.

    Science.gov (United States)

    Cai, Ruikun; Liu, Zexian; Ren, Jian; Ma, Chuang; Gao, Tianshun; Zhou, Yanhong; Yang, Qing; Xue, Yu

    2012-01-01

    As a severe chronic metabolic disease and autoimmune disorder, type 1 diabetes (T1D) affects millions of people world-wide. Recent advances in antigen-based immunotherapy have provided a great opportunity for further treating T1D with a high degree of selectivity. It is reported that MHC class II I-A(g7) in the non-obese diabetic (NOD) mouse and human HLA-DQ8 are strongly linked to susceptibility to T1D. Thus, the identification of new I-A(g7) and HLA-DQ8 epitopes would be of great help to further experimental and biomedical manipulation efforts. In this study, a novel GPS-MBA (MHC Binding Analyzer) software package was developed for the prediction of I-A(g7) and HLA-DQ8 epitopes. Using experimentally identified epitopes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted and improved. By extensive evaluation and comparison, the GPS-MBA performance was found to be much better than other tools of this type. With this powerful tool, we predicted a number of potentially new I-A(g7) and HLA-DQ8 epitopes. Furthermore, we designed a T1D epitope database (TEDB) for all of the experimentally identified and predicted T1D-associated epitopes. Taken together, this computational prediction result and analysis provides a starting point for further experimental considerations, and GPS-MBA is demonstrated to be a useful tool for generating starting information for experimentalists. The GPS-MBA is freely accessible for academic researchers at: http://mba.biocuckoo.org.

  11. Upper cervical spine movement during intubation: fluoroscopic comparison of the AirWay Scope, McCoy laryngoscope, and Macintosh laryngoscope.

    Science.gov (United States)

    Maruyama, K; Yamada, T; Kawakami, R; Kamata, T; Yokochi, M; Hara, K

    2008-01-01

    The AirWay Scope (AWS) is a new fibreoptic intubation device, which allows visualization of the glottic structures without alignment of the oral, pharyngeal, and tracheal axes, and thus may be useful in patients with limited cervical spine (C-spine) movement. We fluoroscopically evaluated upper C-spine movement during intubation with the AWS or Macintosh or McCoy laryngoscope. Forty-five patients, with normal C-spine, scheduled for elective surgery were randomly assigned to one of the three intubation devices. Movement of the upper C-spine was examined by measuring angles formed by adjacent vertebrae during intubation. Time to intubation was also recorded. Median cumulative upper C-spine movement was 22.3 degrees, 32.3 degrees, and 36.5 degrees with the AWS, Macintosh laryngoscope, and McCoy laryngoscope, respectively (Pmovement of the C-spine at C1/C2 in comparison with the Macintosh or McCoy laryngoscope (P=0.012), and at C3/C4 in comparison with the McCoy laryngoscope (P=0.019). Intubation time was significantly longer in the AWS group than in the Macintosh group (P=0.03). Compared with the Macintosh or McCoy laryngoscope, the AWS produced less movement of upper C-spine for intubation in patients with a normal C-spine.

  12. A randomised comparative study of the effect of Airtraq optical laryngoscope vs. Macintosh laryngoscope on intraocular pressure in non-ophthalmic surgery

    Directory of Open Access Journals (Sweden)

    Bikramjit Das

    2016-02-01

    Full Text Available BACKGROUND: We compared intraocular pressure changes following laryngoscopy and intubation with conventional Macintosh blade and Airtraq optical laryngoscope. METHODS: Ninety adult patients were randomly assigned to study group or control group. Study group (n = 45 - Airtraq laryngoscope was used for laryngoscopy. Control group (n = 45 - conventional Macintosh laryngoscope was used for laryngoscopy. Preoperative baseline intraocular pressure was measured with Schiotz tonometer. Laryngoscopy was done as per group protocol. Intraocular pressure and haemodynamic parameters were recorded just before insertion of the device and subsequently three times at an interval of one minute after insertion of the device. RESULTS: Patient characteristics, baseline haemodynamic parameters and baseline intraocular pressure were comparable in the two groups. Following insertion of the endotracheal tube with Macintosh laryngoscope, there was statistically significant rise in heart rate and intraocular pressure compared to Airtraq group. There was no significant change in MAP. Eight patients in Macintosh group had tongue-lip-dental trauma during intubation, while only 2 patients received upper airway trauma in Airtraq group. CONCLUSION: We conclude that Airtraq laryngoscope in comparison to Macintosh laryngoscope results in significantly fewer rises in intraocular pressure and clinically less marked increase in haemodynamic response to laryngoscopy and intubation.

  13. Evaluation of Truview evo2® Laryngoscope In Anticipated Difficult Intubation-A Comparison To Macintosh Laryngoscope

    Directory of Open Access Journals (Sweden)

    Ishwar Singh

    2009-01-01

    Full Text Available The aim of the study was to assess and compare laryngoscopic view of Truview evo2 laryngoscope with that of Macintosh laryngoscope in patients with one or more predictors of difficult intubation (PDI. Moreover ease of intubation with Truview evo2 in terms of absolute time requirement was also aimed at. Patients for elective surgery requiring endotracheal intubation were initially assessed for three PDI parameters - modified Mallampati test, thyro-mental distance& Atlanto-occipital (AO joint extension. Patients with cumulative PDI scores of 2 to 5 (in a scale of 0 to 8 were evaluated for Cormack& Lehane (CL grading by Macintosh blade after standard induction. Cases with CL grade of two or more were further evaluated by Truview evo2 laryngoscope and corresponding CL grades were assigned. Intubation attempted under Truview evo2 vision and time required for each successful tracheal intubation (i.e. tracheal intubation completed within one minute was noted. Total fifty cases were studied. The CL grades assigned by Macintosh blade correlated well with the cumulative PDI scores assigned preoperatively, confirming there predictability. Truview evo2 improved laryngeal view in 92 % cases by one or more CL grade. Intubation with Truview evo2 was possible in 88% cases within stipulated time of one minute and mean time of 28.6 seconds with SD of 11.23 was reasonably quick. No significant complication like oro- pharyngeal trauma or extreme pressor response to laryngoscopy was noticed. To conclude, Truview evo2 proved to be a better tool than conventional laryngoscope in anticipated difficult situations.

  14. Comparative examinations of serum pepsinogen I, II and gastric area using computed radiography in the atrophic gastritis

    Energy Technology Data Exchange (ETDEWEB)

    Tatsu, Yoshimitsu; Ogura, Yasuharu; Yamazaki, Kouichi [Osaka Medical Coll., Takatsuki (Japan)] [and others

    1995-11-01

    The relationship between serum PG I, PG II levels and extent of atrophic gastritis was examined. The subjects were 64 patients (male: 32, female: 32, 51.9 years old on average) with established diagnosis of either atrophic gastritis or normal. In the X-ray gastric examination, Fuji Computed Radiography (FCR) was used to obtain clear-cut images of the gastric area. Concerning the serum PG I level, patients in the group with atrophic gastritis showed lower levels than those of the people in the group with no atrophic change, but the variation was wide, and no definite tendency was seen in the relationship between the atrophic change and the serum PG I levels. Concerning the serum PG II level, as the atrophic change progresses, the serum PG II level tended to increase gradually. A significant reduction in the PG I/II ratio was seen in the group with atrophic changes (p<0.01) in comparison with the group with no atrophic changes, and the PG I/II value tended to decrease. In conclusion, as a relationship between the atrophic change and the serum PG levels had a wide variation, we considered it to be difficult to understand the presence and extent of the atrophic gastritis by measuring serum PG levels. (author).

  15. Comparative examinations of serum pepsinogen I, II and gastric area using computed radiography in the atrophic gastritis

    International Nuclear Information System (INIS)

    Tatsu, Yoshimitsu; Ogura, Yasuharu; Yamazaki, Kouichi

    1995-01-01

    The relationship between serum PG I, PG II levels and extent of atrophic gastritis was examined. The subjects were 64 patients (male: 32, female: 32, 51.9 years old on average) with established diagnosis of either atrophic gastritis or normal. In the X-ray gastric examination, Fuji Computed Radiography (FCR) was used to obtain clear-cut images of the gastric area. Concerning the serum PG I level, patients in the group with atrophic gastritis showed lower levels than those of the people in the group with no atrophic change, but the variation was wide, and no definite tendency was seen in the relationship between the atrophic change and the serum PG I levels. Concerning the serum PG II level, as the atrophic change progresses, the serum PG II level tended to increase gradually. A significant reduction in the PG I/II ratio was seen in the group with atrophic changes (p<0.01) in comparison with the group with no atrophic changes, and the PG I/II value tended to decrease. In conclusion, as a relationship between the atrophic change and the serum PG levels had a wide variation, we considered it to be difficult to understand the presence and extent of the atrophic gastritis by measuring serum PG levels. (author)

  16. Computer-controlled neutron time-of-flight spectrometer. Part II

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1979-12-01

    A time-of-flight spectrometer for neutron inelastic scattering research has been interfaced to a PDP-15/30 computer. The computer is used for experimental data acquisition and analysis and for apparatus control. This report was prepared to summarize the functions of the computer and to act as a users' guide to the software system

  17. A DDC Bibliography on Computers in Information Sciences. Volume II. Information Sciences Series.

    Science.gov (United States)

    Defense Documentation Center, Alexandria, VA.

    The unclassified and unlimited bibliography compiles references dealing specifically with the role of computers in information sciences. The volume contains 239 annotated references grouped under three major headings: Artificial and Programming Languages, Computer Processing of Analog Data, and Computer Processing of Digital Data. The references…

  18. Density functionalized [RuII(NO)(Salen)(Cl)] complex: Computational photodynamics and in vitro anticancer facets.

    Science.gov (United States)

    Mir, Jan Mohammad; Jain, N; Jaget, P S; Maurya, R C

    2017-09-01

    Photodynamic therapy (PDT) is a treatment that uses photosensitizing agents to kill cancer cells. Scientific community has been eager for decades to design an efficient PDT drug. Under such purview, the current report deals with the computational photodynamic behavior of ruthenium(II) nitrosyl complex containing N, N'-salicyldehyde-ethylenediimine (SalenH 2 ), the synthesis and X-ray crystallography of which is already known [Ref. 38,39]. Gaussian 09W software package was employed to carry out the density functional (DFT) studies. DFT calculations with Becke-3-Lee-Yang-Parr (B3LYP)/Los Alamos National Laboratory 2 Double Z (LanL2DZ) specified for Ru atom and B3LYP/6-31G(d,p) combination for all other atoms were used using effective core potential method. Both, the ground and excited states of the complex were evolved. Some known photosensitizers were compared with the target complex. Pthalocyanine and porphyrin derivatives were the compounds selected for the respective comparative study. It is suggested that effective photoactivity was found due to the presence of ruthenium core in the model complex. In addition to the evaluation of theoretical aspects in vitro anticancer aspects against COLO-205 human cancer cells have also been carried out with regard to the complex. More emphasis was laid to extrapolate DFT to depict the chemical power of the target compound to release nitric oxide. A promising visible light triggered nitric oxide releasing power of the compound has been inferred. In vitro antiproliferative studies of [RuCl 3 (PPh 3 ) 3 ] and [Ru(NO)(Salen)(Cl)] have revealed the model complex as an excellent anticancer agent. From IC 50 values of 40.031mg/mL in former and of 9.74mg/mL in latter, it is established that latter bears more anticancer potentiality. From overall study the DFT based structural elucidation and the efficiency of NO, Ru and Salen co-ligands has shown promising drug delivery property and a good candidacy for both chemotherapy as well as

  19. Desk-top publishing using IBM-compatible computers.

    Science.gov (United States)

    Grencis, P W

    1991-01-01

    This paper sets out to describe one Medical Illustration Departments' experience of the introduction of computers for desk-top publishing. In this particular case, after careful consideration of all the options open, an IBM-compatible system was installed rather than the often popular choice of an Apple Macintosh.

  20. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  1. Cooperative Learning with a Computer in a Native Language Class.

    Science.gov (United States)

    Bennett, Ruth

    In a cooperative task, American Indian elementary students produced bilingual natural history dictionaries using a Macintosh computer. Students in grades 3 through 8 attended weekly, multi-graded bilingual classes in Hupa/English or Yurok/English, held at two public school field sites for training elementary teaching-credential candidates. Teams…

  2. Prompt Burst Energetics (PBE) experiment analyses using the SIMMER-II computer code

    International Nuclear Information System (INIS)

    Tomkins, J.L.; Hitchcock, J.T.; Young, M.F.

    1979-01-01

    Two of the Prompt Burst Energetics (PBE) in-pile experiments conducted at Sandia Laboratories PBE-5S and PBE-SG2, have been investigated with SIMMER-II. These two tests utilize fresh uranium oxide and fresh uranium carbide pins, respectively, in stagnant sodium. The purpose of the analysis is to investigate the applicability of SIMMER-II to this type of experiment. Qualitative agreement with measured data is seen for PBE-5S. PBE-SG2 results agree somewhat less well but demonstrate SIMMER-II's potential for describing fuel-coolant-interactions with further model development

  3. The Effects of FreeSurfer Version, Workstation Type, and Macintosh Operating System Version on Anatomical Volume and Cortical Thickness Measurements

    OpenAIRE

    Gronenschild, Ed H. B. M.; Habets, Petra; Jacobs, Heidi I. L.; Mengelers, Ron; Rozendaal, Nico; van Os, Jim; Marcelis, Machteld

    2012-01-01

    FreeSurfer is a popular software package to measure cortical thickness and volume of neuroanatomical structures. However, little if any is known about measurement reliability across various data processing conditions. Using a set of 30 anatomical T1-weighted 3T MRI scans, we investigated the effects of data processing variables such as FreeSurfer version (v4.3.1, v4.5.0, and v5.0.0), workstation (Macintosh and Hewlett-Packard), and Macintosh operating system version (OSX 10.5 and OSX 10.6). S...

  4. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    International Nuclear Information System (INIS)

    Moore, K; Kagadis, G; Xing, L; McNutt, T

    2014-01-01

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set against new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud

  5. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    Energy Technology Data Exchange (ETDEWEB)

    Moore, K [University of California, San Diego, La Jolla, CA (United States); Kagadis, G [University Patras, Rion - Patras (Greece); Xing, L [Stanford University, Stanford, CA (United States); McNutt, T [Johns Hopkins University, Severna Park, MD (United States)

    2014-06-15

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set against new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.

  6. An Automated High Aspect Ratio Mesher for Computational Fluid Dynamics, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Computational fluid dynamics (CFD) simulations are routinely used while designing, analyzing, and optimizing air- and spacecraft. An important component of CFD...

  7. A Comparison of Equality in Computer Algebra and Correctness in Mathematical Pedagogy (II)

    Science.gov (United States)

    Bradford, Russell; Davenport, James H.; Sangwin, Chris

    2010-01-01

    A perennial problem in computer-aided assessment is that "a right answer", pedagogically speaking, is not the same thing as "a mathematically correct expression", as verified by a computer algebra system, or indeed other techniques such as random evaluation. Paper I in this series considered the difference in cases where there was "the right…

  8. Neuro-evolutionary computing paradigm for Painlevé equation-II in nonlinear optics

    Science.gov (United States)

    Ahmad, Iftikhar; Ahmad, Sufyan; Awais, Muhammad; Ul Islam Ahmad, Siraj; Asif Zahoor Raja, Muhammad

    2018-05-01

    The aim of this study is to investigate the numerical treatment of the Painlevé equation-II arising in physical models of nonlinear optics through artificial intelligence procedures by incorporating a single layer structure of neural networks optimized with genetic algorithms, sequential quadratic programming and active set techniques. We constructed a mathematical model for the nonlinear Painlevé equation-II with the help of networks by defining an error-based cost function in mean square sense. The performance of the proposed technique is validated through statistical analyses by means of the one-way ANOVA test conducted on a dataset generated by a large number of independent runs.

  9. COXPRO-II: a computer program for calculating radiation and conduction heat transfer in irradiated fuel assemblies

    International Nuclear Information System (INIS)

    Rhodes, C.A.

    1984-12-01

    This report describes the computer program COXPRO-II, which was written for performing thermal analyses of irradiated fuel assemblies in a gaseous environment with no forced cooling. The heat transfer modes within the fuel pin bundle are radiation exchange among fuel pin surfaces and conduction by the stagnant gas. The array of parallel cylindrical fuel pins may be enclosed by a metal wrapper or shroud. Heat is dissipated from the outer surface of the fuel pin assembly by radiation and convection. Both equilateral triangle and square fuel pin arrays can be analyzed. Steady-state and unsteady-state conditions are included. Temperatures predicted by the COXPRO-II code have been validated by comparing them with experimental measurements. Temperature predictions compare favorably to temperature measurements in pressurized water reactor (PWR) and liquid-metal fast breeder reactor (LMFBR) simulated, electrically heated fuel assemblies. Also, temperature comparisons are made on an actual irradiated Fast-Flux Test Facility (FFTF) LMFBR fuel assembly

  10. A report on intercomparison studies of computer programs which respectively model: i) radionuclide migration ii) equilibrium chemistry of groundwater

    International Nuclear Information System (INIS)

    Broyd, T.W.; McD Grant, M.; Cross, J.E.

    1985-01-01

    This report describes two intercomparison studies of computer programs which respectively model: i) radionuclide migration ii) equilibrium chemistry of groundwaters. These studies have been performed by running a series of test cases with each program and comparing the various results obtained. The work forms a part of the CEC MIRAGE project (MIgration of RAdionuclides in the GEosphere) and has been jointly funded by the CEC and the United Kingdom Department of the Environment. Presentations of the material contained herein were given at plenary meetings of the MIRAGE project in Brussels in March, 1984 (migration) and March, 1985 (equilibrium chemistry) respectively

  11. PECITIS-II, a computer program to predict the performance of collapsible clad UO2 fuel elements

    International Nuclear Information System (INIS)

    Anand, A.K.; Anantharaman, K.; Sarda, V.

    1978-01-01

    The Indian power programme envisages the use of PHWRs, which use collapsible clad UO 2 fuel elements. A computer code, PECITIS-II, developed for the analysis of this type of fuel is described in detail. The sheath strain and fission gas pressure are evaluated by this method. The pellet clad gap conductance is calculated by Ross and Solute model. The pellet thermal expansion is calculated by assuming a two zone model, i.e. a plastic core surrounded by an elastic cracked annulus. (author)

  12. Hybrid Computational Model for High-Altitude Aeroassist Vehicles, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort addresses a need for accurate computational models to support aeroassist and entry vehicle system design over a broad range of flight conditions...

  13. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  14. RADTRAN II: revised computer code to analyze transportation of radioactive material

    International Nuclear Information System (INIS)

    Taylor, J.M.; Daniel, S.L.

    1982-10-01

    A revised and updated version of the RADTRAN computer code is presented. This code has the capability to predict the radiological impacts associated with specific schemes of radioactive material shipments and mode specific transport variables

  15. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a predictive computational tool for the aerothermal environment around ablation-cooled hypersonic atmospheric entry...

  16. Upper limb muscular activity and perceived workload during laryngoscopy: comparison of Glidescope(R) and Macintosh laryngoscopy in manikin: an observational study.

    Science.gov (United States)

    Caldiroli, D; Molteni, F; Sommariva, A; Frittoli, S; Guanziroli, E; Cortellazzi, P; Orena, E F

    2014-03-01

    The interaction between operators and their working environment during laryngoscopy is poorly understood. Numerous studies have focused on the forces applied to the patient's airway during laryngoscopy, but only a few authors have addressed operator muscle activity and workload. We tested whether different devices (Glidescope(®) and Macintosh) use different muscles and how these differences affect the perceived workload. Ten staff anaesthetists performed three intubations with each device on a manikin. Surface electromyography was recorded for eight single muscles of the left upper limb. The NASA Task Load Index (TLX) was administered after each experimental session to evaluate perceived workload. A consistent reduction in muscular activation occurred with Glidescope(®) compared with Macintosh for all muscles tested (mean effect size d=3.28), and significant differences for the upper trapezius (P=0.002), anterior deltoid (P=0.001), posterior deltoid (P=0.000), and brachioradialis (P=0.001) were observed. The overall NASA-TLX workload score was significantly lower for Glidescope(®) than for Macintosh (P=0.006), and the factors of physical demand (P=0.008) and effort (P=0.006) decreased significantly. Greater muscular activity and workload were observed with the Macintosh laryngoscope. Augmented vision and related postural adjustments related to using the Glidescope(®) may reduce activation of the operator's muscles and task workload.

  17. A comparison of tracheal intubation using the Airtraq or the Macintosh laryngoscope in routine airway management: A randomised, controlled clinical trial.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-11-01

    The Airtraq laryngoscope is a novel single use tracheal intubation device. We compared the Airtraq with the Macintosh laryngoscope in patients deemed at low risk for difficult intubation in a randomised, controlled clinical trial. Sixty consenting patients presenting for surgery requiring tracheal intubation were randomly allocated to undergo intubation using a Macintosh (n = 30) or Airtraq (n = 30) laryngoscope. All patients were intubated by one of four anaesthetists experienced in the use of both laryngoscopes. No significant differences in demographic or airway variables were observed between the groups. All but one patient, in the Macintosh group, was successfully intubated on the first attempt. There was no difference between groups in the duration of intubation attempts. In comparison to the Macintosh laryngoscope, the Airtraq resulted in modest improvements in the intubation difficulty score, and in ease of use. Tracheal intubation with the Airtraq resulted in less alterations in heart rate. These findings demonstrate the utility of the Airtraq laryngoscope for tracheal intubation in low risk patients.

  18. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    Science.gov (United States)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  19. Performance/Design Requirements and Detailed Technical Description for a Computer-Directed Training Subsystem for Integration into the Air Force Phase II Base Level System.

    Science.gov (United States)

    Butler, A. K.; And Others

    The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…

  20. IN SILICO EVALUATION OF ANGIOTENSIN II RECEPTOR ANTAGONIST’S PLASMA PROTEIN BINDING USING COMPUTED MOLECULAR DESCRIPTORS

    Directory of Open Access Journals (Sweden)

    Jadranka Odović

    2014-03-01

    Full Text Available The discovery of new pharmacologically active substances and drugs modeling led to necessity of predicting drugs properties and its ADME data. Angiotensin II receptor antagonists are a group of pharmaceuticals which modulate the renin-angiotensin-aldosterone system and today represent the most commonly prescribed anti-hypertensive drugs. The aim of this study was to compare different molecular properties of seven angiotensin II receptor antagonists / blockers (ARBs, (eprosartan, irbesartan, losartan, olmesartan, telmisartan, valsartan and their plasma protein binding (PPB data. Several ARBs molecular descriptors were calculated using software package Molinspiration Depiction Software as well as Virtual Computational Chemistry Laboratory (electronic descriptor – PSA, constitutional parameter – Mw, geometric descriptor – Vol, lipophilicity descriptors - logP values, aqueous solubility data – logS. The correlations between all collected descriptors and plasma protein binding data obtained from relevant literature were established. In the simple linear regression poor correlations were obtained in relationships between PPB data and all calculated molecular descriptors. In the next stage of the study multiple linear regression (MLR was used for correlation of PPB data with two different descriptors as independent variables. The best correlation (R2=0.70 with P<0.05 was established between PPB data and molecular weight with addition of volume values as independent variables. The possible application of computed molecular descriptors in drugs protein binding evaluation can be of great importance in drug research.

  1. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  2. 3-D conformal radiation therapy - Part II: Computer-controlled 3-D treatment delivery

    International Nuclear Information System (INIS)

    Benedick, A.

    1997-01-01

    Purpose/Objective: This course will describe the use of computer-controlled treatment delivery techniques for treatment of patients with sophisticated conformal therapy. In particular, research and implementation issues related to clinical use of computer-controlled conformal radiation therapy (CCRT) techniques will be discussed. The possible/potential advantages of CCRT techniques will be highlighted using results from clinical 3-D planning studies. Materials and Methods: In recent years, 3-D treatment planning has been used to develop and implement 3-D conformal therapy treatment techniques, and studies based on these conformal treatments have begun to show the promise of conformal therapy. This work has been followed by the development of commercially-available multileaf collimator and computer control systems for treatment machines. Using these (and other) CCRT devices, various centers are beginning to clinically use complex computer-controlled treatments. Both research and clinical CCRT treatment techniques will be discussed in this presentation. General concepts and requirements for CCRT will be mentioned. Developmental and clinical experience with CCRT techniques from a number of centers will be utilized. Results: Treatment planning, treatment preparation and treatment delivery must be approached in an integrated fashion in order to clinically implement CCRT treatment techniques, and the entire process will be discussed. Various CCRT treatment methodologies will be reviewed from operational, dosimetric, and technical points of view. The discussion will concentrate on CCRT techniques which are likely to see rather wide dissemination over the next several years, including particularly the use of multileaf collimators (MLC), dynamic and segmental conformal therapy, conformal field shaping, and other related techniques. More advanced CCRT techniques, such as the use of individualized intensity modulation of beams or segments, and the use of computer

  3. A comparison of equality in computer algebra and correctness in mathematical pedagogy (II)

    OpenAIRE

    Bradford, Russell; Davenport, James H; Sangwin, C

    2010-01-01

    A perennial problem in computer-aided assessment is that “a right answer”, pedagogically speaking, is not the same thing as “a mathematically correct expression”, as verified by a computer algebra system, or indeed other techniques such as random evaluation. Paper I in this series considered the difference in cases where there was “the right answer”, typically calculus questions. Here we look at some other cases, notably in linear algebra, where there can be many “right answers”, but still th...

  4. Using Computers for Intervention and Remediation of Severely Reading-Impaired Children in a University Literacy Clinic.

    Science.gov (United States)

    Balajthy, Ernest; Reuber, Kristin; Damon, Corrine J.

    A study investigated software choices of graduate-level clinicians in a university reading clinic to determine computer use and effectiveness in literacy instruction. The clinic involved students of varying ability, ages 7-12, using 24 Power Macintosh computers equipped with "ClarisWorks,""Kid Pix,""Student Writing…

  5. Critical Analysis of Underground Coal Gasification Models. Part II: Kinetic and Computational Fluid Dynamics Models

    Directory of Open Access Journals (Sweden)

    Alina Żogała

    2014-01-01

    Originality/value: This paper presents state of art in the field of coal gasification modeling using kinetic and computational fluid dynamics approach. The paper also presents own comparative analysis (concerned with mathematical formulation, input data and parameters, basic assumptions, obtained results etc. of the most important models of underground coal gasification.

  6. Parallel computing in experimental mechanics and optical measurement: A review (II)

    Science.gov (United States)

    Wang, Tianyi; Kemao, Qian

    2018-05-01

    With advantages such as non-destructiveness, high sensitivity and high accuracy, optical techniques have successfully integrated into various important physical quantities in experimental mechanics (EM) and optical measurement (OM). However, in pursuit of higher image resolutions for higher accuracy, the computation burden of optical techniques has become much heavier. Therefore, in recent years, heterogeneous platforms composing of hardware such as CPUs and GPUs, have been widely employed to accelerate these techniques due to their cost-effectiveness, short development cycle, easy portability, and high scalability. In this paper, we analyze various works by first illustrating their different architectures, followed by introducing their various parallel patterns for high speed computation. Next, we review the effects of CPU and GPU parallel computing specifically in EM & OM applications in a broad scope, which include digital image/volume correlation, fringe pattern analysis, tomography, hyperspectral imaging, computer-generated holograms, and integral imaging. In our survey, we have found that high parallelism can always be exploited in such applications for the development of high-performance systems.

  7. Human brain as the model of a new computer system. II

    Energy Technology Data Exchange (ETDEWEB)

    Holtz, K; Langheld, E

    1981-12-09

    For Pt. I see IBID., Vol. 29, No. 22, P. 13 (1981). The authors describe the self-generating system of connections of a self-teaching no-program associative computer. The self-generating systems of connections are regarded as simulation models of the human brain and compared with the brain structure. The system hardware comprises microprocessor, PROM, memory, VDU, keyboard unit.

  8. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  9. Computer code ANISN multiplying media and shielding calculation II. Code description (input/output)

    International Nuclear Information System (INIS)

    Maiorino, J.R.

    1990-01-01

    The user manual of the ANISN computer code describing input and output subroutines is presented. ANISN code was developed to solve one-dimensional transport equation for neutron or gamma rays in slab, sphere or cylinder geometry with general anisotropic scattering. The solution technique is the discrete ordinate method. (M.C.K.)

  10. Computational tools and lattice design for the PEP-II B-Factory

    International Nuclear Information System (INIS)

    Cai Yunhai; Irwin, John; Nosochkov, Yuri; Yan, Yiton

    1997-01-01

    Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT

  11. Poisson/Superfish codes for personal computers

    International Nuclear Information System (INIS)

    Humphries, S.

    1992-01-01

    The Poisson/Superfish codes calculate static E or B fields in two-dimensions and electromagnetic fields in resonant structures. New versions for 386/486 PCs and Macintosh computers have capabilities that exceed the mainframe versions. Notable improvements are interactive graphical post-processors, improved field calculation routines, and a new program for charged particle orbit tracking. (author). 4 refs., 1 tab., figs

  12. COSA II Further benchmark exercises to compare geomechanical computer codes for salt

    International Nuclear Information System (INIS)

    Lowe, M.J.S.; Knowles, N.C.

    1989-01-01

    Project COSA (COmputer COdes COmparison for SAlt) was a benchmarking exercise involving the numerical modelling of the geomechanical behaviour of heated rock salt. Its main objective was to assess the current European capability to predict the geomechanical behaviour of salt, in the context of the disposal of heat-producing radioactive waste in salt formations. Twelve organisations participated in the exercise in which their solutions to a number of benchmark problems were compared. The project was organised in two distinct phases: The first, from 1984-1986, concentrated on the verification of the computer codes. The second, from 1986-1988 progressed to validation, using three in-situ experiments at the Asse research facility in West Germany as a basis for comparison. This document reports the activities of the second phase of the project and presents the results, assessments and conclusions

  13. Advances in Computational High-Resolution Mechanical Spectroscopy HRMS Part II: Resonant Frequency – Young's Modulus

    International Nuclear Information System (INIS)

    Majewski, M; Magalas, L B

    2012-01-01

    In this paper, we compare the values of the resonant frequency f 0 of free decaying oscillations computed according to the parametric OMI method (Optimization in Multiple Intervals) and nonparametric DFT-based (discrete Fourier transform) methods as a function of the sampling frequency. The analysis is carried out for free decaying signals embedded in an experimental noise recorded for metallic samples in a low-frequency resonant mechanical spectrometer. The Yoshida method (Y), the Agrez' method (A), and new interpolated discrete Fourier transform (IpDFT) methods, that is, the Yoshida-Magalas (YM) and (YM C ) methods developed by the authors are carefully compared for the resonant frequency f 0 = 1.12345 Hz and the logarithmic decrement, δ = 0.0005. Precise estimation of the resonant frequency (Youngs' modulus ∼ f 0 2 ) for real experimental conditions, i.e., for exponentially damped harmonic signals embedded in an experimental noise, is a complex task. In this work, various computing methods are analyzed as a function of the sampling frequency used to digitize free decaying oscillations. The importance of computing techniques to obtain reliable and precise values of the resonant frequency (i.e. Young's modulus) in materials science is emphasized.

  14. WISDOM-II: Screening against multiple targets implicated in malaria using computational grid infrastructures

    Directory of Open Access Journals (Sweden)

    Kenyon Colin

    2009-05-01

    Full Text Available Abstract Background Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Motivation Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR, and on a new promising one, glutathione-S-transferase. Methods In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. Results On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. Conclusion The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software

  15. Computational tools and lattice design for the PEP-II B-Factory

    International Nuclear Information System (INIS)

    Cai, Y.; Irwin, J.; Nosochkov, Y.; Yan, Y.

    1997-01-01

    Several accelerator codes were used to design the PEP-II lattices, ranging from matrix-based codes, such as MAD and DIMAD, to symplectic-integrator codes, such as TRACY and DESPOT. In addition to element-by-element tracking, we constructed maps to determine aberration strengths. Furthermore, we have developed a fast and reliable method (nPB tracking) to track particles with a one-turn map. This new technique allows us to evaluate performance of the lattices on the entire tune-plane. Recently, we designed and implemented an object-oriented code in C++ called LEGO which integrates and expands upon TRACY and DESPOT. copyright 1997 American Institute of Physics

  16. Improved Linear Algebra Methods for Redshift Computation from Limited Spectrum Data - II

    Science.gov (United States)

    Foster, Leslie; Waagen, Alex; Aijaz, Nabella; Hurley, Michael; Luis, Apolo; Rinsky, Joel; Satyavolu, Chandrika; Gazis, Paul; Srivastava, Ashok; Way, Michael

    2008-01-01

    Given photometric broadband measurements of a galaxy, Gaussian processes may be used with a training set to solve the regression problem of approximating the redshift of this galaxy. However, in practice solving the traditional Gaussian processes equation is too slow and requires too much memory. We employed several methods to avoid this difficulty using algebraic manipulation and low-rank approximation, and were able to quickly approximate the redshifts in our testing data within 17 percent of the known true values using limited computational resources. The accuracy of one method, the V Formulation, is comparable to the accuracy of the best methods currently used for this problem.

  17. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  18. Introduction to Xgrid: Cluster Computing for Everyone

    OpenAIRE

    Breen, Barbara J.; Lindner, John F.

    2010-01-01

    Xgrid is the first distributed computing architecture built into a desktop operating system. It allows you to run a single job across multiple computers at once. All you need is at least one Macintosh computer running Mac OS X v10.4 or later. (Mac OS X Server is not required.) We provide explicit instructions and example code to get you started, including examples of how to distribute your computing jobs, even if your initial cluster consists of just two old laptops in your basement.

  19. Intelligent Computer-Aided Instruction and Musical Performance Skills. CITE Report No. 18.

    Science.gov (United States)

    Baker, Michael

    This paper is a transcription from memory of a short talk that used overhead projector slides, with musical examples played on an Apple Macintosh computer and a Yamaha CX5 synthesizer. The slides appear in the text as reduced "icons" at the point where they would have been used in the talk. The paper concerns ways in which artificial intelligence…

  20. Computer studies of the evolution of planetary and satellite systems. II

    International Nuclear Information System (INIS)

    Barricelli, N.A.; Aashamar, K.

    1980-01-01

    This paper describes two computer experiments carried out with a CDC-Cyber 74 program for computer simulation of a large number of objects in orbit about a central body or primary. The first experiment was started with 125 planets of which the two largest ones had coplanar orbits and masses comparable to those of Jupiter and Saturn, respectively. Their semi-major axes and eccentricities were, however, much larger. The smaller planets had a distribution promoting the formation of an axial meeting area. The experiment gives information relevant to the question of focusing of planetary orbits into a common plane and to the question of the formation and stability of an axial meeting area. Together with the next experiment, it also gives information about the development of commensurabilities (or resonances) with the largest planets. The second experiment started with 55 planets none of them with a mass greater than about 20% of Jupiter's but several of them with orbits close to a common plane. The aim of the experiment was to investigate whether successive captures followed by planetary fusion could lead to the formation of major planets comparable to Jupiter and Saturn, and in similar orbits. Also this experiment gives information relevant to the commensurability problem. (Auth.)

  1. GIER: A Danish computer from 1961 with a role in the modern revolution of astronomy - II

    Science.gov (United States)

    Høg, Erik

    2018-04-01

    A Danish computer, GIER, from 1961 played a vital role in the development of a new method for astrometric measurement. This method, photon counting astrometry, ultimately led to two satellites with a significant role in the modern revolution of astronomy. A GIER was installed at the Hamburg Observatory in 1964 where it was used to implement the entirely new method for the measurement of stellar positions by means of a meridian circle, at that time the fundamental instrument of astrometry. An expedition to Perth in Western Australia with the instrument and the computer was a success. This method was also implemented in space in the first ever astrometric satellite Hipparcos launched by ESA in 1989. The Hipparcos results published in 1997 revolutionized astrometry with an impact in all branches of astronomy from the solar system and stellar structure to cosmic distances and the dynamics of the Milky Way. In turn, the results paved the way for a successor, the one million times more powerful Gaia astrometry satellite launched by ESA in 2013. Preparations for a Gaia successor in twenty years are making progress.

  2. LHCb computing in Run II and its evolution towards Run III

    CERN Document Server

    Falabella, Antonio

    2016-01-01

    his contribution reports on the experience of the LHCb computing team during LHC Run 2 and its preparation for Run 3. Furthermore a brief introduction on LHCbDIRAC, i.e. the tool to interface to the experiment distributed computing resources for its data processing and data management operations, is given. Run 2, which started in 2015, has already seen several changes in the data processing workflows of the experiment. Most notably the ability to align and calibrate the detector between two different stages of the data processing in the high level trigger farm, eliminating the need for a second pass processing of the data offline. In addition a fraction of the data is immediately reconstructed to its final physics format in the high level trigger and only this format is exported from the experiment site to the physics analysis. This concept have successfully been tested and will continue to be used for the rest of Run 2. Furthermore the distributed data processing has been improved with new concepts and techn...

  3. Control of horizontal plasma position by feedforward-feedback system with digital computer in JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, K.; Sakurai, K.; Itoh, S.; Matsuura, K.; Tanahashi, S.

    1980-01-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation in a thin resistive shell from a large-aspect-ratio approximation every 1.39 msec with a digital computer. The iron core effect also is taken account by a simple form in the equation. The required strength of vertical field is determined by the control-demand composed of a ''feedback'' term with Proportion-Integration-Differentiation correction (PID-controller) and ''feedforward'' one in proportion to plasma current. The experimental results have a satisfactory agreement with the analysis of control system. By this control system, the horizontal displacement has been suppressed within 1 cm throughout a discharge for the plasma of 15 cm-radius with high density and low q(a)-value obtained by the second current rise and strong gas puffing. (author)

  4. CRAB-II: a computer program to predict hydraulics and scram dynamics of LMFBR control assemblies and its validation

    International Nuclear Information System (INIS)

    Carelli, M.D.; Baker, L.A.; Willis, J.M.; Engel, F.C.; Nee, D.Y.

    1982-01-01

    This paper presents an analytical method, the computer code CRAB-II, which calculates the hydraulics and scram dynamics of LMFBR control assemblies of the rod bundle type and its validation against prototypic data obtained for the Clinch River Breeder Reactor (CRBR) primary control assemblies. The physical-mathematical model of the code is presented, followed by a description of the testing of prototypic CRBR control assemblies in water and sodium to characterize, respectively, their hydraulic and scram dynamics behavior. Comparison of code predictions against the experimental data are presened in detail; excellent agreement was found. Also reported are experimental data and empirical correlations for the friction factor of the absorber bundle in the entire flow range (laminar to turbulent) which represent an extension of the state-of-the-art, since only fuel and blanket assemblies friction factor correlations were previously reported in the open literature

  5. Computer augumented modelling studies of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-glutamic acid in 1,2-propanediol–water mixtures

    Directory of Open Access Journals (Sweden)

    MAHESWARA RAO VEGI

    2008-12-01

    Full Text Available Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-glutamic acid was studied at 303 K in 0–60 vol. % 1,2-propanediol–water mixtures, whereby the ionic strength was maintained at 0.16 mol dm-3. The active forms of the ligand are LH3+, LH2 and LH–. The predominant detected species were ML, ML2, MLH, ML2H and ML2H2. The trend of the variation in the stability constants with changing dielectric constant of the medium is explained based on the cation stabilizing nature of the co-solvents, specific solvent–water interactions, charge dispersion and specific interactions of the co-solvent with the solute. The effect of systematic errors in the concentrations of the substances on the stability constants is in the order alkali > > acid > ligand > metal. The bioavailability and transportation of metals are explained based on distribution diagrams and stability constants.

  6. Calculation of DND-signals in case of fuel pin failures in KNK II with the computer code FICTION III

    International Nuclear Information System (INIS)

    Schmuck, I.

    1990-11-01

    In KNK II two delayed neutron detectors are installed for quick detection of fuel subassembly cladding failures. They record the release of the precursors of the emitters of delayed neutrons into the sodium. The computer code FICTION III calculates the expected delayed neutron signals for certain fuel pin failures, where the user has to set the boundary conditions interactively. In view of FICTION II the advancement of FICTION III consists of the following items: application of the data sets of 105 isotopes, distinction of thermal and fast neutron induced fission, partitioning of the sodium flow into two circuits, consideration of the specific fission rates in 10 fuel pin sections, elaboration of the user's interaction possibilities for input/ output. The capability of FICTION III is shown by means of two applications (UNi-test pin on position 100 and the third KNK fuel subassembly cladding failure). Object of further evaluations will be among other things the analysis of increased delayed neutron signals in regard to the fault location and dimension

  7. Two-loop renormalization in the standard model, part II. Renormalization procedures and computational techniques

    Energy Technology Data Exchange (ETDEWEB)

    Actis, S. [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Passarino, G. [Torino Univ. (Italy). Dipt. di Fisica Teorica; INFN, Sezione di Torino (Italy)

    2006-12-15

    In part I general aspects of the renormalization of a spontaneously broken gauge theory have been introduced. Here, in part II, two-loop renormalization is introduced and discussed within the context of the minimal Standard Model. Therefore, this paper deals with the transition between bare parameters and fields to renormalized ones. The full list of one- and two-loop counterterms is shown and it is proven that, by a suitable extension of the formalism already introduced at the one-loop level, two-point functions suffice in renormalizing the model. The problem of overlapping ultraviolet divergencies is analyzed and it is shown that all counterterms are local and of polynomial nature. The original program of 't Hooft and Veltman is at work. Finite parts are written in a way that allows for a fast and reliable numerical integration with all collinear logarithms extracted analytically. Finite renormalization, the transition between renormalized parameters and physical (pseudo-)observables, are discussed in part III where numerical results, e.g. for the complex poles of the unstable gauge bosons, are shown. An attempt is made to define the running of the electromagnetic coupling constant at the two-loop level. (orig.)

  8. Lake Erie Water Level Study. Appendix A. Regulation. Volume II. Coordinated Basic Data and Computer Programs.

    Science.gov (United States)

    1981-07-01

    F- fta fm~p- ri- a V.- On ;a~V~ I’ .C1A 1" ~ P 0%AJfVtN Q I1’t I O a0 4 . 0 * a C C COO a * a ma. 93 *a Ob fu -E tvvP- fl NurvI M I-Al N Ir-N fuII...NU N4 -4ftfNA fmd .-. - -9 0 0 0r IL 0 w wow 00~ 0Nm(U a 0P.-CM@W aO 0 1.4- 41- &0 M Im -4 & - 40 0 P-0 om fc I- 20o w wwef -4 . aaW4 MMM a-MI- n0 4...O zq~ f-’ W 9 A o a i. po j -4 N NfUfJ .-. AD uw s )LA~L; 4DW L NANI f & ftA DNN W LL) mij 3 z I- -9 9 a -j -z~u nV1 0 ~ OPP~~.N fj- , MD0* n’DCL t cc

  9. Patient size and x-ray technique factors in head computed tomography examinations. II. Image quality

    International Nuclear Information System (INIS)

    Huda, Walter; Lieberman, Kristin A.; Chang, Jack; Roskopf, Marsha L.

    2004-01-01

    We investigated how patient head characteristics, as well as the choice of x-ray technique factors, affect lesion contrast and noise values in computed tomography (CT) images. Head sizes and mean Hounsfield unit (HU) values were obtained from head CT images for five classes of patients ranging from the newborn to adults. X-ray spectra with tube voltages ranging from 80 to 140 kV were used to compute the average photon energy, and energy fluence, transmitted through the heads of patients of varying size. Image contrast, and the corresponding contrast to noise ratios (CNRs), were determined for lesions of fat, muscle, and iodine relative to a uniform water background. Maintaining a constant image CNR for each lesion, the patient energy imparted was also computed to identify the x-ray tube voltage that minimized the radiation dose. For adults, increasing the tube voltage from 80 to 140 kV changed the iodine HU from 2.62x10 5 to 1.27x10 5 , the fat HU from -138 to -108, and the muscle HU from 37.1 to 33.0. Increasing the x-ray tube voltage from 80 to 140 kV increased the percentage energy fluence transmission by up to a factor of 2. For a fixed x-ray tube voltage, the percentage transmitted energy fluence in adults was more than a factor of 4 lower than for newborns. For adults, increasing the x-ray tube voltage from 80 to 140 kV improved the CNR for muscle lesions by 130%, for fat lesions by a factor of 2, and for iodine lesions by 25%. As the size of the patient increased from newborn to adults, lesion CNR was reduced by about a factor of 2. The mAs value can be reduced by 80% when scanning newborns while maintaining the same lesion CNR as for adults. Maintaining the CNR of an iodine lesion at a constant level, use of 140 kV increases the energy imparted to an adult patient by nearly a factor of 3.5 in comparison to 80 kV. For fat and muscle lesions, raising the x-ray tube voltage from 80 to 140 kV at a constant CNR increased the patient dose by 37% and 7

  10. II - Detector simulation for the LHC and beyond : how to match computing resources and physics requirements

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  11. Phosphorous vacancy nearest neighbor hopping induced instabilities in InP capacitors II. Computer simulation

    International Nuclear Information System (INIS)

    Juang, M.T.; Wager, J.F.; Van Vechten, J.A.

    1988-01-01

    Drain current drift in InP metal insulator semiconductor devices display distinct activation energies and pre-exponential factors. The authors have given evidence that these result from two physical mechanisms: thermionic tunneling of electrons into native oxide traps and phosphorous vacancy nearest neighbor hopping (PVNNH). They here present a computer simulation of the effect of the PVNHH mechanism on flatband voltage shift vs. bias stress time measurements. The simulation is based on an analysis of the kinetics of the PVNNH defect reaction sequence in which the electron concentration in the channel is related to the applied bias by a solution of the Poisson equation. The simulation demonstrates quantitatively that the temperature dependence of the flatband shift is associated with PVNNH for temperatures above room temperature

  12. Uranium accountability for ATR fuel fabrication: Part II. A computer simulation

    International Nuclear Information System (INIS)

    Dolan, C.A.; Nieschmidt, E.B.; Vegors, S.H. Jr.; Wagner, E.P. Jr.

    1977-08-01

    A stochastic computer model has been designed to simulate the material control system used during the production of fuel plates for the Advanced Test Reactor. Great care has been taken to see that this model follows the manufacturing and measuring processes used. The model is designed so that manufacturing process and measurement parameters are fed in as input; hence, changes in the manufacturing process and measurement procedures are easily simulated. Individual operations in the plant are described by program subroutines. By varying the calling sequence of these subroutines, variations in the manufacturing process may be simulated. By using this model values for MUF and LEMUF may be calculated for predetermined plant operating conditions. Furthermore the effect on MUF and LEMUF produced by changing plant operating procedures and measurement techniques may also be examined. A sample calculation simulating one inventory period of the plant's operation is included

  13. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  14. Evaluation of a computer aided X-ray fluorographic system: Part II - image processing

    International Nuclear Information System (INIS)

    Burch, S.F.; Cocking, S.J.

    1981-12-01

    The TV imagery from a computer aided X-ray fluorographic system has been digitally processed with an I 2 S model 70E image processor, controlled by a PDP 11/60 minicomputer. The image processor allowed valuable processing for detection of defects in cast components to be carried out at television frame rates. Summation of TV frames was used to reduce noise, and hence improve the thickness sensitivity of the system. A displaced differencing technique and interactive contrast enhancement were then used to improve the reliability of inspection by removing spurious blemishes and interference lines, while simultaneously enhancing the visibility of real defects. The times required for these operations are given, and the benefits provided for X-ray fluorography are illustrated by the results from inspection of aero engine castings. (author)

  15. Circular-detector array hybrid emission computed tomograph, HEADTOME-II

    Energy Technology Data Exchange (ETDEWEB)

    Uemura, Kazuo; Kanno, Iwao; Miura, Yuko; Miura, Syuichi [Research Inst. of Brain and Blood Vessels, Akita (Japan); Hattori, Hiroyuki; Hirose, Yoshiharu; Nagata, Takashi; Higashi, Yoshihumi

    1982-08-01

    The HEADTOME-II is a successor of the original hybrid ECT, the HEADTOME-I, which was built and used in Research Institute of Brain and Blood Vessels, Akita. The new machine has three detector rings comprising 64 Nal crystals each, the axial length of which is 30 mm, 100 mm in total for three rings including 5 mm lead shield between the rings, and is long enough to examine most of the brain at a given time. The system sensitivity for positrons is 27.5 kcps/..mu..Ci/ml for intra-ring coincidence and 36.5 kcps for inter-ring coincidence. Spatial resolution is 10 mm FWHM at the center of the field of view. For single photon ECT study, new ''Turbofan'' rotating collimators, high sensitivity H.S. and high resolution H.R., are adopted. The collimators for single photons and positrons can be selected by manual operation. The system sensitivity for sup(99m)Tc is, 52.5 kcps/..mu..Ci/ml by H.S. and 13.8 kcps by H.R. collimator. Spatial resolution at the center of the field of view is 20 mm for H.S. and 11 mm for H.R. respectively. High quality images have been obtained by sup(99m)Tc, /sup 81/mKr, /sup 111/In and /sup 11/C. The regional cerebral blood flow study by /sup 133/Xe inhalation or intra-venous injection has been tried and good results are obtained.

  16. A circular-detector array hybrid emission computed tomograph, HEADTOME-II

    International Nuclear Information System (INIS)

    Uemura, Kazuo; Kanno, Iwao; Miura, Yuko; Miura, Syuichi; Hattori, Hiroyuki; Hirose, Yoshiharu; Nagata, Takashi; Higashi, Yoshihumi.

    1982-01-01

    The HEADTOME-II is a successor of the original hybrid ECT, the HEADTOME-I, which was built and used in Research Institute of Brain and Blood Vessels, Akita. The new machine has three detector rings comprising 64 Nal crystals each, the axial length of which is 30 mm, 100 mm in total for three rings including 5 mm lead shield between the rings, and is long enought to examine the most part of the brain at a time. The system sensitivity for positrons is 27.5 kcps/μCi/ml for intra-ring coincidence and 36.5 kcps for inter-ring coincidence. Spatial resolution is 10 mm FWHM at the center of the field of view. For single photon ECT study, new ''Turbofan'' rotaing collimators, high sensitivity H.S. and high resolution H.R., are adopted. The collimators for single photons and positrons can be selected by manual operation. The system sensitivity for sup(99m)Tc is, 52.5 kcps/μCi/ml by H.S. and 13.8 kcps by H.R. collimator. Spatial resolution at the center of the field of view is 20 mm for H.S. and 11 mm for H.R. respectively. High quality images have been obtained by sup(99m)Tc, 81 mKr, 111 In and 11 C. The regional cerebral blood flow study by 133 Xe inhalation or intra-venous injection has been tried and good results are obtained. (author)

  17. AIRDOS-II computer code for estimating radiation dose to man from airborne radionuclides in areas surrouding nuclear facilities

    International Nuclear Information System (INIS)

    Moore, R.E.

    1977-04-01

    The AIRDOS-II computer code estimates individual and population doses resulting from the simultaneous atmospheric release of as many as 36 radionuclides from a nuclear facility. This report describes the meteorological and environmental models used is the code, their computer implementation, and the applicability of the code to assessments of radiological impact. Atmospheric dispersion and surface deposition of released radionuclides are estimated as a function of direction and distance from a nuclear power plant or fuel-cycle facility, and doses to man through inhalation, air immersion, exposure to contaminated ground, food ingestion, and water immersion are estimated in the surrounding area. Annual doses are estimated for total body, GI tract, bone, thyroid, lungs, muscle, kidneys, liver, spleen, testes, and ovaries. Either the annual population doses (man-rems/year) or the highest annual individual doses in the assessment area (rems/year), whichever are applicable, are summarized in output tables in several ways--by nuclides, modes of exposure, and organs. The location of the highest individual doses for each reference organ estimated for the area is specified in the output data

  18. Effect of resin coating and occlusal loading on microleakage of Class II computer-aided design/computer-aided manufacturing fabricated ceramic restorations: a confocal microscopic study.

    Science.gov (United States)

    Kitayama, Shuzo; Nasser, Nasser A; Pilecki, Peter; Wilson, Ron F; Nikaido, Toru; Tagami, Junji; Watson, Timothy F; Foxton, Richard M

    2011-05-01

    To evaluate the effect of resin coating and occlusal loading on microleakage of class II computer-aided design/computer-aided manufacturing (CAD/CAM) ceramic restorations. Molars were prepared for an mesio-occlusal-distal (MOD) inlay and were divided into two groups: non-coated (controls); and resin-coated, in which the cavity was coated with a combination of a dentin bonding system (Clearfil Protect Bond) and a flowable resin composite (Clearfil Majesty Flow). Ceramic inlays were fabricated using the CAD/CAM technique (CEREC 3) and cemented with resin cement (Clearfil Esthetic Cement). After 24 h of water storage, the restored teeth in each group were divided into two subgroups: unloaded or loaded with an axial force of 80 N at a rate of 2.5 cycles/s for 250,000 cycles while stored in water. After immersion in 0.25% Rhodamine B solution, the teeth were sectioned bucco-lingually at the mesial and distal boxes. Tandem scanning confocal microscopy (TSM) was used for evaluation of microleakage. The locations of the measurements were assigned to the cavity walls and floor. Loading did not have a significant effect on microleakage in either the resin-coated or non-coated group. Resin coating significantly reduced microleakage regardless of loading. The cavity floor exhibited greater microleakage compared to the cavity wall. TSM observation also revealed that microleakage at the enamel surface was minimal regardless of resin coating. In contrast, non-coated dentin showed extensive leakage, whereas resin-coated dentin showed decreased leakage. Resin coating with a combination of a dentin-bonding system and a flowable resin composite may be indicated prior to impression-taking when restoring teeth with CAD/CAM ceramic inlays in order to reduce microleakage at the tooth-resin interface.

  19. Design procedure for pollutant loadings and impacts for highway stormwater runoff (Macintosh version) (for microcomputers). Software

    International Nuclear Information System (INIS)

    1990-01-01

    The interactive computer program was developed to make a user friendly procedure for the personal computer for calculations and guidance to make estimations of pollutant loadings and impacts from highway stormwater runoff which are presented in the Publication FHWA-RD-88-006, Pollutant Loadings and Impacts from Highway Stormwater Runoff, Volume I: Design Procedure. The computer program is for the evaluation of the water quality impact from highway stormwater runoff to a lake or a stream from a specific highway site considering the necessary rainfall data and geographic site situation. The evaluation considers whether or not the resulting water quality conditions can cause a problem as indicated by the violations of water quality criteria or objectives

  20. II - Template Metaprogramming for Massively Parallel Scientific Computing - Vectorization with Expression Templates

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Large scale scientific computing raises questions on different levels ranging from the fomulation of the problems to the choice of the best algorithms and their implementation for a specific platform. There are similarities in these different topics that can be exploited by modern-style C++ template metaprogramming techniques to produce readable, maintainable and generic code. Traditional low-level code tend to be fast but platform-dependent, and it obfuscates the meaning of the algorithm. On the other hand, object-oriented approach is nice to read, but may come with an inherent performance penalty. These lectures aim to present he basics of the Expression Template (ET) idiom which allows us to keep the object-oriented approach without sacrificing performance. We will in particular show to to enhance ET to include SIMD vectorization. We will then introduce techniques for abstracting iteration, and introduce thread-level parallelism for use in heavy data-centric loads. We will show to to apply these methods i...

  1. SIMMER-II: A computer program for LMFBR disrupted core analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bohl, W.R.; Luck, L.B.

    1990-06-01

    SIMMER-2 (Version 12) is a computer program to predict the coupled neutronic and fluid-dynamics behavior of liquid-metal fast reactors during core-disruptive accident transients. The modeling philosophy is based on the use of general, but approximate, physics to represent interactions of accident phenomena and regimes rather than a detailed representation of specialized situations. Reactor neutronic behavior is predicted by solving space (r,z), energy, and time-dependent neutron conservation equations (discrete ordinates transport or diffusion). The neutronics and the fluid dynamics are coupled via temperature- and background-dependent cross sections and the reactor power distribution. The fluid-dynamics calculation solves multicomponent, multiphase, multifield equations for mass, momentum, and energy conservation in (r,z) or (x,y) geometry. A structure field with nine density and five energy components; a liquid field with eight density and six energy components; and a vapor field with six density and on energy component are coupled by exchange functions representing a modified-dispersed flow regime with a zero-dimensional intra-cell structure model.

  2. Computer program for regional assessment of lung perfusion defect. Part II - verification of the algorithm

    International Nuclear Information System (INIS)

    Stefaniak, B.

    2002-01-01

    As described earlier, a dedicated computer program was developed for quantitative evaluation of regional lung perfusion defects, visualized by pulmonary scintigraphy. The correctness of the basic assumptions accepted to construct the algorithms and of the all program functions needed to be checked, before application of the program into the clinical routine. The aim of this study was to verified the program using various software instruments and physical models. Evaluation of the proposed method was performed using software procedures, physical lung phantom, and selected lung image.The reproducibility of lung regions, defined by the program was found excellent. No significant distortion of registered data was observed after ROI transformation into the circle and retransformation into the original shape. The obtained results comprised parametric presentation of activity defects as well as a set of numerical indices which defined extent and intensity of decreased counts density. Among these indices PD2 and DM* were proved the most suitable for the above purposes. The obtained results indicate that the algorithms used for the program construction were correct and suitable for the aim of the study. The above algorithms enable function under study to be presented graphically with true imaging of activity distribution, as well as numerical indices, defining extent and intensity of activity defects to calculated. (author)

  3. Cosmic reionization on computers. II. Reionization history and its back-reaction on early galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Gnedin, Nickolay Y. [Particle Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kaurov, Alexander A., E-mail: gnedin@fnal.gov, E-mail: kaurov@uchicago.edu [Department of Astronomy and Astrophysics, The University of Chicago, Chicago, IL 60637 (United States)

    2014-09-20

    We compare the results from several sets of cosmological simulations of cosmic reionization, produced under the Cosmic Reionization On Computers project, with existing observational data on the high-redshift Lyα forest and the abundance of Lyα emitters. We find good consistency with the observational measurements and previous simulation work. By virtue of having several independent realizations for each set of numerical parameters, we are able to explore the effect of cosmic variance on observable quantities. One unexpected conclusion we are forced into is that cosmic variance is unusually large at z > 6, with both our simulations and, most likely, observational measurements still not fully converged for even such basic quantities as the average Gunn-Peterson optical depth or the volume-weighted neutral fraction. We also find that reionization has little effect on the early galaxies or on global cosmic star formation history, because galaxies whose gas content is affected by photoionization contain no molecular (i.e., star-forming) gas in the first place. In particular, measurements of the faint end of the galaxy luminosity function by the James Webb Space Telescope are unlikely to provide a useful constraint on reionization.

  4. Development of computer code SIMPSEX for simulation of FBR fuel reprocessing flowsheets: II. additional benchmarking results

    International Nuclear Information System (INIS)

    Shekhar Kumar; Koganti, S.B.

    2003-07-01

    Benchmarking and application of a computer code SIMPSEX for high plutonium FBR flowsheets was reported recently in an earlier report (IGC-234). Improvements and recompilation of the code (Version 4.01, March 2003) required re-validation with the existing benchmarks as well as additional benchmark flowsheets. Improvements in the high Pu region (Pu Aq >30 g/L) resulted in better results in the 75% Pu flowsheet benchmark. Below 30 g/L Pu Aq concentration, results were identical to those from the earlier version (SIMPSEX Version 3, code compiled in 1999). In addition, 13 published flowsheets were taken as additional benchmarks. Eleven of these flowsheets have a wide range of feed concentrations and few of them are β-γ active runs with FBR fuels having a wide distribution of burnup and Pu ratios. A published total partitioning flowsheet using externally generated U(IV) was also simulated using SIMPSEX. SIMPSEX predictions were compared with listed predictions from conventional SEPHIS, PUMA, PUNE and PUBG. SIMPSEX results were found to be comparable and better than the result from above listed codes. In addition, recently reported UREX demo results along with AMUSE simulations are also compared with SIMPSEX predictions. Results of the benchmarking SIMPSEX with these 14 benchmark flowsheets are discussed in this report. (author)

  5. SIMMER-II: A computer program for LMFBR disrupted core analysis

    International Nuclear Information System (INIS)

    Bohl, W.R.; Luck, L.B.

    1990-06-01

    SIMMER-2 (Version 12) is a computer program to predict the coupled neutronic and fluid-dynamics behavior of liquid-metal fast reactors during core-disruptive accident transients. The modeling philosophy is based on the use of general, but approximate, physics to represent interactions of accident phenomena and regimes rather than a detailed representation of specialized situations. Reactor neutronic behavior is predicted by solving space (r,z), energy, and time-dependent neutron conservation equations (discrete ordinates transport or diffusion). The neutronics and the fluid dynamics are coupled via temperature- and background-dependent cross sections and the reactor power distribution. The fluid-dynamics calculation solves multicomponent, multiphase, multifield equations for mass, momentum, and energy conservation in (r,z) or (x,y) geometry. A structure field with nine density and five energy components; a liquid field with eight density and six energy components; and a vapor field with six density and on energy component are coupled by exchange functions representing a modified-dispersed flow regime with a zero-dimensional intra-cell structure model

  6. Macintosh/LabVIEW based control and data acquisition system for a single photon counting fluorometer

    Science.gov (United States)

    Stryjewski, Wieslaw J.

    1991-08-01

    A flexible software system has been developed for controlling fluorescence decay measurements using the virtual instrument approach offered by LabVIEW. The time-correlated single photon counting instrument operates under computer control in both manual and automatic mode. Implementation time was short and the equipment is now easier to use, reducing the training time required for new investigators. It is not difficult to customize the front panel or adapt the program to a different instrument. We found LabVIEW much more convenient to use for this application than traditional, textual computer languages.

  7. Comparison of the Glidescope and Pentax AWS laryngoscopes to the Macintosh laryngoscope for use by advanced paramedics in easy and simulated difficult intubation.

    LENUS (Irish Health Repository)

    Nasim, Sajid

    2009-01-01

    BACKGROUND: Intubation of the trachea in the pre-hospital setting may be lifesaving in severely ill and injured patients. However, tracheal intubation is frequently difficult to perform in this challenging environment, is associated with a lower success rate, and failed tracheal intubation constitutes an important cause of morbidity. Novel indirect laryngoscopes, such as the Glidescope and the AWS laryngoscopes may reduce this risk. METHODS: We compared the efficacy of these devices to the Macintosh laryngoscope when used by 25 Advanced Paramedics proficient in direct laryngoscopy, in a randomized, controlled, manikin study. Following brief didactic instruction with the Glidescope and the AWS laryngoscopes, each participant took turns performing laryngoscopy and intubation with each device, in an easy intubation scenario and following placement of a hard cervical collar, in a SimMan manikin. RESULTS: Both the Glidescope and the AWS performed better than the Macintosh, and demonstrate considerable promise in this context. The AWS had the least number of dental compressions in all three scenarios, and in the cervical spine immobilization scenario it required fewer maneuvers to optimize the view of the glottis. CONCLUSION: The Glidescope and AWS devices possess advantages over the conventional Macintosh laryngoscope when used by Advanced Paramedics in normal and simulated difficult intubation scenarios in this manikin study. Further studies are required to extend these findings to the clinical setting.

  8. Comparison of the Glidescope® and Pentax AWS® laryngoscopes to the Macintosh laryngoscope for use by Advanced Paramedics in easy and simulated difficult intubation

    Directory of Open Access Journals (Sweden)

    O' Donnell John

    2009-05-01

    Full Text Available Abstract Background Intubation of the trachea in the pre-hospital setting may be lifesaving in severely ill and injured patients. However, tracheal intubation is frequently difficult to perform in this challenging environment, is associated with a lower success rate, and failed tracheal intubation constitutes an important cause of morbidity. Novel indirect laryngoscopes, such as the Glidescope® and the AWS® laryngoscopes may reduce this risk. Methods We compared the efficacy of these devices to the Macintosh laryngoscope when used by 25 Advanced Paramedics proficient in direct laryngoscopy, in a randomized, controlled, manikin study. Following brief didactic instruction with the Glidescope® and the AWS® laryngoscopes, each participant took turns performing laryngoscopy and intubation with each device, in an easy intubation scenario and following placement of a hard cervical collar, in a SimMan® manikin. Results Both the Glidescope® and the AWS® performed better than the Macintosh, and demonstrate considerable promise in this context. The AWS® had the least number of dental compressions in all three scenarios, and in the cervical spine immobilization scenario it required fewer maneuvers to optimize the view of the glottis. Conclusion The Glidescope® and AWS® devices possess advantages over the conventional Macintosh laryngoscope when used by Advanced Paramedics in normal and simulated difficult intubation scenarios in this manikin study. Further studies are required to extend these findings to the clinical setting.

  9. Domain interaction in rabbit muscle pyruvate kinase. II. Small angle neutron scattering and computer simulation.

    Science.gov (United States)

    Consler, T G; Uberbacher, E C; Bunick, G J; Liebman, M N; Lee, J C

    1988-02-25

    The effects of ligands on the structure of rabbit muscle pyruvate kinase were studied by small angle neutron scattering. The radius of gyration, RG, decreases by about 1 A in the presence of the substrate phosphoenolpyruvate, but increases by about the same magnitude in the presence of the allosteric inhibitor phenylalanine. With increasing pH or in the absence of Mg2+ and K+, the RG of pyruvate kinase increases. Hence, there is a 2-A difference in RG between two alternative conformations. Length distribution analysis indicates that, under all experimental conditions which increase the radius of gyration, there is a pronounced increase observed in the probability for interatomic distance between 80 and 110 A. These small angle neutron scattering results indicate a "contraction" and "expansion" of the enzyme when it transforms between its active and inactive forms. Using the alpha-carbon coordinates of crystalline cat muscle pyruvate kinase, a length distribution profile was calculated, and it matches the scattering profile of the inactive form. These observations are expected since the crystals were grown in the absence of divalent cations (Stuart, D. I., Levine, M., Muirhead, H., and Stammers, D. K. (1979) J. Mol. Biol. 134, 109-142). Hence, results from neutron scattering, x-ray crystallographic, and sedimentation studies (Oberfelder, R. W., Lee, L. L.-Y., and Lee, J.C. (1984) Biochemistry 23, 3813-3821) are totally consistent with each other. With the aid of computer modeling, the crystal structure has been manipulated in order to effect changes that are consistent with the conformational change described by the solution scattering data. The structural manipulation involves the rotation of the B domain relative to the A domain, leading to the closure of the cleft between these domains. These manipulations resulted in the generation of new sets of atomic (C-alpha) coordinates, which were utilized in calculations, the result of which compared favorably with the

  10. Web-based computational chemistry education with CHARMMing II: Coarse-grained protein folding.

    Directory of Open Access Journals (Sweden)

    Frank C Pickard

    2014-07-01

    Full Text Available A lesson utilizing a coarse-grained (CG Gō-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing web portal (www.charmming.org to the Chemistry at HARvard Macromolecular Mechanics (CHARMM molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the Gō-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG Gō model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field.

  11. Learning and performance of tracheal intubation by novice personnel: a comparison of the Airtraq and Macintosh laryngoscope.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-07-01

    Direct laryngoscopic tracheal intubation is taught to many healthcare professionals as it is a potentially lifesaving procedure. However, it is a difficult skill to acquire and maintain, and, of concern, the consequences of poorly performed intubation attempts are potentially serious. The Airtraq Laryngoscope is a novel intubation device which may possess advantages over conventional direct laryngoscopes for use by novice personnel. We conducted a prospective trial with 40 medical students who had no prior airway management experience. Following brief didactic instruction, each participant took turns in performing laryngoscopy and intubation using the Macintosh and Airtraq devices under direct supervision. Each student was allowed up to three attempts to intubate in three laryngoscopy scenarios using a Laerdal Intubation Trainer and one scenario in a Laerdal SimMan Manikin. They then performed tracheal intubation of the normal airway a second time to characterise the learning curve for each device. The Airtraq provided superior intubating conditions, resulting in greater success of intubation, particularly in the difficult laryngoscopy scenarios. In both easy and simulated difficult laryngoscopy scenarios, the Airtraq decreased the duration of intubation attempts, reduced the number of optimisation manoeuvres required, and reduced the potential for dental trauma. The Airtraq device showed a rapid learning curve and the students found it significantly easier to use. The Airtraq appears to be a superior device for novice personnel to acquire the skills of tracheal intubation.

  12. Comparison of the effects of Truview PCD™ video laryngoscopy and Macintosh blade direct laryngoscopy in geriatric patients.

    Science.gov (United States)

    Kurnaz, Muhammed M; Sarıtaş, Aykut

    2016-12-01

    To compare the effects of Truview PCD™ video laryngoscopy (TVL) and Macintosh blade direct laryngoscopy (MDL) on hemodynamic responses observed during laryngoscopy and orotracheal intubation conditions in geriatric patients. Randomized prospective study. Operating room. One hundred patients in the risk group American Society of Anesthesiologists I to III aged 65 years and older underwent elective surgery under general anesthesia. This prospective study was performed between January 2014 and February 2015 after institutional ethics committee approval. Patients were randomly allocated to 2 groups, namely, TVL and MDL. Hemodynamic parameters, modified Cormack-Lehane grade, intubation period, and preoperative examination (age, sex, American Society of Anesthesiologists, modified Mallampati test score, and thyromental and sternomental distances) of patients were evaluated. There were no statistically significant differences in hemodynamic responses (heart rates and mean arterial pressure) between the 2 groups (P>.05). The median intubation period in the TVL group was significantly higher than observed in the MDL group (t=4.594; Psystem does not provide significant hemodynamic response sparing or shorten orotracheal intubation times when compared to MDL in geriatric patients. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Computational analysis of neutronic parameters for TRIGA Mark-II research reactor using evaluated nuclear data libraries

    International Nuclear Information System (INIS)

    Uddin, M.N.; Sarker, M.M.; Khan, M.J.H.; Islam, S.M.A.

    2010-01-01

    The aim of this study is to analyze the neutronic parameters of TRIGA Mark-II research reactor using the chain of NJOY-WIMS-CITATION computer codes based on evaluated nuclear data libraries CENDL-2.2 and JEFF-3.1.1. The nuclear data processing code NJOY99.0 has been employed to generate the 69 group WIMS library for the isotopes of TRIGA core. The cell code WIMSD-5B was used to generate the cross sections in CITATION format and then 3-dimensional diffusion code CITTATION was used to calculate the neutronic parameters of the TRIGA Mark-II research reactor. All the analyses were performed using the 7-group macroscopic cross section library. The CITATION test-runs using different cross section sets based on different models applied in WIMS calculations have shown a strong influence of those models on the final integral parameters. Some of the cells were specially treated with PRIZE options available in WIMSD-5B to take into account the fine structure of the flux gradient in the fuel-reflector interface region. It was observed that two basic parameters, the effective multiplication factor, k eff and the thermal neutron flux, were in good agreement among the calculated results with each other as well as the measured values. The maximum power densities at the hot spot were 1.0446E02 W/cc and 1.0426E02 W/cc for the libraries CENDL-2.2 and JEFF-3.1.1 respectively. The calculated total peaking factors 5.793 and 5.745 were compared to the original SAR value of 5.6325 as well as MCNP result. Consequently, this analysis will be helpful to enhance the neutronic calculations and also be used for the further thermal-hydraulics study of the TRIGA core.

  14. Control of horizontal plasma position by feedforward-feedback system with digital computer in the JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, Kazuo; Sakurai, Keiichi; Itoh, Satoshi; Matsuura, Kiyokata; Tanashi, Shugo

    1980-01-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation of a large-aspect-ratio tokamak plasma surrounded by a thin resistive shell of a skin time of 5.2 ms, every 1.39 ms with a digital computer. The iron core effect is also taken into account by a simple form in the equation. The required strenght of vertical field is determined by the control demand composed of two groups; one is a ''feedback'' term expressed by the deviation of plasma position from the desired one and proportion-integration-differentiation correction (PID-controller), and the other is a ''feedforward'' term which is in proportion to the plasma current. The experimental results in a quasi-constant phase of plasma current are in good agreement with the stability analysis of the control system by using the so-called Bode-diagram which is calculated on the assumption that the plasma current is independent of time. By this control system, the horizontal plasma displacement has been suppressed within 1 cm of the initiation of discharge to the termination in the high-density and low-q(a) plasma of 15 cm radius which is obtained by both strong gas puffing and second current rise. (author)

  15. Control of horizontal plasma position by feedforward-feedback system with digital computer in the JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, K.; Itoh, S.; Sakurai, K.; Matsuura, K.; Tanahashi, S.

    1980-02-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation of a large-aspect-ratio tokamak plasma surrounded by a thin resistive shell of a skin time of 5.2 msec, every 1.39 msec with a digital computer. The iron core effect is also taken into account by a simple form in the equation. The required strength of vertical field is determined by the control demand composed of two groups; one is a ''feedback'' term expressed by the deviation of plasma position from the desired one and proportion-integration-differentiation correction (PID-controller), and the other is a ''feedforward'' term which is in proportion to the plasma current. The experimental results have a good agreement with the stability analysis of the control system by using the so-called Bode-diagram. By this control system, the horizontal displacement has been suppressed within 1 cm from the initiation of discharge to the termination in the high-density and low-q(a) plasma of 15 cm-radius which is obtained by both strong gas puffing and second current rise. (author)

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  18. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  1. The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements.

    Science.gov (United States)

    Gronenschild, Ed H B M; Habets, Petra; Jacobs, Heidi I L; Mengelers, Ron; Rozendaal, Nico; van Os, Jim; Marcelis, Machteld

    2012-01-01

    FreeSurfer is a popular software package to measure cortical thickness and volume of neuroanatomical structures. However, little if any is known about measurement reliability across various data processing conditions. Using a set of 30 anatomical T1-weighted 3T MRI scans, we investigated the effects of data processing variables such as FreeSurfer version (v4.3.1, v4.5.0, and v5.0.0), workstation (Macintosh and Hewlett-Packard), and Macintosh operating system version (OSX 10.5 and OSX 10.6). Significant differences were revealed between FreeSurfer version v5.0.0 and the two earlier versions. These differences were on average 8.8 ± 6.6% (range 1.3-64.0%) (volume) and 2.8 ± 1.3% (1.1-7.7%) (cortical thickness). About a factor two smaller differences were detected between Macintosh and Hewlett-Packard workstations and between OSX 10.5 and OSX 10.6. The observed differences are similar in magnitude as effect sizes reported in accuracy evaluations and neurodegenerative studies.The main conclusion is that in the context of an ongoing study, users are discouraged to update to a new major release of either FreeSurfer or operating system or to switch to a different type of workstation without repeating the analysis; results thus give a quantitative support to successive recommendations stated by FreeSurfer developers over the years. Moreover, in view of the large and significant cross-version differences, it is concluded that formal assessment of the accuracy of FreeSurfer is desirable.

  2. The effects of FreeSurfer version, workstation type, and Macintosh operating system version on anatomical volume and cortical thickness measurements.

    Directory of Open Access Journals (Sweden)

    Ed H B M Gronenschild

    Full Text Available FreeSurfer is a popular software package to measure cortical thickness and volume of neuroanatomical structures. However, little if any is known about measurement reliability across various data processing conditions. Using a set of 30 anatomical T1-weighted 3T MRI scans, we investigated the effects of data processing variables such as FreeSurfer version (v4.3.1, v4.5.0, and v5.0.0, workstation (Macintosh and Hewlett-Packard, and Macintosh operating system version (OSX 10.5 and OSX 10.6. Significant differences were revealed between FreeSurfer version v5.0.0 and the two earlier versions. These differences were on average 8.8 ± 6.6% (range 1.3-64.0% (volume and 2.8 ± 1.3% (1.1-7.7% (cortical thickness. About a factor two smaller differences were detected between Macintosh and Hewlett-Packard workstations and between OSX 10.5 and OSX 10.6. The observed differences are similar in magnitude as effect sizes reported in accuracy evaluations and neurodegenerative studies.The main conclusion is that in the context of an ongoing study, users are discouraged to update to a new major release of either FreeSurfer or operating system or to switch to a different type of workstation without repeating the analysis; results thus give a quantitative support to successive recommendations stated by FreeSurfer developers over the years. Moreover, in view of the large and significant cross-version differences, it is concluded that formal assessment of the accuracy of FreeSurfer is desirable.

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  5. User's instructions for ORCENT II: a digital computer program for the analysis of steam turbine cycles supplied by light-water-cooled reactors

    International Nuclear Information System (INIS)

    Fuller, L.C.

    1979-02-01

    The ORCENT-II digital computer program will perform calculations at valves-wide-open design conditions, maximum guaranteed rating conditions, and an approximation of part-load conditions for steam turbine cycles supplied with throttle steam characteristic of contemporary light-water reactors. Turbine performance calculations are based on a method published by the General Electric Company. Output includes all information normally shown on a turbine-cycle heat balance diagram. The program is written in FORTRAN IV for the IBM System 360 digital computers at the Oak Ridge National Laboratory

  6. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II

    International Nuclear Information System (INIS)

    Corge, Ch.

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [fr

  7. User's instructions for ORCENT II: a digital computer program for the analysis of steam turbine cycles supplied by light-water-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, L.C.

    1979-02-01

    The ORCENT-II digital computer program will perform calculations at valves-wide-open design conditions, maximum guaranteed rating conditions, and an approximation of part-load conditions for steam turbine cycles supplied with throttle steam characteristic of contemporary light-water reactors. Turbine performance calculations are based on a method published by the General Electric Company. Output includes all information normally shown on a turbine-cycle heat balance diagram. The program is written in FORTRAN IV for the IBM System 360 digital computers at the Oak Ridge National Laboratory.

  8. Assignment of solid-state 13C and 1H NMR spectra of paramagnetic Ni(II) acetylacetonate complexes aided by first-principles computations

    DEFF Research Database (Denmark)

    Rouf, Syed Awais; Jakobsen, Vibe Boel; Mareš, Jiří

    2017-01-01

    Recent advances in computational methodology allowed for first-principles calculations of the nuclear shielding tensor for a series of paramagnetic nickel(II) acetylacetonate complexes, [Ni(acac)2L2] with L = H2O, D2O, NH3, ND3, and PMe2Ph have provided detailed insight into the origin of the par......Recent advances in computational methodology allowed for first-principles calculations of the nuclear shielding tensor for a series of paramagnetic nickel(II) acetylacetonate complexes, [Ni(acac)2L2] with L = H2O, D2O, NH3, ND3, and PMe2Ph have provided detailed insight into the origin...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  16. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  17. Evaluation of intubation using the Airtraq or Macintosh laryngoscope by anaesthetists in easy and simulated difficult laryngoscopy--a manikin study.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-05-01

    The Airtraq Laryngoscope is a novel intubation device which allows visualisation of the vocal cords without alignment of the oral, pharyngeal and tracheal axes. We compared the Airtraq with the Macintosh laryngoscope in simulated easy and difficult laryngoscopy. Twenty-five anaesthetists were allowed up to three attempts to intubate the trachea in each of three laryngoscopy scenarios using a Laerdal Intubation Trainer followed by five scenarios using a Laerdal SimMan Manikin. Each anaesthetist then performed tracheal intubation of the normal airway a second time to characterise the learning curve. In the simulated easy laryngoscopy scenarios, there was no difference between the Airtraq and the Macintosh in success of tracheal intubation. The time taken to intubate at the end of the protocol was significantly lower using the Airtraq (9.5 (6.7) vs. 14.2 (7.4) s), demonstrating a rapid acquisition of skills. In the simulated difficult laryngoscopy scenarios, the Airtraq was more successful in achieving tracheal intubation, required less time to intubate successfully, caused less dental trauma, and was considered by the anaesthetists to be easier to use.

  18. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  19. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  1. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  2. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  5. New fluorescent azo-Schiff base Cu(II) and Zn(II) metal chelates; spectral, structural, electrochemical, photoluminescence and computational studies

    Science.gov (United States)

    Purtas, Fatih; Sayin, Koray; Ceyhan, Gokhan; Kose, Muhammet; Kurtoglu, Mukerrem

    2017-06-01

    A new Schiff base containing azo chromophore group obtained by condensation of 2-hydroxy-4-[(E)-phenyldiazenyl]benzaldehyde with 3,4-dimethylaniline (HL) are used for the syntheses of new copper(II) and zinc(II) chelates, [Cu(L)2], and [Zn(L)2], and characterized by physico-chemical and spectroscopic methods such as 1H and 13C NMR, IR, UV.-Vis. and elemental analyses. The solid state structure of the ligand was characterized by single crystal X-ray diffraction study. X-ray diffraction data was then used to calculate the harmonic oscillator model of aromaticity (HOMA) indexes for the rings so as to investigate of enol-imine and keto-amine tautomeric forms in the solid state. The phenol ring C10-C15 shows a considerable deviation from the aromaticity with HOMA value of 0.837 suggesting the shift towards the keto-amine tautomeric form in the solid state. The analytical data show that the metal to ligand ratio in the chelates was found to be 1:2. Theoretical calculations of the possible isomers of the ligand and two metal complexes are performed by using B3LYP method. Electrochemical and photoluminescence properties of the synthesized azo-Schiff bases were also investigated.

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  7. Comparison of interradicular distances and cortical bone thickness in Thai patients with class I and class II skeletal patterns using cone-beam computed tomography

    International Nuclear Information System (INIS)

    Khumsarn, Nattida; Patanaporn, Virush; Janhom, Apirum; Jotikasthira, Dhirawat

    2016-01-01

    This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns

  8. Comparison of interradicular distances and cortical bone thickness in Thai patients with class I and class II skeletal patterns using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Khumsarn, Nattida [Dental Division of Lamphun Hospital, Lamphun (Thailand); Patanaporn, Virush; Janhom, Apirum; Jotikasthira, Dhirawat [Faculty of Dentistry, Chiang Mai University, Chiang Mai (Thailand)

    2016-06-15

    This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns.

  9. [Use of personal computers by diplomats of anesthesiology in Japan].

    Science.gov (United States)

    Yamamoto, K; Ohmura, S; Tsubokawa, T; Kita, M; Kushida, Y; Kobayashi, T

    1999-04-01

    Use of personal computers by diplomats of the Japanese Board of Anesthesiology working in Japanese university hospitals was investigated. Unsigned questionnaires were returned from 232 diplomats of 18 anesthesia departments. The age of responders ranged from twenties to sixties. Personal computer systems are used by 223 diplomats (96.1%), while nine (3.9%) do not use them. The computer systems used are: Apple Macintosh 77%, IBM compatible PC 21% and UNIX 2%. Although 197 diplomats have e-mail addresses, only 162 of them actually send and receive e-mails. Diplomats in fifties use e-mail most actively and those in sixties come second.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  12. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  13. INREM II: a computer implementation of recent models for estimating the dose equivalent to organs of man from an inhaled or ingested radionuclide

    International Nuclear Information System (INIS)

    Killough, G.G.; Dunning, D.E. Jr.; Pleasant, J.C.

    1978-01-01

    This report describes a computer code, INREM II, which calculates the internal radiation dose equivalent to organs of man which results from the intake of a radionuclide by inhalation or ingestion. Deposition and removal of radioactivity from the respiratory tract is represented by the ICRP Task Group Lung Model. A four-segment catenary model of the GI tract is used to estimate movement of radioactive material that is ingested or swallowed after being cleared from the respiratory tract. Retention of radioactivity in other organs is specified by linear combinations of decaying exponential functions. The formation and decay of radioactive daughters is treated explicitly, with each radionuclide species in the chain having its own uptake and retention parameters, as supplied by the user. The dose equivalent to a target organ is computed as the sum of contributions from each source organ in which radioactivity is assumed to be situated. This calculation utilizes a matrix of S-factors (rem/μCi-day) supplied by the user for the particular choice of source and target organs. Output permits the evaluation of crossfire components of dose when penetrating radiations are present. INREM II is coded in FORTRAN IV and has been compiled and executed on an IBM-360 computer

  14. [Morphological analysis of alveolar bone of anterior mandible in high-angle skeletal class II and class III malocclusions assessed with cone-beam computed tomography].

    Science.gov (United States)

    Ma, J; Jiang, J H

    2018-02-18

    To evaluate the difference of features of alveolar bone support under lower anterior teeth between high-angle adults with skeletal class II malocclusions and high-angle adults presenting skeletal class III malocclusions by using cone-beam computed tomography (CBCT). Patients who had taken the images of CBCT were selected from the Peking University School and Hospital of Stomatology between October 2015 and August 2017. The CBCT archives from 62 high-angle adult cases without orthodontic treatment were divided into two groups based on their sagittal jaw relationships: skeletal class II and skeletal class III. vertical bone level (VBL), alveolar bone area (ABA), and the width of alveolar bone were measured respectively at the 2 mm, 4 mm, 6 mm below the cemento-enamel junction (CEJ) level and at the apical level. After that, independent samples t-tests were conducted for statistical comparisons. The ABA of the mandibular alveolar bone in the area of lower anterior teeth was significantly thinner in the patients of skeletal class III than those of skeletal class II, especially in terms of the apical ABA, total ABA on the labial and lingual sides and the ABA at 6 mm below CEJ level on the lingual side (Pclass III than those of skeletal class II, especially regarding the apical level on the labial and lingual side and at the level of 4 mm, 6 mm below CEJ level on the lingual side (Pclass III adult patients with high-angle when compared with the sample of high-angle skeletal class II adult cases. We recommend orthodontists to be more cautious in treatment of high-angle skeletal class III patients, especially pay attention to control the torque of lower anterior teeth during forward and backward movement, in case that the apical root might be absorbed or fenestration happen in the area of lower anterior teeth.

  15. Effects of periodic boundary conditions on equilibrium properties of computer simulated fluids. II. Application to simple liquids

    International Nuclear Information System (INIS)

    Pratt, L.R.; Haan, S.W.

    1981-01-01

    The theory of the previous paper is used to predict anomalous size effects observed for computer simulated liquid Ar. The theoretical results for the boundary condition induced anisotropy of two-particle correlations are found to be large, and in excellent agreement with the computer experimental data of Mandell for densities near the Ar triple point density. The agreement is less good at higher densities

  16. Alternative intubation techniques vs Macintosh laryngoscopy in patients with cervical spine immobilization: systematic review and meta-analysis of randomized controlled trials

    Science.gov (United States)

    Suppan, L.; Tramèr, M. R.; Niquille, M.; Grosgurin, O.; Marti, C.

    2016-01-01

    Background. Immobilization of the cervical spine worsens tracheal intubation conditions. Various intubation devices have been tested in this setting. Their relative usefulness remains unclear. Methods. We searched MEDLINE, EMBASE, and the Cochrane Library for randomized controlled trials comparing any intubation device with the Macintosh laryngoscope in human subjects with cervical spine immobilization. The primary outcome was the risk of tracheal intubation failure at the first attempt. Secondary outcomes were quality of glottis visualization, time until successful intubation, and risk of oropharyngeal complications. Results. Twenty-four trials (1866 patients) met inclusion criteria. With alternative intubation devices, the risk of intubation failure was lower compared with Macintosh laryngoscopy [risk ratio (RR) 0.53; 95% confidence interval (CI) 0.35–0.80]. Meta-analyses could be performed for five intubation devices (Airtraq, Airwayscope, C-Mac, Glidescope, and McGrath). The Airtraq was associated with a statistically significant reduction of the risk of intubation failure at the first attempt (RR 0.14; 95% CI 0.06–0.33), a higher rate of Cormack–Lehane grade 1 (RR 2.98; 95% CI 1.94–4.56), a reduction of time until successful intubation (weighted mean difference −10.1 s; 95% CI −3.2 to −17.0), and a reduction of oropharyngeal complications (RR 0.24; 95% CI 0.06–0.93). Other devices were associated with improved glottis visualization but no statistically significant differences in intubation failure or time to intubation compared with conventional laryngoscopy. Conclusions. In situations where the spine is immobilized, the Airtraq device reduces the risk of intubation failure. There is a lack of evidence for the usefulness of other intubation devices. PMID:26133898

  17. Computational modeling for irrigated agriculture planning. Part II: risk analysis Modelagem computacional para planejamento em agricultura irrigada: Parte II - Análise de risco

    Directory of Open Access Journals (Sweden)

    João C. F. Borges Júnior

    2008-09-01

    Full Text Available Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.Técnicas de avaliação de riscos procedentes de incertezas inerentes à atividade agrícola devem acompanhar os estudos de planejamento. A análise de risco pode ser desempenhada por meio de simulação, utilizando técnicas como o método de Monte Carlo. Neste trabalho, teve-se o objetivo de desenvolver um programa computacional, denominado P-RISCO, para utilização de simulações de risco em modelos de programação linear, aplicar a um estudo de caso e testar os resultados comparativamente ao programa @RISK. Na análise de risco, observou-se que a média da variável de saída, valor presente líquido total (U, foi consideravelmente inferior ao valor máximo de U obtido no modelo de programação linear. Constatou

  18. Evaluating Children's Learning Disabilities with an Apple II Personal Computer or Tempting Poor Learners with an Apple.

    Science.gov (United States)

    Sisson, Lee Hansen; And Others

    This paper describes the use of commercially-available software for the Apple Computer to augment diagnostic evaluations of learning disabled children and to enhance "learning to learn" strategies at the application/transfer level of learning. A short rationale discusses levels of evaluation and learning, using a model that synthesizes the ideas…

  19. Computer-monitored radionuclide tracking of three-dimensional mandibular movements. Part II: experimental setup and preliminary results - Posselt diagram

    Energy Technology Data Exchange (ETDEWEB)

    Salomon, J.A.; Waysenson, B.D.; Warshaw, B.D.

    1979-04-01

    This article described a new method to track mandibular movements using a computer-assisted radionuclide kinematics technique. The usefulness of various image-enhancement techniques is discussed, and the reproduction of physiologic displacements is shown. Vertical, lateral, and protrusive envelopes of motion of a point on a tooth of a complete denture mounted on a semiadjustable articulator were measured. A demonstrative example of the validity of this approach is reproducing the motion of the dental point, which clearly evidences the Posselt diagram.

  20. Conventional multi-slice computed tomography (CT) and cone-beam CT (CBCT) for computer-aided implant placement. Part II: reliability of mucosa-supported stereolithographic guides.

    Science.gov (United States)

    Arisan, Volkan; Karabuda, Zihni Cüneyt; Pişkin, Bülent; Özdemir, Tayfun

    2013-12-01

    Deviations of implants that were placed by conventional computed tomography (CT)- or cone beam CT (CBCT)-derived mucosa-supported stereolithographic (SLA) surgical guides were analyzed in this study. Eleven patients were randomly scanned by a multi-slice CT (CT group) or a CBCT scanner (CBCT group). A total of 108 implants were planned on the software and placed using SLA guides. A new CT or CBCT scan was obtained and merged with the planning data to identify the deviations between the planned and placed implants. Results were analyzed by Mann-Whitney U test and multiple regressions (p < .05). Mean angular and linear deviations in the CT group were 3.30° (SD 0.36), and 0.75 (SD 0.32) and 0.80 mm (SD 0.35) at the implant shoulder and tip, respectively. In the CBCT group, mean angular and linear deviations were 3.47° (SD 0.37), and 0.81 (SD 0.32) and 0.87 mm (SD 0.32) at the implant shoulder and tip, respectively. No statistically significant differences were detected between the CT and CBCT groups (p = .169 and p = .551, p = .113 for angular and linear deviations, respectively). Implant placement via CT- or CBCT-derived mucosa-supported SLA guides yielded similar deviation values. Results should be confirmed on alternative CBCT scanners. © 2012 Wiley Periodicals, Inc.

  1. Computational Investigation of the Influence of Halogen Atoms on the Photophysical Properties of Tetraphenylporphyrin and Its Zinc(II) Complexes.

    Science.gov (United States)

    De Simone, Bruna C; Mazzone, Gloria; Russo, Nino; Sicilia, Emilia; Toscano, Marirosa

    2018-03-15

    How the tetraphenylporphyrin (TPP) and its zinc(II) complexes (ZnTPP) photophysical properties (absorption energies, singlet-triplet energy gap and spin-orbit coupling contributions) can change due to the presence of an increasing number of heavy atoms in their molecular structures has been investigated by means of density functional theory and its time-dependent formulation. Results show that the increase of the atomic mass of the substituted halogen strongly enhances the spin-orbit coupling values, allowing a more efficient singlet-triplet intersystem crossing. Different deactivation channels have been considered and rationalized on the basis of El-Sayed and Kasha rules. Most of the studied compounds possess the appropriate properties to generate cytotoxic singlet molecular oxygen ( 1 Δ g ) and, consequently, they can be proposed as photosensitizers in photodynamic therapy.

  2. Molecular modeling and computational simulation of the photosystem-II reaction center to address isoproturon resistance in Phalaris minor.

    Science.gov (United States)

    Singh, Durg Vijay; Agarwal, Shikha; Kesharwani, Rajesh Kumar; Misra, Krishna

    2012-08-01

    Isoproturon is the only herbicide that can control Phalaris minor, a competitive weed of wheat that developed resistance in 1992. Resistance against isoproturon was reported to be due to a mutation in the psbA gene that encodes the isoproturon-binding D1 protein. Previously in our laboratory, a triazole derivative of isoproturon (TDI) was synthesized and found to be active against both susceptible and resistant biotypes at 0.5 kg/ha but has shown poor specificity. In the present study, both susceptible D1((S)), resistant D1((R)) and D2 proteins of the PS-II reaction center of P. minor have been modeled and simulated, selecting the crystal structure of PS-II from Thermosynechococcus elongatus (2AXT.pdb) as template. Loop regions were refined, and the complete reaction center D1/D2 was simulated with GROMACS in lipid (1-palmitoyl-2-oleoylglycero-3-phosphoglycerol, POPG) environment along with ligands and cofactor. Both S and R models were energy minimized using steepest decent equilibrated with isotropic pressure coupling and temperature coupling using a Berendsen protocol, and subjected to 1,000 ps of MD simulation. As a result of MD simulation, the best model obtained in lipid environment had five chlorophylls, two plastoquinones, two phenophytins and a bicarbonate ion along with cofactor Fe and oxygen evolving center (OEC). The triazole derivative of isoproturon was used as lead molecule for docking. The best worked out conformation of TDI was chosen for receptor-based de novo ligand design. In silico designed molecules were screened and, as a result, only those molecules that show higher docking and binding energies in comparison to isoproturon and its triazole derivative were proposed for synthesis in order to get more potent, non-resistant and more selective TDI analogs.

  3. The Computational Fluid Dynamics Rupture Challenge 2013--Phase II: Variability of Hemodynamic Simulations in Two Intracranial Aneurysms.

    Science.gov (United States)

    Berg, Philipp; Roloff, Christoph; Beuing, Oliver; Voss, Samuel; Sugiyama, Shin-Ichiro; Aristokleous, Nicolas; Anayiotos, Andreas S; Ashton, Neil; Revell, Alistair; Bressloff, Neil W; Brown, Alistair G; Chung, Bong Jae; Cebral, Juan R; Copelli, Gabriele; Fu, Wenyu; Qiao, Aike; Geers, Arjan J; Hodis, Simona; Dragomir-Daescu, Dan; Nordahl, Emily; Bora Suzen, Yildirim; Owais Khan, Muhammad; Valen-Sendstad, Kristian; Kono, Kenichi; Menon, Prahlad G; Albal, Priti G; Mierka, Otto; Münster, Raphael; Morales, Hernán G; Bonnefous, Odile; Osman, Jan; Goubergrits, Leonid; Pallares, Jordi; Cito, Salvatore; Passalacqua, Alberto; Piskin, Senol; Pekkan, Kerem; Ramalho, Susana; Marques, Nelson; Sanchi, Stéphane; Schumacher, Kristopher R; Sturgeon, Jess; Švihlová, Helena; Hron, Jaroslav; Usera, Gabriel; Mendina, Mariana; Xiang, Jianping; Meng, Hui; Steinman, David A; Janiga, Gábor

    2015-12-01

    With the increased availability of computational resources, the past decade has seen a rise in the use of computational fluid dynamics (CFD) for medical applications. There has been an increase in the application of CFD to attempt to predict the rupture of intracranial aneurysms, however, while many hemodynamic parameters can be obtained from these computations, to date, no consistent methodology for the prediction of the rupture has been identified. One particular challenge to CFD is that many factors contribute to its accuracy; the mesh resolution and spatial/temporal discretization can alone contribute to a variation in accuracy. This failure to identify the importance of these factors and identify a methodology for the prediction of ruptures has limited the acceptance of CFD among physicians for rupture prediction. The International CFD Rupture Challenge 2013 seeks to comment on the sensitivity of these various CFD assumptions to predict the rupture by undertaking a comparison of the rupture and blood-flow predictions from a wide range of independent participants utilizing a range of CFD approaches. Twenty-six groups from 15 countries took part in the challenge. Participants were provided with surface models of two intracranial aneurysms and asked to carry out the corresponding hemodynamics simulations, free to choose their own mesh, solver, and temporal discretization. They were requested to submit velocity and pressure predictions along the centerline and on specified planes. The first phase of the challenge, described in a separate paper, was aimed at predicting which of the two aneurysms had previously ruptured and where the rupture site was located. The second phase, described in this paper, aims to assess the variability of the solutions and the sensitivity to the modeling assumptions. Participants were free to choose boundary conditions in the first phase, whereas they were prescribed in the second phase but all other CFD modeling parameters were not

  4. Computational design of new molecular scaffolds for medicinal chemistry, part II: generalization of analog series-based scaffolds

    Science.gov (United States)

    Dimova, Dilyana; Stumpfe, Dagmar; Bajorath, Jürgen

    2018-01-01

    Aim: Extending and generalizing the computational concept of analog series-based (ASB) scaffolds. Materials & methods: Methodological modifications were introduced to further increase the coverage of analog series (ASs) and compounds by ASB scaffolds. From bioactive compounds, ASs were systematically extracted and second-generation ASB scaffolds isolated. Results: More than 20,000 second-generation ASB scaffolds with single or multiple substitution sites were extracted from active compounds, achieving more than 90% coverage of ASs. Conclusion: Generalization of the ASB scaffold approach has yielded a large knowledge base of scaffold-capturing compound series and target information. PMID:29379641

  5. SMACS - a system of computer programs for probabilistic seismic analysis of structures and subsystems. Volume II. Example problem

    International Nuclear Information System (INIS)

    Maslenikov, O.R.; Johnson, J.J.; Tiong, L.W.; Mraz, M.J.; Bumpus, S.; Gerhard, M.A.

    1985-03-01

    In this volume of the SMACS User's Manual an example problem is presented to demonstrate the type of problem that SMACS is capable of solving and to familiarize the user with format of the various data files involved. This volume is organized into thirteen appendices which follow a short description of the problem. Each appendix contains listings of the input and output files associated with each computer run that was necessary to solve the problem. In cases where one SMACS program uses data generated by another SMACS program, the data file is shown in the appendix for the programs which generated it

  6. Computer vision system approach in colour measurements of foods: Part II. validation of methodology with real foods

    Directory of Open Access Journals (Sweden)

    Fatih TARLAK

    2016-01-01

    Full Text Available Abstract The colour of food is one of the most important factors affecting consumers’ purchasing decision. Although there are many colour spaces, the most widely used colour space in the food industry is L*a*b* colour space. Conventionally, the colour of foods is analysed with a colorimeter that measures small and non-representative areas of the food and the measurements usually vary depending on the point where the measurement is taken. This leads to the development of alternative colour analysis techniques. In this work, a simple and alternative method to measure the colour of foods known as “computer vision system” is presented and justified. With the aid of the computer vision system, foods that are homogenous and uniform in colour and shape could be classified with regard to their colours in a fast, inexpensive and simple way. This system could also be used to distinguish the defectives from the non-defectives. Quality parameters of meat and dairy products could be monitored without any physical contact, which causes contamination during sampling.

  7. Barrier-free proton transfer in the valence anion of 2'-deoxyadenosine-5'-monophosphate. II. A computational study

    Science.gov (United States)

    Kobyłecka, Monika; Gu, Jiande; Rak, Janusz; Leszczynski, Jerzy

    2008-01-01

    The propensity of four representative conformations of 2'-deoxyadenosine-5'-monophosphate (5'-dAMPH) to bind an excess electron has been studied at the B3LYP /6-31++G(d,p) level. While isolated canonical adenine does not support stable valence anions in the gas phase, all considered neutral conformations of 5'-dAMPH form adiabatically stable anions. The type of an anionic 5'-dAMPH state, i.e., the valence, dipole bound, or mixed (valence/dipole bound), depends on the internal hydrogen bond(s) pattern exhibited by a particular tautomer. The most stable anion results from an electron attachment to the neutral syn-south conformer. The formation of this anion is associated with a barrier-free proton transfer triggered by electron attachment and the internal rotation around the C4'-C5' bond. The adiabatic electron affinity of the a&barbelow;south-syn anion is 1.19eV, while its vertical detachment energy is 1.89eV. Our results are compared with the photoelectron spectrum (PES) of 5'-dAMPH- measured recently by Stokes et al., [J. Chem. Phys. 128, 044314 (2008)]. The computational VDE obtained for the most stable anionic structure matches well with the experimental electron binding energy region of maximum intensity. A further understanding of DNA damage might require experimental and computational studies on the systems in which purine nucleotides are engaged in hydrogen bonding.

  8. Mapping the extent of disease by multislice computed tomography, magnetic resonance imaging and sentinel node evaluation in stage I and II cervical carcinoma

    Directory of Open Access Journals (Sweden)

    Rajaram S

    2010-01-01

    Full Text Available Aims: (1 To map the extent of disease in women with stage I and II carcinoma cervix by multislice spiral computed tomography (CT, magnetic resonance imaging (MRI and sentinel nodes. (2 To assess accuracy of each modality individually and in conjunction with FIGO clinical staging. Design and Setting: Prospective, single-blind study. Departments of Obstetrics and Gynaecology, Radiodiagnosis, and Pathology, UCMS and GTBH and Division of Radiological Imaging and Bioinformatics, INMAS, Delhi. Material and Method: The study was conducted on 25 women with cervical cancer FIGO stage I and II. Each woman underwent clinical staging, multislice spiral CT and MRI which was compared to the gold-standard histopathology/cytology. The overall accuracy of each modality and improvement of clinical staging by CT/MRI were noted. Sentinel nodes were evaluated by intracervical Patent Blue V dye injection. Statistical Analysis: Sensitivity, specificity, positive and negative predictive values were calculated by 2Χ2 contingency tables. Results: The accuracy of staging by FIGO, CT and MRI was 68%, 52% and 80%, respectively. MRI and CT improved the overall accuracy of FIGO staging to 96% and 80%, respectively. Sentinel nodes were identified in 89% of patients with 91% accuracy. Conclusion: MRI emerges as the most valuable stand-alone modality improving accuracy of FIGO staging to 96%. Sentinel lymph-node evaluation appears promising in evaluating spread beyond cervix.

  9. Nonlinear optical and G-Quadruplex DNA stabilization properties of novel mixed ligand copper(II) complexes and coordination polymers: Synthesis, structural characterization and computational studies

    Science.gov (United States)

    Rajasekhar, Bathula; Bodavarapu, Navya; Sridevi, M.; Thamizhselvi, G.; RizhaNazar, K.; Padmanaban, R.; Swu, Toka

    2018-03-01

    The present study reports the synthesis and evaluation of nonlinear optical property and G-Quadruplex DNA Stabilization of five novel copper(II) mixed ligand complexes. They were synthesized from copper(II) salt, 2,5- and 2,3- pyridinedicarboxylic acid, diethylenetriamine and amide based ligand (AL). The crystal structure of these complexes were determined through X-ray diffraction and supported by ESI-MAS, NMR, UV-Vis and FT-IR spectroscopic methods. Their nonlinear optical property was studied using Gaussian09 computer program. For structural optimization and nonlinear optical property, density functional theory (DFT) based B3LYP method was used with LANL2DZ basis set for metal ion and 6-31G∗ for C,H,N,O and Cl atoms. The present work reveals that pre-polarized Complex-2 showed higher β value (29.59 × 10-30e.s.u) as compared to that of neutral complex-1 (β = 0.276 × 10-30e.s.u.) which may be due to greater advantage of polarizability. Complex-2 is expected to be a potential material for optoelectronic and photonic technologies. Docking studies using AutodockVina revealed that complex-2 has higher binding energy for both G-Quadruplex DNA (-8.7 kcal/mol) and duplex DNA (-10.1 kcal/mol). It was also observed that structure plays an important role in binding efficiency.

  10. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    Energy Technology Data Exchange (ETDEWEB)

    Uzunyan, S. A. [Northern Illinois Univ., DeKalb, IL (United States); Blazey, G. [Northern Illinois Univ., DeKalb, IL (United States); Boi, S. [Northern Illinois Univ., DeKalb, IL (United States); Coutrakon, G. [Northern Illinois Univ., DeKalb, IL (United States); Dyshkant, A. [Northern Illinois Univ., DeKalb, IL (United States); Francis, K. [Northern Illinois Univ., DeKalb, IL (United States); Hedin, D. [Northern Illinois Univ., DeKalb, IL (United States); Johnson, E. [Northern Illinois Univ., DeKalb, IL (United States); Kalnins, J. [Northern Illinois Univ., DeKalb, IL (United States); Zutshi, V. [Northern Illinois Univ., DeKalb, IL (United States); Ford, R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rauch, J. E. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rubinov, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sellberg, G. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Wilson, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Naimuddin, M. [Univ. of Delhi, New Delhi (India)

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  11. Computational modeling of elastic properties of carbon nanotube/polymer composites with interphase regions. Part II: Mechanical modeling

    KAUST Repository

    Han, Fei

    2014-01-01

    We present two modeling approaches for predicting the macroscopic elastic properties of carbon nanotubes/polymer composites with thick interphase regions at the nanotube/matrix frontier. The first model is based on local continuum mechanics; the second one is based on hybrid local/non-local continuum mechanics. The key computational issues, including the peculiar homogenization technique and treatment of periodical boundary conditions in the non-local continuum model, are clarified. Both models are implemented through a three-dimensional geometric representation of the carbon nanotubes network, which has been detailed in Part I. Numerical results are shown and compared for both models in order to test convergence and sensitivity toward input parameters. It is found that both approaches provide similar results in terms of homogenized quantities but locally can lead to very different microscopic fields. © 2013 Elsevier B.V. All rights reserved.

  12. Development of a computer program for the simulation of ice-bank system operation, part II: Verification

    Energy Technology Data Exchange (ETDEWEB)

    Grozdek, Marino; Halasz, Boris; Curko, Tonko [University of Zagreb, Faculty of Mechanical Engineering and Naval Architecture, Ivana Lucica 5, 10 000 Zagreb (Croatia)

    2010-12-15

    In order to verify the mathematical model of an ice bank system developed for the purpose of predicting the system performance, experimental measurements on the ice bank system were performed. Static, indirect, cool thermal storage system, with an external ice-on-coil building/melting was considered. Cooling energy stored in the form of ice by night is used for the rapid cooling of milk after the process of pasteurization by day. The ice bank system was tested under real operating conditions to determine parameters such as the time-varying heat load imposed by the consumer, refrigeration unit load, storage capacity, supply water temperature to the load and to find charging and discharging characteristics of the storage. Experimentally obtained results were then compared to the computed ones. It was found that the calculated and experimentally obtained results are in good agreement as long as there is ice present in the silo. (author)

  13. Multiyear interactive computer almanac, 1800-2050

    CERN Document Server

    United States. Naval Observatory

    2005-01-01

    The Multiyear Interactive Computer Almanac (MICA Version 2.2.2 ) is a software system that runs on modern versions of Windows and Macintosh computers created by the U.S. Naval Observatory's Astronomical Applications Department, especially for astronomers, surveyors, meteorologists, navigators and others who regularly need accurate information on the positions, motions, and phenomena of celestial objects. MICA produces high-precision astronomical data in tabular form, tailored for the times and locations specified by the user. Unlike traditional almanacs, MICA computes these data in real time, eliminating the need for table look-ups and additional hand calculations. MICA tables can be saved as standard text files, enabling their use in other applications. Several important new features have been added to this edition of MICA, including: extended date coverage from 1800 to 2050; a redesigned user interface; a graphical sky map; a phenomena calculator (eclipses, transits, equinoxes, solstices, conjunctions, oppo...

  14. Use of TOUGH2 on small computers

    Energy Technology Data Exchange (ETDEWEB)

    Antunez, E.; Pruess, K.; Moridis, G. [Lawrence Berkeley Laboratory, CA (United States)

    1995-03-01

    TOUGH2/PC has been tested extensively on different PC platforms (486-33, 486-66, Pentium-90), with encouraging results. TOUGH2 performance has also been tested in other 32-bit computers as the MacIntosh Quadra 800, and a workstation IBM RISC 6000. Results obtained with these machines are compared with PCs` performance. PC results for 3-D geothermal reservoir models are discussed, including: (a) a Cartesian; and (b) a geothermal reservoir model with 1,411 irregular grid blocks. Also discussed are the results of the TOUGH2-compiler performance tests conducted on small computer systems. Code modifications required to operate on 32-bit computers and its setup in each machine environment are described. It is concluded that in today`s market PCs provide the best price/performance alternative to conduct TOUGH2 numerical simulations.

  15. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    Science.gov (United States)

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  16. Acid-base properties of the N3 ruthenium(II) solar cell sensitizer: a combined experimental and computational analysis.

    Science.gov (United States)

    Pizzoli, Giuliano; Lobello, Maria Grazia; Carlotti, Benedetta; Elisei, Fausto; Nazeeruddin, Mohammad K; Vitillaro, Giuseppe; De Angelis, Filippo

    2012-10-14

    We report a combined spectro-photometric and computational investigation of the acid-base equilibria of the N3 solar cell sensitizer [Ru(dcbpyH(2))(2)(NCS)(2)] (dcbpyH(2) = 4,4'-dicarboxyl-2,2' bipyridine) in aqueous/ethanol solutions. The absorption spectra of N3 recorded at various pH values were analyzed by Single Value Decomposition techniques, followed by Global Fitting procedures, allowing us to identify four separate acid-base equilibria and their corresponding ground state pK(a) values. DFT/TDDFT calculations were performed for the N3 dye in solution, investigating the possible relevant species obtained by sequential deprotonation of the four dye carboxylic groups. TDDFT excited state calculations provided UV-vis absorption spectra which nicely agree with the experimental spectral shapes at various pH values. The calculated pK(a) values are also in good agreement with experimental data, within <1 pK(a) unit. Based on the calculated energy differences a tentative assignment of the N3 deprotonation pathway is reported.

  17. Image viewing station for MR and SPECT : using personal computer

    International Nuclear Information System (INIS)

    Yim, Byung Il; Jeong, Eun Kee; Suh, Jin Suck; Kim, Myeong Joon

    1996-01-01

    Macro language was programmed to analyze and process on Macintosh personal computers, GEMR images digitally transferred from the MR main computer, with special interest in the interpretation of information such as patients data and imaging parameters under each image header. By this method, raw data(files) of certain patients may be digitally stored on a hard disk or CD ROM, and the quantitative analysis, interpretation and display is possible. Patients and images were randomly selected 4.X MR images were transferred through FTP using the ethernet network. 5.X and SPECT images were transferred using floppy diskettes. To process transferred images, an freely distributed software for Macintosh namely NIH Image, with its macro language, was used to import images and translate header information. To identify necessary information, a separate window named I nfo=txt , was made for each image series. MacLC, Centris650, and PowerMac6100/CD, 7100/CD, 8100/CD models with 256 color and RAM over 8Mbyte were used. Different versions of MR images and SPECT images were displayed simultaneously and a separate window named 'info-txt' was used to show all necessary information(name of the patient, unit number, date, TR, TE, FOV etc.). Additional information(diagnosis, pathologic report etc.) was stored in another text box in 'info-txt'. The size of the file for each image plane was about 149Kbytes and the images were stored in a step-like file folders. 4.X and 5.X GE Signa 1.5T images were successfully processed with Macintosh computer and NIH Image. This result may be applied to many fields and there is hope of a broader area of application with the linkage of NIH Image and a database program

  18. Outpatient follow-up system using a personal computer for patients with hepatocellular carcinoma after surgery.

    Science.gov (United States)

    Itasaka, H; Matsumata, T; Taketomi, A; Yamamoto, K; Yanaga, K; Takenaka, K; Akazawa, K; Sugimachi, K

    1994-12-01

    A simple outpatient follow-up system was developed with a laptop personal computer to assist management of patients with hepatocellular carcinoma after hepatic resections. Since it is based on a non-relational database program and the graphical user interface of Macintosh operating system, those who are not a specialist of the computer operation can use it. It is helpful to promptly recognize current status and problems of the patients, to diagnose recurrences of the disease and to prevent lost from follow-up cases. A portability of the computer also facilitates utilization of these data everywhere, such as in clinical conferences and laboratories.

  19. A visual interface to computer programs for linkage analysis.

    Science.gov (United States)

    Chapman, C J

    1990-06-01

    This paper describes a visual approach to the input of information about human families into computer data bases, making use of the GEM graphic interface on the Atari ST. Similar approaches could be used on the Apple Macintosh or on the IBM PC AT (to which it has been transferred). For occasional users of pedigree analysis programs, this approach has considerable advantages in ease of use and accessibility. An example of such use might be the analysis of risk in families with Huntington disease using linked RFLPs. However, graphic interfaces do make much greater demands on the programmers of these systems.

  20. Interactive house investigation and radon diagnostics computer program

    International Nuclear Information System (INIS)

    Gillette, L.M.

    1990-01-01

    This paper reports on the interactive computer program called Dungeons and Radon which was developed as part of the Environmental Protection Agency's (EPA's) Radon Contractor Proficiency (RCP) Program's Radon Technology for Mitigators (RTM) course which is currently being offered in the Regional Radon Training Centers (RRTCs). The program was designed by Terry Brennan to be used in training radon mitigation contractors. The Macintosh based program consists of a series of animated, sound and voice enhanced house scenes. The participants choose where and what to investigate and where to perform diagnostic tests in order to gather enough information to design a successful mitigation system

  1. Design of a digital beam attenuation system for computed tomography. Part II. Performance study and initial results

    International Nuclear Information System (INIS)

    Szczykutowicz, Timothy P.; Mistretta, Charles A.

    2013-01-01

    reduction of ≈4 times relative to flat field CT. The dynamic range for the DBA prototype was 3.7 compared to 84.2 for the flat field scan. Conclusions: Based on the results presented in this paper and the companion paper [T. Szczykutowicz and C. Mistretta, “Design of a digital beam attenuation system for computed tomography. Part I. System design and simulation framework,” Med. Phys. 40, 021905 (2013)], FFMCT implemented via the DBA device seems feasible and should result in both a dose reduction and an improvement in image quality as judged by noise uniformity and scatter reduction. In addition, the dynamic range reduction achievable using the DBA may allow photon counting imaging to become a clinical reality. This study may allow for yet another step to be taken in the field of patient specific dose modulation.

  2. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  3. Extent of early ischemic changes on computed tomography (CT) before thrombolysis: prognostic value of the Alberta Stroke Program Early CT Score in ECASS II.

    Science.gov (United States)

    Dzialowski, Imanuel; Hill, Michael D; Coutts, Shelagh B; Demchuk, Andrew M; Kent, David M; Wunderlich, Olaf; von Kummer, Rüdiger

    2006-04-01

    The significance of early ischemic changes (EICs) on computed tomography (CT) to triage patients for thrombolysis has been controversial. The Alberta Stroke Program Early CT Score (ASPECTS) semiquantitatively assesses EICs within the middle cerebral artery territory using a10-point grading system. We hypothesized that dichotomized ASPECTS predicts response to intravenous thrombolysis and incidence of secondary hemorrhage within 6 hours of stroke onset. Data from the European-Australian Acute Stroke Study (ECASS) II study were used in which 800 patients were randomized to recombinant tissue plasminogen activator (rt-PA) or placebo within 6 hours of symptom onset. We retrospectively assessed all baseline CT scans, dichotomized ASPECTS at 7, defined favorable outcome as modified Rankin Scale score 0 to 2 after 90 days, and secondary hemorrhage as parenchymal hematoma 1 (PH1) or PH2. We performed a multivariable logistic regression analysis and assessed for an interaction between rt-PA treatment and baseline ASPECTS score. We scored ASPECTS >7 in 557 and < or =7 in 231 patients. There was no treatment-by-ASPECTS interaction with dichotomized ASPECTS (P=0.3). This also applied for the 0- to 3-hour and 3- to 6-hour cohorts. However, a treatment-by-ASPECTS effect modification was seen in predicting PH (0.043 for the interaction term), indicating a much higher likelihood of thrombolytic-related parenchymal hemorrhage in those with ASPECTS < or =7. In ECASS II, the effect of rt-PA on functional outcome is not influenced by baseline ASPECTS. Patients with low ASPECTS have a substantially increased risk of thrombolytic-related PH.

  4. An Evaluation of Mandibular Dental and Basal Arch Dimensions in Class I and Class II Division 1 Adult Syrian Patients using Cone-beam Computed Tomography.

    Science.gov (United States)

    Al-Hilal, Layal H; Sultan, Kinda; Hajeer, Mohammad Y; Mahmoud, Ghiath; Wanli, Abdulrahman A

    2018-04-01

    Aim: The aim of this study is (1) to inspect any difference in mandibular arch widths between males and females in class I and class II division 1 (class malocclusions using cone-beam computed tomography (CBCT), (2) to compare the mandibular dental and basal widths between the two groups, and (3) to investigate any possible correlation between dental and basal arch widths in both groups. Materials and methods: The CBCT images of 68 patients aged between 18 and 25 years consisted of 34 class I (17 males and 17 females) and 34 class (17 males and 17 females) who were recruited at the Department of Orthodontics, University of Damascus Dental School (Syria). Using on-demand three-dimensional (3D) on axial views, facial axis points for dental measurements and basal bone center (BBC) points for basal measurements were identified on lower canines and first molars. Dental and basal intercanine width (ICW) and intermolar width (IMW) were measured. Results: Independent t-test showed a statistically significant difference between males and females in several variables in both groups and a statistically significant difference between class I and class groups in the basal ICW for both genders and in the dental ICW for females only (p class I group, Pearson's correlation coefficients between dental and basal measurements showed a strong correlation in the IMW for both genders (r > 0.73; p class group, a moderate correlation in females' IMW (r = 0.67; p Class I patients had larger ICW than class II-1 patients in all measurements and had narrower IMW than class in most measurements for both genders. There were moderate-to-strong correlations between dental and basal dimensions. BBC points might be landmarks that accurately represent the basal bone arch. Clinical significance: CBCT-based assessments of dental and basal arch dimensions provide a great opportunity to accurately evaluate these aspects, to enhance clinicians' decisions regarding proper tooth movements, and to achieve

  5. Apical Root Canal Morphology of Mesial Roots of Mandibular First Molar Teeth with Vertucci Type II Configuration by Means of Micro-Computed Tomography.

    Science.gov (United States)

    Keleş, Ali; Keskin, Cangül

    2017-03-01

    The aim of this study was to assess the features of the apical root canal anatomy and its relation to the level at which 2 separate root canals merge in the mesial roots of the mandibular first molars with Vertucci type II canal configuration by using micro-computed tomography analysis. The anatomic features of the apical 3 mm of root canals in 83 mesial roots of mandibular first molar teeth were investigated by micro-computed tomography and software imaging according to the level at which 2 separate root canals merge. The most apical slice where a visible root canal was detectable was recorded as 0 level. The specimens from where 2 root canals rejoin at within 3 mm from the 0 level were then assigned to group 1 (n = 37), whereas the specimens from where root canals rejoin 3-9 mm from the 0 level were assigned to group 2 (n = 46). Data were presented by using descriptive statistics and Mann-Whitney U tests, with the significance level set at 5%. In all specimens the long oval type of cross-sectional shape increased from 50.9% at 1 mm to 80.5% at 3 mm. Group 1 presented significantly higher major diameter values compared with group 2 (P  .05) between groups. Group 2 displayed significantly higher roundness values than group 1 (P < .05). A long oval root cross section of apical root canal anatomy is more prevalent in roots for which 2 root canals merge within apical 3 mm of root canals. Copyright © 2016 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  6. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    Science.gov (United States)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  7. cobalt (ii), nickel (ii)

    African Journals Online (AJOL)

    DR. AMINU

    Department of Chemistry Bayero University, P. M. B. 3011, Kano, Nigeria. E-mail: hnuhu2000@yahoo.com. ABSTRACT. The manganese (II), cobalt (II), nickel (II) and .... water and common organic solvents, but are readily soluble in acetone. The molar conductance measurement [Table 3] of the complex compounds in.

  8. Criticality and safety parameter studies for upgrading 3 MW TRIGA MARK II research reactor and validation of generated cross section library and computational method

    International Nuclear Information System (INIS)

    Bhuiyan, S.I.; Mondal, M.A.W.; Sarker, M.M.; Rahman, M.; Shahdatullah, M.S.; Huda, M.Q.; Chakrroborty, T.K.; Khan, M.J.H.

    2000-01-01

    This study deals with the neutronic and thermal hydraulic analysis of the 3MW TRIGA MARK II research reactor to upgrade it to a higher flux. The upgrading will need a major reshuffling and reconfiguration of the current core. To reshuffle the current core configuration, the chain of NJOY94.10 - WIMSD-5A - CITATION - PARET - MCNP4B2 codes has been used for the overall analysis. The computational methods, tools and techniques, customisation of cross section libraries, various models for cells and super cells, and a lot of associated utilities have been standardised and established/validated for the overall core analysis. Analyses using the 4-group and 7-group libraries of macroscopic cross sections generated from the 69-group WIMSD-5 library showed that a 7-group structure is more suitable for TRIGA calculations considering its LEU fuel composition. The MCNP calculations established that the CITATION calculations and the generated cross section library are reasonably good for neutronic analysis of TRIGA reactors. Results obtained from PARET demonstrated that the flux upgrade will not cause the temperature limit on the fuel to be exceeded. Also, the maximum power density remains, by a substantial margin below the level at which the departure from nucleate boiling could occur. A possible core with two additional irradiation channels around the CT is projected where almost identical thermal fluxes as in the CT are obtained. The reconfigured core also shows 7.25% thermal flux increase in the Lazy Susan. (author)

  9. The study of time-dependent neutronics parameters of the 2MW TRIGA Mark II Moroccan research reactor using BUCAL1 computer code

    International Nuclear Information System (INIS)

    Bakkari, B. El; Nacir, B.; El Younoussi, C.; Boulaich, Y.; Riyach, I.; Otmani, S.; Marcih, I.; Elbadri, H.; El Bardouni, T; Merroun, O.; Boukhal, H.; Zoubair, M.; Htet, A.; Chakir, M.

    2010-01-01

    The 2-MW TRIGA MARK II research reactor at Centre National de l'Energie, des Sciences et des Techniques Nucleaires (CNESTEN) achieved initial criticality on May 2, 2007 with 71 fuel elements. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower and training and production of radioisotopes for their use in agriculture, industry and medicine. This work aims to study the time-dependent neutronics parameters of the TRIGA reactor for elaborating and planning of an in-core fuel management strategy to maximize the utilization of the TRIGA fluxes, using a new elaborated burnup computer code called 'BUCAL1'. The code can be used to aid in analysis, prediction, and optimization of fuel burnup performance in a nuclear reactor. It was developed to incorporate the neutron absorption tally/reaction information generated directly by MCNP5 code in the calculation of fissioned or neutron-transmuted isotopes for multi-fueled regions. The use of Monte Carlo method and punctual cross section data characterizing the MCNP code allows an accurate simulation of neutron life cycle in the reactor, and the integration of data on the entire energy spectrum, thus a more accurate estimation of results than deterministic code can do. Also, for the purpose of this study, a full-model of the TRIGA reactor was developed using the MCNP5 code. The validation of the MCNP model of the TRIGA reactor was made by benchmarking the reactivity experiments. (author)

  10. Dynamic 123I-BMIPP single-photon emission computed tomography in patients with congestive heart failure: effect of angiotensin II type-1 receptor blockade.

    Science.gov (United States)

    Takeishi, Yasuchika; Minamihaba, Osamu; Yamauchi, Sou; Arimoto, Takanori; Hirono, Osamu; Takahashi, Hiroki; Akiyama, Hideyuki; Miyamoto, Takuya; Nitobe, Joji; Nozaki, Naoki; Tachibana, Hidetada; Okuyama, Masaki; Fukui, Akio; Kubota, Isao; Okada, Akio; Takahashi, Kazuei

    2004-04-01

    Heart failure is a major and growing public health problem with a high mortality rate. Although recent studies have demonstrated that a variety of metabolic and/or neurohumoral factors are involved in the progression of this syndrome, the precise mechanisms responsible for this complex condition are poorly understood. To examine 123I-beta-methyl-iodophenylpentadecanoic acid (BMIPP) kinetics in the early phase soon after tracer injection in patients with congestive heart failure (CHF), we performed dynamic single-photon emission computed tomography (SPECT). Twenty-six patients with CHF and eight control subjects were examined. The consecutive 15 images of 2-min dynamic SPECT were acquired for 30 min after injection. In the early phase after injection (0-4 min), a significant amount of radioactivity existed in the blood pool. After 6 min, the myocardial 123I-BMIPP image was clear and thus the washout rate of 123I-BMIPP from 6 to 30 min was calculated. The washout rate of 123I-BMIPP from the myocardium was faster in patients with CHF than in the controls (8 +/- 4 vs. -5 +/- 3%, p acid metabolism may represent a new mechanism for beneficial effects of angiotensin II receptor blockade on cardiac function and survival in patients with heart failure. 123I-BMIPP washout in the early phase obtained from dynamic SPECT may be a new marker for evaluating the severity of heart failure and the effects of medical treatment.

  11. Influence of clinical experience of the Macintosh laryngoscope on performance with the Pentax-AWS Airway Scope(®), a rigid video-laryngoscope, by paramedics in Japan.

    Science.gov (United States)

    Ota, Kohei; Sadamori, Takuma; Kusunoki, Shinji; Otani, Tadatsugu; Tamura, Tomoko; Une, Kazunobu; Kida, Yoshiko; Itai, Junji; Iwasaki, Yasumasa; Hirohashi, Nobuyuki; Nakao, Masakazu; Tanigawa, Koichi

    2015-10-01

    We sought to establish the clinical utility of the Pentax-AWS Airway Scope(®) (AWS) when used by paramedics to intubate the trachea, and to evaluate whether their performance was influenced by previous clinical experience with the Macintosh laryngoscope (ML). Twenty paramedics attempted tracheal intubation using the AWS in five patients each in the operating room. We recorded the success rate, the number of intubation attempts, and the time for intubation and adverse events, and compared these based on the paramedics' previous clinical experience with the ML. Ten paramedics had no prior clinical experience of the ML (group A) and 10 had used it on more than 30 occasions (group B). The intubation success rate was 99 % (99/100). Notably, 96 % (47/49) of intubations were achieved on the first attempt by the inexperienced paramedics in group A, compared with 64 % (32/50) by the experienced paramedics in group B (p = 0.0001). The time to intubation (mean ± SD) was significantly shorter in group A than in group B (37 ± 24 vs. 48 ± 21 s, p = 0.002). There were marked variations in the times taken to intubate, but no apparent improvement as the intubators gained experience between their first and fifth cases. No complications were encountered in either group. We found that paramedics could achieve a high tracheal intubation success rate using the AWS independent of previous airway management experience. Better intubation performance with the AWS was observed in paramedics without clinical experience with the ML.

  12. Muscle activity during endotracheal intubation using 4 laryngoscopes (Macintosh laryngoscope, Intubrite, TruView Evo2 and King Vision – A comparative study

    Directory of Open Access Journals (Sweden)

    Tomasz Gaszyński

    2016-04-01

    Full Text Available Background: Successful endotracheal intubation requires mental activity and no less important physical activity from the anesthesiologist, so ergonomics of used devices is important. The aim of our study has been to compare 4 laryngoscopes regarding an operator’s activity of selected muscles of the upper limb, an operator’s satisfaction with used devices and an operator’s fatigue during intubation attempts. Material and Methods: The study included 13 anesthesiologists of similar seniority. To measure muscle activity MyoPlus 2 with 2-channel surface ElectroMyoGraphy (sEMG test device was used. Participant’s satisfaction with studied devices was evaluated using Visual Analog Scale. An operator’s fatigue during intubation efforts was evaluated by means of the modified Borg’s scale. Results: The highest activity of all the studied muscles was observed for the Intubrite laryngoscope, followed by the Mackintosh, TruView Evo2 and the lowest one – for the King Vision video laryngoscope. A significant statistical difference was observed for the King Vision and the rest of laryngoscopes (p 0.05. The shortest time of intubation was achieved using the standard Macintosh blade laryngoscope. The highest satisfaction was noted for the King Vision video laryngoscope, and the lowest for – the TruView Evo2. The Intubrite was the most demanding in terms of workload, in the opinion of the participants’, and the least demanding was the King Vision video laryngoscope. Conclusions: Muscle activity, namely the force used for intubation, is the smallest when the King Vision video laryngoscope is used with the highest satisfaction and lowest workload, and the highest muscle activity was proven for the Intubrite laryngoscope with the highest workload. Med Pr 2016;67(2:155–162

  13. Ability of paramedics to perform endotracheal intubation during continuous chest compressions: a randomized cadaver study comparing Pentax AWS and Macintosh laryngoscopes.

    Science.gov (United States)

    Truszewski, Zenon; Czyzewski, Lukasz; Smereka, Jacek; Krajewski, Paweł; Fudalej, Marcin; Madziala, Marcin; Szarpak, Lukasz

    2016-09-01

    The aim of the trial was to compare the time parameters for intubation with the use of the Macintosh (MAC) laryngoscope and Pentax AWS-S100 videolaryngoscope (AWS; Pentax Corporation, Tokyo, Japan) with and without chest compression (CC) by paramedics during simulated cardiopulmonary resuscitation in a cadaver model. This was a randomized crossover cadaver trial. Thirty-five paramedics with no experience in videolaryngoscopy participated in the study. They performed intubation in two emergency scenarios: scenario A, normal airway without CC; scenario B, normal airway with continuous CC. The median time to first ventilation with the use of the AWS and the MAC was similar in scenario A: 25 (IQR, 22-27) seconds vs. 24 (IQR, 22.5-26) seconds (P=.072). A statistically significant difference in TTFV between AWS and MAC was noticed in scenario B (P=.011). In scenario A, the first endotracheal intubation (ETI) attempt success rate was achieved in 97.1% with AWS compared with 94.3% with MAC (P=.43). In scenario B, the success rate after the first ETI attempt with the use of the different intubation methods varied and amounted to 88.6% vs. 77.1% for AWS and MAC, respectively (P=.002). The Pentax AWS offered a superior glottic view as compared with the MAC laryngoscope, which was associated with a higher intubation rate and a shorter intubation time during an uninterrupted CC scenario. However, in the scenario without CC, the results for AWS and MAC were comparable. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Quantitative evaluation of low-cost frame-grabber boards for personal computers.

    Science.gov (United States)

    Kofler, J M; Gray, J E; Fuelberth, J T; Taubel, J P

    1995-11-01

    Nine moderately priced frame-grabber boards for both Macintosh (Apple Computers, Cupertino, CA) and IBM-compatible computers were evaluated using a Society of Motion Pictures and Television Engineers (SMPTE) pattern and a video signal generator for dynamic range, gray-scale reproducibility, and spatial integrity of the captured image. The degradation of the video information ranged from minor to severe. Some boards are of reasonable quality for applications in diagnostic imaging and education. However, price and quality are not necessarily directly related.

  15. Polymorphisms in the F8 gene and MHC-II variants as risk factors for the development of inhibitory anti-factor VIII antibodies during the treatment of hemophilia a: a computational assessment.

    Directory of Open Access Journals (Sweden)

    Gouri Shankar Pandey

    Full Text Available The development of neutralizing anti-drug-antibodies to the Factor VIII protein-therapeutic is currently the most significant impediment to the effective management of hemophilia A. Common non-synonymous single nucleotide polymorphisms (ns-SNPs in the F8 gene occur as six haplotypes in the human population (denoted H1 to H6 of which H3 and H4 have been associated with an increased risk of developing anti-drug antibodies. There is evidence that CD4+ T-cell response is essential for the development of anti-drug antibodies and such a response requires the presentation of the peptides by the MHC-class-II (MHC-II molecules of the patient. We measured the binding and half-life of peptide-MHC-II complexes using synthetic peptides from regions of the Factor VIII protein where ns-SNPs occur and showed that these wild type peptides form stable complexes with six common MHC-II alleles, representing 46.5% of the North American population. Next, we compared the affinities computed by NetMHCIIpan, a neural network-based algorithm for MHC-II peptide binding prediction, to the experimentally measured values and concluded that these are in good agreement (area under the ROC-curve of 0.778 to 0.972 for the six MHC-II variants. Using a computational binding predictor, we were able to expand our analysis to (a include all wild type peptides spanning each polymorphic position; and (b consider more MHC-II variants, thus allowing for a better estimation of the risk for clinical manifestation of anti-drug antibodies in the entire population (or a specific sub-population. Analysis of these computational data confirmed that peptides which have the wild type sequence at positions where the polymorphisms associated with haplotypes H3, H4 and H5 occur bind MHC-II proteins significantly more than a negative control. Taken together, the experimental and computational results suggest that wild type peptides from polymorphic regions of FVIII constitute potential T-cell epitopes

  16. COMPUTER HARDWARE MARKING

    CERN Multimedia

    Groupe de protection des biens

    2000-01-01

    As part of the campaign to protect CERN property and for insurance reasons, all computer hardware belonging to the Organization must be marked with the words 'PROPRIETE CERN'.IT Division has recently introduced a new marking system that is both economical and easy to use. From now on all desktop hardware (PCs, Macintoshes, printers) issued by IT Division with a value equal to or exceeding 500 CHF will be marked using this new system.For equipment that is already installed but not yet marked, including UNIX workstations and X terminals, IT Division's Desktop Support Service offers the following services free of charge:Equipment-marking wherever the Service is called out to perform other work (please submit all work requests to the IT Helpdesk on 78888 or helpdesk@cern.ch; for unavoidable operational reasons, the Desktop Support Service will only respond to marking requests when these coincide with requests for other work such as repairs, system upgrades, etc.);Training of personnel designated by Division Leade...

  17. Experimental and Computational Studies of the Macrocyclic Effect of an Auxiliary Ligand on Electron and Proton Transfers Within Ternary Copper(II)-Histidine Complexes

    International Nuclear Information System (INIS)

    Song, Tao; Lam, Corey; Ng, Dominic C.; Orlova, G.; Laskin, Julia; Fang, De-Cai; Chu, Ivan K.

    2009-01-01

    The dissociation of [Cu II (L)His] -2+ complexes [L = diethylenetriamine (dien) or 1,4,7-triazacyclononane (9-aneN 3 )] bears a strong resemblance to the previously reported behavior of [Cu II (L)GGH] -2+ complexes. We have used low energy collision-induced dissociation experiments and density functional theory (DFT) calculations at the B3LYP/6-31+G(d) level to study the macrocyclic effect of the auxiliary ligands on the formation of His -+ from prototypical [Cu II (L)His] -2+ systems. DFT revealed that the relative energy barriers of the same electron transfer (ET) dissociation pathways of [Cu II (9-aneN 3 )His] -2+ and [Cu II (dien)His] -2+ are very similar, with the ET reactions of [Cu II (9-aneN 3 )His] -2+ leading to the generation of two distinct His -+ species; in contrast, the proton transfer (PT) dissociation pathways of [Cu II (9-aneN 3 )His] -2+ and [Cu II (dien)His] -2+ differ considerably. The PT reactions of [Cu II (9-aneN 3 )His] -2+ are associated with substantially higher barriers (>13 kcal/mol) than those of [Cu II (dien)His] -2+ . Thus, the sterically encumbered auxiliary 9-aneN3 ligand facilitates ET reactions while moderating PT reactions, allowing the formation of hitherto non-observable histidine radical cations.

  18. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  19. Alveolar bone thickness and lower incisor position in skeletal Class I and Class II malocclusions assessed with cone-beam computed tomography.

    Science.gov (United States)

    Baysal, Asli; Ucar, Faruk Izzet; Buyuk, Suleyman Kutalmis; Ozer, Torun; Uysal, Tancan

    2013-06-01

    To evaluate lower incisor position and bony support between patients with Class II average- and high-angle malocclusions and compare with the patients presenting Class I malocclusions. CBCT records of 79 patients were divided into 2 groups according to sagittal jaw relationships: Class I and II. Each group was further divided into average- and high-angle subgroups. Six angular and 6 linear measurements were performed. Independent samples t-test, Kruskal-Wallis, and Dunn post-hoc tests were performed for statistical comparisons. Labial alveolar bone thickness was significantly higher in Class I group compared to Class II group (p = 0.003). Lingual alveolar bone angle (p = 0.004), lower incisor protrusion (p = 0.007) and proclination (p = 0.046) were greatest in Class II average-angle patients. Spongious bone was thinner (p = 0.016) and root apex was closer to the labial cortex in high-angle subgroups when compared to the Class II average-angle subgroup (p = 0.004). Mandibular anterior bony support and lower incisor position were different between average- and high-angle Class II patients. Clinicians should be aware that the range of lower incisor movement in high-angle Class II patients is limited compared to average- angle Class II patients.

  20. Comparison of the C-MAC video laryngoscope to the Macintosh laryngoscope for intubation of blunt trauma patients in the ED

    Directory of Open Access Journals (Sweden)

    Erkan Goksu

    2016-06-01

    Full Text Available Objectives: We aimed to compare the performance of the C-MAC video laryngoscope (C-MAC to the Macintosh laryngoscope for intubation of blunt trauma patients in the ED. Material and methods: This was a prospective randomized study. The primary outcome measure is overall successful intubation. Secondary outcome measures are first attempt successful intubation, Cormack–Lehane (CL grade, and indicators of the reasons for unsuccessful intubation at the first attempt with each device. Adult patients who suffered from blunt trauma and required intubation were randomized to video laryngoscopy with C-MAC device or direct laryngoscopy (DL. Results: During a 17-month period, a total of 150 trauma intubations were performed using a C-MAC and DL. Baseline characteristics of patients were similar between the C-MAC and DL group. Overall success for the C-MAC was 69/75 (92%, 95% CI 0.83 to 0.96 while for the DL it was 72/75 (96%, 95% CI 0.88 to 0.98. First attempt success for the C-MAC was 47/75 (62.7%, 95% CI 0.51 to 0.72 while for the DL it was 44/75 patients (58.7%, 95% CI 0.47 to 0.69. The mean time to achieve successful intubation was 33.4 ± 2.5 s for the C-MAC versus 42.4 ± 5.1 s for the DL (p = 0.93. There was a statistically significant difference between the DL and C-MAC in terms of visualizing the glottic opening and esophageal intubation in favor of the C-MAC (p = 0.002 and p = 0.013 respectively. Discussion and conclusion: The overall success rates were similar. The C-MAC demonstrated improved glottic view and decrease in esophageal intubation rate. Keywords: Airway management, Emergency medicine, Video laryngoscope

  1. Comparisons of the Pentax-AWS, Glidescope, and Macintosh Laryngoscopes for Intubation Performance during Mechanical Chest Compressions in Left Lateral Tilt: A Randomized Simulation Study of Maternal Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    Sanghyun Lee

    2015-01-01

    Full Text Available Purpose. Rapid advanced airway management is important in maternal cardiopulmonary resuscitation (CPR. This study aimed to compare intubation performances among Pentax-AWS (AWS, Glidescope (GVL, and Macintosh laryngoscope (MCL during mechanical chest compression in 15° and 30° left lateral tilt. Methods. In 19 emergency physicians, a prospective randomized crossover study was conducted to examine the three laryngoscopes. Primary outcomes were the intubation time and the success rate for intubation. Results. The median intubation time using AWS was shorter than that of GVL and MCL in both tilt degrees. The time to visualize the glottic view in GVL and AWS was significantly lower than that of MCL (all P<0.05, whereas there was no significant difference between the two video laryngoscopes (in 15° tilt, P=1; in 30° tilt, P=0.71. The progression of tracheal tube using AWS was faster than that of MCL and GVL in both degrees (all P<0.001. Intubations using AWS and GVL showed higher success rate than that of Macintosh laryngoscopes. Conclusions. The AWS could be an appropriate laryngoscope for airway management of pregnant women in tilt CPR considering intubation time and success rate.

  2. Angiotensin-Converting Inhibitors and Angiotensin II Receptor Blockers and Longitudinal Change in Percent Emphysema on Computed Tomography. The Multi-Ethnic Study of Atherosclerosis Lung Study

    Science.gov (United States)

    Parikh, Megha A.; Aaron, Carrie P.; Hoffman, Eric A.; Schwartz, Joseph E.; Madrigano, Jaime; Austin, John H. M.; Lovasi, Gina; Watson, Karol; Stukovsky, Karen Hinckley

    2017-01-01

    Rationale: Although emphysema on computed tomography (CT) is associated with increased morbidity and mortality in patients with and without spirometrically defined chronic obstructive pulmonary disease, no available medications target emphysema outside of alpha-1 antitrypsin deficiency. Transforming growth factor-β and endothelial dysfunction are implicated in emphysema pathogenesis, and angiotensin II receptor blockers (ARBs) inhibit transforming growth factor-β, improve endothelial function, and restore airspace architecture in murine models. Evidence in humans is, however, lacking. Objectives: To determine whether angiotensin-converting enzyme (ACE) inhibitor and ARB dose is associated with slowed progression of percent emphysema by CT. Methods: The Multi-Ethnic Study of Atherosclerosis researchers recruited participants ages 45–84 years from the general population from 2000 to 2002. Medication use was assessed by medication inventory. Percent emphysema was defined as the percentage of lung regions less than −950 Hounsfield units on CTs. Mixed-effects regression models were used to adjust for confounders. Results: Among 4,472 participants, 12% used an ACE inhibitor and 6% used an ARB at baseline. The median percent emphysema was 3.0% at baseline, and the rate of progression was 0.64 percentage points over a median of 9.3 years. Higher doses of ACE or ARB were independently associated with a slower change in percent emphysema (P = 0.03). Over 10 years, in contrast to a predicted mean increase in percent emphysema of 0.66 percentage points in those who did not take ARBs or ACE inhibitors, the predicted mean increase in participants who used maximum doses of ARBs or ACE inhibitors was 0.06 percentage points (P = 0.01). The findings were of greatest magnitude among former smokers (P emphysema. There was no evidence that ACE inhibitor or ARB dose was associated with decline in lung function. Conclusions: In a large population-based study, ACE

  3. Graphics gems V (Macintosh version)

    CERN Document Server

    Paeth, Alan W

    1995-01-01

    Graphics Gems V is the newest volume in The Graphics Gems Series. It is intended to provide the graphics community with a set of practical tools for implementing new ideas and techniques, and to offer working solutions to real programming problems. These tools are written by a wide variety of graphics programmers from industry, academia, and research. The books in the series have become essential, time-saving tools for many programmers.Latest collection of graphics tips in The Graphics Gems Series written by the leading programmers in the field.Contains over 50 new gems displaying some of t

  4. Computer-based nuclear radiation detection and instrumentation teaching laboratory system

    International Nuclear Information System (INIS)

    Ellis, W.H.; He, Q.

    1993-01-01

    The integration of computers into the University of Florida's Nuclear Engineering Sciences teaching laboratories is based on the innovative use of MacIntosh 2 microcomputers, IEEE-488 (GPIB) communication and control bus system and protocol, compatible modular nuclear instrumentation (NIM) and test equipment, LabVIEW graphics and applications software, with locally prepared, interactive, menu-driven, HyperCard based multi-exercise laboratory instruction sets and procedures. Results thus far have been highly successful with the majority of the laboratory exercises having been implemented

  5. ISORROPIA II: a computationally efficient thermodynamic equilibrium model for K+─Ca²+─Mg²+─NH4+─Na+─SO4²-─NO3-─Cl-─H2O aerosols

    Directory of Open Access Journals (Sweden)

    C. Fountoukis

    2007-09-01

    Full Text Available This study presents ISORROPIA II, a thermodynamic equilibrium model for the K+–Ca2+–Mg2+–NH4+–Na+–SO42−–NO3−–Cl−–H2O aerosol system. A comprehensive evaluation of its performance is conducted against water uptake measurements for laboratory aerosol and predictions of the SCAPE2 thermodynamic module over a wide range of atmospherically relevant conditions. The two models agree well, to within 13% for aerosol water content and total PM mass, 16% for aerosol nitrate and 6% for aerosol chloride and ammonium. Largest discrepancies were found under conditions of low RH, primarily from differences in the treatment of water uptake and solid state composition. In terms of computational speed, ISORROPIA II was more than an order of magnitude faster than SCAPE2, with robust and rapid convergence under all conditions. The addition of crustal species does not slow down the thermodynamic calculations (compared to the older ISORROPIA code because of optimizations in the activity coefficient calculation algorithm. Based on its computational rigor and performance, ISORROPIA II appears to be a highly attractive alternative for use in large scale air quality and atmospheric transport models.

  6. Copper(II) complex with 6-methylpyridine-2-carboxyclic acid: Experimental and computational study on the XRD, FT-IR and UV-Vis spectra, refractive index, band gap and NLO parameters.

    Science.gov (United States)

    Altürk, Sümeyye; Avcı, Davut; Başoğlu, Adil; Tamer, Ömer; Atalay, Yusuf; Dege, Necmi

    2018-02-05

    Crystal structure of the synthesized copper(II) complex with 6-methylpyridine-2-carboxylic acid, [Cu(6-Mepic) 2 ·H 2 O]·H 2 O, was determined by XRD, FT-IR and UV-Vis spectroscopic techniques. Furthermore, the geometry optimization, harmonic vibration frequencies for the Cu(II) complex were carried out by using Density Functional Theory calculations with HSEh1PBE/6-311G(d,p)/LanL2DZ level. Electronic absorption wavelengths were obtained by using TD-DFT/HSEh1PBE/6-311G(d,p)/LanL2DZ level with CPCM model and major contributions were determined via Swizard/Chemissian program. Additionally, the refractive index, linear optical (LO) and non-nonlinear optical (NLO) parameters of the Cu(II) complex were calculated at HSEh1PBE/6-311G(d,p) level. The experimental and computed small energy gap shows the charge transfer in the Cu(II) complex. Finally, the hyperconjugative interactions and intramolecular charge transfer (ICT) were studied by performing of natural bond orbital (NBO) analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Single photon emission computed tomographic studies (SPECT) of hepatic arterial perfusion scintigraphy (HAPS) in patients with colorectal liver metastases: improved tumour targetting by microspheres with angiotensin II.

    Science.gov (United States)

    Goldberg, J A; Bradnam, M S; Kerr, D J; McKillop, J H; Bessent, R G; McArdle, C S; Willmott, N; George, W D

    1987-12-01

    As intra-arterial chemotherapy for liver metastases of colorectal origin becomes accepted, methods of further improving drug delivery to the tumour have been devised. Degradable microspheres have been shown to reduce regional blood flow by transient arteriolar capillary block, thereby improving uptake of a co-administered drug, when injected into the hepatic artery. In our study of five patients, we combined hepatic arterial perfusion scintigraphy (HAPS) and SPECT to assess the localization of approximately 1 X 10(5) labelled microspheres of human serum albumin (99Tcm MSA) in tumour. In addition, in three patients, we assessed the effect of an intra-arterial infusion of the vasoactive agent angiotension II during HAPS. Results were interpreted by comparing transaxial slices with corresponding slices of a tin colloid liver-spleen scan. Two of five patients showed good localization of 99Tcm MSA in tumour without an angiotensin II infusion. Of the three patients receiving angiotensin II, all showed good tumour targetting with the vasoconstrictor compared with only one of these three before its use. Thus, hepatic arterial infusion of angiotensin II greatly improves microsphere localization in tumour in some patients with colorectal liver metastases. This technique may be useful in the assessment of tumour targetting before and during locoregional therapy.

  8. Memorias Conferencia Internacional IEEE Mexico 1971, Sobre Sistemas, Redes Y Computadoras. Volumen I and Volumen II. (Proceedings of International Conference of IEEE Concerning Systems, Networks, and Computers. Volume I and Volume II.

    Science.gov (United States)

    Concheiro, A. Alonso, Ed.; And Others

    The following papers in English from this international conference may be of particular interest to those in the field of education. T. Nakahara, A. Tsukamota, and M. Matsumoto describe a computer-aided design technique for an economical urban cable television system. W. D. Wasson and R. K. Chitkara outline a recognition scheme based on analysis…

  9. Assessment of CREAMS [Chemicals, Runoff, and Erosion from Agricultural Management Systems] and ERHYM-II [Ekalaka Rangeland Hydrology and Yield Model] computer models for simulating soil water movement on the Idaho National Engineering Laboratory

    International Nuclear Information System (INIS)

    Laundre, J.W.

    1990-05-01

    The major goal of radioactive waste management is long-term containment of radioactive waste. Long-term containment is dependent on understanding water movement on, into, and through trench caps. Several computer simulation models are available for predicting water movement. Of the several computer models available, CREAMS (Chemicals, Runoff, and Erosion from Agricultural Management Systems) and ERHYM-II (Ekalaka Rangeland Hydrology and Yield Model) were tested for use on the Idaho National Engineering Laboratory (INEL). The models were calibrated, tested for sensitivity, and used to evaluate some basic trench cap designs. Each model was used to postdict soil moisture, evapotranspiration, and runoff of two watersheds for which such data were already available. Sensitivity of the models was tested by adjusting various input parameters from high to low values and then comparing model outputs to those generated from average values. Ten input parameters of the CREAMS model were tested for sensitivity. 17 refs., 23 figs., 20 tabs

  10. Copper (II)

    African Journals Online (AJOL)

    CLEMENT O BEWAJI

    Valine (2 - amino - 3 – methylbutanoic acid), is a chemical compound containing .... Stability constant (Kf). Gibb's free energy. ) (. 1. −. ∆. Mol. JG. [CuL2(H2O)2] ... synthesis and characterization of Co(ii), Ni(ii), Cu (II), and Zn(ii) complexes with ...

  11. Evaluation of ETOG-3Q, ETOG-3, FLANGE-II, XLACS, NJOY and LINEAR/RECENT/GROUPIE computer codes concerning to the resonance contribution and background cross sections

    International Nuclear Information System (INIS)

    Anaf, J.; Chalhoub, E.S.

    1988-12-01

    The NJOY and LINEAR/RECENT/GROUPIE calculational procedures for the resolved and unresolved resonance contributions and background cross sections are evaluated. Elastic scattering, fission and capture multigroup cross sections generated by these codes and the previously validated ETOG-3Q, ETOG-3, FLANGE-II and XLACS are compared. Constant weighting function and zero Kelvin temperature are considered. Discrepancies are presented and analysed. (author) [pt

  12. Bibliography of mass spectroscopy literature for 1972 compiled by a computer method. Volume II. Key Word Out of Context (KWOC) Index

    International Nuclear Information System (INIS)

    Capellen, J.; Svec, H.J.; Sage, C.R.; Sun, R.

    1975-08-01

    This report covers the year 1972, and lists approximately 10,000 articles of interest to mass spectroscopists. This two-volume report consists of three sections. Vol. II contains the Key Word Out of Context Index (KWOC Index) section. The KWOC Index lists the key words, the reference numbers of the articles in which the key word appears, and the first 100 characters of the title

  13. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-09-01

    This portion of the RELAP4/MOD5 User's Manual presents the details of setting up and entering the reactor model to be evaluated. The input card format and arrangement is presented in depth, including not only cards for data but also those for editing and restarting. Problem initalization including pressure distribution and energy balance is discussed. A section entitled ''User Guidelines'' is included to provide modeling recommendations, analysis and verification techniques, and computational difficulty resolution. The section is concluded with a discussion of the computer output form and format

  14. Computer program for Scatchard analysis of protein: Ligand interaction - use for determination of soluble and nuclear steroid receptor concentrations

    International Nuclear Information System (INIS)

    Leake, R.; Cowan, S.; Eason, R.

    1998-01-01

    Steroid receptor concentration may be determined routinely in biopsy samples of breast and endometrial cancer by the competition method. This method yields data for both the soluble and nuclear fractions of the tissue. The data are usually subject to Scatchard analysis. This Appendix describes a computer program written initially for a PDP-11. It has been modified for use with IBM, Apple Macintosh and BBC microcomputers. The nature of the correction for competition is described and examples of the printout are given. The program is flexible and its use for different receptors is explained. The program can be readily adapted to other assays in which Scatchard analysis is appropriate

  15. CASY: a dynamic simulation of the gas-cooled fast breeder reactor core auxiliary cooling system. Volume II. Example computer run

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    A listing of a CASY computer run is presented. It was initiated from a demand terminal and, therefore, contains the identification ST0952. This run also contains an INDEX listing of the subroutine UPDATE. The run includes a simulated scram transient at 30 seconds.

  16. CASY: a dynamic simulation of the gas-cooled fast breeder reactor core auxiliary cooling system. Volume II. Example computer run

    International Nuclear Information System (INIS)

    1979-09-01

    A listing of a CASY computer run is presented. It was initiated from a demand terminal and, therefore, contains the identification ST0952. This run also contains an INDEX listing of the subroutine UPDATE. The run includes a simulated scram transient at 30 seconds

  17. Relative Effectiveness of Computer-Supported Jigsaw II, STAD and TAI Cooperative Learning Strategies on Performance, Attitude, and Retention of Secondary School Students in Physics

    Science.gov (United States)

    Gambari, Amosa Isiaka; Yusuf, Mudasiru Olalere

    2017-01-01

    This study investigated the relative effectiveness of computer-supported cooperative learning strategies on the performance, attitudes, and retention of secondary school students in physics. A purposive sampling technique was used to select four senior secondary schools from Minna, Nigeria. The students were allocated to one of four groups:…

  18. Computational science - ICCS 2008: 8th international conference, Kraków, Poland, June 23-25, 2008: Proceedings, part II

    NARCIS (Netherlands)

    Bubak, M.; van Albada, G.D.; Dongarra, J.; Sloot, P.M.A.

    2008-01-01

    The three-volume set LNCS 5101-5103 constitutes the refereed proceedings of the 8th International Conference on Computational Science, ICCS 2008, held in Krakow, Poland in June 2008. The 167 revised papers of the main conference track presented together with the abstracts of 7 keynote talks and the

  19. Computation Modeling of Limb-bud Dysmorphogenesis: Predicting Cellular Dynamics and Key Events in Developmental Toxicity with a Multicellular Systems Model (FutureToxII)

    Science.gov (United States)

    Congenital limb malformations are among the most frequent malformation occurs in humans, with a frequency of about 1 in 500 to 1 in 1000 human live births. ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput (HTS) and computational methods that...

  20. Software Reviews. Programs Worth a Second Look.

    Science.gov (United States)

    Schneider, Roxanne; Eiser, Leslie

    1989-01-01

    Reviewed are three computer software packages for use in middle/high school classrooms. Included are "MacWrite II," a word-processing program for MacIntosh computers; "Super Story Tree," a word-processing program for Apple and IBM computers; and "Math Blaster Mystery," for IBM, Apple, and Tandy computers. (CW)

  1. Modelo computacional para suporte à decisão em áreas irrigadas. Parte II: testes e aplicação Computer model for decision support in irrigated areas. Part II: tests and application

    Directory of Open Access Journals (Sweden)

    Paulo A. Ferreira

    2006-12-01

    Full Text Available Apresentou-se, na Parte I desta pesquisa, o desenvolvimento de um modelo computacional denominado MCID, para suporte à tomada de decisão quanto ao planejamento e manejo de projetos de irrigação e/ou drenagem. Objetivou-se, na Parte II, testar e aplicar o MCID. No teste comparativo com o programa DRAINMOD, espaçamentos entre drenos, obtidos com o MCID, foram ligeiramente maiores ou idênticos. Os espaçamentos advindos com o MCID e o DRAINMOD foram consideravelmente maiores que os obtidos por meio de metodologias tradicionais de dimensionamento de sistemas de drenagem. A produtividade relativa total, YRT, obtida com o MCID foi, em geral, inferior à conseguida com o DRAINMOD, devido a diferenças de metodologia ao se estimar a produtividade da cultura em resposta ao déficit hídrico. Na comparação com o programa CROPWAT, obtiveram-se resultados muito próximos para (YRT e evapotranspiração real. O modelo desenvolvido foi aplicado para as condições do Projeto Jaíba, MG, para culturas perenes e anuais cultivadas em diferentes épocas. Os resultados dos testes e aplicações indicaram a potencialidade do MCID como ferramenta de apoio à decisão em projetos de irrigação e/ou drenagem.Part I of this research presented the development of a decision support model, called MCID, for planning and managing irrigation and/or drainage projects. Part II is aimed at testing and applying MCID. In a comparative test with the DRAINMOD model, drain spacings obtained with MCID were slightly larger or identical. The spacings obtained with MCID and DRAINMOD were considerably larger than those obtained through traditional methodologies of design of drainage systems. The relative crop yield (YRT obtained with MCID was, in general, lower than the one obtained with DRAINMOD due to differences in the estimate of crop response to water deficit. In comparison with CROPWAT, very close results for YRT and for actual evapotranspiration were obtained. The

  2. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer

    International Nuclear Information System (INIS)

    Corge, C.

    1967-01-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  3. (II) complexes

    African Journals Online (AJOL)

    activities of Schiff base tin (II) complexes. Neelofar1 ... Conclusion: All synthesized Schiff bases and their Tin (II) complexes showed high antimicrobial and ...... Singh HL. Synthesis and characterization of tin (II) complexes of fluorinated Schiff bases derived from amino acids. Spectrochim Acta Part A: Molec Biomolec.

  4. Can Early Computed Tomography Angiography after Endovascular Aortic Aneurysm Repair Predict the Need for Reintervention in Patients with Type II Endoleak?

    Energy Technology Data Exchange (ETDEWEB)

    Dudeck, O., E-mail: oliver.dudeck@med.ovgu.de [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Schnapauff, D. [Charité Universitätsmedizin Berlin, Department of Radiology (Germany); Herzog, L.; Löwenthal, D.; Bulla, K.; Bulla, B. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Halloul, Z.; Meyer, F. [University of Magdeburg, Department of General, Visceral and Vascular Surgery (Germany); Pech, M. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Gebauer, B. [Charité Universitätsmedizin Berlin, Department of Radiology (Germany); Ricke, J. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany)

    2015-02-15

    PurposeThis study was designed to identify parameters on CT angiography (CTA) of type II endoleaks following endovascular aortic aneurysm repair (EVAR) for abdominal aortic aneurysm (AAA), which can be used to predict the subsequent need for reinterventions.MethodsWe retrospectively identified 62 patients with type II endoleak who underwent early CTA in mean 3.7 ± 1.9 days after EVAR. On the basis of follow-up examinations (mean follow-up period 911 days; range, 373–1,987 days), patients were stratified into two groups: those who did (n = 18) and those who did not (n = 44) require reintervention. CTA characteristics, such as AAA, endoleak, as well as nidus dimensions, patency of the inferior mesenteric artery, number of aortic branch vessels, and the pattern of endoleak appearance, were recorded and correlated with the clinical outcome.ResultsUnivariate and receiver operating characteristic curve regression analyses revealed significant differences between the two groups for the endoleak volume (surveillance group: 1391.6 ± 1427.9 mm{sup 3}; reintervention group: 3227.7 ± 2693.8 mm{sup 3}; cutoff value of 2,386 mm{sup 3}; p = 0.002), the endoleak diameter (13.6 ± 4.3 mm compared with 25.9 ± 9.6 mm; cutoff value of 19 mm; p < 0.0001), the number of aortic branch vessels (2.9 ± 1.2 compared with 4.2 ± 1.4 vessels; p = 0.001), as well as a “complex type” endoleak pattern (13.6 %, n = 6 compared with 44.4 %, n = 8; p = 0.02).ConclusionsEarly CTA can predict the future need for reintervention in patients with type II endoleak. Therefore, treatment decision should be based not only on aneurysm enlargement alone but also on other imaging characteristics.

  5. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  6. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  7. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  8. The Belle II Experiment

    CERN Document Server

    Kahn, J

    2017-01-01

    Set to begin data taking at the end of 2018, the Belle II experiment is the next-generation B-factory experiment hosted at KEK in Tsukuba, Japan. The experiment represents the cumulative effort from the collaboration of experimental and detector physics, computing, and software development. Taking everything learned from the previous Belle experiment, which ran from 1998 to 2010, Belle II aims to probe deeper than ever before into the field of heavy quark physics. By achieving an integrated luminosity of 50 ab−1 and accumulating 50 times more data than the previous experiment across its lifetime, along with a rewritten analysis framework, the Belle II experiment will push the high precision frontier of high energy physics. This paper will give an overview of the key components and development activities that make the Belle II experiment possible.

  9. FRESCO-II: A computer program for analysis of fission product release from spherical HTGR-fuel elements in irradiation and annealing experiments

    International Nuclear Information System (INIS)

    Krohn, H.; Finken, R.

    1983-06-01

    The modular computer code FRESCO has been developed to describe the mechanism of fission product release from a HTGR-Core under accident conditions. By changing some program modules it has been extended to take into account the transport phenomena (i.e. recoil) too, which only occur under reactor operating conditions and during the irradiation experiments. For this report, the release of cesium and strontium from three HTGR-fuel elements has been evaluated and compared with the experimental data. The results show that the measured release can be described by the considered models. (orig.) [de

  10. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation

    International Nuclear Information System (INIS)

    1976-06-01

    A discussion is presented of the use of the RELAP4/MOD5 computer program in simulating the thermal-hydraulic behavior of light-water reactor systems when subjected to postulated transients such as a LOCA, pump failure, or nuclear excursion. The volume is divided into main sections which cover: (1) program description, (2) input data, (3) problem initialization, (4) user guidelines, (5) output discussion, (6) source program description, (7) implementation requirements, (8) data files, (9) description of PLOTR4M, (10) description of STH20, (11) summary flowchart, (12) sample problems, (13) problem definition, and (14) problem input

  11. Relationships (II) of International Classification of High-resolution Computed Tomography for Occupational and Environmental Respiratory Diseases with ventilatory functions indices for parenchymal abnormalities.

    Science.gov (United States)

    Tamura, Taro; Suganuma, Narufumi; Hering, Kurt G; Vehmas, Tapio; Itoh, Harumi; Akira, Masanori; Takashima, Yoshihiro; Hirano, Harukazu; Kusaka, Yukinori

    2015-01-01

    The International Classification of High-Resolution Computed Tomography (HRCT) for Occupational and Environmental Respiratory Diseases (ICOERD) is used to screen and diagnose respiratory illnesses. Using univariate and multivariate analysis, we investigated the relationship between subject characteristics and parenchymal abnormalities according to ICOERD, and the results of ventilatory function tests (VFT). Thirty-five patients with and 27 controls without mineral-dust exposure underwent VFT and HRCT. We recorded all subjects' occupational history for mineral dust exposure and smoking history. Experts independently assessed HRCT using the ICOERD parenchymal abnormalities (Items) grades for well-defined rounded opacities (RO), linear and/or irregular opacities (IR), and emphysema (EM). High-resolution computed tomography showed that 11 patients had RO; 15 patients, IR; and 19 patients, EM. According to the multiple regression model, age and height had significant associations with many indices ventilatory functions such as vital capacity, forced vital capacity, and forced expiratory volume in 1 s (FEV1). The EM summed grades on the upper, middle, and lower zones of the right and left lungs also had significant associations with FEV1 and the maximum mid-expiratory flow rate. The results suggest the ICOERD notation is adequate based on the good and significant multiple regression modeling of ventilatory function with the EM summed grades.

  12. Computational screening of Six Antigens for potential MHC class II restricted epitopes and evaluating its CD4+ T-Cell Responsiveness against Visceral Leishmaniasis

    Directory of Open Access Journals (Sweden)

    Manas Ranjan

    2017-12-01

    Full Text Available Visceral leishmaniasis is one of the most neglected tropical diseases for which no vaccine exists. In spite of extensive efforts, no successful vaccine is available against this dreadful infectious disease. To support the vaccine development, immunoinformatics approach was applied to search for potential MHC-classII restricted epitopes that can activate the immune cells. Initially, a total of 37 epitopes derived from six, stage dependent over expressed antigens were predicted, which were presented by at least 26 diverse MHC class II alleles including: DRB10101, DRB10301, DRB10401, DRB10404, DRB10405, DRB10701, DRB10802, DRB10901, DRB11101, DRB11302, DRB11501, DRB30101, DRB40101, DRB50101, DPA10103-DPB10401, DPA10103-DPB10201, DPA10201-DPB10101, DPA10103-DPB10301_DPB10401, DPA10301-DPB10402, DPA10201-DPB105021, DQA10102-DQB10602, DQA10401-DQB10402, DQA10501-QB10201, DQA10501-DQB10301, DQA10301-DQB10302 and DQA10101-DQB10501. Based on the population coverage analysis and HLA cross presentation ability, six epitopes namely, FDLFLFSNGAVVWWG (P1, YPVYPFLASNAALLN (P2, VYPFLASNAALLNLI (P3, LALLIMLYALIATQF (P4, LIMLYALIATQFSDD (P5, IMLYALIATQFSDDA (P6 were selected for further analysis. Stimulation with synthetic peptide alone or as a cocktail triggered the intracellular IFN-γ production. Moreover, specific IgG class of antibodies was detected in the serum of active VL cases against P1, P4, P and P6 in order to evaluate peptide effect on humoral immune response. Additionally, most of the peptides, except P2, were found to be non-inducer of CD4+ IL-10 against both active VL as well as treated VL subjects. Peptide immunogenicity was validated in BALB/c mice immunized with cocktail of synthetic peptide emulsified in complete Freund’s adjuvant/incomplete Freund’s adjuvant. The immunized splenocytes induced strong spleen cell proliferation upon parasite re-stimulation. Furthermore, an increased IFN-γ, IL-12, IL-17 and IL-22 production augmented with

  13. Conception of a course for professional training and education in the field of computer and mobile forensics: Part II: Android Forensics

    Science.gov (United States)

    Kröger, Knut; Creutzburg, Reiner

    2013-03-01

    The growth of Android in the mobile sector and the interest to investigate these devices from a forensic point of view has rapidly increased. Many companies have security problems with mobile devices in their own IT infrastructure. To respond to these incidents, it is important to have professional trained staff. Furthermore, it is necessary to further train their existing employees in the practical applications of mobile forensics owing to the fact that a lot of companies are trusted with very sensitive data. Inspired by these facts, this paper - a continuation of a paper of January 2012 [1] which showed the conception of a course for professional training and education in the field of computer and mobile forensics - addresses training approaches and practical exercises to investigate Android mobile devices.

  14. A comparison of the COG and MCNP codes in computational neutron capture therapy modeling, Part II: gadolinium neutron capture therapy models and therapeutic effects.

    Science.gov (United States)

    Wangerin, K; Culbertson, C N; Jevremovic, T

    2005-08-01

    The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for gadolinium neutron capture therapy (GdNCT) related modeling. The validity of COG NCT model has been established for this model, and here the calculation was extended to analyze the effect of various gadolinium concentrations on dose distribution and cell-kill effect of the GdNCT modality and to determine the optimum therapeutic conditions for treating brain cancers. The computational results were compared with the widely used MCNP code. The differences between the COG and MCNP predictions were generally small and suggest that the COG code can be applied to similar research problems in NCT. Results for this study also showed that a concentration of 100 ppm gadolinium in the tumor was most beneficial when using an epithermal neutron beam.

  15. Computational electrodynamics in material media with constraint-preservation, multidimensional Riemann solvers and sub-cell resolution - Part II, higher order FVTD schemes

    Science.gov (United States)

    Balsara, Dinshaw S.; Garain, Sudip; Taflove, Allen; Montecinos, Gino

    2018-02-01

    The Finite Difference Time Domain (FDTD) scheme has served the computational electrodynamics community very well and part of its success stems from its ability to satisfy the constraints in Maxwell's equations. Even so, in the previous paper of this series we were able to present a second order accurate Godunov scheme for computational electrodynamics (CED) which satisfied all the same constraints and simultaneously retained all the traditional advantages of Godunov schemes. In this paper we extend the Finite Volume Time Domain (FVTD) schemes for CED in material media to better than second order of accuracy. From the FDTD method, we retain a somewhat modified staggering strategy of primal variables which enables a very beneficial constraint-preservation for the electric displacement and magnetic induction vector fields. This is accomplished with constraint-preserving reconstruction methods which are extended in this paper to third and fourth orders of accuracy. The idea of one-dimensional upwinding from Godunov schemes has to be significantly modified to use the multidimensionally upwinded Riemann solvers developed by the first author. In this paper, we show how they can be used within the context of a higher order scheme for CED. We also report on advances in timestepping. We show how Runge-Kutta IMEX schemes can be adapted to CED even in the presence of stiff source terms brought on by large conductivities as well as strong spatial variations in permittivity and permeability. We also formulate very efficient ADER timestepping strategies to endow our method with sub-cell resolving capabilities. As a result, our method can be stiffly-stable and resolve significant sub-cell variation in the material properties within a zone. Moreover, we present ADER schemes that are applicable to all hyperbolic PDEs with stiff source terms and at all orders of accuracy. Our new ADER formulation offers a treatment of stiff source terms that is much more efficient than previous ADER

  16. A comparative evaluation of Cone Beam Computed Tomography (CBCT) and Multi-Slice CT (MSCT). Part II: On 3D model accuracy

    International Nuclear Information System (INIS)

    Liang Xin; Lambrichts, Ivo; Sun Yi; Denis, Kathleen; Hassan, Bassam; Li Limin; Pauwels, Ruben; Jacobs, Reinhilde

    2010-01-01

    Aim: The study aim was to compare the geometric accuracy of three-dimensional (3D) surface model reconstructions between five Cone Beam Computed Tomography (CBCT) scanners and one Multi-Slice CT (MSCT) system. Materials and methods: A dry human mandible was scanned with five CBCT systems (NewTom 3G, Accuitomo 3D, i-CAT, Galileos, Scanora 3D) and one MSCT scanner (Somatom Sensation 16). A 3D surface bone model was created from the six systems. The reference (gold standard) 3D model was obtained with a high resolution laser surface scanner. The 3D models from the five systems were compared with the gold standard using a point-based rigid registration algorithm. Results: The mean deviation from the gold standard for MSCT was 0.137 mm and for CBCT were 0.282, 0.225, 0.165, 0.386 and 0.206 mm for the i-CAT, Accuitomo, NewTom, Scanora and Galileos, respectively. Conclusion: The results show that the accuracy of CBCT 3D surface model reconstructions is somewhat lower but acceptable comparing to MSCT from the gold standard.

  17. Estimation of energetic efficiency of heat supply in front of the aircraft at supersonic accelerated flight. Part II. Mathematical model of the trajectory boost part and computational results

    Science.gov (United States)

    Latypov, A. F.

    2009-03-01

    The fuel economy was estimated at boost trajectory of aerospace plane during energy supply to the free stream. Initial and final velocities of the flight were given. A model of planning flight above cold air in infinite isobaric thermal wake was used. The comparison of fuel consumption was done at optimal trajectories. The calculations were done using a combined power plant consisting of ramjet and liquid-propellant engine. An exergy model was constructed in the first part of the paper for estimating the ramjet thrust and specific impulse. To estimate the aerodynamic drag of aircraft a quadratic dependence on aerodynamic lift is used. The energy for flow heating is obtained at the sacrifice of an equivalent decrease of exergy of combustion products. The dependencies are obtained for increasing the range coefficient of cruise flight at different Mach numbers. In the second part of the paper, a mathematical model is presented for the boost part of the flight trajectory of the flying vehicle and computational results for reducing the fuel expenses at the boost trajectory at a given value of the energy supplied in front of the aircraft.

  18. Launch Site Computer Simulation and its Application to Processes

    Science.gov (United States)

    Sham, Michael D.

    1995-01-01

    This paper provides an overview of computer simulation, the Lockheed developed STS Processing Model, and the application of computer simulation to a wide range of processes. The STS Processing Model is an icon driven model that uses commercial off the shelf software and a Macintosh personal computer. While it usually takes one year to process and launch 8 space shuttles, with the STS Processing Model this process is computer simulated in about 5 minutes. Facilities, orbiters, or ground support equipment can be added or deleted and the impact on launch rate, facility utilization, or other factors measured as desired. This same computer simulation technology can be used to simulate manufacturing, engineering, commercial, or business processes. The technology does not require an 'army' of software engineers to develop and operate, but instead can be used by the layman with only a minimal amount of training. Instead of making changes to a process and realizing the results after the fact, with computer simulation, changes can be made and processes perfected before they are implemented.

  19. Integrating Xgrid into the HENP distributed computing model

    International Nuclear Information System (INIS)

    Hajdu, L; Lauret, J; Kocoloski, A; Miller, M

    2008-01-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology

  20. Integrating Xgrid into the HENP distributed computing model

    Science.gov (United States)

    Hajdu, L.; Kocoloski, A.; Lauret, J.; Miller, M.

    2008-07-01

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  1. Computational analysis of neutronic parameters for TRIGA Mark-II research reactor using evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3

    International Nuclear Information System (INIS)

    Altaf, M.H.; Badrun, N.H.; Chowdhury, M.T.

    2015-01-01

    Highlights: • SRAC-PIJ code and SRAC-CITATION have been utilized to model the core. • Most of the simulated results show no significant differences with references. • Thermal peak flux varies a bit due to up condition of TRIGA. • ENDF/B-VII.0 and JENDL-3.3 libraries perform well for neutronics analysis of TRIGA. - Abstract: Important kinetic parameters such as effective multiplication factor, k eff , excess reactivity, neutron flux and power distribution, and power peaking factors of TRIGA Mark II research reactor in Bangladesh have been calculated using the comprehensive neutronics calculation code system SRAC 2006 with the evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3. In the code system, PIJ code was employed to obtain cross section of the core cells, followed by the integral calculation of neutronic parameters of the reactor conducted by CITATION code. All the analyses were performed using the 7-group macroscopic cross section library. Results were compared to the experimental data, the safety analysis report (SAR) of the reactor provided by General Atomic as well as to the simulated values by numerically benchmarked MCNP4C, WIMS-CITATION and SRAC-CITATION codes. The maximum power densities at the hot spot were found to be 169.7 W/cc and 170.1 W/cc for data libraries ENDF/B-VII.0 and JENDL-3.3, respectively. Similarly, the total peaking factors based on ENDF/B-VII.0 and JENDL-3.3 were calculated as 5.68 and 5.70, respectively, which were compared to the original SAR value of 5.63, as well as to MCNP4C, WIMS-CITATION and SRAC-CITATION results. It was found in most cases that the calculated results demonstrate a good agreement with our experiments and published works. Therefore, this analysis benchmarks the code system and will be helpful to enhance further neutronics and thermal hydraulics study of the reactor

  2. Computationally efficient analysis of particle transport and deposition in a human whole-lung-airway model. Part II: Dry powder inhaler application.

    Science.gov (United States)

    Kolanjiyil, Arun V; Kleinstreuer, Clement; Sadikot, Ruxana T

    2017-05-01

    Pulmonary drug delivery is becoming a favored route for administering drugs to treat both lung and systemic diseases. Examples of lung diseases include asthma, cystic fibrosis and chronic obstructive pulmonary disease (COPD) as well as respiratory distress syndrome (ARDS) and pulmonary fibrosis. Special respiratory drugs are administered to the lungs, using an appropriate inhaler device. Next to the pressurized metered-dose inhaler (pMDI), the dry powder inhaler (DPI) is a frequently used device because of the good drug stability and a minimal need for patient coordination. Specific DPI-designs and operations greatly affect drug-aerosol formation and hence local lung deposition. Simulating the fluid-particle dynamics after use of a DPI allows for the assessment of drug-aerosol deposition and can also assist in improving the device configuration and operation. In Part I of this study a first-generation whole lung-airway model (WLAM) was introduced and discussed to analyze particle transport and deposition in a human respiratory tract model. In the present Part II the drug-aerosols are assumed to be injected into the lung airways from a DPI mouth-piece, forming the mouth-inlet. The total as well as regional particle depositions in the WLAM, as inhaled from a DPI, were successfully compared with experimental data sets reported in the open literature. The validated modeling methodology was then employed to study the delivery of curcumin aerosols into lung airways using a commercial DPI. Curcumin has been implicated to possess high therapeutic potential as an antioxidant, anti-inflammatory and anti-cancer agent. However, efficacy of curcumin treatment is limited because of the low bioavailability of curcumin when ingested. Hence, alternative drug administration techniques, e.g., using inhalable curcumin-aerosols, are under investigation. Based on the present results, it can be concluded that use of a DPI leads to low lung deposition efficiencies because large amounts of

  3. Graphics gems II

    CERN Document Server

    Arvo, James

    1991-01-01

    Graphics Gems II is a collection of articles shared by a diverse group of people that reflect ideas and approaches in graphics programming which can benefit other computer graphics programmers.This volume presents techniques for doing well-known graphics operations faster or easier. The book contains chapters devoted to topics on two-dimensional and three-dimensional geometry and algorithms, image processing, frame buffer techniques, and ray tracing techniques. The radiosity approach, matrix techniques, and numerical and programming techniques are likewise discussed.Graphics artists and comput

  4. Comparative study of open and arthroscopic coracoid transfer for shoulder anterior instability (Latarjet)-computed tomography evaluation at a short term follow-up. Part II.

    Science.gov (United States)

    Kordasiewicz, Bartłomiej; Kicinski, Maciej; Małachowski, Konrad; Wieczorek, Janusz; Chaberek, Sławomir; Pomianowski, Stanisław

    2018-05-01

    The aim of this study was to evaluate and to compare the radiological parameters after arthroscopic and open Latarjet technique via evaluation of computed tomography (CT) scans. Our hypothesis was that the radiological results after arthroscopic stabilisation remained in the proximity of those results achieved after open stabilisation. CT scan evaluation results of patients after primary Latarjet procedure were analysed. Patients operated on between 2006 and 2011 using an open technique composed the OPEN group and patients operated on arthroscopically between 2011 and 2013 composed the ARTHRO group. Forty-three out of 55 shoulders (78.2%) in OPEN and 62 out of 64 shoulders (95.3%) in ARTHRO were available for CT scan evaluation. The average age at surgery was 28 years in OPEN and 26 years in ARTHRO. The mean follow-up was 54.2 months in OPEN and 23.4 months in ARTHRO. CT scan evaluation was used to assess graft fusion and osteolysis. Bone block position and screw orientation were assessed in the axial and the sagittal views. The subscapularis muscle fatty infiltration was evaluated according to Goutallier classification. The non-union rate was significantly higher in OPEN than in ARTHRO: 5 (11.9%) versus 1 (1.7%) (p OPEN group: five cases (11.9%) versus zero in ARTHRO (p OPEN group (p > 0.05). These results should be evaluated very carefully due to significant difference in the follow-up of both groups. A significantly higher rate of partial graft osteolysis at the level of the superior screw was reported in ARTHRO with 32 patients (53.3%) versus 10 (23.8%) in OPEN (p OPEN had the coracoid bone block in an acceptable position (between 4 mm medially and 2 mm laterally). In the sagittal plane, the bone block was in an acceptable position between 2 and 5 o'clock in 86.7% of patients in ARTHRO and 90.2% in OPEN (p > 0.05). However, in the position between 3 and 5 o'clock there were 56.7% of the grafts in ARTHRO versus 87.8% in OPEN (p OPEN group

  5. Computational Analysis of Nuclear Safety Parameters of 3 MW TRIGA Mark-II Research Reactor Based on Evaluated Nuclear Data Libraries JENDL-3.3 and ENDF/B-VII.0

    International Nuclear Information System (INIS)

    Khan, Jahirul Haque

    2013-01-01

    The objective of this study is to explain the main nuclear safety parameters of 3 MW TRIGA Mark-II Research Reactor at AERE, Savar, Dhaka, Bangladesh from the viewpoint of reactor safety and also reactor operator. The most important nuclear reactor physics safety parameters are power distribution, power peaking factors, shutdown margin, control rod worth, excess reactivity and fuel temperature reactivity coefficient. These parameters are calculated using the chain of the computer codes the SRAC-PIJ for cell calculation based on neutron transport theory and the SRAC-CITATION for core calculation based on neutron diffusion equation. To achieve this objective the TRIGA model is developed by the 3-D diffusion code SRAC-CITATION based on the group constants that come from the collision probability transport code SRAC-PIJ. In this study the evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 are used. The calculated most important reactor physics parameters are compared to the safety analysis report (SAR) values as well as earlier published MCNP results (numerically benchmark). It was found that the calculated results show a good agreement between the said libraries. Besides, in most cases the calculated results reveal a reasonable agreement with the SAR values (by General Atomic) as well as the MCNP results. In addition, this analysis can be used as the inputs for thermal-hydraulic calculations of the TRIGA fresh core in the steady state and pulse mode operation. Because of power peaking factors, power distributions and temperature reactivity coefficients are the most important reactor safety parameters for normal operation and transient safety analysis in research as well as in power reactors. They form the basis for technical specifications and limitations for reactor operation such as loading pattern limitations for pulse operation (in TRIGA). Therefore, this analysis will be very important to develop the nuclear safety parameters data of 3 MW TRIGA Mark-II

  6. Accelerators and Beams, multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R.R.; Browman, A.A.; Mead, W.C.; Williams, R.A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user close-quote s rate of learning and length of retention of the material. They integrate interactive On-Screen Laboratories, hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer. copyright 1999 American Institute of Physics

  7. ''Accelerators and Beams,'' multimedia computer-based training in accelerator physics

    International Nuclear Information System (INIS)

    Silbar, R. R.; Browman, A. A.; Mead, W. C.; Williams, R. A.

    1999-01-01

    We are developing a set of computer-based tutorials on accelerators and charged-particle beams under an SBIR grant from the DOE. These self-paced, interactive tutorials, available for Macintosh and Windows platforms, use multimedia techniques to enhance the user's rate of learning and length of retention of the material. They integrate interactive ''On-Screen Laboratories,'' hypertext, line drawings, photographs, two- and three-dimensional animations, video, and sound. They target a broad audience, from undergraduates or technicians to professionals. Presently, three modules have been published (Vectors, Forces, and Motion), a fourth (Dipole Magnets) has been submitted for review, and three more exist in prototype form (Quadrupoles, Matrix Transport, and Properties of Charged-Particle Beams). Participants in the poster session will have the opportunity to try out these modules on a laptop computer

  8. EASI graphics - Version II

    International Nuclear Information System (INIS)

    Allensworth, J.A.

    1984-04-01

    EASI (Estimate of Adversary Sequence Interruption) is an analytical technique for measuring the effectiveness of physical protection systems. EASI Graphics is a computer graphics extension of EASI which provides a capability for performing sensitivity and trade-off analyses of the parameters of a physical protection system. This document reports on the implementation of the Version II of EASI Graphics and illustrates its application with some examples. 5 references, 15 figures, 6 tables

  9. Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II binary complexes of l-methionine in 1,2-propanediol-water mixtures

    Directory of Open Access Journals (Sweden)

    M. Padma Latha

    2007-04-01

    Full Text Available Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-methionine in 0.0-60 % v/v 1,2-propanediol-water mixtures maintaining an ionic strength of 0.16 M at 303 K has been studied pH metrically. The active forms of ligand are LH2+, LH and L-. The predominant species detected are ML, MLH, ML2, ML2H, ML2H2 and MLOH. Models containing different numbers of species were refined by using the computer program MINIQUAD 75. The best-fit chemical models were arrived at based on statistical parameters. The trend in variation of complex stability constants with change in the dielectric constant of the medium is explained on the basis of electrostatic and non-electrostatic forces.

  10. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  11. Desktop Social Science: Coming of Age.

    Science.gov (United States)

    Dwyer, David C.; And Others

    Beginning in 1985, Apple Computer, Inc. and several school districts began a collaboration to examine the impact of intensive computer use on instruction and learning in K-12 classrooms. This paper follows the development of a Macintosh II-based management and retrieval system for text data undertaken to store and retrieve oral reflections of…

  12. Visualizing Infrared (IR) Spectroscopy with Computer Animation

    Science.gov (United States)

    Abrams, Charles B.; Fine, Leonard W.

    1996-01-01

    IR Tutor, an interactive, animated infrared (IR) spectroscopy tutorial has been developed for Macintosh and IBM-compatible computers. Using unique color animation, complicated vibrational modes can be introduced to beginning students. Rules governing the appearance of IR absorption bands become obvious because the vibrational modes can be visualized. Each peak in the IR spectrum is highlighted, and the animation of the corresponding normal mode can be shown. Students can study each spectrum stepwise, or click on any individual peak to see its assignment. Important regions of each spectrum can be expanded and spectra can be overlaid for comparison. An introduction to the theory of IR spectroscopy is included, making the program a complete instructional package. Our own success in using this software for teaching and research in both academic and industrial environments will be described. IR Tutor consists of three sections: (1) The 'Introduction' is a review of basic principles of spectroscopy. (2) 'Theory' begins with the classical model of a simple diatomic molecule and is expanded to include larger molecules by introducing normal modes and group frequencies. (3) 'Interpretation' is the heart of the tutorial. Thirteen IR spectra are analyzed in detail, covering the most important functional groups. This section features color animation of each normal mode, full interactivity, overlay of related spectra, and expansion of important regions. This section can also be used as a reference.

  13. Predictive Models and Computational Toxicology (II IBAMTOX)

    Science.gov (United States)

    EPA’s ‘virtual embryo’ project is building an integrative systems biology framework for predictive models of developmental toxicity. One schema involves a knowledge-driven adverse outcome pathway (AOP) framework utilizing information from public databases, standardized ontologies...

  14. Computer-Controlled Force Generator, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — TDA Research, Inc. is developing a compact, low power, Next-Generation Exercise Device (NGRED) that can generate any force between 5 and 600 lbf. We use a closed...

  15. TBscore II

    DEFF Research Database (Denmark)

    Rudolf, Frauke; Lemvik, Grethe; Abate, Ebba

    2013-01-01

    Abstract Background: The TBscore, based on simple signs and symptoms, was introduced to predict unsuccessful outcome in tuberculosis patients on treatment. A recent inter-observer variation study showed profound variation in some variables. Further, some variables depend on a physician assessing...... them, making the score less applicable. The aim of the present study was to simplify the TBscore. Methods: Inter-observer variation assessment and exploratory factor analysis were combined to develop a simplified score, the TBscore II. To validate TBscore II we assessed the association between start...

  16. Optical RISC computer

    Science.gov (United States)

    Guilfoyle, Peter S.; Stone, Richard V.; Hessenbruch, John M.; Zeise, Frederick F.

    1993-07-01

    A second generation digital optical computer (DOC II) has been developed which utilizes a RISC based operating system as its host. This 32 bit, high performance (12.8 GByte/sec), computing platform demonstrates a number of basic principals that are inherent to parallel free space optical interconnects such as speed (up to 1012 bit operations per second) and low power 1.2 fJ per bit). Although DOC II is a general purpose machine, special purpose applications have been developed and are currently being evaluated on the optical platform.

  17. Pb II

    African Journals Online (AJOL)

    Windows User

    This investigation describes the use of non-living biomass of Aspergillus caespitosus for removal of ... Pb(II) production has exceeded 3.5 million tons per year. It has been used in the ... This biomass was selected after screening a wide range of microbes. .... prolonged, which proved better biopolymer in metal uptake (Gadd ...

  18. UIMX: A User Interface Management System For Scientific Computing With X Windows

    Science.gov (United States)

    Foody, Michael

    1989-09-01

    Applications with iconic user interfaces, (for example, interfaces with pulldown menus, radio buttons, and scroll bars), such as those found on Apple's Macintosh computer and the IBM PC under Microsoft's Presentation Manager, have become very popular, and for good reason. They are much easier to use than applications with traditional keyboard-oriented interfaces, so training costs are much lower and just about anyone can use them. They are standardized between applications, so once you learn one application you are well along the way to learning another. The use of one reinforces the common elements between applications of the interface, and, as a result, you remember how to use them longer. Finally, for the developer, their support costs can be much lower because of their ease of use.

  19. Overview of the DIII-D program computer systems

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1997-11-01

    Computer systems pervade every aspect of the DIII-D National Fusion Research program. This includes real-time systems acquiring experimental data from data acquisition hardware; cpu server systems performing short term and long term data analysis; desktop activities such as word processing, spreadsheets, and scientific paper publication; and systems providing mechanisms for remote collaboration. The DIII-D network ties all of these systems together and connects to the ESNET wide area network. This paper will give an overview of these systems, including their purposes and functionality and how they connect to other systems. Computer systems include seven different types of UNIX systems (HP-UX, REALIX, SunOS, Solaris, Digital UNIX, Ultrix, and IRIX), OpenVMS systems (both BAX and Alpha), MACintosh, Windows 95, and more recently Windows NT systems. Most of the network internally is ethernet with some use of FDDI. A T3 link connects to ESNET and thus to the Internet. Recent upgrades to the network have notably improved its efficiency, but the demand for bandwidth is ever increasing. By means of software and mechanisms still in development, computer systems at remote sites are playing an increasing role both in accessing and analyzing data and even participating in certain controlling aspects for the experiment. The advent of audio/video over the interest is now presenting a new means for remote sites to participate in the DIII-D program

  20. Comparison of the force applied on oral structures during intubation attempts by novice physicians between the Macintosh direct laryngoscope, Airway Scope and C-MAC PM: a high-fidelity simulator-based study.

    Science.gov (United States)

    Nakanishi, Taizo; Shiga, Takashi; Homma, Yosuke; Koyama, Yasuaki; Goto, Tadahiro

    2016-05-23

    We examined whether the use of Airway Scope (AWS) and C-MAC PM (C-MAC) decreased the force applied on oral structures during intubation attempts as compared with the force applied with the use of Macintosh direct laryngoscope (DL). Prospective cross-over study. A total of 35 novice physicians participated. We used 6 simulation scenarios based on the difficulty of intubation and intubation devices. Our primary outcome measures were the maximum force applied on the maxillary incisors and tongue during intubation attempts, measured by a high-fidelity simulator. The maximum force applied on maxillary incisors was higher with the use of the C-MAC than with the DL and AWS in the normal airway scenario (DL, 26 Newton (N); AWS, 18 N; C-MAC, 52 N; p<0.01) and the difficult airway scenario (DL, 42 N; AWS, 24 N; C-MAC, 68 N; p<0.01). In contrast, the maximum force applied on the tongue was higher with the use of the DL than with the AWS and C-MAC in both airway scenarios (DL, 16 N; AWS, 1 N; C-MAC, 7 N; p<0.01 in the normal airway scenario; DL, 12 N; AWS, 4 N; C-MAC, 7 N; p<0.01 in the difficult airway scenario). The use of C-MAC, compared with the DL and AWS, was associated with the higher maximum force applied on maxillary incisors during intubation attempts. In contrast, the use of video laryngoscopes was associated with the lower force applied on the tongue in both airway scenarios, compared with the DL. Our study was a simulation-based study, and further research on living patients would be warranted. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  1. TJ-II Library Manual

    International Nuclear Information System (INIS)

    Tribaldos, V.; Milligen, B.Ph., van; Lopez-Fraguas, A.

    1996-01-01

    This report contains a detailed description of the TJ-II library and its routines. The library is written in FORTRAN 77 language and is available in the CRAY J916 and DEC Alpha 8400 computers at CIEMAT. This document also contains some examples of its use. (Author)

  2. TJ-II Library Manual (Version 2)

    International Nuclear Information System (INIS)

    Tribaldos, V.; Milligen, B. Ph. van; Lopez-Fraguas, A.

    2001-01-01

    This is a manual of use of the TJ2 Numerical Library that has been developed for making numerical computations of different TJ-II configurations. This manual is a new version of the earlier manual CIEMAT report 806. (Author)

  3. Computer assisted spirometry.

    Science.gov (United States)

    Hansen, D J; Toy, V M; Deininger, R A; Collopy, T K

    1983-06-01

    Three of the most popular microcomputers, the TRS-80 Model I, the APPLE II+, and the IBM Personal Computer were connected to a spirometer for data acquisition and analysis. Simple programs were written which allow the collection, analysis and storage of the data produced during spirometry. Three examples demonstrate the relative ease for automating spirometers.

  4. Computer Series, 38.

    Science.gov (United States)

    Moore, John W., Ed.

    1983-01-01

    Discusses numerical solution of the one-dimension Schrodinger equation. A PASCAL computer program for the Apple II which performs the calculations is available from the authors. Also discusses quantization and perturbation theory using microcomputers, indicating benefits of using the addition of a perturbation term to harmonic oscillator as an…

  5. Small Diameter Bomb Increment II (SDB II)

    Science.gov (United States)

    2015-12-01

    Selected Acquisition Report (SAR) RCS: DD-A&T(Q&A)823-439 Small Diameter Bomb Increment II (SDB II) As of FY 2017 President’s Budget Defense... Bomb Increment II (SDB II) DoD Component Air Force Joint Participants Department of the Navy Responsible Office References SAR Baseline (Production...Mission and Description Small Diameter Bomb Increment II (SDB II) is a joint interest United States Air Force (USAF) and Department of the Navy

  6. Office 2011 for Macintosh The Missing Manual

    CERN Document Server

    Grover, Chris

    2010-01-01

    Office 2011 for Mac is easy to use, but to unleash its full power, you need to go beyond the basics. This entertaining guide not only gets you started with Word, Excel, PowerPoint, and the new Outlook for Mac, it also reveals useful lots of things you didn't know the software could do. Get crystal-clear explanations on the features you use most -- and plenty of power-user tips when you're ready for more. Take advantage of new tools. Navigate with the Ribbon, use SmartArt graphics, and work online with Office Web Apps.Create professional-looking documents. Use Word to craft beautiful reports,

  7. Improved selectivity for Pb(II) by sulfur, selenium and tellurium analogues of 1,8-anthraquinone-18-crown-5: synthesis, spectroscopy, X-ray crystallography and computational studies.

    Science.gov (United States)

    Mariappan, Kadarkaraisamy; Alaparthi, Madhubabu; Hoffman, Mariah; Rama, Myriam Alcantar; Balasubramanian, Vinothini; John, Danielle M; Sykes, Andrew G

    2015-07-14

    We report here a series of heteroatom-substituted macrocycles containing an anthraquinone moiety as a fluorescent signaling unit and a cyclic polyheteroether chain as the receptor. Sulfur, selenium, and tellurium derivatives of 1,8-anthraquinone-18-crown-5 (1) were synthesized by reacting sodium sulfide (Na2S), sodium selenide (Na2Se) and sodium telluride (Na2Te) with 1,8-bis(2-bromoethylethyleneoxy)anthracene-9,10-dione in a 1 : 1 ratio. The optical properties of the new compounds are examined and the sulfur and selenium analogues produce an intense green emission enhancement upon association with Pb(II) in acetonitrile. Selectivity for Pb(II) is markedly improved as compared to the oxygen analogue 1 which was also competitive for Ca(II) ion. UV-Visible and luminescence titrations reveal that 2 and 3 form 1 : 1 complexes with Pb(II), confirmed by single-crystal X-ray studies where Pb(II) is complexed within the macrocycle through coordinate covalent bonds to neighboring carbonyl, ether and heteroether donor atoms. Cyclic voltammetry of 2-8 showed classical, irreversible oxidation potentials for sulfur, selenium and tellurium heteroethers in addition to two one-electron reductions for the anthraquinone carbonyl groups. DFT calculations were also conducted on 1, 2, 3, 6, 6 + Pb(II) and 6 + Mg(II) to determine the trend in energies of the HOMO and the LUMO levels along the series.

  8. Integrating Xgrid into the HENP distributed computing model

    Energy Technology Data Exchange (ETDEWEB)

    Hajdu, L; Lauret, J [Brookhaven National Laboratory, Upton, NY 11973 (United States); Kocoloski, A; Miller, M [Department of Physics, Massachusetts Institute of Technology, Cambridge, MA 02139 (United States)], E-mail: kocolosk@mit.edu

    2008-07-15

    Modern Macintosh computers feature Xgrid, a distributed computing architecture built directly into Apple's OS X operating system. While the approach is radically different from those generally expected by the Unix based Grid infrastructures (Open Science Grid, TeraGrid, EGEE), opportunistic computing on Xgrid is nonetheless a tempting and novel way to assemble a computing cluster with a minimum of additional configuration. In fact, it requires only the default operating system and authentication to a central controller from each node. OS X also implements arbitrarily extensible metadata, allowing an instantly updated file catalog to be stored as part of the filesystem itself. The low barrier to entry allows an Xgrid cluster to grow quickly and organically. This paper and presentation will detail the steps that can be taken to make such a cluster a viable resource for HENP research computing. We will further show how to provide to users a unified job submission framework by integrating Xgrid through the STAR Unified Meta-Scheduler (SUMS), making tasks and jobs submission effortlessly at reach for those users already using the tool for traditional Grid or local cluster job submission. We will discuss additional steps that can be taken to make an Xgrid cluster a full partner in grid computing initiatives, focusing on Open Science Grid integration. MIT's Xgrid system currently supports the work of multiple research groups in the Laboratory for Nuclear Science, and has become an important tool for generating simulations and conducting data analyses at the Massachusetts Institute of Technology.

  9. Tomo II

    OpenAIRE

    Llano Zapata, José Eusebio

    2015-01-01

    Memorias, histórico, físicas, crítico, apologéticas de la América Meridional con unas breves advertencias y noticias útiles, a los que de orden de Su Majestad hubiesen de viajar y describir aquellas vastas regiones. Reino Vegetal, Tomo II. Por un anónimo americano en Cádiz por los años de 1757. Muy Señor mío, juzgo que los 20 artículos del libro que remití a Vuestra Merced le habrán hecho formar el concepto que merece la fecundidad de aquellos países en las producciones minerales. Y siendo es...

  10. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  11. Project Final Report: HPC-Colony II

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Terry R [ORNL; Kale, Laxmikant V [University of Illinois, Urbana-Champaign; Moreira, Jose [IBM T. J. Watson Research Center

    2013-11-01

    This report recounts the HPC Colony II Project which was a computer science effort funded by DOE's Advanced Scientific Computing Research office. The project included researchers from ORNL, IBM, and the University of Illinois at Urbana-Champaign. The topic of the effort was adaptive system software for extreme scale parallel machines. A description of findings is included.

  12. 25 CFR 502.3 - Class II gaming.

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Class II gaming. 502.3 Section 502.3 Indians NATIONAL INDIAN GAMING COMMISSION, DEPARTMENT OF THE INTERIOR GENERAL PROVISIONS DEFINITIONS OF THIS CHAPTER § 502.3 Class II gaming. Class II gaming means: (a) Bingo or lotto (whether or not electronic, computer...

  13. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  14. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  15. HEXAGA-II. A two-dimensional multi-group neutron diffusion programme for a uniform triangular mesh with arbitrary group scattering for the IBM/370-168 computer

    International Nuclear Information System (INIS)

    Woznicki, Z.

    1976-05-01

    This report presents the AGA two-sweep iterative methods belonging to the family of factorization techniques in their practical application in the HEXAGA-II two-dimensional programme to obtain the numerical solution to the multi-group, time-independent, (real and/or adjoint) neutron diffusion equations for a fine uniform triangular mesh. An arbitrary group scattering model is permitted. The report written for the users provides the description of input and output. The use of HEXAGA-II is illustrated by two sample reactor problems. (orig.) [de

  16. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  17. RTNS-II: present status

    International Nuclear Information System (INIS)

    Heikkinen, D.W.; Logan, C.M.

    1980-10-01

    The present status of the RTNS-II facility is described and typical operating parameters are given. A brief discussion is given of the methods used in production of the TiT 2 targets as well as their performance and tritium handling at RTNS-II. The various types of non-interactive beam diagnostics presently in use at the neutron sources are outlined. The on-line computer system which provides a time history of an irradiation and records target performance is described. Examples are listed of several representative experimental programs which have been carried out thus far at RTNS-II. These include both active and passive experiments. Finally, several of the major improvements to the facility made since the beginning of the experimental program are given

  18. Overhead Crane Computer Model

    Science.gov (United States)

    Enin, S. S.; Omelchenko, E. Y.; Fomin, N. V.; Beliy, A. V.

    2018-03-01

    The paper has a description of a computer model of an overhead crane system. The designed overhead crane system consists of hoisting, trolley and crane mechanisms as well as a payload two-axis system. With the help of the differential equation of specified mechanisms movement derived through Lagrange equation of the II kind, it is possible to build an overhead crane computer model. The computer model was obtained using Matlab software. Transients of coordinate, linear speed and motor torque of trolley and crane mechanism systems were simulated. In addition, transients of payload swaying were obtained with respect to the vertical axis. A trajectory of the trolley mechanism with simultaneous operation with the crane mechanism is represented in the paper as well as a two-axis trajectory of payload. The designed computer model of an overhead crane is a great means for studying positioning control and anti-sway control systems.

  19. Quantum computational webs

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2010-01-01

    We discuss the notion of quantum computational webs: These are quantum states universal for measurement-based computation, which can be built up from a collection of simple primitives. The primitive elements--reminiscent of building blocks in a construction kit--are (i) one-dimensional states (computational quantum wires) with the power to process one logical qubit and (ii) suitable couplings, which connect the wires to a computationally universal web. All elements are preparable by nearest-neighbor interactions in a single pass, of the kind accessible in a number of physical architectures. We provide a complete classification of qubit wires, a physically well-motivated class of universal resources that can be fully understood. Finally, we sketch possible realizations in superlattices and explore the power of coupling mechanisms based on Ising or exchange interactions.

  20. Cd(II), Cu(II)

    African Journals Online (AJOL)

    user

    Depending on the way goethite was pretreated with oxalic acid, affinity for Cd(II) varied ...... Effects and mechanisms of oxalate on Cd(II) adsorption on goethite at different ... precipitation, surfactant mediation, hydrothermal and micro-emulsion.

  1. FRM-II conversion revisited

    International Nuclear Information System (INIS)

    Glaser, A.; Pistner, C.; Liebert, W.

    2000-01-01

    The possibilities for a conversion of the currently constructed research reactor FRM-II has been extensively discussed at various RERTR meetings over the past years. In order to support the ongoing decision-making process in Germany, we prepared computer simulations providing extra information on the scientific usability of the converted reactor based on designs proposed by ANL and TUM. The most important results of these calculations are presented and discussed. Special attention is thereby given to the specific German context. (author)

  2. TRANSWRAP II: problem definition manual

    International Nuclear Information System (INIS)

    Knittle, D.E.

    1981-02-01

    The TRANSWRAP II computer code, written in Fortran IV and described in this Problem Definition Manual, was developed to analytically predict the magnitude of pressure pulses of large scale sodium-wate reactions in LMFBR secondary systems. It is currently being used for the Clinch River Breeder Reactor Program. The code provides the options, flexibility and features necessary to consider any system configuration. The code methodology has been validated with the aid of extensive sodium-water reaction test programs

  3. PANDA II

    International Nuclear Information System (INIS)

    Rabou, L.P.L.M.; Zwart, P.; Langedijk, G.J.; Mijnarends, P.E.

    1988-11-01

    A full account is given of the design, construction and operation of an experimental apparatus for the measurement of the angular correlation of positron-annihilation radiation in two dimensions (2D ACAR). The 2D ACAR technique is insensitive to the electronic mean-free-path and can be applied to pure metals as well as non-dilute alloys and compounds to obtain valuable information on the band structure and Fermi-surface geometry. The apparatus consists of two 30 x 30 cm 2 hybrid (high-density) multi-wire-chamber γ detectors at variable distances from 5 to 12 m at opposite sides of a variable temperature cryostat which contains a 6.5 T superconducting magnet. The detectors, the coded centre-of-gravity position read-out method employed, the associated electronics and computer software are described in detail. Operational characteristics and results are presented. A net detection efficiency of 6% and an angular resolution of 0.21 x 0.31 mrad 2 (0.029 x 0.042 a.u. 2 ) are reproducibly obtained. Parameters affecting the performance of the system are discussed. An improvement of the efficiency to over 10% by relatively simple measures is foreseen

  4. Cu(II) AND Zn(II)

    African Journals Online (AJOL)

    Preferred Customer

    SYNTHESIS OF 2,2-DIMETHYL-4-PHENYL-[1,3]-DIOXOLANE USING ZEOLITE. ENCAPSULATED Co(II), Cu(II) AND Zn(II) COMPLEXES. B.P. Nethravathi1, K. Rama Krishna Reddy2 and K.N. Mahendra1*. 1Department of Chemistry, Bangalore University, Bangalore-560001, India. 2Department of Chemistry, Government ...

  5. Elizabeth II uus kunstigalerii

    Index Scriptorium Estoniae

    1999-01-01

    Tähistamaks oma troonile asumise 50. aastapäeva, avab Elizabeth II 6. II 2002 Buckinghami palees uue kunstigalerii, mis ehitatakse palee tiibhoonena. Arhitekt John Simpson. Elizabeth II kunstikogust

  6. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  7. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  8. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  9. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  10. The evolution of computer technology

    CERN Document Server

    Kamar, Haq

    2018-01-01

    Today it seems that computers occupy every single space in life. This book traces the evolution of computers from the humble beginnings as simple calculators up to the modern day jack-of-all trades devices like the iPhone. Readers will learn about how computers evolved from humongous military-issue refrigerators to the spiffy, delicate, and intriguing devices that many modern people feel they can't live without anymore. Readers will also discover the historical significance of computers, and their pivotal roles in World War II, the Space Race, and the emergence of modern Western powers.

  11. Synthesis and characterisation of Cu(II), Ni(II), Mn(II), Zn(II) and VO(II ...

    Indian Academy of Sciences (India)

    Unknown

    Synthesis and characterisation of Cu(II), Ni(II), Mn(II), Zn(II) and VO(II) Schiff base complexes derived from o-phenylenediamine and acetoacetanilide. N RAMAN*, Y PITCHAIKANI RAJA and A KULANDAISAMY. Department of Chemistry, VHNSN College, Virudhunagar 626 001, India e-mail: ra_man@123india.com.

  12. A SURVEY ON UBIQUITOUS COMPUTING

    Directory of Open Access Journals (Sweden)

    Vishal Meshram

    2016-01-01

    Full Text Available This work presents a survey of ubiquitous computing research which is the emerging domain that implements communication technologies into day-to-day life activities. This research paper provides a classification of the research areas on the ubiquitous computing paradigm. In this paper, we present common architecture principles of ubiquitous systems and analyze important aspects in context-aware ubiquitous systems. In addition, this research work presents a novel architecture of ubiquitous computing system and a survey of sensors needed for applications in ubiquitous computing. The goals of this research work are three-fold: i serve as a guideline for researchers who are new to ubiquitous computing and want to contribute to this research area, ii provide a novel system architecture for ubiquitous computing system, and iii provides further research directions required into quality-of-service assurance of ubiquitous computing.

  13. Magazine Development: Creative Arts Magazines Can Take on More Creativity through Staff Innovation, Desktop Publishing.

    Science.gov (United States)

    Cutsinger, John

    1988-01-01

    Explains how a high school literary magazine staff accessed the journalism department's Apple Macintosh computers to typeset its publication. Provides examples of magazine layouts designed partially or completely by "Pagemaker" software on a Macintosh. (MM)

  14. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  15. Transuranic Computational Chemistry.

    Science.gov (United States)

    Kaltsoyannis, Nikolas

    2018-02-26

    Recent developments in the chemistry of the transuranic elements are surveyed, with particular emphasis on computational contributions. Examples are drawn from molecular coordination and organometallic chemistry, and from the study of extended solid systems. The role of the metal valence orbitals in covalent bonding is a particular focus, especially the consequences of the stabilization of the 5f orbitals as the actinide series is traversed. The fledgling chemistry of transuranic elements in the +II oxidation state is highlighted. Throughout, the symbiotic interplay of experimental and computational studies is emphasized; the extraordinary challenges of experimental transuranic chemistry afford computational chemistry a particularly valuable role at the frontier of the periodic table. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  17. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  18. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  19. Non-invasive Heart Team assessment of multivessel coronary disease with coronary computed tomography angiography based on SYNTAX score II treatment recommendations: design and rationale of the randomised SYNTAX III Revolution trial

    NARCIS (Netherlands)

    Cavalcante, Rafael; Onuma, Yoshinobu; Sotomi, Yohei; Collet, Carlos; Thomsen, Brian; Rogers, Campbell; Zeng, Yaping; Tenekecioglu, Erhan; Asano, Taku; Miyasaki, Yosuke; Abdelghani, Mohammad; Morel, Marie-Angèle; Serruys, Patrick W.

    2017-01-01

    The aim of this study was to investigate whether a Heart Team decision-making process regarding the choice of revascularisation strategy based on non-invasive coronary multislice computed tomography angiography (MSCT) assessment of coronary artery disease (CAD) is equivalent to the standard-of-care

  20. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  1. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  2. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  3. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  4. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  5. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  6. Computer simulation of a clinical magnet resonance tomography scanner for training purposes

    International Nuclear Information System (INIS)

    Hacklaender, T.; Mertens, H.; Cramer, B.M.

    2004-01-01

    Purpose: The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. Materials and Methods: The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Results: Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. (orig.)

  7. [Computer simulation of a clinical magnet resonance tomography scanner for training purposes].

    Science.gov (United States)

    Hackländer, T; Mertens, H; Cramer, B M

    2004-08-01

    The idea for this project was born by the necessity to offer medical students an easy approach to the theoretical basics of magnetic resonance imaging. The aim was to simulate the features and functions of such a scanner on a commercially available computer by means of a computer program. The simulation was programmed in pure Java under the GNU General Public License and is freely available for a commercially available computer with Windows, Macintosh or Linux operating system. The graphic user interface is oriented to a real scanner. In an external program parameter, images for the proton density and the relaxation times T1 and T2 are calculated on the basis of clinical examinations. From this, the image calculation is carried out in the simulation program pixel by pixel on the basis of a pulse sequence chosen and modified by the user. The images can be stored and printed. In addition, it is possible to display and modify k-space images. Seven classes of pulse sequences are implemented and up to 14 relevant sequence parameters, such as repetition time and echo time, can be altered. Aliasing and motion artifacts can be simulated. As the image calculation only takes a few seconds, interactive working is possible. The simulation has been used in the university education for more than 1 year, successfully illustrating the dependence of the MR images on the measuring parameters. This should facititate the approach of students to the understanding MR imaging in the future.

  8. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  9. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  10. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  11. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  13. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  14. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  15. The Causal-Compositional Concept of Information—Part II: Information through Fairness: How Does the Relationship between Information, Fairness and Language Evolve, Stimulate the Development of (New Computing Devices and Help to Move towards the Information Society

    Directory of Open Access Journals (Sweden)

    Gerhard Luhn

    2012-09-01

    Full Text Available We are moving towards the information society, and we need to overcome the discouraging perspective, which is caused by the false belief that our thoughts (and thereby also our acting represent a somehow externally existing world. Indeed, it is already a step forward to proclaim that there exists a somehow common world for all people. But if those internal forms of representation are primarily bound to the subject itself, then, consequently, anybody can argue for his or her view of the world as being the “right” one. Well, what is the exit strategy out of this dilemma? It is information; information as understood in its actual and potential dimension, in its identity of structure and meaning. Such an approach requires a deeper elaborated conceptual approach. The goal of this study is to show that such a concept is glued by the strong relationship between seemingly unrelated disciplines: physics, semantics (semiotics/cognition and computer science, and even poetry. But the terminus of information is nowadays discussed and elaborated in all those disciplines. Hence, there is no shortcut, no way around. The aim of this study is not even to show that those strong relationships exist. We will see within the same horizon that, based on such a concept, new kinds of computing systems are becoming possible. Nowadays energy consumption is becoming a major issue regarding computing systems. We will work towards an approach, which enables new devices consuming a minimum amount of energy and maximizing the performance at the same time. And within the same horizon it becomes possible to release the saved energy towards a new ethical spirit—towards the information society.

  16. Belle-II Experiment Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Asner, David [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Bell, Greg [ESnet; Carlson, Tim [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Cowley, David [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Dart, Eli [ESnet; Erwin, Brock [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Godang, Romulus [Univ. of South Alabama, Mobile, AL (United States); Hara, Takanori [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Johnson, Jerry [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Johnson, Ron [Univ. of Washington, Seattle, WA (United States); Johnston, Bill [ESnet; Dam, Kerstin Kleese-van [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Kaneko, Toshiaki [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Kubota, Yoshihiro [NII; Kuhr, Thomas [Karlsruhe Inst. of Technology (KIT) (Germany); McCoy, John [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Miyake, Hideki [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Monga, Inder [ESnet; Nakamura, Motonori [NII; Piilonen, Leo [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Pordes, Ruth [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ray, Douglas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Russell, Richard [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schram, Malachi [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schroeder, Jim [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Sevior, Martin [Univ. of Melbourne (Australia); Singh, Surya [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Suzuki, Soh [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Sasaki, Takashi [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Williams, Jim [Indiana Univ., Bloomington, IN (United States)

    2013-05-28

    The Belle experiment, part of a broad-based search for new physics, is a collaboration of ~400 physicists from 55 institutions across four continents. The Belle detector is located at the KEKB accelerator in Tsukuba, Japan. The Belle detector was operated at the asymmetric electron-positron collider KEKB from 1999-2010. The detector accumulated more than 1 ab-1 of integrated luminosity, corresponding to more than 2 PB of data near 10 GeV center-of-mass energy. Recently, KEK has initiated a $400 million accelerator upgrade to be called SuperKEKB, designed to produce instantaneous and integrated luminosity two orders of magnitude greater than KEKB. The new international collaboration at SuperKEKB is called Belle II. The first data from Belle II/SuperKEKB is expected in 2015. In October 2012, senior members of the Belle-II collaboration gathered at PNNL to discuss the computing and neworking requirements of the Belle-II experiment with ESnet staff and other computing and networking experts. The day-and-a-half-long workshop characterized the instruments and facilities used in the experiment, the process of science for Belle-II, and the computing and networking equipment and configuration requirements to realize the full scientific potential of the collaboration's work.

  17. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  18. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  19. Positron Survival in Type II Supernovae

    Science.gov (United States)

    1989-05-01

    B: Computer Program and Flow Diagram 53 References 59 I. Introduction Since the discovery of Supernova 1987A (a Type II supernova) in February of 1987...the fewer number of decays depositing energy within the supernova. The rate of this cooling is unknown because it is uncertain whether a pulsar was

  20. KAFEPA-II program users' manual and description

    International Nuclear Information System (INIS)

    Suk, H. C.; Hwang, W.; Kim, B. G.; Sim, K. S.; Heo, Y. H.; Byun, T. S.; Park, G. S.

    1992-04-01

    KAFEPA-II is a computer program for simulating the behaviour of UO 2 fuel elements under normal operating conditions of a CANDU reactor. It computes the one-dimensional temperature distribution and thermal expansion of the fuel pellets. The amount of gas released during irradiation of the fuel is also computed. Thermal expansion and gas pressure inside the fuel element are then used to compute the strains and stresses in the sheath. This document is intended as a user's manual and description for KAFEPA-II. (Author)

  1. (II) COMPLEX COMPOUND

    African Journals Online (AJOL)

    user

    electrochemical sensors, as well as in various chromatographic ... were carried out using Jenway pH meter Model 3320 and a conductivity ... Figure 1: the proposed molecular structure of the copper (II) Schiff base complex. M = Cu (II) or Mn (II).

  2. and copper(II)

    Indian Academy of Sciences (India)

    Unknown

    (II) and copper(II)–zinc(II) complexes. SUBODH KUMAR1, R N PATEL1*, P V KHADIKAR1 and. K B PANDEYA2. 1 Department of Chemistry, APS University, Rewa 486 003, India. 2 CSJM University, Kanpur 208 016, India e-mail: (R N Patel) ...

  3. Computational, electrochemical, and spectroscopic studies of two mononuclear cobaloximes: the influence of an axial pyridine and solvent on the redox behaviour and evidence for pyridine coordination to cobalt(I) and cobalt(II) metal centres†

    Science.gov (United States)

    Lawrence, Mark A. W.; Celestine, Michael J.; Artis, Edward T.; Joseph, Lorne S.; Esquivel, Deisy L.; Ledbetter, Abram J.; Cropek, Donald M.; Jarrett, William L.; Bayse, Craig A.; Brewer, Matthew I.; Holder, Alvin A.

    2018-01-01

    [Co(dmgBF2)2(H2O)2] 1 (where dmgBF2 = difluoroboryldimethylglyoximato) was used to synthesize [Co(dmgBF2)2(H2O)(py)]·0.5(CH3)2CO 2 (where py = pyridine) in acetone. The formulation of complex 2 was confirmed by elemental analysis, high resolution MS, and various spectroscopic techniques. The complex [Co(dmgBF2)2(solv)(py)] (where solv = solvent) was readily formed in situ upon the addition of pyridine to complex 1. A spectrophotometric titration involving complex 1 and pyridine proved the formation of such a species, with formation constants, log K = 5.5, 5.1, 5.0, 4.4, and 3.1 in 2-butanone, dichloromethane, acetone, 1,2-difluorobenzene/acetone (4 : 1, v/v), and acetonitrile, respectively, at 20 °C. In strongly coordinating solvents, such as acetonitrile, the lower magnitude of K along with cyclic voltammetry, NMR, and UV-visible spectroscopic measurements indicated extensive dissociation of the axial pyridine. In strongly coordinating solvents, [Co(dmgBF2)2(solv)(py)] can only be distinguished from [Co(dmgBF2)2(solv)2] upon addition of an excess of pyridine, however, in weakly coordinating solvents the distinctions were apparent without the need for excess pyridine. The coordination of pyridine to the cobalt(II) centre diminished the peak current at the Epc value of the CoI/0 redox couple, which was indicative of the relative position of the reaction equilibrium. Herein we report the first experimental and theoretical 59Co NMR spectroscopic data for the formation of Co(I) species of reduced cobaloximes in the presence and absence of py (and its derivatives) in CD3CN. From spectroelectrochemical studies, it was found that pyridine coordination to a cobalt(I) metal centre is more favourable than coordination to a cobalt(II) metal centre as evident by the larger formation constant, log K = 4.6 versus 3.1, respectively, in acetonitrile at 20 °C. The electrosynthesis of hydrogen by complexes 1 and 2 in various solvents demonstrated the dramatic effects of the axial

  4. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  5. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  6. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  7. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  8. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  9. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  10. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  11. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  12. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer; Methode d'analyse des resonances induites par les neutrons dans les experiences de transmission par temps-de-vol et automatisation de ces methodes sur ordinateur IBM-7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, C

    1967-07-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  13. Computational fluid dynamic simulation (CFD) for hydrogen emission in batteries rooms of new technologic safeguards system of nuclear power plant Vandellos II; Simulacion de dinamica de fluidos computacional (CFD) para la emision de hidrogeno en las salas de baterias de nuevo sistema de salvaguardias tecnologicas de C.N. Vandellos II

    Energy Technology Data Exchange (ETDEWEB)

    Aleman, A.; Arino, X; Colomer, C.

    2010-07-01

    CFD (Computational Fluid Dynamics) technology is a powerful tool used when traditional methods of engineering are not sufficient to address the complexity of a problem and want to avoid the construction of prototypes. Natural ventilation and transport of hydrogen gas, is a problem where there are no models based on experimental data or analytical expressions that can reflect, the complex behaviour, of the fluid, but which can be addressed by use of CFD. (Author). 3 Refs.

  14. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer; Methode d'analyse des resonances induites par les neutrons dans les experiences de transmission par temps-de-vol et automatisation de ces methodes sur ordinateur IBM-7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, C

    1967-07-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  15. MENO-II: An AI-Based Programming Tutor.

    Science.gov (United States)

    Soloway, Elliot; And Others

    This report examines the features and performance of the BUG-FINDing component of MENO-II, a computer-based tutor for beginning PASCAL programming students. A discussion of the use of artificial intelligence techniques is followed by a summary of the system status and objectives. The two main components of MENO-II are described, beginning with the…

  16. Pius II. a utrakvismus

    OpenAIRE

    Šimek, Milan

    2009-01-01

    Milan Šimek Pius II. a utrakvismus Pius II. and utraquism Based on sources work - out, the thesis aims the description and analysis of the attitude alternation of Enea Sylvio Piccolomini - Pius II to the utraquism. The conclusions stress the postulate that Pius II. did not change that attitude, but just did not succed in quelling the utraquist movement. In the sense of political background that finally led to fatal dissention among both leaders, king Jiří of Poděbrady and pope Pius II.

  17. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  18. Data handling at EBR-II [Experimental Breeder Reactor II] for advanced diagnostics and control work

    International Nuclear Information System (INIS)

    Lindsay, R.W.; Schorzman, L.W.

    1988-01-01

    Improved control and diagnostics systems are being developed for nuclear and other applications. The Experimental Breeder Reactor II (EBR-II) Division of Argonne National Laboratory has embarked on a project to upgrade the EBR-II control and data handling systems. The nature of the work at EBR-II requires that reactor plant data be readily available for experimenters, and that the plant control systems be flexible to accommodate testing and development needs. In addition, operational concerns require that improved operator interfaces and computerized diagnostics be included in the reactor plant control system. The EBR-II systems have been upgraded to incorporate new data handling computers, new digital plant process controllers, and new displays and diagnostics are being developed and tested for permanent use. In addition, improved engineering surveillance will be possible with the new systems

  19. Complexes of cobalt(II), nickel(II), copper(II), zinc(II), cadmium(II) and dioxouranium(II) with thiophene-2-aldehydethiosemicarbazone

    International Nuclear Information System (INIS)

    Singh, Balwan; Misra, Harihar

    1986-01-01

    Metal complexes of thiosemicarbazides have been known for their pharmacological applications. Significant antitubercular, fungicidal and antiviral activities have been reported for thiosemicarbazides and their derivatives. The present study describes the systhesis and characterisation of complexes of Co II , Cu II , Zn II ,Cd II and UO II with thiosemicarbazone obtained by condensing thiophene-2-aldehyde with thiosemicarbazide. 17 refs., 2 tables. (author)

  20. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  1. A Method for Transferring Photoelectric Photometry Data from Apple II+ to IBM PC

    Science.gov (United States)

    Powell, Harry D.; Miller, James R.; Stephenson, Kipp

    1989-06-01

    A method is presented for transferring photoelectric photometry data files from an Apple II computer to an IBM PC computer in a form which is compatible with the AAVSO Photoelectric Photometry data collection process.

  2. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  3. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  4. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  5. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  6. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  7. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  8. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  9. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  10. Pertuzumab and Erlotinib in Patients With Relapsed Non-Small Cell Lung Cancer: A Phase II Study Using 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography Imaging

    Science.gov (United States)

    Mileshkin, Linda; Townley, Peter; Gitlitz, Barbara; Eaton, Keith; Mitchell, Paul; Hicks, Rodney; Wood, Katie; Amler, Lucas; Fine, Bernard M.; Loecke, David; Pirzkall, Andrea

    2014-01-01

    Background. Combination blockade of human epidermal growth factor receptor (HER) family signaling may confer enhanced antitumor activity than single-agent blockade. We performed a single-arm study of pertuzumab, a monoclonal antibody that inhibits HER2 dimerization, and erlotinib in relapsed non-small cell lung cancer (NSCLC). Methods. Patients received pertuzumab (840-mg loading dose and 420-mg maintenance intravenously every 3 weeks) and erlotinib (150-mg or 100-mg dose orally, daily). The primary endpoint was response rate (RR) by 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) at day 56 in all patients and those with EGFR wild-type tumors. Results. Of 41 patients, 28 (68.3%) experienced treatment-related grade ≥3 adverse events, including pneumatosis intestinalis (3 patients), resulting in early cessation of enrollment. Tissue samples from 32 patients showed mutated EGFR status in 9 of 41 (22%) and wild-type EGFR in 23 of 41 (56%). The FDG-PET RR for patients with assessments at day 56 was 19.5% in all patients (n = 41) and 8.7% in patients with wild-type EGFR NSCLC (n = 23). Investigator-assessed computed tomography RR at day 56 was 12.2%. Conclusion. FDG-PET suggests that pertuzumab plus erlotinib is an active combination, but combination therapy was poorly tolerated, which limits its clinical applicability. More research is warranted to identify drug combinations that disrupt HER receptor signaling but that exhibit improved tolerability profiles. PMID:24457379

  11. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  12. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  13. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  14. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  15. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  16. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  17. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  18. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  19. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  20. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  1. Accident analysis for PRC-II reactor

    International Nuclear Information System (INIS)

    Wei Yongren; Tang Gang; Wu Qing; Lu Yili; Liu Zhifeng

    1997-12-01

    The computer codes, calculation models, transient results, sensitivity research, design improvement, and safety evaluation used in accident analysis for PRC-II Reactor (The Second Pulsed Reactor in China) are introduced. PRC-II Reactor is built in big populous city, so the public pay close attention to reactor safety. Consequently, Some hypothetical accidents are analyzed. They include an uncontrolled control rod withdrawal at rated power, a pulse rod ejection at rated power, and loss of coolant accident. Calculation model which completely depict the principle and process for each accident is established and the relevant analysis code is developed. This work also includes comprehensive computing and analyzing transients for each accident of PRC-II Reactor; the influences in the reactor safety of all kind of sensitive parameters; evaluating the function of engineered safety feature. The measures to alleviate the consequence of accident are suggested and taken in the construction design of PRC-II Reactor. The properties of reactor safety are comprehensively evaluated. A new advanced calculation model (True Core Uncovered Model) of LOCA of PRC-II Reactor and the relevant code (MCRLOCA) are first put forward

  2. Quininium tetrachloridozinc(II

    Directory of Open Access Journals (Sweden)

    Li-Zhuang Chen

    2009-10-01

    Full Text Available The asymmetric unit of the title compound {systematic name: 2-[hydroxy(6-methoxyquinolin-1-ium-4-ylmethyl]-8-vinylquinuclidin-1-ium tetrachloridozinc(II}, (C20H26N2O2[ZnCl4], consists of a double protonated quininium cation and a tetrachloridozinc(II anion. The ZnII ion is in a slightly distorted tetrahedral coordination environment. The crystal structure is stabilized by intermolecular N—H...Cl and O—H...Cl hydrogen bonds.

  3. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  4. Burkina Faso - BRIGHT II

    Data.gov (United States)

    Millennium Challenge Corporation — Millennium Challenge Corporation hired Mathematica Policy Research to conduct an independent evaluation of the BRIGHT II program. The three main research questions...

  5. Synchrotron power supply of TARN II

    International Nuclear Information System (INIS)

    Watanabe, Shin-ichi.

    1991-07-01

    The construction and performance of synchrotron power supply of TARN II are described. The 1.1 GeV synchrotron-cooler TARN II has been constructed at Institute for Nuclear Study, University of Tokyo. Constructed power supply for the dipole magnets is 600 V, 2500 A operated in the mode of trapezoid wave form with the repetition cycle of 0.1 Hz. The stability of magnetic field within 10 -3 and tracking error of 10 -4 have been attained with the aid of computer control system. First trial of synchrotron acceleration of He 2+ beam has been done up to 600 MeV in April, 1991. (author)

  6. Computer-based Astronomy Labs for Non-science Majors

    Science.gov (United States)

    Smith, A. B. E.; Murray, S. D.; Ward, R. A.

    1998-12-01

    We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.

  7. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  8. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  9. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  10. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  11. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  12. cobalt(II), nickel(II)

    Indian Academy of Sciences (India)

    Unknown

    procedures. The supporting electrolyte, NaClO4 used in the voltammetric experiment was purchased from. Sigma. IR spectra were recorded in KBr medium on .... (13⋅6). L = Schiff base ligand form of one broad band envelope. The electronic spectra of Co(II) complex showed two spin-allowed transitions at 17856 and ...

  13. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  14. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  15. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  16. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  17. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  18. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  19. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  20. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp