WorldWideScience

Sample records for macintosh ii computer

  1. Response time accuracy in Apple Macintosh computers.

    Science.gov (United States)

    Neath, Ian; Earle, Avery; Hallett, Darcy; Surprenant, Aimée M

    2011-06-01

    The accuracy and variability of response times (RTs) collected on stock Apple Macintosh computers using USB keyboards was assessed. A photodiode detected a change in the screen's luminosity and triggered a solenoid that pressed a key on the keyboard. The RTs collected in this way were reliable, but could be as much as 100 ms too long. The standard deviation of the measured RTs varied between 2.5 and 10 ms, and the distributions approximated a normal distribution. Surprisingly, two recent Apple-branded USB keyboards differed in their accuracy by as much as 20 ms. The most accurate RTs were collected when an external CRT was used to display the stimuli and Psychtoolbox was able to synchronize presentation with the screen refresh. We conclude that RTs collected on stock iMacs can detect a difference as small as 5-10 ms under realistic conditions, and this dictates which types of research should or should not use these systems.

  2. Coping with Computer Viruses: General Discussion and Review of Symantec Anti-Virus for the Macintosh.

    Science.gov (United States)

    Primich, Tracy

    1992-01-01

    Discusses computer viruses that attack the Macintosh and describes Symantec AntiVirus for Macintosh (SAM), a commercial program designed to detect and eliminate viruses; sample screen displays are included. SAM is recommended for use in library settings as well as two public domain virus protection programs. (four references) (MES)

  3. Macintosh Plus

    CERN Multimedia

    1986-01-01

    Apple introduced the Macintosh Plus on January 16, 1986. The Macintosh Plus has an 8 MHz 68000 processor and an internal 800K floppy disk drive. It supports up to 4 MB of RAM. The Plus is a significant improvement over the previous compact Macs primarily due to the addition of the SCSI bus. Previous Macs did not have SCSI, thus making it more difficult to find a suitable external hard drive able to connect through the drive port, the printer port, or the modem port. These drives are considerably slower (as much as 4 times slower) than external SCSI hard drives. The Macintosh Plus is a very important computer in the history of the Apple Computers. It set up many of the standards that Apple followed for over a decade going forward.

  4. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2008-09-01

    Full Text Available The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems are marketed with the ability to accommodate these three common operating system environments, including Windows XP in native and virtual environments. We performed a series of experiments to measure the functionality and performance of the two most commonly used Windows-based computer forensics applications on a Macintosh running Windows XP in native mode and in two virtual environments relative to a similarly configured Dell personal computer. The research results are directly beneficial to practitioners, and the process illustrates affective pedagogy whereby students were engaged in applied research.

  5. System Software 7 Macintosh

    CERN Multimedia

    1991-01-01

    System 7 is a single-user graphical user interface-based operating system for Macintosh computers and was part of the classic Mac OS line of operating systems. It was introduced on May 13, 1991, by Apple Computer. It succeeded System 6, and was the main Macintosh operating system until it was succeeded by Mac OS 8 in 1997. Features added with the System 7 release included virtual memory, personal file sharing, QuickTime, QuickDraw 3D, and an improved user interface. This is the first real major evolution of the Macintosh system, bringing a significant improvement in the user interface, improved stability and many new features such as the ability to use multiple applications at the same time. "System 7" is the last operating system name of the Macintosh that contains the word "system". Macintosh operating systems were later called "Mac OS" (for Macintosh Operating System).

  6. Macintoshed Libraries 5. Fifth Edition.

    Science.gov (United States)

    Valauskas, Edward J., Ed.; Vaccaro, Bill, Ed.

    This annual collection contains 16 papers about the use of Macintosh computers in libraries which include: "New Horizons in Library Training: Using HyperCard for Computer-Based Staff Training" (Pauline S. Bayne and Joe C. Rader); "Get a Closet!" (Ron Berntson); "Current Periodicals: Subject Access the Mac Way"…

  7. Airtraq® versus Macintosh laryngoscope: A comparative study in tracheal intubation

    Science.gov (United States)

    Bhandari, Geeta; Shahi, K. S.; Asad, Mohammad; Bhakuni, Rajani

    2013-01-01

    Background: The curved laryngoscope blade described by Macintosh in 1943 remains the most widely used device to facilitate tracheal intubation. The Airtraq® (Prodol Meditec S.A, Vizcaya, Spain) is a new, single use, indirect laryngoscope introduced into clinical practice in 2005. It has wan exaggerated blade curvature with internal arrangement of optical lenses and a mechanism to prevent fogging of the distal lens. A high quality view of the glottis is provided without the need to align the oral, pharyngeal and tracheal axis. We evaluated Airtraq and Macintosh laryngoscopes for success rate of tracheal intubation, overall duration of successful intubation, optimization maneuvers, POGO (percentage of glottic opening) score, and ease of intubation. Materials and Methods: Patients were randomly allocated by computer-generated random table to one of the two groups, comprising 40 patients each, group I (Airtraq) and group II (Macintosh). After induction of general anesthesia, tracheal intubation was attempted with the Airtraq or the Macintosh laryngoscope as per group. Primary end points were overall success rate of tracheal intubation, overall duration of successful tracheal intubation, optimization maneuvers, POGO score and ease of intubation between the two groups. Results: We observed that Airtraq was better than the Macintosh laryngoscope as duration of successful intubation was shorter in Airtraq 18.15 seconds (±2.74) and in the Macintosh laryngoscope it was 32.72 seconds (±8.31) P < 0.001. POGO was also better in the Airtraq group 100% grade 1 versus 67.5% in the Macintosh group, P < 0.001. Ease of intubation was also better in the Airtraq group. It was easy in 97.5% versus 42.5% in the Macintosh group, P < 0.001. Conclusion: Both Airtraq and Macintosh laryngoscopes are equally effective in tracheal intubation in normal airways. Duration of successful tracheal intubation was shorter in the Airtraq group which was statistically significant. PMID:25885839

  8. Think Different: Applying the Old Macintosh Mantra to the Computability of the SUSY Auxiliary Field Problem

    CERN Document Server

    Calkins, Mathew; Gates,, S James; Golding, William M

    2015-01-01

    Starting with valise supermultiplets obtained from 0-branes plus field redefinitions, valise adinkra networks, and the "Garden Algebra," we discuss an architecture for algorithms that (starting from on-shell theories and, through a well-defined computation procedure), search for off-shell completions. We show in one dimension how to directly attack the notorious "off-shell auxiliary field" problem of supersymmetry with algorithms in the adinkra network-world formulation.

  9. Think different: applying the old macintosh mantra to the computability of the SUSY auxiliary field problem

    Energy Technology Data Exchange (ETDEWEB)

    Calkins, Mathew; Gates, D.E.A.; Gates, S. James Jr. [Center for String and Particle Theory, Department of Physics, University of Maryland,College Park, MD 20742-4111 (United States); Golding, William M. [Sensors and Electron Devices Directorate, US Army Research Laboratory,Adelphi, Maryland 20783 (United States)

    2015-04-13

    Starting with valise supermultiplets obtained from 0-branes plus field redefinitions, valise adinkra networks, and the “Garden Algebra,” we discuss an architecture for algorithms that (starting from on-shell theories and, through a well-defined computation procedure), search for off-shell completions. We show in one dimension how to directly attack the notorious “off-shell auxiliary field” problem of supersymmetry with algorithms in the adinkra network-world formulation.

  10. Think different: applying the old macintosh mantra to the computability of the SUSY auxiliary field problem

    Science.gov (United States)

    Calkins, Mathew; Gates, D. E. A.; Gates, S. James; Golding, William M.

    2015-04-01

    Starting with valise supermultiplets obtained from 0-branes plus field redefinitions, valise adinkra networks, and the "Garden Algebra," we discuss an architecture for algorithms that (starting from on-shell theories and, through a well-defined computation procedure), search for off-shell completions. We show in one dimension how to directly attack the notorious "off-shell auxiliary field" problem of supersymmetry with algorithms in the adinkra network-world formulation.

  11. Scientific Graphical Displays on the Macintosh

    Energy Technology Data Exchange (ETDEWEB)

    Grotch, S. [Lawrence Livermore National Lab., CA (United States)

    1994-11-15

    In many organizations scientists have ready access to more than one computer, often both a workstation (e.g., SUN, HP, SGI) as well as a Macintosh or other PC. The scientist commonly uses the work station for `number-crunching` and data analysis whereas the Macintosh is relegated to either word processing or serves as a `dumb terminal` to a larger main-frame computer. In an informal poll of my colleagues, very few of them used their Macintoshes for either statistical analysis or for graphical data display. I believe that this state of affairs is particularly unfortunate because over the last few years both the computational capability, and even more so, the software availability for the Macintosh have become quite formidable. In some instances, very powerful tools are now available on the Macintosh that may not exist (or be far too costly) on the so-called `high end` workstations. Many scientists are simply unaware of the wealth of extremely useful, `off-the-shelf` software that already exists on the Macintosh for scientific graphical and statistical analysis.

  12. A user`s guide to LUGSAN II. A computer program to calculate and archive lug and sway brace loads for aircraft-carried stores

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, W.N. [Sandia National Labs., Albuquerque, NM (United States). Mechanical and Thermal Environments Dept.

    1998-03-01

    LUG and Sway brace ANalysis (LUGSAN) II is an analysis and database computer program that is designed to calculate store lug and sway brace loads for aircraft captive carriage. LUGSAN II combines the rigid body dynamics code, SWAY85, with a Macintosh Hypercard database to function both as an analysis and archival system. This report describes the LUGSAN II application program, which operates on the Macintosh System (Hypercard 2.2 or later) and includes function descriptions, layout examples, and sample sessions. Although this report is primarily a user`s manual, a brief overview of the LUGSAN II computer code is included with suggested resources for programmers.

  13. Vectronic's Power Macintosh G3 (B & W)

    CERN Multimedia

    1999-01-01

    Apple introduced the Power Macintosh G3 Blue and White (B & W) on January 5, 1999. The Power Macintosh G3 line stayed in production until August 1999, and was replaced by the Power Macintosh G4, which used the same chassis. The Power Macintosh G3 originally cost between $1599 and $2900 depending on options. The three original Power Macintosh G3 models shipped with a 300 MHz, 350 MHz, or 400 MHz PowerPC 750 (G3) processor. Just pull on the small round handle on the side of the tower, and the entire side of the computer opens up. The G3's motherboard is mounted on that surface, giving you easy access for upgrading RAM or installed PCI cards. Apple added new ports (USB and the much-anticipated FireWire) that took the place of historic, and quickly becoming antiquated, Mac serial (printer and modem) ports. The Power Macintosh G3 has two USB (12 Mbps) ports, two FireWire (400 Mbps) ports, one 10/100BaseT Ethernet port, an RJ-11 jack for an optional 56K modem, a sound out and sound in jack, and one ADB (Apple D...

  14. Scientific statistics and graphics on the Macintosh

    Energy Technology Data Exchange (ETDEWEB)

    Grotch, S.L.

    1994-09-01

    In many organizations scientists have ready access to more than one computer, often both a workstation (e.g., SUN, HIP, SGI) as well as a Macintosh or other PC. The scientist commonly uses the work station for {open_quotes}number-crunching{close_quotes} and data analysis whereas the Macintosh is relegated to either word processing or serves as a {open_quotes}dumb terminal{close_quotes} to a larger mainframe computer. In an informal poll of the author`s colleagues, very few of them used their Macintoshes for either statistical analysis or for graphical data display. The author believes that this state of affairs is particularly unfortunate because over the last few years both the computational capability, and even more so, the software availability for the Macintosh have become quite formidable. In some instances, very powerful tools are now available on the Macintosh that may not exist (or be far too costly) on the so-called {open_quotes}high end{close_quotes} workstations. Many scientists are simply unaware of the wealth of extremely useful, {open_quotes}off-the-shelf{close_quote} software that already exists on the Macintosh for scientific graphical and statistical analysis. This paper is a very personal view illustrating several such software packages that have proved valuable in the author`s own work in the analysis and display of climatic datasets. It is not meant to be either an all-inclusive enumeration, nor is it to be taken as an endorsement of these products as the {open_quotes}best{close_quotes} of their class. Rather, it has been found, through extensive use that these few packages were generally capable of satisfying his particular needs for both statistical analysis and graphical data display. In the limited space available, the focus will be on some of the more novel features found to be of value.

  15. Simulation Modeling on the Macintosh using STELLA.

    Science.gov (United States)

    Costanza, Robert

    1987-01-01

    Describes a new software package for the Apple Macintosh computer which can be used to create elaborate simulation models in a fraction of the time usually required without using a programming language. Illustrates the use of the software which relates to water usage. (TW)

  16. Computer science II essentials

    CERN Document Server

    Raus, Randall

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Computer Science II includes organization of a computer, memory and input/output, coding, data structures, and program development. Also included is an overview of the most commonly

  17. Power Macintosh 7300/166

    CERN Multimedia

    1997-01-01

    The Power Macintosh 7300 was released in 1997 and was the same case as the Power Macintosh 7600. Its main evolution is that it was equipped with a faster processor. It also had a bigger hard drive (2 GB) and a faster CD-ROM drive (12x to 8x). In return, Apple chose to remove the audiovisual connections that were present on all its predecessors of the range 7x00.

  18. DNA based computers II

    CERN Document Server

    Landweber, Laura F; Baum, Eric B

    1998-01-01

    The fledgling field of DNA computers began in 1994 when Leonard Adleman surprised the scientific community by using DNA molecules, protein enzymes, and chemicals to solve an instance of a hard computational problem. This volume presents results from the second annual meeting on DNA computers held at Princeton only one and one-half years after Adleman's discovery. By drawing on the analogy between DNA computing and cutting-edge fields of biology (such as directed evolution), this volume highlights some of the exciting progress in the field and builds a strong foundation for the theory of molecular computation. DNA computing is a radically different approach to computing that brings together computer science and molecular biology in a way that is wholly distinct from other disciplines. This book outlines important advances in the field and offers comprehensive discussion on potential pitfalls and the general practicality of building DNA based computers.

  19. SAGE FOR MACINTOSH (MSAGE) VERSION 1.0 SOLVENT ALTERNATIVES GUIDE - USER'S GUIDE

    Science.gov (United States)

    The guide provides instructions for using the Solvent Alternatives Guide (SAGE) for Macintosh, version 1.0. The guide assumes that the user is familiar with the fundamentals of operating aMacintosh personal computer under the System 7.0 (or higher) operating system. SAGE for ...

  20. Hamlet on the Macintosh: An Experimental Seminar That Worked.

    Science.gov (United States)

    Strange, William C.

    1987-01-01

    Describes experimental college Shakespeare seminar that used Macintosh computers and software called ELIZA and ADVENTURE to develop character dialogs and adventure games based on Hamlet's characters and plots. Programming languages are examined, particularly their relationship to metaphor, and the use of computers in humanities is discussed. (LRW)

  1. Hamlet on the Macintosh: An Experimental Seminar That Worked.

    Science.gov (United States)

    Strange, William C.

    1987-01-01

    Describes experimental college Shakespeare seminar that used Macintosh computers and software called ELIZA and ADVENTURE to develop character dialogs and adventure games based on Hamlet's characters and plots. Programming languages are examined, particularly their relationship to metaphor, and the use of computers in humanities is discussed. (LRW)

  2. Computing Borel's Regulator II

    CERN Document Server

    Choo, Zacky; Sánchez-García, Rubén J; Snaith, Victor P

    2009-01-01

    In our earlier article we described a power series formula for the Borel regulator evaluated on the odd-dimensional homology of the general linear group of a number field and, concentrating on dimension three for simplicity, described a computer algorithm which calculates the value to any chosen degree of accuracy. In this sequel we give an algorithm for the construction of the input homology classes and describe the results of one cyclotomic field computation.

  3. GRID Computing at Belle II

    CERN Document Server

    Bansal, Vikas

    2015-01-01

    The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start physics data taking in 2018 and will accumulate 50 ab$^{-1}$ of e$^{+}$e$^{-}$ collision data, about 50 times larger than the data set of the earlier Belle experiment. The computing requirements of Belle II are comparable to those of a run I high-p$_T$ LHC experiment. Computing will make full use of such grids in North America, Asia, Europe, and Australia, and high speed networking. Results of an initial MC simulation campaign with 3 ab$^{-1}$ equivalent luminosity will be described

  4. MacMath 92 a dynamical systems software package for the Macintosh

    CERN Document Server

    Hubbard, John H

    1993-01-01

    MacMath is a scientific toolkit for the Macintosh computer consisting of twelve graphics programs. It supports mathematical computation and experimentation in dynamical systems, both for differential equations and for iteration. The MacMath package was designed to accompany the textbooks Differential Equations: A Dynamical Systems Approach Part I & II. The text and software was developed for a junior-senior level course in applicable mathematics at Cornell University, in order to take advantage of excellent and easily accessible graphics. MacMath addresses differential equations and iteration such as: analyzer, diffeq, phase plane, diffeq 3D views, numerical methods, periodic differential equations, cascade, 2D iteration, eigenfinder, jacobidraw, fourier, planets. These versatile programs greatly enhance the understanding of the mathematics in these topics. Qualitative analysis of the picture leads to quantitative results and even to new mathematics. This new edition includes the latest version of the Mac...

  5. Computer Bits: The Ideal Computer System for Your Center.

    Science.gov (United States)

    Brown, Dennis; Neugebauer, Roger

    1986-01-01

    Reviews five computer systems that can address the needs of a child care center: (1) Sperry PC IT with Bernoulli Box, (2) Compaq DeskPro 286, (3) Macintosh Plus, (4) Epson Equity II, and (5) Leading Edge Model "D." (HOD)

  6. LERC-SLAM - THE NASA LEWIS RESEARCH CENTER SATELLITE LINK ATTENUATION MODEL PROGRAM (MACINTOSH VERSION)

    Science.gov (United States)

    Manning, R. M.

    1994-01-01

    antenna required to establish a link with the satellite, the statistical parameters that characterize the rainrate process at the terminal site, the length of the propagation path within the potential rain region, and its projected length onto the local horizontal. The IBM PC version of LeRC-SLAM (LEW-14979) is written in Microsoft QuickBASIC for an IBM PC compatible computer with a monitor and printer capable of supporting an 80-column format. The IBM PC version is available on a 5.25 inch MS-DOS format diskette. The program requires about 30K RAM. The source code and executable are included. The Macintosh version of LeRC-SLAM (LEW-14977) is written in Microsoft Basic, Binary (b) v2.00 for Macintosh II series computers running MacOS. This version requires 400K RAM and is available on a 3.5 inch 800K Macintosh format diskette, which includes source code only. The Macintosh version was developed in 1987 and the IBM PC version was developed in 1989. IBM PC is a trademark of International Business Machines. MS-DOS is a registered trademark of Microsoft Corporation. Macintosh is a registered trademark of Apple Computer, Inc.

  7. Library Signage: Applications for the Apple Macintosh and MacPaint.

    Science.gov (United States)

    Diskin, Jill A.; FitzGerald, Patricia

    1984-01-01

    Describes specific applications of the Macintosh computer at Carnegie-Mellon University Libraries, where MacPaint was used as a flexible, easy to use, and powerful tool to produce informational, instructional, and promotional signage. Profiles of system hardware and software, an evaluation of the computer program MacPaint, and MacPaint signage…

  8. From Newton to Mandelbrot a primer in theoretical physics with fractals for the Macintosh

    CERN Document Server

    Stauffer, Dietrich

    1996-01-01

    From Newton to Mandelbrot A Primer in Theoretical Physics with Fractals for the Macintosh ( ) takes the student on a tour of the most important landmarks of theoretical physics classical, quantum, and statistical mechanics, relativity, electrodynamics, and, the most modern and exciting of all, the physics of fractals The treatment is confined to the essentials of each area, and short computer programs, numerous problems, and beautiful color illustrations round off this unusual textbook Ideally suited for a one-year course in theoretical physics it will also prove useful in preparing and revising for exams This edition is corrected and includes a new appendix on elementary particle physics, answers to all short questions, and a Macintosh diskette where a selection of executable programs exploring the fractal concept can be found The Diskette The program FRACTAL DIMENSION can be used on any 68030-, 68040,- or PowerPC-based Macintosh with 4 Mb RAM and 256 color display running System 67 - 75 - Sierpinski gasket ...

  9. Macintosh Troubleshooting Pocket Guide for Mac OS

    CERN Document Server

    Lerner, David; Corporation, Tekserve

    2009-01-01

    The Macintosh Troubleshooting Pocket Guide covers the most common user hardware and software trouble. It's not just a book for Mac OS X (although it includes tips for OS X and Jaguar), it's for anyone who owns a Mac of any type-- there are software tips going back as far as OS 6. This slim guide distills the answers to the urgent questions that Tekserve's employee's answer every week into a handy guide that fits in your back pocket or alongside your keyboard.

  10. Office 2008 for Macintosh The Missing Manual

    CERN Document Server

    Elferdink, Jim

    2008-01-01

    Though Office 2008 has been improved to take advantage of the latest Mac OS X features, you don't get a single page of printed instructions to guide you through the changes. Office 2008 for Macintosh: The Missing Manual gives you the friendly and thorough introduction you need, whether you're a beginner who can't do more than point and click, or a power user who's ready for a few advanced techniques.

  11. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    Science.gov (United States)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh

  12. Office X for Macintosh the missing manual

    CERN Document Server

    Barber, Nan; Reynolds, David

    2002-01-01

    Mac OS X, Apple's super-advanced, Unix-based operating system, offers every desirable system-software feature known to humans. But without a compatible software library, the Mac of the future was doomed. Microsoft Office X for Macintosh is exactly the software suite most Mac fans were waiting for. Its four programs--Word, Excel, PowerPoint, and Entourage--have been completely overhauled to take advantage of the stunning looks and rock-like stability of Mac OS X. But this magnificent package comes without a single page of printed instructions. Fortunately, Pogue Press/O'Reilly is once again

  13. DEMAID - A DESIGN MANAGER'S AID FOR INTELLIGENT DECOMPOSITION (MACINTOSH VERSION)

    Science.gov (United States)

    Rogers, J. L.

    1994-01-01

    effects of an output with respect to a change in a particular input. The second method traces backward to determine what modules must be re-executed if the output of a module must be recomputed. DeMAID is available in three machine versions: a Macintosh version which is written in Symantec's Think C 3.01, a Sun version, and an SGI IRIS version, both of which are written in C language. The Macintosh version requires system software 6.0.2 or later and CLIPS 4.3. The source code for the Macintosh version will not compile under version 4.0 of Think C; however, a sample executable is provided on the distribution media. QuickDraw is required for plotting. The Sun version requires GKS 4.1 graphics libraries, OpenWindows 3, and CLIPS 4.3. The SGI IRIS version requires CLIPS 4.3. Since DeMAID is not compatible with CLIPS 5.0 or later, the source code for CLIPS 4.3 is included on the distribution media; however, the documentation for CLIPS 4.3 is not included in the documentation package for DeMAID. It is available from COSMIC separately as the documentation for MSC-21208. The standard distribution medium for the Macintosh version of DeMAID is a set of four 3.5 inch 800K Macintosh format diskettes. The standard distribution medium for the Sun version of DeMAID is a .25 inch streaming magnetic tape cartridge (QIC-24) in UNIX tar format. The standard distribution medium for the IRIS version is a .25 inch IRIX compatible streaming magnetic tape cartridge in UNIX tar format. All versions include sample input. DeMAID was originally developed for use on VAX VMS computers in 1989. The Macintosh version of DeMAID was released in 1991 and updated in 1992. The Sun version of DeMAID was released in 1992 and updated in 1993. The SGI IRIS version was released in 1993.

  14. A CLINICAL ASSESSMENT OF MACINTOSH BLADE, MILLER BLADE AND KING VISIONTM VIDEOLARYNGOSCOPE FOR LARYNGEAL EXPOSURE AND DIFFICULTY IN ENDOTRACHEAL INTUBATION

    Directory of Open Access Journals (Sweden)

    Apoorva Mahendera

    2016-03-01

    Full Text Available CONTEXT Previous studies suggest glottic view is better achieved with straight blades while tracheal intubation is easier with curved blades and videolaryngoscope is better than conventional laryngoscope. AIMS Comparison of conventional laryngoscope (Macintosh blade and Miller blade with channelled videolaryngoscope (King Vision TM with respect to laryngeal visualisation and difficulty in endotracheal intubation. SETTINGS AND DESIGN This prospective randomised comparative study was conducted at a tertiary care hospital (in ASA I and ASA II patients after approval from the Institutional Ethics Committee. METHODS We compared Macintosh, Miller, and the King VisionTM videolaryngoscope for glottic visualisation and ease of tracheal intubation. Patients undergoing elective surgeries under general anaesthesia requiring endotracheal intubation were randomly divided into three groups (N=180. After induction of anaesthesia, laryngoscopy was performed and trachea intubated. We recorded visualisation of glottis (Cormack-Lehane grade-CL, ease of intubation, number of attempts, need to change blade, and need for external laryngeal manipulation. STATISTICAL ANALYSIS Demographic data, Mandibular length, Mallampati classification were compared using ANOVA, Chi-square test, Kruskal-Wallis Test, where P value <0.005 is statically significant. RESULTS CL grade 1 was most often observed in King Vision -TM VL group (90% which is followed by Miller (28.33%, and Macintosh group (15%. We found intubation was to be easier (grade 1 with King Vision -TM VL group (73.33%, followed by Macintosh (38.33%, and Miller group (1.67%. External manipulation (BURP was needed more frequently in patients in Miller group (71.67%, followed by Macintosh (28.33% and in King Vision -TM VL group (6.67%. All (100% patients were intubated in the 1 st attempt with King Vision -TM VL group, followed by Macintosh group (90% and Miller group (58.33%. CONCLUSIONS In patients with normal airway

  15. 75 FR 64258 - Cloud Computing Forum & Workshop II

    Science.gov (United States)

    2010-10-19

    ... National Institute of Standards and Technology Cloud Computing Forum & Workshop II AGENCY: National... announces the Cloud Computing Forum & Workshop II to be held on November 4 and 5, 2010. This workshop will provide information on a Cloud Computing Roadmap Strategy as well as provide an updated status on NIST...

  16. Routine Use of Glidescope and Macintosh Laryngoscope by Trainee Anesthetists.

    Science.gov (United States)

    Aqil, Mansoor; Khan, Mueen Ullah; Hussain, Altaf; Khokhar, Rashid Saeed; Mansoor, Saara; Alzahrani, Tariq

    2016-04-01

    To compare intubating conditions, success rate, and ease of intubation by anesthesia trainees using Glidescope Videolaryngoscope (GVL) compared to Macintosh laryngoscope (MCL). Comparative study. King Khalid University Hospital, Riyadh, Saudi Arabia, from January 2012 to February 2015. Eighty adult patients ASAI and II with normal airway, scheduled to undergo elective surgery requiring endotracheal (ET) intubation were enrolled. Patients were randomly divided into 2 groups: GVL and MCL. All intubations were performed by trainee residents having experience of more than 1 year and who had successfully performed more than 50 tracheal intubations with each device. Glottic view based on Cormack and Lehane's (C&L's) score and percentage of glottis opening (POGO) score, time to successful intubation, need of external pressure, and overall difficulty scores were compared using either GVL or MCL. View of glottis based on C&L's classification was better (p < 0.001) and POGO score was higher (88.25 ±22.06 vs. 57.25 ±29.26, p < 0.001) with GVL compared to MCL. Time to intubate in seconds was (32.90 ±8.69 vs. 41.33 ±15.29, p = 0.004) and overall difficulty score was less 2.78 ±1.39 vs. 4.85 ±1.75 (p < 0.001) using GVL compared to MCL. Residents found ET intubation to be faster and easier with superior glottic view using GVL compared to MCL in patients with normal airway.

  17. DET/MPS - THE GSFC ENERGY BALANCE PROGRAM, DIRECT ENERGY TRANSFER/MULTIMISSION SPACECRAFT MODULAR POWER SYSTEM (MACINTOSH A/UX VERSION)

    Science.gov (United States)

    Jagielski, J. M.

    1994-01-01

    The DET/MPS programs model and simulate the Direct Energy Transfer and Multimission Spacecraft Modular Power System in order to aid both in design and in analysis of orbital energy balance. Typically, the DET power system has the solar array directly to the spacecraft bus, and the central building block of MPS is the Standard Power Regulator Unit. DET/MPS allows a minute-by-minute simulation of the power system's performance as it responds to various orbital parameters, focusing its output on solar array output and battery characteristics. While this package is limited in terms of orbital mechanics, it is sufficient to calculate eclipse and solar array data for circular or non-circular orbits. DET/MPS can be adjusted to run one or sequential orbits up to about one week, simulated time. These programs have been used on a variety of Goddard Space Flight Center spacecraft projects. DET/MPS is written in FORTRAN 77 with some VAX-type extensions. Any FORTRAN 77 compiler that includes VAX extensions should be able to compile and run the program with little or no modifications. The compiler must at least support free-form (or tab-delineated) source format and 'do do-while end-do' control structures. DET/MPS is available for three platforms: GSC-13374, for DEC VAX series computers running VMS, is available in DEC VAX Backup format on a 9-track 1600 BPI tape (standard distribution) or TK50 tape cartridge; GSC-13443, for UNIX-based computers, is available on a .25 inch streaming magnetic tape cartridge in UNIX tar format; and GSC-13444, for Macintosh computers running AU/X with either the NKR FORTRAN or AbSoft MacFORTRAN II compilers, is available on a 3.5 inch 800K Macintosh format diskette. Source code and test data are supplied. The UNIX version of DET requires 90K of main memory for execution. DET/MPS was developed in 1990. A/UX and Macintosh are registered trademarks of Apple Computer, Inc. VMS, DEC VAX and TK50 are trademarks of Digital Equipment Corporation. UNIX is a

  18. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  19. A randomized controlled study to evaluate and compare Truview blade with Macintosh blade for laryngoscopy and intubation under general anesthesia

    Directory of Open Access Journals (Sweden)

    Ramesh T Timanaykar

    2011-01-01

    Full Text Available Background: The Truview EVO2 TM laryngoscope is a recently introduced device with a unique blade that provides a magnified laryngeal view at 42° anterior reflected view. It facilitates visualization of the glottis without alignment of oral, pharyngeal, and tracheal axes. We compared the view obtained at laryngoscopy, intubating conditions and hemodynamic parameters of Truview with Macintosh blade. Materials and Methods: In prospective, randomized and controlled manner, 200 patients of ASA I and II of either sex (20-50 years, presenting for surgery requiring tracheal intubation, were assigned to undergo intubation using a Truview or Macintosh laryngoscope. Visualization of the vocal cord, ease of intubation, time taken for intubation, number of attempts, and hemodynamic parameters were evaluated. Results: Truview provided better results for the laryngeal view using Cormack and Lehane grading, particularly in patients with higher airway Mallampati grading (P < 0.05. The time taken for intubation (33.06±5.6 vs. 23.11±57 seconds was more with Truview than with Macintosh blade (P < 0.01. The Percentage of Glottic Opening (POGO score was significantly higher (97.26±8 in Truview as that observed with Macintosh blade (83.70±21.5. Hemodynamic parameters increased after tracheal intubation from pre-intubation value (P < 0.05 in both the groups, but they were comparable amongst the groups. No postoperative adverse events were noted. Conclusion: Tracheal intubation using Truview blade provided consistently improved laryngeal view as compared to Macintosh blade without the need to align the oral, pharyngeal and tracheal axes, with equal attempts for successful intubation and similar changes in hemodynamics. However, the time taken for intubation was more with Truview.

  20. Cinematica: a system for calibrated, Macintosh-driven displays from within Mathematica.

    Science.gov (United States)

    Solomon, J A; Watson, A B

    1996-01-01

    Cinematica is a minimal system for producing calibrated grayscale movies on an Apple Macintosh computer from within the Mathematica programming environment. It makes use of the ISR Video Attenuator and the Video Toolbox software library developed by Denis Pelli. By design, Cinematica provides a very low-level interface to the display routine. Display instructions take the form of a list of pairs (image index, colormap index). The philosophy is that programming is much easier in Mathematica than in C, so we reserve the complexity for Mathematica. A few simple examples are provided.

  1. LHCb Computing Resource usage in 2015 (II)

    CERN Document Server

    Bozzi, Concezio

    2016-01-01

    This documents reports the usage of computing resources by the LHCb collaboration during the period January 1st – December 31st 2015. The data in the following sections has been compiled from the EGI Accounting portal: https://accounting.egi.eu. For LHCb specific information, the data is taken from the DIRAC Accounting at the LHCb DIRAC Web portal: http://lhcb-portal-dirac.cern.ch.

  2. Type II Quantum Computing Algorithm For Computational Fluid Dynamics

    Science.gov (United States)

    2006-03-01

    Hall/CRC (2003) 30. Gilbert Strang, Linear Algebra and its Applications. Thompson Learning, Inc (1988) 31. George Arfken and Hans Weber, Mathematical ... method is called ensemble Figure 3. Ensemble measurement averages the measurement results of N identical quantum computers to obtain the magnitude of...the lattice Boltzmann equation. There are two methods of modeling this mesoscopic equation. The first approach is to directly simulate the

  3. Macintosh support is provided at the level of the Service Desk

    CERN Multimedia

    2011-01-01

    Since September 2010 the Apple laptops & desktops with Mac OS are recognized and supported at CERN by the IT department. Therefore, the “Macintosh support” procedure now follows the same ITIL*) schema as for all IT services, i.e.: All CERN users must address any request for support on Macintosh PCs to the Service Desk. The Service Desk will move on questions or problems they cannot solve to “IT 2nd level” support people, provided by the “computing support” contract managed by IT department. Mac OS being officially supported by the IT department, a 3rd level support is provided by CERN IT staff; they may give specialized expert assistance, within the scope described at the ITUM-2 presentation, for all incidents or requests which can be neither resolved nor fulfilled by the Service Desk (1st level) and the 2nd level support people. Therefore, users who have problems related to Mac OS should simply fill-in the appropriate form from th...

  4. Parallel computations in linear algebra. II

    Energy Technology Data Exchange (ETDEWEB)

    Faddeeva, V.N.; Faddeev, D.K.

    1982-05-01

    For pt.I, see Kibernetika, vol.13, no.6, p.28 (1977). Considerable effort was devoted in the surveyed period to automatic decomposition of sequential algorithms, or rather of procedures or subprograms written in the algorithmic languages ALGOL or FORTRAN. The authors do not consider this body of research, they only note that, on the one hand, the available linear algebra subprograms included in Eispack provide convenient objects for testing various approaches to automatic construction of parallel programs and, on the other, an important state in this activity is the development of methods for fast and efficient solution of linear recurrences, which reduce to solving systems of linear equations with band-triangular matrix (in particular, of sufficiently small width). This article reflects the penetration of the parallelism ideas into the computational methods of linear algebra in recent years. 74 references.

  5. Computer simulation of a magnetohydrodynamic dynamo. II

    Science.gov (United States)

    Kageyama, Akira; Sato, Tetsuya; Complexity Simulation Group

    1995-05-01

    A computer simulation of a magnetohydrodynamic dynamo in a rapidly rotating spherical shell is performed. Extensive parameter runs are carried out changing electrical resistivity. When resistivity is sufficiently small, total magnetic energy can grow more than ten times larger than total kinetic energy of convection motion which is driven by an unlimited external energy source. When resistivity is relatively large and magnetic energy is comparable or smaller than kinetic energy, the convection motion maintains its well-organized structure. However, when resistivity is small and magnetic energy becomes larger than kinetic energy, the well-organized convection motion is highly irregular. The magnetic field is organized in two ways. One is the concentration of component parallel to the rotation axis and the other is the concentration of perpendicular component. The parallel component tends to be confined inside anticyclonic columnar convection cells, while the perpendicular component is confined outside convection cells.

  6. A Visualization and Orthographic Drawing Test Using the Macintosh Computer.

    Science.gov (United States)

    Bertoline, Gary R.; Miller, Daniel C.

    1990-01-01

    Developed is an examination to determine a student's visualization capabilities. This examination has two versions, both developed to measure visualization ability, time to visualize, and reaction time. This test indicates the learner's ability to visualize complex three-dimensional objects from six principal views. Included are the results from a…

  7. An ADC test system based on the Macintosh computer

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Z.; Spiriti, E.; Tortora, L. (INFN, Roma (Italy). Sezione Roma Sanita)

    1994-02-01

    DA[Phi]NE is an approved INFN [Phi]-factory project to be realized at the Frascati National Laboratories of Italy. DA[Phi]NE consists of the construction of a two-ring colliding beam [Phi] Factory and a 510 Mev e[sup +]/e[sup [minus

  8. Modeling fluid dynamics on type II quantum computers

    Science.gov (United States)

    Scoville, James; Weeks, David; Yepez, Jeffrey

    2006-03-01

    A quantum algorithm is presented for modeling the time evolution of density and flow fields governed by classical equations, such as the diffusion equation, the nonlinear Burgers equation, and the damped wave equation. The algorithm is intended to run on a type-II quantum computer, a parallel quantum computer consisting of a lattice of small type I quantum computers undergoing unitary evolution and interacting via information interchanges represented by an orthogonal matrices. Information is effectively transferred between adjacent quantum computers over classical communications channels because of controlled state demolition following local quantum mechanical qubit-qubit interactions within each quantum computer. The type-II quantum algorithm presented in this paper describes a methodology for generating quantum logic operations as a generalization of classical operations associated with finite-point group symmetries. The quantum mechanical evolution of multiple qubits within each node is described. Presented is a proof that the parallel quantum system obeys a finite-difference quantum Boltzman equation at the mesoscopic scale, leading in turn to various classical linear and nonlinear effective field theories at the macroscopic scale depending on the details of the local qubit-qubit interactions.

  9. MACSIGMA0 - MACINTOSH TOOL FOR ANALYZING JPL AIRSAR, ERS-1, JERS-1, AND MAGELLAN MIDR DATA

    Science.gov (United States)

    Norikane, L.

    1994-01-01

    MacSigma0 is an interactive tool for the Macintosh which allows you to display and make computations from radar data collected by the following sensors: the JPL AIRSAR, ERS-1, JERS-1, and Magellan. The JPL AIRSAR system is a multi-polarimetric airborne synthetic aperture radar developed and operated by the Jet Propulsion Laboratory. It includes the single-frequency L-band sensor mounted on the NASA CV990 aircraft and its replacement, the multi-frequency P-, L-, and C-band sensors mounted on the NASA DC-8. MacSigma0 works with data in the standard JPL AIRSAR output product format, the compressed Stokes matrix format. ERS-1 and JERS-1 are single-frequency, single-polarization spaceborne synthetic aperture radars launched by the European Space Agency and NASDA respectively. To be usable by MacSigma0, The data must have been processed at the Alaska SAR Facility and must be in the "low-resolution" format. Magellan is a spacecraft mission to map the surface of Venus with imaging radar. The project is managed by the Jet Propulsion Laboratory. The spacecraft carries a single-frequency, single-polarization synthetic aperture radar. MacSigma0 works with framelets of the standard MIDR CD-ROM data products. MacSigma0 provides four basic functions: synthesis of images (if necessary), statistical analysis of selected areas, analysis of corner reflectors as a calibration measure (if appropriate and possible), and informative mouse tracking. For instance, the JPL AIRSAR data can be used to synthesize a variety of images such as a total power image. The total power image displays the sum of the polarized and unpolarized components of the backscatter for each pixel. Other images which can be synthesized are HH, HV, VV, RL, RR, HHVV*, HHHV*, HVVV*, HHVV* phase and correlation coefficient images. For the complex and phase images, phase is displayed using color and magnitude is displayed using intensity. MacSigma0 can also be used to compute statistics from within a selected area. The

  10. A randomised trial to compare Truview PCD(®), C-MAC(®) and Macintosh laryngoscopes in paediatric airway management.

    Science.gov (United States)

    Singh, Ranju; Kumar, Nishant; Jain, Aruna

    2017-06-01

    To evaluate and compare the Truview PCD and C-MAC laryngoscopes to the standard Macintosh laryngoscope in paediatric patients. One hundred and fifty ASA I-II patients in the age group of 1-6 years (10-20 kg) scheduled for elective surgery were randomised into three equal groups for laryngoscopy and intubation with either Truview PCD (Group T), C-MAC (Group C) or Macintosh (Group M) laryngoscopes under general anaesthesia. Percentage of glottic opening (POGO) score, application of external laryngeal manoeuvre, time to intubation, number of attempts at intubation, failed intubations, episodes of desaturation and trauma caused were recorded and statistically analysed. A p value of MAC and Macintosh laryngoscopes (94.7 ± 12.9/82 ± 25.0/85.1 ± 17.1; p < 0.01). There were no failed attempts, episodes of desaturation or trauma in any of the patients. The mean intubation time taken was 19.2 s in group T, 12.3 s in group C and 10.7 s in group M, respectively. There is a statistically significant difference among groups (p < 0.01). Eight patients in group T, 21 out of 50 patients in group C and 19 out of 50 patients in group M needed OELM, respectively. There is significant difference among the groups (p < 0.01) CONCLUSION: Using Truview PCD to assist intubation offers excellent view field of glottic opening after OLEM and the mean time taken is less than 20 s. The Truview PCD tool is suitable for paediatric patients. Copyright © 2017. Published by Elsevier B.V.

  11. Computer simulation study of hexokinase II from Ehrlich ascites cells.

    Science.gov (United States)

    Garfinkel, L

    1975-02-21

    A study of the mechanism of hexokinase II from ascites cells the effects of its binding to mitochondrial membranes has been carried out by computer simulation. This is based on experimental data of Kosow and Rose and of Gumaa and McLean, and the theoretical methods of cleveland. For the soluble enzyme the mechanism is random with ternary produce-inhibition complexes; when bound to mitochondria, the mechanism becomes ordered-on, random-off, as the binding of ATP to the free enzymes becomes negligibly slow. The requirements of experimental data for mechanistic studies are discussed.

  12. Job monitoring on DIRAC for Belle II distributed computing

    Science.gov (United States)

    Kato, Yuji; Hayasaka, Kiyoshi; Hara, Takanori; Miyake, Hideki; Ueda, Ikuo

    2015-12-01

    We developed a monitoring system for Belle II distributed computing, which consists of active and passive methods. In this paper we describe the passive monitoring system, where information stored in the DIRAC database is processed and visualized. We divide the DIRAC workload management flow into steps and store characteristic variables which indicate issues. These variables are chosen carefully based on our experiences, then visualized. As a result, we are able to effectively detect issues. Finally, we discuss the future development for automating log analysis, notification of issues, and disabling problematic sites.

  13. Computer and Information Sciences II : 26th International Symposium on Computer and Information Sciences

    CERN Document Server

    Lent, Ricardo; Sakellari, Georgia

    2012-01-01

    Information technology is the enabling foundation for all of human activity at the beginning of the 21st century, and advances in this area are crucial to all of us. These advances are taking place all over the world and can only be followed and perceived when researchers from all over the world assemble, and exchange their ideas in conferences such as the one presented in this proceedings volume regarding the 26th International Symposium on Computer and Information Systems, held at the Royal Society in London on 26th to 28th September 2011. Computer and Information Sciences II contains novel advances in the state of the art covering applied research in electrical and computer engineering and computer science, across the broad area of information technology. It provides access to the main innovative activities in research across the world, and points to the results obtained recently by some of the most active teams in both Europe and Asia.

  14. Does C-MAC® video laryngoscope improve the nasotracheal intubating conditions compared to Macintosh direct laryngoscope in paediatric patients posted for tonsillectomy surgeries?

    Science.gov (United States)

    Patil, Vinuta V; Subramanya, Bala H; Kiranchand, N; Bhaskar, S Bala; Dammur, Srinivasalu

    2016-01-01

    Background and Aims: C-MAC® video laryngoscope (VL) with Macintosh blade has been found to improve Cormack-Lehane (C-L) laryngoscopic view as well as intubating conditions for orotracheal intubation. However, studies done on the performance of C-MAC® VL for nasotracheal intubation (NTI) are very few in number. Hence, we compared laryngoscopy and intubating conditions between Macintosh direct laryngoscope and C-MAC® VL for NTI. Methods: Sixty American Society of Anesthesiologists Physical Status I, II patients, aged 8–18 years, posted for tonsillectomy surgeries under general anaesthesia with NTI were randomised, into two groups. Patients in group 1 were intubated using Macintosh direct laryngoscope and group 2 with C-MAC® VL. C-L grading, time required for intubation, need for additional manoeuvres and haemodynamic changes during and after intubation were compared between the groups. Results: C-L grade 1 views were obtained in 26 and 29 patients in group 1 and group 2, respectively (86.7% vs. 96.7%). Remaining patients were having C-L grade 2 (13.3% vs. 3.3%). Duration of intubation was less than a minute in group 2 (93.3%). Need for additional manoeuvres (M1–M5) were more in group 1 (97% vs. 77%). M1 (external manipulation) was needed more in group 2 compared to group 1 (53.3% vs. 30%). Magill's forceps alone (M4) and M4 with additional external manipulation (M5) were needed more in group 1 compared to group 2 (60% vs. 16%). Conclusion: The overall performance of C-MAC® VL was better when compared to conventional direct Macintosh laryngoscope during NTI in terms of glottis visualisation, intubation time and need for additional manoeuvres.

  15. A comparison of McCoy, TruView, and Macintosh laryngoscopes for tracheal intubation in patients with immobilized cervical spine

    Directory of Open Access Journals (Sweden)

    Neerja Bharti

    2014-01-01

    Full Text Available Background: Cervical spine immobilization results in a poor laryngeal view on direct laryngoscopy leading to difficulty in intubation. This randomized prospective study was designed to compare the laryngeal view and ease of intubation with the Macintosh, McCoy, and TruView laryngoscopes in patients with immobilized cervical spine. Materials and Methods: 60 adult patients of ASA grade I-II with immobilized cervical spine undergoing elective cervical spine surgery were enrolled. Anesthesia was induced with propofol, fentanyl, and vecuronium and maintained with isoflurane and nitrous oxide in oxygen. The patients were randomly allocated into three groups to achieve tracheal intubation with Macintosh, McCoy, or TruView laryngoscopes. When the best possible view of the glottis was obtained, the Cormack-Lehane laryngoscopy grade and the percentage of glottic opening (POGO score were assessed. Other measurements included the intubation time, the intubation difficulty score, and the intubation success rate. Hemodynamic parameters and any airway complications were also recorded. Results: TruView reduced the intubation difficulty score, improved the Cormack and Lehane glottic view, and the POGO score compared with the McCoy and Macintosh laryngoscopes. The first attempt intubation success rate was also high in the TruView laryngoscope group. However, there were no differences in the time required for successful intubation and the overall success rates between the devices tested. No dental injury or hypoxia occurred with either device. Conclusion: The use of a TruView laryngoscope resulted in better glottis visualization, easier tracheal intubation, and higher first attempt success rate as compared to Macintosh and McCoy laryngoscopes in immobilized cervical spine patients.

  16. Endotracheal intubation in patients with cervical spine immobilization: a comparison of macintosh and airtraq laryngoscopes.

    LENUS (Irish Health Repository)

    Maharaj, Chrisen H

    2007-07-01

    The Airtraq laryngoscope (Prodol Ltd., Vizcaya, Spain) is a novel single-use tracheal intubation device. The authors compared ease of intubation with the Airtraq and Macintosh laryngoscopes in patients with cervical spine immobilization in a randomized, controlled clinical trial.

  17. Microcomputer Decisions for the 1990s [and] Apple's Macintosh: A Viable Choice.

    Science.gov (United States)

    Grosch, Audrey N.

    1989-01-01

    Discussion of the factors that should be considered when purchasing or upgrading a microcomputer focuses on the MS-DOS and OS/2 operating systems. Macintosh purchasing decisions are discussed in a sidebar. A glossary is provided. (CLB)

  18. Incidence and severity of postoperative sore throat: a randomized comparison of Glidescope with Macintosh laryngoscope.

    Science.gov (United States)

    Aqil, Mansoor; Khan, Mueen Ullah; Mansoor, Saara; Mansoor, Saad; Khokhar, Rashid Saeed; Narejo, Abdul Sattar

    2017-09-12

    Postoperative sore throat (POST) is a common problem following endotracheal (ET) intubation during general anesthesia. The objective was to compare the incidence and severity of POST during routine intubation with Glidescope (GL) and Macintosh laryngoscope (MCL). One hundred forty adult patients ASA I and II with normal airway, scheduled to undergo elective surgery under GA requiring ET intubation were enrolled in this prospective randomized study and were randomly divided in two groups, GL and MCL. Incidence and severity of POST was evaluated at 0, 6, 12 and 24 h after surgery. At 0 h, the incidence of POST was more in MCL than GL (n = 41 v.s n = 22, P = 0.001), and also at 6 h after surgery (n = 37 v.s n = 23, P = 0.017). Severity of POST was more at 0, 6 and 12 h after surgery in MCL (P < 0.001, P = 0.001, P = 0.004 respectively). Routine use of GL for ET tube placement results in reduction in the incidence and severity of POST compared to MCL. ClinicalTrials.gov NCT02848365 . Retrospectively Registered (Date of registration: July, 2016).

  19. SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Wang, L.

    1994-01-01

    representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.

  20. SPLICER - A GENETIC ALGORITHM TOOL FOR SEARCH AND OPTIMIZATION, VERSION 1.0 (MACINTOSH VERSION)

    Science.gov (United States)

    Wang, L.

    1994-01-01

    representation scheme. The SPLICER tool provides representation libraries for binary strings and for permutations. These libraries contain functions for the definition, creation, and decoding of genetic strings, as well as multiple crossover and mutation operators. Furthermore, the SPLICER tool defines the appropriate interfaces to allow users to create new representation libraries. Fitness modules are the only component of the SPLICER system a user will normally need to create or alter to solve a particular problem. Fitness functions are defined and stored in interchangeable fitness modules which must be created using C language. Within a fitness module, a user can create a fitness (or scoring) function, set the initial values for various SPLICER control parameters (e.g., population size), create a function which graphically displays the best solutions as they are found, and provide descriptive information about the problem. The tool comes with several example fitness modules, while the process of developing a fitness module is fully discussed in the accompanying documentation. The user interface is event-driven and provides graphic output in windows. SPLICER is written in Think C for Apple Macintosh computers running System 6.0.3 or later and Sun series workstations running SunOS. The UNIX version is easily ported to other UNIX platforms and requires MIT's X Window System, Version 11 Revision 4 or 5, MIT's Athena Widget Set, and the Xw Widget Set. Example executables and source code are included for each machine version. The standard distribution media for the Macintosh version is a set of three 3.5 inch Macintosh format diskettes. The standard distribution medium for the UNIX version is a .25 inch streaming magnetic tape cartridge in UNIX tar format. For the UNIX version, alternate distribution media and formats are available upon request. SPLICER was developed in 1991.

  1. [Computer stimulation of endoscopic resection of the prostate].

    Science.gov (United States)

    Lardennois, B; Clément, T; Ziade, A; Brandt, B

    1990-01-01

    The number of urologists in need of training is increasing whereas the number of resections of the prostate is falling. Patients are less and less willing to have their procedure serve for young residents to learn the technique of endoscopic resection of the prostate. Despite teachings and videoendoscopy, this procedure remains difficult to learn. We decided to develop a simulator for endoscopic resection after having seen the remarkable model of endoscopic simulation developed by French gastroenterologists and presented at the computer science workshop of the AFU symposium. The hypercard program, the laservision disk, and the CD ROM project were elected as a good introduction to this challenging although not unsurmountable problem. We had to adequate our goals to the resources of French urologists: the Macintosh II is the most sophisticated affordable computer. Computerization of the televised images of the endoscopic procedure and formalization of the gestures of the operator are required. Conventional image synthesis programs for use with the Mac II are either very slow or very limited. Conventional simulation programs are highly mathematical. Computerized images take up considerable memory space and large capacity disks or optical disks are required. The urologic laservision to which we contributed in 1986 contains few endoscopic images of the prostate but served as a basis for devising a methodology. Object programming with hypercard and animated image programs for Macintosh computers will be the starting points for our project that will benefit from the significant advances announced by Apple concerning color image file maintenance.

  2. Belle II grid computing: An overview of the distributed data management system.

    Science.gov (United States)

    Bansal, Vikas; Schram, Malachi; Belle Collaboration, II

    2017-01-01

    The Belle II experiment at the SuperKEKB collider in Tsukuba, Japan, will start physics data taking in 2018 and will accumulate 50/ab of e +e- collision data, about 50 times larger than the data set of the Belle experiment. The computing requirements of Belle II are comparable to those of a Run I LHC experiment. Computing at this scale requires efficient use of the compute grids in North America, Asia and Europe and will take advantage of upgrades to the high-speed global network. We present the architecture of data flow and data handling as a part of the Belle II computing infrastructure.

  3. Social Studies: Application Units. Course II, Teachers. Computer-Oriented Curriculum. REACT (Relevant Educational Applications of Computer Technology).

    Science.gov (United States)

    Tecnica Education Corp., San Carlos, CA.

    This book is one of a series in Course II of the Relevant Educational Applications of Computer Technology (REACT) Project. It is designed to point out to teachers two of the major applications of computers in the social sciences: simulation and data analysis. The first section contains a variety of simulation units organized under the following…

  4. PARIS II: Computer Aided Solvent Design for Pollution Prevention

    Science.gov (United States)

    This product is a summary of U.S. EPA researchers' work developing the solvent substitution software tool PARIS II (Program for Assisting the Replacement of Industrial Solvents, version 2.0). PARIS II finds less toxic solvents or solvent mixtures to replace more toxic solvents co...

  5. Multi-Rate Digital Control Systems with Simulation Applications. Volume II. Computer Algorithms

    Science.gov (United States)

    1980-09-01

    34 ~AFWAL-TR-80-31 01 • • Volume II L IL MULTI-RATE DIGITAL CONTROL SYSTEMS WITH SIMULATiON APPLICATIONS Volume II: Computer Algorithms DENNIS G. J...29 Ma -8 - Volume II. Computer Algorithms ~ / ’+ 44MWLxkQT N Uwe ~~ 4 ~jjskYIF336l5-79-C-369~ 9. PER~rORMING ORGANIZATION NAME AND ADDRESS IPROG AMEL...additional options. The analytical basis for the computer algorithms is discussed in Ref. 12. However, to provide a complete description of the program, some

  6. The Influence of Computer Training Platform on Subsequent Computer Preferences.

    Science.gov (United States)

    Pardamean, Bens; Slovaceks, Simeon

    1995-01-01

    Reports a study that examined the impact of an introductory college computer course on users' subsequent preferences in their choice of computer (IBM versus Macintosh). Surveys found a strong positive relationship between the type of computer students used in the course and their later use and purchasing preferences. (SM)

  7. Comparing insertion characteristics on nasogastric tube placement by using GlideScopeTM visualization vs. MacIntosh laryngoscope assistance in anaesthetized and intubated patients

    Directory of Open Access Journals (Sweden)

    Wan Hafsah Wan Ibadullah

    Full Text Available Abstract Background and objective: This was a prospective, randomized clinical study to compare the success rate of nasogastric tube insertion by using GlideScopeTM visualization versus direct MacIntosh laryngoscope assistance in anesthetized and intubated patients. Methods: Ninety-six ASA I or II patients, aged 18-70 years were recruited and randomized into two groups using either technique. The time taken from insertion of the nasogastric tube from the nostril until the calculated length of tube had been inserted was recorded. The success rate of nasogastric tube insertion was evaluated in terms of successful insertion in the first attempt. Complications associated with the insertion techniques were recorded. Results: The results showed success rates of 74.5% in the GlideScopeTM Group as compared to 58.3% in the MacIntosh Group (p = 0.10. For the failed attempts, the nasogastric tube was successfully inserted in all cases using rescue techniques. The duration taken in the first attempt for both techniques was not statistically significant; Group A was 17.2 ± 9.3 s as compared to Group B, with a duration of 18.9 ± 13.0 s (p = 0.57. A total of 33 patients developed complications during insertion of the nasogastric tube, 39.4% in Group A and 60.6% in Group B (p = 0.15. The most common complications, which occurred, were coiling, followed by bleeding and kinking. Conclusion: This study showed that using the GlideScopeTM to facilitate nasogastric tube insertion was comparable to the use of the MacIntosh laryngoscope in terms of successful rate of insertion and complications.

  8. Comparing insertion characteristics on nasogastric tube placement by using GlideScope™ visualization vs. MacIntosh laryngoscope assistance in anaesthetized and intubated patients.

    Science.gov (United States)

    Wan Ibadullah, Wan Hafsah; Yahya, Nurlia; Ghazali, Siti Salmah; Kamaruzaman, Esa; Yong, Liu Chian; Dan, Adnan; Md Zain, Jaafar

    2016-01-01

    This was a prospective, randomized clinical study to compare the success rate of nasogastric tube insertion by using GlideScope™ visualization versus direct MacIntosh laryngoscope assistance in anesthetized and intubated patients. Ninety-six ASA I or II patients, aged 18-70 years were recruited and randomized into two groups using either technique. The time taken from insertion of the nasogastric tube from the nostril until the calculated length of tube had been inserted was recorded. The success rate of nasogastric tube insertion was evaluated in terms of successful insertion in the first attempt. Complications associated with the insertion techniques were recorded. The results showed success rates of 74.5% in the GlideScope™ Group as compared to 58.3% in the MacIntosh Group (p=0.10). For the failed attempts, the nasogastric tube was successfully inserted in all cases using rescue techniques. The duration taken in the first attempt for both techniques was not statistically significant; Group A was 17.2±9.3s as compared to Group B, with a duration of 18.9±13.0s (p=0.57). A total of 33 patients developed complications during insertion of the nasogastric tube, 39.4% in Group A and 60.6% in Group B (p=0.15). The most common complications, which occurred, were coiling, followed by bleeding and kinking. This study showed that using the GlideScope™ to facilitate nasogastric tube insertion was comparable to the use of the MacIntosh laryngoscope in terms of successful rate of insertion and complications. Copyright © 2015 Sociedade Brasileira de Anestesiologia. Published by Elsevier Editora Ltda. All rights reserved.

  9. COMPUTER PROGRAMMING TECHNIQUES FOR INTELLIGENCE ANALYST APPLICATION. VOLUME II.

    Science.gov (United States)

    COMPUTER PROGRAMMING , STATISTICAL PROCESSES), (*MAN MACHINE SYSTEMS, DISPLAY SYSTEMS), GRAPHICS, INFORMATION RETRIEVAL, DATA PROCESSING, SYSTEMS ENGINEERING, MILITARY INTELLIGENCE, CLASSIFICATION, AIR FORCE PERSONNEL.

  10. ACII 2009, Affective Computing & Intelligent Interaction : Proceedings Volume II

    NARCIS (Netherlands)

    Mühl, Christian; Heylen, Dirk; Nijholt, Anton

    2009-01-01

    These are the proceedings of ABCI 2009, Affective Brain Computer Interfaces, a workshop that was organized in conjunction with ACII 2009, the International Conference on Affective Computation and Intelligent Interaction, held in Amsterdam, The Netherlands, September 2009. The workshop took place on

  11. Evaluation of the Airtraq and Macintosh laryngoscopes in patients at increased risk for difficult tracheal intubation.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2008-02-01

    The Airtraq, a novel single use indirect laryngoscope, has demonstrated promise in the normal and simulated difficult airway. We compared the ease of intubation using the Airtraq with the Macintosh laryngoscope, in patients at increased risk for difficult tracheal intubation, in a randomised, controlled clinical trial. Forty consenting patients presenting for surgery requiring tracheal intubation, who were deemed to possess at least three characteristics indicating an increased risk for difficulty in tracheal intubation, were randomly assigned to undergo tracheal intubation using a Macintosh (n = 20) or Airtraq (n = 20) laryngoscope. All patients were intubated by one of three anaesthetists experienced in the use of both laryngoscopes. Four patients were not successfully intubated with the Macintosh laryngoscope, but were intubated successfully with the Airtraq. The Airtraq reduced the duration of intubation attempts (mean (SD); 13.4 (6.3) vs 47.7 (8.5) s), the need for additional manoeuvres, and the intubation difficulty score (0.4 (0.8) vs 7.7 (3.0)). Tracheal intubation with the Airtraq also reduced the degree of haemodynamic stimulation and minor trauma compared to the Macintosh laryngoscope.

  12. Effects of Computer-Assisted Jigsaw II Cooperative Learning Strategy on Physics Achievement and Retention

    Science.gov (United States)

    Gambari, Isiaka Amosa; Yusuf, Mudasiru Olalere

    2016-01-01

    This study investigated the effects of computer-assisted Jigsaw II cooperative strategy on physics achievement and retention. The study also determined how moderating variables of achievement levels as it affects students' performance in physics when Jigsaw II cooperative learning is used as an instructional strategy. Purposive sampling technique…

  13. Janus II: the new generation Special Purpose Computer for spin-system simulations

    Science.gov (United States)

    Perez-Gaviro, Sergio; Janus Collaboration

    2014-03-01

    We present Janus II, our second grand challenge of High Performance Computing on Computational Physics. This Special Purpose Computer, recently developed and commissioned by the Janus Collaboration, is based on a Field-Programmable-Gate-Array (FPGA) architecture. Janus II has been designed and developed as a multipurpose reprogramable supercomputer and it is optimized for speeding up the Monte Carlo simulations of a wide class of spin glass models. It builds and improves on the experience of its predecessor,Janus, that has been successfully running physics simulations for the last 6 years. With Janus II will make possible to carry out Monte Carlo simulations campaigns that would take several centuries if performed on currently available computer systems.

  14. MacCAD (Macintosh Computer Aided Design), Computer Aided Design Tool for System Analysis.

    Science.gov (United States)

    1987-12-01

    8217 . - -. ;. . : • . , . . . . . ., . ... ..- ... . -. .... , , .. .. , ..: Change from System Group Cancel Forward Path Feedback Path (stmMotor Circuit I (elocity ’: Lstm Loa~d Type of feedback - N simplification form ,Geq [I Add

  15. Yeast ancestral genome reconstructions: the possibilities of computational methods II.

    Science.gov (United States)

    Chauve, Cedric; Gavranovic, Haris; Ouangraoua, Aida; Tannier, Eric

    2010-09-01

    Since the availability of assembled eukaryotic genomes, the first one being a budding yeast, many computational methods for the reconstruction of ancestral karyotypes and gene orders have been developed. The difficulty has always been to assess their reliability, since we often miss a good knowledge of the true ancestral genomes to compare their results to, as well as a good knowledge of the evolutionary mechanisms to test them on realistic simulated data. In this study, we propose some measures of reliability of several kinds of methods, and apply them to infer and analyse the architectures of two ancestral yeast genomes, based on the sequence of seven assembled extant ones. The pre-duplication common ancestor of S. cerevisiae and C. glabrata has been inferred manually by Gordon et al. (Plos Genet. 2009). We show why, in this case, a good convergence of the methods is explained by some properties of the data, and why results are reliable. In another study, Jean et al. (J. Comput Biol. 2009) proposed an ancestral architecture of the last common ancestor of S. kluyveri, K. thermotolerans, K. lactis, A. gossypii, and Z. rouxii inferred by a computational method. In this case, we show that the dataset does not seem to contain enough information to infer a reliable architecture, and we construct a higher resolution dataset which gives a good reliability on a new ancestral configuration.

  16. Modernization of computer of plant Vandellos-II; Modernizacion del ordenador de planta de Vandello-II

    Energy Technology Data Exchange (ETDEWEB)

    Fuente Arias, E. de la

    2014-07-01

    The Plant computer from the nuclear de Vandellos II, whose modernization process will carry out Westinghouse, is a centralized system which performs monitoring and supervision in real time of plant processes and which performs the calculations necessary for an efficient assessment of plant operation, without performing any action on it. Its main function is to provide current and historical information on the status of the plant, both in normal operation and emergency conditions. (Author)

  17. A Technology Assessment of Personal Computers. Vol. II: Personal Computer Technology, Users, and Uses.

    Science.gov (United States)

    Nilles, Jack M.

    This volume reports on the initial phase of a technology assessment of personal computers. First, technological developments that will influence the rate of diffusion of personal computer technology among the general populace are examined. Then the probable market for personal computers is estimated and analyzed on a functional basis, segregating…

  18. Study on Effect of Gd (III) Speciation on Ca (II) Speciation in Human Blood Plasma by Computer Simulation

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Ca (II) speciation and effect of Gd (III) speciation on Ca (II) speciation in human blood plasma were studied by computer simulation. [CaHCO3]+ is a predominant compound species of Ca (II). Gd (III) can compete with Ca (II) for biological molecules. The presence of Gd (III) results in a increase of concentration of free Ca (II) and a decrease of concentration of Ca (II) compounds.

  19. CPU SIM: A Computer Simulator for Use in an Introductory Computer Organization-Architecture Class.

    Science.gov (United States)

    Skrein, Dale

    1994-01-01

    CPU SIM, an interactive low-level computer simulation package that runs on the Macintosh computer, is described. The program is designed for instructional use in the first or second year of undergraduate computer science, to teach various features of typical computer organization through hands-on exercises. (MSE)

  20. Photoisomerization among ring-open merocyanines. II. A computational study

    Science.gov (United States)

    Walter, Christof; Ruetzel, Stefan; Diekmann, Meike; Nuernberger, Patrick; Brixner, Tobias; Engels, Bernd

    2014-06-01

    The photochemical isomerization of the trans-trans-cis to the trans-trans-trans isomer of the merocyanine form of 6-nitro BIPS, which has been studied with femtosecond transient absorption spectroscopy [S. Ruetzel, M. Diekmann, P. Nuernberger, C. Walter, B. Engels, and T. Brixner, J. Chem. Phys. 140, 224310 (2014)], is investigated using time-dependent density functional theory in conjunction with polarizable continuum models. Benchmark calculations against SCS-ADC(2) evaluate the applicability of the CAM-B3LYP functional. Apart from a relaxed scan in the ground state with additional computation of the corresponding excitation energies, which produces the excited-state surface vertical to the ground-state isomerization coordinate, a relaxed scan in the S1 gives insight into the geometric changes orthogonal to the reaction coordinate and the fluorescence conditions. The shape of the potential energy surface (PES) along the reaction coordinate is found to be highly sensitive to solvation effects, with the method of solvation (linear response vs. state-specific) being critical. The shape of the PES as well as the computed harmonic frequencies in the S1 minima are in line with the experimental results and offer a straightforward interpretation.

  1. Fast Algorithm for Computing the Discrete Hartley Transform of Type-II

    Directory of Open Access Journals (Sweden)

    Mounir Taha Hamood

    2016-06-01

    Full Text Available The generalized discrete Hartley transforms (GDHTs have proved to be an efficient alternative to the generalized discrete Fourier transforms (GDFTs for real-valued data applications. In this paper, the development of direct computation of radix-2 decimation-in-time (DIT algorithm for the fast calculation of the GDHT of type-II (DHT-II is presented. The mathematical analysis and the implementation of the developed algorithm are derived, showing that this algorithm possesses a regular structure and can be implemented in-place for efficient memory utilization.The performance of the proposed algorithm is analyzed and the computational complexity is calculated for different transform lengths. A comparison between this algorithm and existing DHT-II algorithms shows that it can be considered as a good compromise between the structural and computational complexities.

  2. Retention of tracheal intubation skills by novice personnel: a comparison of the Airtraq and Macintosh laryngoscopes.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2007-03-01

    Direct laryngoscopic tracheal intubation is a potentially lifesaving manoeuvre, but it is a difficult skill to acquire and to maintain. These difficulties are exacerbated if the opportunities to utilise this skill are infrequent, and by the fact that the consequences of poorly performed intubation attempts may be severe. Novice users find the Airtraq laryngoscope easier to use than the conventional Macintosh laryngoscope. We therefore wished to determine whether novice users would have greater retention of intubation skills with the Airtraq rather than the Macintosh laryngoscope. Twenty medical students who had no prior airway management experience participated in this study. Following brief didactic instruction, each took turns performing laryngoscopy and intubation using the Macintosh and Airtraq devices in easy and simulated difficult laryngoscopy scenarios. The degree of success with each device, the time taken to perform intubation and the assistance required, and the potential for complications were then assessed. Six months later, the assessment process was repeated. No didactic instruction or practice attempts were provided on this latter occasion. Tracheal intubation skills declined markedly with both devices. However, the Airtraq continued to provide better intubating conditions, resulting in greater success of intubation, with fewer optimisation manoeuvres required, and reduced potential for dental trauma, particularly in the difficult laryngoscopy scenarios. The substantial decline in direct laryngoscopy skills over time emphasise the need for continued reinforcement of this complex skill.

  3. [McGRATH® MAC Is Useful to Learn Tracheal Intubation Using a Macintosh Laryngoscope].

    Science.gov (United States)

    Wakasugi, Keiko; Niyama, Yukitoshi; Kita, Asuka; Sonoda, Hajime; Yamakage, Michiaki

    2015-10-01

    Learning tracheal intubation using a Macintosh laryngoscope (McL) is important although video laryngoscope is becoming popular. The purpose of this study was to compare the usefulness as a training device for intubation technique using McL with three devices; McGRATH® MAC (MAC), Airwayscope® (AWS) and McL. In this prospective study, 60 nurses not experienced in tracheal intubation were randomly assigned to MAC, AWS, and McL groups (each group: n=20), and 10 times of practice using each device were carried out. We compared the intubation time using McL and the nurse's anatomical understanding of the larynx before and after the practice. The intubation time before the practice was comparable among the three groups, but the time after the practice was significantly shorter in the McL and MAC groups compared to the AWS group (P=0.001). The practice significantly improved anatomical understanding of the larynx in all groups (PMAC and AWS groups compared with the McL group (PMAC may possess advantages compared to Airwayscope® and Macintosh laryngoscope as a training device for learning intubation technique using Macintosh laryngoscope and understanding anatomy of the larynx.

  4. Measurement of forces applied during Macintosh direct laryngoscopy compared with GlideScope® videolaryngoscopy.

    Science.gov (United States)

    Russell, T; Khan, S; Elman, J; Katznelson, R; Cooper, R M

    2012-06-01

    Laryngoscopy can induce stress responses that may be harmful in susceptible patients. We directly measured the force applied to the base of the tongue as a surrogate for the stress response. Force measurements were obtained using three FlexiForce Sensors(®) (Tekscan Inc, Boston, MA, USA) attached along the concave surface of each laryngoscope blade. Twenty-four 24 adult patients of ASA physical status 1-2 were studied. After induction of anaesthesia and neuromuscular blockade, laryngoscopy and tracheal intubation was performed using either a Macintosh or a GlideScope(®) (Verathon, Bothell, WA, USA) laryngoscope. Complete data were available for 23 patients. Compared with the Macintosh, we observed lower median (IQR [range]) peak force (9 (5-13 [3-25]) N vs 20 (14-28 [4-41]) N; p = 0.0001), average force (5 (3-7 [2-19]) N vs 11 (6-16 [1-24]) N; p = 0.0003) and impulse force (98 (42-151 [26-444]) Ns vs 150 (93-207 [17-509]) Ns; p = 0.017) with the GlideScope. Our study shows that the peak lifting force on the base of the tongue during laryngoscopy is less with the GlideScope videolaryngoscope compared with the Macintosh laryngoscope.

  5. Computer Aided Design of Polygalacturonase II from Aspergillus niger

    Directory of Open Access Journals (Sweden)

    Ibrahim Ali Noorbatcha

    2011-12-01

    Full Text Available Pectin is a complex polysaccharide found in the cell walls of plants and consisting mainly of esterified D-galacturonic acid resides in α-(1-4 chain. In production of fruit juice, pectin contributes to fruit juice viscosity, thereby reducing the juice production and increasing the filtration time. Polygalacturonase improves the juice production process by rapid degradation of pectin. In this project we have designed a novel polygalacturonase enzyme using computer aided design approaches. The three dimension structure of polygalacturonase is first modeled on the basis of the known crystal structure. The active site in this enzyme is identified by manual and automated docking methods. Lamarckian genetic algorithm is used for automated docking and the active site is validated by comparing with existing experimental data. This is followed by in silico mutations of the enzymes and the automated docking process is repeated using the mutant enzymes. The strength of the binding of the ligands inside the active site is evaluated by computing the binding score using Potential Mean Force (PMF method. The in silico mutations R256Q and K258N are found to decrease the binding strength of the ligand at the active site, indicating lowering of enzyme activity, which is consistent with the experimental results. Hence in silico mutations can be used to design new polygalacturonase enzymes with improved enzyme activity.ABSTRAK: Pektin adalah polisakarida kompleks yang terdapat di dalam dinding sel tumbuhan dan sebahagian besarnya terdiri daripada asid D-galakturonik terester yang ditemui di dalam rantaian α-(1-4. Dalam penghasilan jus buah-buahan, pektin menyumbang dalam kepekatan jus buah-buahan, di mana ia mengurangkan penghasilan jus dan menambahkan masa penapisan. Poligalakturonase meningkatkan proses penghasilan jus dengan pemecahan pektin dengan cepat. Dalam projek ini, kami telah merangka satu enzim poligalakturonase baru dengan menggunakan pendekatan reka

  6. Phase II Final Report Computer Optimization of Electron Guns

    Energy Technology Data Exchange (ETDEWEB)

    R. Lawrence Ives; Thuc Bui; Hien Tran; Michael Read; Adam Attarian; William Tallis

    2011-04-15

    This program implemented advanced computer optimization into an adaptive mesh, finite element, 3D, charged particle code. The routines can optimize electron gun performance to achieve a specified current, beam size, and perveance. It can also minimize beam ripple and electric field gradients. The magnetics optimization capability allows design of coil geometries and magnetic material configurations to achieve a specified axial magnetic field profile. The optimization control program, built into the charged particle code Beam Optics Analyzer (BOA) utilizes a 3D solid modeling package to modify geometry using design tables. Parameters within the graphical user interface (currents, voltages, etc.) can be directly modified within BOA. The program implemented advanced post processing capability for the optimization routines as well as the user. A Graphical User Interface allows the user to set up goal functions, select variables, establish ranges of variation, and define performance criteria. The optimization capability allowed development of a doubly convergent multiple beam gun that could not be designed using previous techniques.

  7. New variational bounds on convective transport. II. Computations and implications

    Science.gov (United States)

    Souza, Andre; Tobasco, Ian; Doering, Charles R.

    2016-11-01

    We study the maximal rate of scalar transport between parallel walls separated by distance h, by an incompressible fluid with scalar diffusion coefficient κ. Given velocity vector field u with intensity measured by the Péclet number Pe =h2 1/2 / κ (where is space-time average) the challenge is to determine the largest enhancement of wall-to-wall scalar flux over purely diffusive transport, i.e., the Nusselt number Nu . Variational formulations of the problem are studied numerically and optimizing flow fields are computed over a range of Pe . Implications of this optimal wall-to-wall transport problem for the classical problem of Rayleigh-Bénard convection are discussed: the maximal scaling Nu Pe 2 / 3 corresponds, via the identity Pe2 = Ra (Nu - 1) where Ra is the usual Rayleigh number, to Nu Ra 1 / 2 as Ra -> ∞ . Supported in part by National Science Foundation Graduate Research Fellowship DGE-0813964, awards OISE-0967140, PHY-1205219, DMS-1311833, and DMS-1515161, and the John Simon Guggenheim Memorial Foundation.

  8. CPESIM II: A Computer System Simulation for Computer Performance Evaluation Use.

    Science.gov (United States)

    1983-12-01

    1 C C DATE: 5 DEC 1983 C C VERSION: 1.0 C C TITLE: CPESIM II Discrete File C FILENAME: SIMF C AUTHORt: David L. Owen C SOF’IWARE SYSTEM: CPESIM II C...itself. In the second phase the actual simulation programs, SIMS and SIMF , are used along with CONDES and EVSTR as input to generate performance data...shown in Figure 8. It is necessary to attach the network portion of the model (SIMS), the discrete portion of the model ( SIMF ), as well as the CONDES

  9. Mononuclear nickel (II) and copper (II) coordination complexes supported by bispicen ligand derivatives: Experimental and computational studies

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Nirupama; Niklas, Jens; Poluektov, Oleg; Van Heuvelen, Katherine M.; Mukherjee, Anusree

    2017-01-01

    The synthesis, characterization and density functional theory calculations of mononuclear Ni and Cu complexes supported by the N,N’-Dimethyl-N,N’-bis-(pyridine-2-ylmethyl)-1,2-diaminoethane ligand and its derivatives are reported. The complexes were characterized by X-ray crystallography as well as by UV-visible absorption spectroscopy and EPR spectroscopy. The solid state structure of these coordination complexes revealed that the geometry of the complex depended on the identity of the metal center. Solution phase characterization data are in accord with the solid phase structure, indicating minimal structural changes in solution. Optical spectroscopy revealed that all of the complexes exhibit color owing to d-d transition bands in the visible region. Magnetic parameters obtained from EPR spectroscopy with other structural data suggest that the Ni(II) complexes are in pseudo-octahedral geometry and Cu(II) complexes are in a distorted square pyramidal geometry. In order to understand in detail how ligand sterics and electronics affect complex topology detailed computational studies were performed. The series of complexes reported in this article will add significant value in the field of coordination chemistry as Ni(II) and Cu(II) complexes supported by tetradentate pyridyl based ligands are rather scarce.

  10. A comparison of GlideScope videolaryngoscope with Macintosh laryngoscope for laryngeal views

    Institute of Scientific and Technical Information of China (English)

    LI Jin-bao; DENG Xiao-ming; WANG Xiao-lin; XIONG Yuan-chang; FAN Xiao-hua; LIU Yi; XU Hua; MA Yu; DU Jian-er; ZHAI Rong

    2007-01-01

    Objective: To describe the use of the GlideScope in comparison with direct laryngoscopy for elective surgical patients requiring tracheal intubation. Methods: Two hundred patients, ASA Ⅰ-Ⅱ scheduled for elective surgery under general anesthesia requiring orotracheal intubation were selected. Information was collected identifying the patient demographics and airway assessment features (Mallampati oropharyngeal scale, thyromenta distance and mouth opening). In a random crossover design, after induction of anesthesia and neuromuscular block, the laryngoscopes were inserted in turn, and the views of the glottis at laryngoscopy (Cormack and Lehane scores) were compared. The tracchea was intubated using either the standard Macintosh laryngoscope or GlideScope after the second grading at laryngoscopy was done. Complications associated with intubating were recorded. Results: There were 200 patients including 107 males and 93 females, with mean age being 52±13 years, height 164. 8±11.3 cm, weight 64.0±11.5 kg, thyromental distance 6.9±1.1 cm, and mouth opening 5.7±0.5 cm. There was a significant association between the preoperative view of the oropharynx (Mallampati score) and the view of the glottis at laryngoscopy for both the direct Macintosh laryngoscope (P<0.001) and the GlideScope (P<0.001). Among 200 patients, 106 patients had the same C&L grade, 91 of remaining patients showed improvement in the C&L grade (P<0.001) obtained with GlideScope compared with the direct Macintosh laryngoscope.3 of remaining patients showed better view of the glottis(C&L grade) with the direct Macintosh laryngoscope (grade 1) than with GlideScope (grade 2). There were no cases of failure to be intubated. There were no cases of dental or mucosal injury in all patients. Conclusion: GlideScope videolaryngoscope yielded comparable or superior laryngeal view compared with Macintosh laryngoscope. The new type of laryngoscope may have potential advantages for managing the difficult

  11. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  12. THE IMPROVEMENT OF COMPUTER NETWORK PERFORMANCE WITH BANDWIDTH MANAGEMENT IN KEMURNIAN II SENIOR HIGH SCHOOL

    Directory of Open Access Journals (Sweden)

    Bayu Kanigoro

    2012-05-01

    Full Text Available This research describes the improvement of computer network performance with bandwidth management in Kemurnian II Senior High School. The main issue of this research is the absence of bandwidth division on computer, which makes user who is downloading data, the provided bandwidth will be absorbed by the user. It leads other users do not get the bandwidth. Besides that, it has been done IP address division on each room, such as computer, teacher and administration room for supporting learning process in Kemurnian II Senior High School, so wireless network is needed. The method is location observation and interview with related parties in Kemurnian II Senior High School, the network analysis has run and designed a new topology network including the wireless network along with its configuration and separation bandwidth on microtic router and its limitation. The result is network traffic on Kemurnian II Senior High School can be shared evenly to each user; IX and IIX traffic are separated, which improve the speed on network access at school and the implementation of wireless network.Keywords: Bandwidth Management; Wireless Network

  13. A comparison of the C-MAC video laryngoscope to the Macintosh direct laryngoscope for intubation in the emergency department.

    Science.gov (United States)

    Sakles, John C; Mosier, Jarrod; Chiu, Stephen; Cosentino, Mari; Kalin, Leah

    2012-12-01

    We determine the proportion of successful intubations with the C-MAC video laryngoscope (C-MAC) compared with the direct laryngoscope in emergency department (ED) intubations. This was a retrospective analysis of prospectively collected data entered into a continuous quality improvement database during a 28-month period in an academic ED. After each intubation, the operator completed a standardized data form evaluating multiple aspects of the intubation, including patient demographics, indication for intubation, device(s) used, reason for device selection, difficult airway characteristics, number of attempts, and outcome of each attempt. Intubation was considered ultimately successful if the endotracheal tube was correctly inserted into the trachea with the initial device. An attempt was defined as insertion of the device into the mouth regardless of whether there was an attempt to pass the tube. The primary outcome measure was ultimate success. Secondary outcome measures were first-attempt success, Cormack-Lehane view, and esophageal intubation. Multivariate logistic regression analyses, with the inclusion of a propensity score, were performed for the outcome variables ultimate success and first-attempt success. During the 28-month study period, 750 intubations were performed with either the C-MAC with a size 3 or 4 blade or a direct laryngoscope with a Macintosh size 3 or 4 blade. Of these, 255 were performed with the C-MAC as the initial device and 495 with a Macintosh direct laryngoscope as the initial device. The C-MAC resulted in successful intubation in 248 of 255 cases (97.3%; 95% confidence interval [CI] 94.4% to 98.9%). A direct laryngoscope resulted in successful intubation in 418 of 495 cases (84.4%; 95% CI 81.0% to 87.5%). In the multivariate regression model, with a propensity score included, the C-MAC was positively predictive of ultimate success (odds ratio 12.7; 95% CI 4.1 to 38.8) and first-attempt success (odds ratio 2.2; 95% CI 1.2 to 3.8). When

  14. Alkyl sulfonic acide hydrazides: Synthesis, characterization, computational studies and anticancer, antibacterial, anticarbonic anhydrase II (hCA II) activities

    Science.gov (United States)

    O. Ozdemir, Ummuhan; İlbiz, Firdevs; Balaban Gunduzalp, Ayla; Ozbek, Neslihan; Karagoz Genç, Zuhal; Hamurcu, Fatma; Tekin, Suat

    2015-11-01

    Methane sulfonic acide hydrazide, CH3SO2NHNH2 (1), ethane sulfonic acide hydrazide, CH3CH2SO2NHNH2 (2), propane sulfonic acide hydrazide, CH3CH2CH2SO2NHNH2 (3) and butane sulfonic acide hydrazide, CH3CH2CH2CH2SO2NHNH2 (4) have been synthesized as homologous series and characterized by using elemental analysis, spectrophotometric methods (1H-13C NMR, FT-IR, LC-MS). In order to gain insight into the structure of the compounds, we have performed computational studies by using 6-311G(d, p) functional in which B3LYP functional were implemented. The geometry of the sulfonic acide hydrazides were optimized at the DFT method with Gaussian 09 program package. A conformational analysis of compounds were performed by using NMR theoretical calculations with DFT/B3LYP/6-311++G(2d, 2p) level of theory by applying the (GIAO) approach. The anticancer activities of these compounds on MCF-7 human breast cancer cell line investigated by comparing IC50 values. The antibacterial activities of synthesized compounds were studied against Gram positive bacteria; Staphylococcus aureus ATCC 6538, Bacillus subtilis ATCC 6633, Bacillus cereus NRRL-B-3711, Enterococcus faecalis ATCC 29212 and Gram negative bacteria; Escherichia coli ATCC 11230, Pseudomonas aeruginosa ATCC 15442, Klebsiella pneumonia ATCC 70063 by using the disc diffusion method. The inhibition activities of these compounds on carbonic anhydrase II enzyme (hCA II) have been investigated by comparing IC50 and Ki values. The biological activity screening shows that butane sulfonic acide hydrazide (4) has more activity than the others against tested breast cancer cell lines MCF-7, Gram negative/Gram positive bacteria and carbonic anhydrase II (hCA II) isoenzyme.

  15. Evaluation of Truview evo2 Laryngoscope In Anticipated Difficult Intubation - A Comparison To Macintosh Laryngoscope.

    Science.gov (United States)

    Singh, Ishwar; Khaund, Abhijit; Gupta, Abhishek

    2009-04-01

    The aim of the study was to assess and compare laryngoscopic view of Truview evo2 laryngoscope with that of Macintosh laryngoscope in patients with one or more predictors of difficult intubation (PDI). Moreover ease of intubation with Truview evo2 in terms of absolute time requirement was also aimed at. Patients for elective surgery requiring endotracheal intubation were initially assessed for three PDI parameters - modified Mallampati test, thyro-mental distance & Atlanto-occipital (AO) joint extension. Patients with cumulative PDI scores of 2 to 5 (in a scale of 0 to 8) were evaluated for Cormack & Lehane (CL) grading by Macintosh blade after standard induction. Cases with CL grade of two or more were further evaluated by Truview evo2 laryngoscope and corresponding CL grades were assigned. Intubation attempted under Truview evo2 vision and time required for each successful tracheal intubation (i.e. tracheal intubation completed within one minute) was noted. Total fifty cases were studied. The CL grades assigned by Macintosh blade correlated well with the cumulative PDI scores assigned preoperatively, confirming there predictability. Truview evo2 improved laryngeal view in 92 % cases by one or more CL grade. Intubation with Truview evo2 was possible in 88% cases within stipulated time of one minute and mean time of 28.6 seconds with SD of 11.23 was reasonably quick. No significant complication like oro- pharyngeal trauma or extreme pressor response to laryngoscopy was noticed. To conclude, Truview evo2 proved to be a better tool than conventional laryngoscope in anticipated difficult situations.

  16. Macros for Educational Research: Part II.

    Science.gov (United States)

    Woodrow, Janice E. J.

    1989-01-01

    Describes the design and operation of four software packages, or macros, written in the programing language of Microsoft's EXCEL for use on the Macintosh computer for data manipulation and presentation used in educational research. Reordering tabulated data, reversing the scoring of tabulated data, and creating tables and graphs are explained.…

  17. THE COMPARATIVE STUDY OF STANDARD MACINTOSH HANDLE VERSUS SHORT HANDLE FOR LARYNGOSCOPY AND INTUBATION IN OBSTETRIC PATIENTS FOR LOWER SEGMENT CESAREAN SECTION

    Directory of Open Access Journals (Sweden)

    Neeharika

    2014-09-01

    Full Text Available : INTRODUCTION: The incidence of failed intubation is higher in obstetrics (1:280 than other surgical patients (1:2230. The anatomical factors that place the pregnant patient at increased risk for airway complications and difficult intubation include pregnancy induced generalized weight gain particularly increase in breast size, respiratory mucosal edema, and an increased risk of pulmonary aspiration. In the supine position, the enlarged breasts tend to fall back against the neck, which can interfere with insertion of the laryngoscope. The aim of our study is to assess the efficacy of short handle laryngoscope versus standard Macintosh handle laryngoscope for laryngoscopy and intubation in obstetric patients posted for Lower Segment Cesarean Section. PLAN OF STUDY: Randomized prospective study. ASA grade I and II full term obstetric patients posted for elective or emergency LSCS studied in two groups[ Group I (n=20 - Standard Macintosh handle, Group II (n=20 - Short / stubby handle (Anesthetics make, India]. Height and weight of patients were recorded. Head, neck and oral cavity of the patient were examined to rule out any obvious pathology and to detect any anticipated difficult intubations for exclusion. Examination of the airway included: neck length, sternomental distance, thyromental distance, inter incisor gap, chest circumference and modified Mallampati grading. The observations noted during laryngoscopy: number of attempts for insertion of laryngoscope into oral cavity, ease of insertion of laryngoscope blade into oral cavity, number of attempts for successful intubation, duration of laryngoscopy and intubation, perpendicular distance from the lower edge of distal end of laryngoscope handle to patient’s chest wall. OBSERVATIONS: The perpendicular distance was significantly higher in group II (16 cm than group I (13.6 cm.The time for laryngoscopy and intubation hard a significant correlation to weight as well as chest circumference in

  18. COMPARISON BETWEEN MACINTOSH LARYNGOSCOPE AND MCGRATH VIDEO LARYNGOSCOPE FOR ENDOTRACHEAL INTUBATION IN NEUROSURGICAL PATIENTS

    Directory of Open Access Journals (Sweden)

    Aastha

    2016-03-01

    Full Text Available This study was done on sixty patients of ASA 1 and 2, undergoing elective surgery under general anaesthesia. The patients were allocated in two groups of 30 patients each. Patients selected were allocated to two groups and without risk factors. Direct laryngoscopy group (group 1 patients were intubated through direct laryngoscope. Video laryngoscopy group (group 2 patients were intubated through McGrath VLS. The distribution of patients according to age, sex and weight was comparable (p>.001 in both the two groups. The changes in heart rate, mean arterial pressure, oxygen saturation were not significant (p>.001 between the two groups after intubation at different time intervals. The number of attempts and intubation time was found to be significantly higher in McGrath VLS as compared to Macintosh laryngoscope. The increase in postoperative sore throat and hoarseness after 6 and 24 hrs following operation was found to be significant in group 1 compared to group 2. So from our study, we conclude that the use of McGrath video laryngoscope has no advantage over direct laryngoscopy in attenuating the cardiovascular responses attributed to tracheal intubation in patients with normal airway. It is also associated with greater number of attempts and longer intubation time. However, with the use of stylet, number of attempts can be reduced, although the use of stylet has its own complications. VLS has lesser incidence of post-operative sore throat and hoarseness as compared to Macintosh laryngoscopy.

  19. Tracheal intubation by inexperienced medical residents using the Airtraq and Macintosh laryngoscopes--a manikin study.

    LENUS (Irish Health Repository)

    Maharaj, Chrisen H

    2006-11-01

    The Airtraq laryngoscope is a novel intubation device that may possess advantages over conventional direct laryngoscopes for use by personnel that are infrequently required to perform tracheal intubation. We conducted a prospective study in 20 medical residents with little prior airway management experience. After brief didactic instruction, each participant took turns performing laryngoscopy and intubation using the Macintosh (Welch Allyn, Welch Allyn, NY) and Airtraq (Prodol Ltd. Vizcaya, Spain) devices, in 3 laryngoscopy scenarios in a Laerdal Intubation Trainer (Laerdal, Stavanger, Norway) and 1 scenario in a Laerdal SimMan manikin (Laerdal, Kent, UK). They then performed tracheal intubation of the normal airway a second time to characterize the learning curve. In all scenarios tested, the Airtraq decreased the duration of intubation attempts, reduced the number of optimization maneuvers required, and reduced the potential for dental trauma. The residents found the Airtraq easier to use in all scenarios compared with the Macintosh laryngoscope. The Airtraq may constitute a superior device for use by personnel infrequently required to perform tracheal intubation.

  20. Learning Curves of Macintosh Laryngoscope in Nurse Anesthetist Trainees Using Cumulative Sum Method

    Directory of Open Access Journals (Sweden)

    Panthila Rujirojindakul

    2014-01-01

    Full Text Available Background. Tracheal intubation is a potentially life-saving procedure. This skill is taught to many anesthetic healthcare professionals, including nurse anesthetists. Our goal was to evaluate the learning ability of nurse anesthetist trainees in their performance of orotracheal intubation with the Macintosh laryngoscope. Methods. Eleven nurse anesthetist trainees were enrolled in the study during the first three months of their training. All trainees attended formal lectures and practice sessions with manikins at least one time on performing successful tracheal intubation under supervision of anesthesiology staff. Learning curves for each nurse anesthetist trainee were constructed with the standard cumulative summation (cusum methods. Results. Tracheal intubation was attempted on 388 patients. Three hundred and six patients (78.9% were successfully intubated on the trainees’ first attempt and 17 patients (4.4% on the second attempt. The mean ± SD number of orotracheal intubations per trainee was 35.5 ± 5.1 (range 30–47. Ten (90.9% of 11 trainees crossed the 20% acceptable failure rate line. A median of 22 procedures was required to achieve an 80% orotracheal intubations success rate. Conclusion. At least 22 procedures were required to reach an 80% success rate for orotracheal intubation using Macintosh laryngoscope in nonexperienced nurse anesthetist trainees.

  1. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  2. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    Science.gov (United States)

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  3. Tracheal intubation in patients with cervical spine immobilization: a comparison of the Airwayscope, LMA CTrach, and the Macintosh laryngoscopes.

    LENUS (Irish Health Repository)

    Malik, M A

    2009-05-01

    The purpose of this study was to evaluate the effectiveness of the Pentax AWS, and the LMA CTrach, in comparison with the Macintosh laryngoscope, when performing tracheal intubation in patients with neck immobilization using manual in-line axial cervical spine stabilization.

  4. Comparison of Macintosh, Truview EVO2, Glidescope, and Airwayscope laryngoscope use in patients with cervical spine immobilization.

    LENUS (Irish Health Repository)

    Malik, M A

    2008-11-01

    The purpose of this study was to evaluate the effectiveness of the Pentax AWS, Glidescope, and the Truview EVO2, in comparison with the Macintosh laryngoscope, when performing tracheal intubation in patients with neck immobilization using manual in-line axial cervical spine stabilization.

  5. GlideScope videolaryngoscope vs. Macintosh direct laryngoscope for intubation of morbidly obese patients: a randomized trial

    DEFF Research Database (Denmark)

    Andersen, L H; Rovsing, Marie Louise; Olsen, K S

    2011-01-01

    Morbidly obese patients are at increased risk of hypoxemia during tracheal intubation because of increased frequency of difficult and impossible intubation and a decreased apnea tolerance. In this study, intubation with the GlideScope videolaryngoscope (GS) was compared with the Macintosh direct...

  6. Radial electric field computations with DKES and neoclassical models in TJ-II stellarator

    Science.gov (United States)

    Martinell, Julio; Gutierrez-Tapia, Cesar; Lopez-Bruna, Daniel

    2015-11-01

    Radial electric fields arise due to the non-ambipolar transport in stellarator plasmas and play an important role in determining some improved confinement regimes. In order to calculate this electric field it is necessary to take all particle fluxes that are not ambipolar. The most important contribution to these fluxes comes from neoclassical transport. Here we use particle fluxes obtained from kinetic equation computations using the code DKES to evaluate the radial electric field profiles for certain discharges of the heliac TJ-II. Experimental profiles for the density and temperatures are used together with the diffusion coefficients obtained with DKES. A similar computation of the electric field is performed with three analytical neoclassical models that use an approximation for the magnetic geometry. The ambipolar electric field from the models is compared with the one given by DKES and we find that they are all qualitatively similar. They are also compared with experimental measurements of the electric field obtained with HIBP. It is shown that, although the electric field is reasonably well reproduced by the neoclassical computations, especially in high temperature regimes, the particle fluxes are not. Thus, neoclassical theory provides good Er estimates in TJ-II. Support from CONACyT 152905 and DGAPA IN109115 projects is acknowledged.

  7. Electronic structure of nickel(II) and zinc(II) borohydrides from spectroscopic measurements and computational modeling.

    Science.gov (United States)

    Desrochers, Patrick J; Sutton, Christopher A; Abrams, Micah L; Ye, Shengfa; Neese, Frank; Telser, Joshua; Ozarowski, Andrew; Krzystek, J

    2012-03-05

    The previously reported Ni(II) complex, Tp*Ni(κ(3)-BH(4)) (Tp* = hydrotris(3,5-dimethylpyrazolyl)borate anion), which has an S = 1 spin ground state, was studied by high-frequency and -field electron paramagnetic resonance (HFEPR) spectroscopy as a solid powder at low temperature, by UV-vis-NIR spectroscopy in the solid state and in solution at room temperature, and by paramagnetic (11)B NMR. HFEPR provided its spin Hamiltonian parameters: D = 1.91(1) cm(-1), E = 0.285(8) cm(-1), g = [2.170(4), 2.161(3), 2.133(3)]. Similar, but not identical parameters were obtained for its borodeuteride analogue. The previously unreported complex, Tp*Zn(κ(2)-BH(4)), was prepared, and IR and NMR spectroscopy allowed its comparison with analogous closed shell borohydride complexes. Ligand-field theory was used to model the electronic transitions in the Ni(II) complex successfully, although it was less successful at reproducing the zero-field splitting (zfs) parameters. Advanced computational methods, both density functional theory (DFT) and ab initio wave function based approaches, were applied to these Tp*MBH(4) complexes to better understand the interaction between these metals and borohydride ion. DFT successfully reproduced bonding geometries and vibrational behavior of the complexes, although it was less successful for the spin Hamiltonian parameters of the open shell Ni(II) complex. These were instead best described using ab initio methods. The origin of the zfs in Tp*Ni(κ(3)-BH(4)) is described and shows that the relatively small magnitude of D results from several spin-orbit coupling (SOC) interactions of large magnitude, but with opposite sign. Spin-spin coupling (SSC) is also shown to be significant, a point that is not always appreciated in transition metal complexes. Overall, a picture of bonding and electronic structure in open and closed shell late transition metal borohydrides is provided, which has implications for the use of these complexes in catalysis and

  8. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.

  9. Comparison of the C-MAC video laryngoscope with direct Macintosh laryngoscopy in the emergency department.

    Science.gov (United States)

    Vassiliadis, John; Tzannes, Alex; Hitos, Kerry; Brimble, Jessica; Fogg, Toby

    2015-04-01

    To investigate the first pass success rate, airway grade and complications in two tertiary EDs with the C-MAC video laryngoscope (VL), when compared with standard direct laryngoscopy (DL). This was a retrospective analysis of prospectively collected data entered into an airway registry database in the EDs of Royal North Shore and St George Hospitals (SGH) over a 30 month period. Doctors had the choice of using either DL using a Macintosh or Miller blade or a C-MAC VL for the intubation. Six hundred and nineteen consecutive patients were recruited. There was no statistical difference between VL and DL in grade of view obtained, P = 0.526. Chance of intubation success increased by more than threefold by using a C-MAC VL in the setting of a grade III/IV (total of 109) on DL (OR = 3.06; 95% CI: 1.52-6.17; P = 0.002). This is the first observational study of airway management comparing the C-MAC VL with DL blades in an Australian ED population. Our findings revealed that although the C-MAC VL overall did not provide an enhanced view of the larynx over the Macintosh DL, it was superior to DL when the grade was at least grade III. Currently we are unable to reliably predict the grade by any algorithm prior to intubation. Findings from this study suggest that the C-MAC VL should be considered as the first line laryngoscope in all ED intubations not just the ones predicted to be difficult. © 2015 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  10. Fe(II)-Polypyridines as Chromophores in Dye-Sensitized Solar Cells: A Computational Perspective.

    Science.gov (United States)

    Jakubikova, Elena; Bowman, David N

    2015-05-19

    Over the past two decades, dye-sensitized solar cells (DSSCs) have become a viable and relatively cheap alternative to conventional crystalline silicon-based systems. At the heart of a DSSC is a wide band gap semiconductor, typically a TiO2 nanoparticle network, sensitized with a visible light absorbing chromophore. Ru(II)-polypyridines are often utilized as chromophores thanks to their chemical stability, long-lived metal-to-ligand charge transfer (MLCT) excited states, tunable redox potentials, and near perfect quantum efficiency of interfacial electron transfer (IET) into TiO2. More recently, coordination compounds based on first row transition metals, such as Fe(II)-polypyridines, gained some attention as potential sensitizers in DSSCs due to their low cost and abundance. While such complexes can in principle sensitize TiO2, they do so very inefficiently since their photoactive MLCT states undergo intersystem crossing (ISC) into low-lying metal-centered states on a subpicosecond time scale. Competition between the ultrafast ISC events and IET upon initial excitation of Fe(II)-polypyridines is the main obstacle to their utilization in DSSCs. Suitability of Fe(II)-polypyridines to serve as sensitizers could therefore be improved by adjusting relative rates of the ISC and IET processes, with the goal of making the IET more competitive with ISC. Our research program in computational inorganic chemistry utilizes a variety of tools based on density functional theory (DFT), time-dependent density functional theory (TD-DFT) and quantum dynamics to investigate structure-property relationships in Fe(II)-polypyridines, specifically focusing on their function as chromophores. One of the difficult problems is the accurate determination of energy differences between electronic states with various spin multiplicities (i.e., (1)A, (1,3)MLCT, (3)T, (5)T) in the ISC cascade. We have shown that DFT is capable of predicting the trends in the energy ordering of these electronic

  11. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  12. HYDRA-II: A hydrothermal analysis computer code: Volume 2, User's manual

    Energy Technology Data Exchange (ETDEWEB)

    McCann, R.A.; Lowery, P.S.; Lessor, D.L.

    1987-09-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite-difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum incorporate directional porosities and permeabilities that are available to model solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated methods are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume 1 - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. This volume, Volume 2 - User's Manual, contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a sample problem. The final volume, Volume 3 - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. 6 refs.

  13. Adherence to PIOPED II investigators' recommendations for computed tomography pulmonary angiography.

    Science.gov (United States)

    Adams, Daniel M; Stevens, Scott M; Woller, Scott C; Evans, R Scott; Lloyd, James F; Snow, Gregory L; Allen, Todd L; Bledsoe, Joseph R; Brown, Lynette M; Blagev, Denitza P; Lovelace, Todd D; Shill, Talmage L; Conner, Karen E; Aston, Valerie T; Elliott, C Gregory

    2013-01-01

    Computed tomography (CT) pulmonary angiography use has increased dramatically, raising concerns for patient safety. Adherence to recommendations and guidelines may protect patients. We measured adherence to the recommendations of Prospective Investigation of Pulmonary Embolism Diagnosis (PIOPED II) investigators for evaluation of suspected pulmonary embolism and the rate of potential false-positive pulmonary embolism diagnoses when recommendations of PIOPED II investigators were not followed. We used a structured record review to identify 3500 consecutive CT pulmonary angiograms performed to investigate suspected pulmonary embolism in 2 urban emergency departments, calculating the revised Geneva score (RGS) to classify patients as "pulmonary embolism unlikely" (RGS≤10) or "pulmonary embolism likely" (RGS>10). CT pulmonary angiograms were concordant with PIOPED II investigator recommendations if pulmonary embolism was likely or pulmonary embolism was unlikely and a highly sensitive D-dimer test result was positive. We independently reviewed 482 CT pulmonary angiograms to measure the rate of potential false-positive pulmonary embolism diagnoses. A total of 1592 of 3500 CT pulmonary angiograms (45.5%) followed the recommendations of PIOPED II investigators. The remaining 1908 CT pulmonary angiograms were performed on patients with an RGS≤10 without a D-dimer test (n=1588) or after a negative D-dimer test result (n=320). The overall rate of pulmonary embolism was 9.7%. Potential false-positive diagnoses of pulmonary embolism occurred in 2 of 3 patients with an RGS≤10 and a negative D-dimer test result. Nonadherence to recommendations for CT pulmonary angiography is common and exposes patients to increased risks, including potential false-positive diagnoses of pulmonary embolism. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. Formulation, computation and improvement of steady state security margins in power systems. Part II: Results

    Energy Technology Data Exchange (ETDEWEB)

    Echavarren, F.M.; Lobato, E.; Rouco, L.; Gomez, T. [School of Engineering of Universidad Pontificia Comillas, C/Alberto Aguilera, 23, 28015 Madrid (Spain)

    2011-02-15

    A steady state security margin for a particular operating point can be defined as the distance from this initial point to the secure operating limits of the system. Four of the most used steady state security margins are the power flow feasibility margin, the contingency feasibility margin, the load margin to voltage collapse, and the total transfer capability between system areas. This is the second part of a two part paper. Part I has proposed a novel framework of a general model able to formulate, compute and improve any steady state security margin. In Part II the performance of the general model is validated by solving a variety of practical situations in modern real power systems. Actual examples of the Spanish power system will be used for this purpose. The same computation and improvement algorithms outlined in Part I have been applied for the four security margins considered in the study, outlining the convenience of defining a general framework valid for the four of them. The general model is used here in Part II to compute and improve: (a) the power flow feasibility margin (assessing the influence of the reactive power generation limits in the Spanish power system), (b) the contingency feasibility margin (assessing the influence of transmission and generation capacity in maintaining a correct voltage profile), (c) the load margin to voltage collapse (assessing the location and quantity of loads that must be shed in order to be far away from voltage collapse) and (d) the total transfer capability (assessing the export import pattern of electric power between different areas of the Spanish system). (author)

  15. What computational chemistry and magnetic resonance reveal concerning the oxygen evolving centre in Photosystem II.

    Science.gov (United States)

    Terrett, Richard; Petrie, Simon; Stranger, Rob; Pace, Ron J

    2016-09-01

    Density Functional Theory (DFT) computational studies of the Mn4/Ca Oxygen Evolving Complex (OEC) region of Photosystem II in the paramagnetic S2 and S3 states of the water oxdizing catalytic cycle are described. These build upon recent advances in computationally understanding the detailed S1 state OEC geometries, revealed by the recent high resolution Photosystem II crystal structures of Shen et al., at 1.90Å and 1.95Å (Petrie et al., 2015, Angew. Chem. Int. Ed., 54, 7120). The models feature a 'Low Oxidation Paradigm' assumption for the mean Mn oxidation states in the functional enzyme, with the mean oxidation levels being 3.0, 3.25 and 3.5 in S1, S2 and S3, respectively. These calculations are used to infer magnetic exchange interactions within the coupled OEC cluster, particularly in the Electron Paramagnetic Resonance (EPR)-visible S2 and S3 states. Detailed computational estimates of the intrinsic magnitudes and molecular orientations of the (55)Mn hyperfine tensors in the S2 state are presented. These parameters, together with the resultant spin projected hyperfine values are compared with recent appropriate experimental EPR data (Continuous Wave (CW), Electron-Nuclear Double Resonance (ENDOR) and ELDOR (Electron-Electron Double Resonance)-Detected Nuclear Magnetic Resonance (EDNMR)) from the OEC. It is found that an effective Coupled Dimer magnetic organization of the four Mn in the OEC cluster in the S2 and S3 states is able to quantitatively rationalize the observed (55)Mn hyperfine data. This is consistent with structures we propose to represent the likely state of the OEC in the catalytically active form of the enzyme.

  16. Comparison of the glidescope, CMAC, storz DCI with the Macintosh laryngoscope during simulated difficult laryngoscopy: a manikin study

    Directory of Open Access Journals (Sweden)

    Healy David W

    2012-06-01

    Full Text Available Abstract Background Videolaryngoscopy presents a new approach for the management of the difficult and rescue airway. There is little available evidence to compare the performance features of these devices in true difficult laryngoscopy. Methods A prospective randomized crossover study was performed comparing the performance features of the Macintosh Laryngoscope, Glidescope, Storz CMAC and Storz DCI videolaryngoscope. Thirty anesthesia providers attempted intubation with each of the 4 laryngoscopes in a high fidelity difficult laryngoscopy manikin. The time to successful intubation (TTSI was recorded for each device, along with failure rate, and the best view of the glottis obtained. Results Use of the Glidescope, CMAC and Storz videolaryngoscopes improved the view of the glottis compared with use of the Macintosh blade (GEE, p = 0.000, p = 0.002, p = 0.000 respectively. Use of the CMAC resulted in an improved view compared with use of the Storz VL (Fishers, p = 0.05. Use of the Glidescope or Storz videolaryngoscope blade resulted in a longer TTSI compared with either the Macintosh (GLM, p = 0.000, p = 0.029 respectively or CMAC blades (GLM, p = 0.000, p = 0.033 respectively. Conclusions Unsurprisingly, when used in a simulated difficult laryngoscopy, all the videolaryngoscopes resulted in a better view of the glottis than the Macintosh blade. However, interestingly the CMAC was found to provide a better laryngoscopic view that the Storz DCI Videolaryngoscope. Additionally, use of either the Glidescope or Storz DCI Videolaryngoscope resulted in a prolonged time to successful intubation compared with use of the CMAC or Macintosh blade. The use of the CMAC during manikin simulated difficult laryngoscopy combined the efficacy of attainment of laryngoscopic view with the expediency of successful intubation. Use of the Macintosh blade combined expedience with success, despite a limited laryngoscopic view. The

  17. Results from percutaneous drainage of Hinchey stage II diverticulitis guided by computed tomography scan.

    Science.gov (United States)

    Durmishi, Y; Gervaz, P; Brandt, D; Bucher, P; Platon, A; Morel, P; Poletti, P A

    2006-07-01

    Percutaneous abscess drainage guided by computed tomography scan is considered the initial step in the management of patients presenting with Hinchey II diverticulitis. The rationale behind this approach is to manage the septic complication conservatively and to follow this later using elective sigmoidectomy with primary anastomosis. The clinical outcomes for Hinchey II patients who underwent percutaneous abscess drainage in our institution were reviewed. Drainage was considered a failure when signs of continuing sepsis developed, abscess or fistula recurred within 4 weeks of drainage, and emergency surgical resection with or without a colostomy had to be performed. A total of 34 patients (17 men and 17 women; median age, 71 years; range, 34-90 years) were considered for analysis. The median abscess size was 6 cm (range, 3-18 cm), and the median duration of drainage was 8 days (range, 1-18 days). Drainage was considered successful for 23 patients (67%). The causes of failure for the remaining 11 patients included continuing sepsis (n = 5), abscess recurrence (n = 5), and fistula formation (n = 1). Ten patients who failed percutaneous abscess drainage underwent an emergency Hartmann procedure, with a median delay of 14 days (range, 1-65 days) between drainage and surgery. Three patients in this group (33%) died in the immediate postoperative period. Among the 23 patients successfully drained, 12 underwent elective sigmoid resection with a primary anastomosis. The median delay between drainage and surgery was 101 days (range, 40-420 days). In this group, there were no anastomotic leaks and no mortality. Drainage of Hinchey II diverticulitis guided by computed scan was successful in two-thirds of the cases, and 35% of the patients eventually underwent a safe elective sigmoid resection with primary anastomosis. By contrast, failure of percutaneous abscess drainage to control sepsis is associated with a high mortality rate when an emergency resection is performed. The

  18. TRACER-II: a complete computational model for mixing and propagation of vapor explosions

    Energy Technology Data Exchange (ETDEWEB)

    Bang, K.H. [School of Mechanical Engineering, Korea Maritime Univ., Pusan (Korea, Republic of); Park, I.G.; Park, G.C.

    1998-01-01

    A vapor explosion is a physical process in which very rapid energy transfer occurs between a hot liquid and a volatile, colder liquid when the two liquids come into a sudden contact. For the analyses of potential impacts from such explosive events, a computer program, TRACER-II, has been developed, which contains a complete description of mixing and propagation phases of vapor explosions. The model consists of fuel, fragmented fuel (debris), coolant liquid, and coolant vapor in two-dimensional Eulerian coordinates. The set of governing equations are solved numerically using finite difference method. The results of this numerical simulation of vapor explosions are discussed in comparison with the recent experimental data of FARO and KROTOS tests. When compared to some selected FARO and KROTOS data, the fuel-coolant mixing and explosion propagation behavior agree reasonably with the data, although the results are yet sensitive primarily to the melt breakup and fragmentation modeling. (author)

  19. Evaluation of Truview evo2® Laryngoscope In Anticipated Difficult Intubation – A Comparison To Macintosh Laryngoscope

    OpenAIRE

    Ishwar Singh; Abhijit Khaund; Abhishek Gupta

    2009-01-01

    Summary The aim of the study was to assess and compare laryngoscopic view of Truview evo2 laryngoscope with that of Macintosh laryngoscope in patients with one or more predictors of difficult intubation (PDI). Moreover ease of intubation with Truview evo2 in terms of absolute time requirement was also aimed at. Patients for elective surgery requiring endotracheal intubation were initially assessed for three PDI parameters – modified Mallampati test, thyro-mental distance & Atlanto-occipital (...

  20. [Comparison of the view of the glottic opening through Macintosh and AirTraq laryngoscopes in patients undergoing scheduled surgery].

    Science.gov (United States)

    López-Negrete, I Laso; Salinas Aguirre, U; Castrillo Villán, J L; Rodríguez Delgado, T; Colomino Alumbreros, J; Aguilera Celorrio, L

    2010-03-01

    The AirTraq laryngoscope is a new intubation device that may provide better viewing conditions than can be achieved with the traditional Macintosh device. This study compared the AirTraq and Macintosh views and assessed whether predictors of intubation difficulty are useful when the AirTraq laryngoscope is used. Prospective study of 215 ASA 1-3 patients over the age of 18 years who were to receive anesthesia with endotracheal intubation. Excluded were patients who required emergency surgery, who had a history of difficult intubation, or for whom ventilation was difficult during induction of anesthesia. In addition to the usual patient characteristics, we recorded thyromental distance, mouth opening, and Mallampati score. The Cormack-Lehane laryngoscopy grade was recorded for each device. A Cormack-Lehane grade of 1 or 2 was considered a good view. A grade of 3 or 4 was considered a poor view. The McNemar test was used to compare laryngoscopy grade between the 2 devices in each patient. The chi2 test was used to compare predictors of intubation difficulty. The Macintosh laryngoscope achieved a Cormack-Lehane grade of 1 in 653% of the patients, of 2 in 22.4%, of 3 in 11.3%, and of 4 in 1.4%. The AirTraq scope gave a Cormack-Lehane grade of 1 in 96.2%, of 2 in 33%, of 3 in 0.5%, and of 4 in 0%. The differences were statistically significant. None of the predictors was associated a poor glottic view through the AirTraq device. Poor viewing conditions occurred less frequently when the AirTraq device was used. Intubation conditions were therefore better with the AirTraq than with the Macintosh device. The traditional predictors of difficult intubation do not seem to be relevant when the AirTraq device is to be used.

  1. A comparison of the Glidescope, Pentax AWS, and Macintosh laryngoscopes when used by novice personnel: a manikin study.

    LENUS (Irish Health Repository)

    Malik, Muhammad A

    2009-11-01

    Direct laryngoscopic tracheal intubation is a potentially lifesaving procedure, but a difficult skill to acquire and maintain. The consequences of poorly performed intubation attempts are potentially severe. The Pentax AWS and the Glidescope are indirect laryngoscopes that may require less skill to use. We therefore hypothesized that AWS and Glidescope would prove superior to the Macintosh laryngoscope when used by novices in the normal and simulated difficult airway.

  2. Malocclusion Class II division 1 skeletal and dental relationships measured by cone-beam computed tomography.

    Science.gov (United States)

    Xu, Yiling; Oh, Heesoo; Lagravère, Manuel O

    2017-09-01

    The purpose of this study was to locate traditionally-used landmarks in two-dimensional (2D) images and newly-suggested ones in three-dimensional (3D) images (cone-beam computer tomographies [CBCTs]) and determine possible relationships between them to categorize patients with Class II-1 malocclusion. CBCTs from 30 patients diagnosed with Class II-1 malocclusion were obtained from the University of Alberta Graduate Orthodontic Program database. The reconstructed images were downloaded and visualized using the software platform AVIZO(®). Forty-two landmarks were chosen and the coordinates were then obtained and analyzed using linear and angular measurements. Ten images were analyzed three times to determine the reliability and measurement error of each landmark using Intra-Class Correlation coefficient (ICC). Descriptive statistics were done using the SPSS statistical package to determine any relationships. ICC values were excellent for all landmarks in all axes, with the highest measurement error of 2mm in the y-axis for the Gonion Left landmark. Linear and angular measurements were calculated using the coordinates of each landmark. Descriptive statistics showed that the linear and angular measurements used in the 2D images did not correlate well with the 3D images. The lowest standard deviation obtained was 0.6709 for S-GoR/N-Me, with a mean of 0.8016. The highest standard deviation was 20.20704 for ANS-InfraL, with a mean of 41.006. The traditional landmarks used for 2D malocclusion analysis show good reliability when transferred to 3D images. However, they did not reveal specific skeletal or dental patterns when trying to analyze 3D images for malocclusion. Thus, another technique should be considered when classifying 3D CBCT images for Class II-1malocclusion. Copyright © 2017 CEO. Published by Elsevier Masson SAS. All rights reserved.

  3. Synthetic, crystallographic, and computational study of copper(II) complexes of ethylenediaminetetracarboxylate ligands.

    Science.gov (United States)

    Matović, Zoran D; Miletić, Vesna D; Ćendić, Marina; Meetsma, Auke; van Koningsbruggen, Petra J; Deeth, Robert J

    2013-02-04

    Copper(II) complexes of hexadentate ethylenediaminetetracarboxylic acid type ligands H(4)eda3p and H(4)eddadp (H(4)eda3p = ethylenediamine-N-acetic-N,N',N'-tri-3-propionic acid; H(4)eddadp = ethylenediamine-N,N'-diacetic-N,N'-di-3-propionic acid) have been prepared. An octahedral trans(O(6)) geometry (two propionate ligands coordinated in axial positions) has been established crystallographically for the Ba[Cu(eda3p)]·8H(2)O compound, while Ba[Cu(eddadp)]·8H(2)O is proposed to adopt a trans(O(5)) geometry (two axial acetates) on the basis of density functional theory calculations and comparisons of IR and UV-vis spectral data. Experimental and computed structural data correlating similar copper(II) chelate complexes have been used to better understand the isomerism and departure from regular octahedral geometry within the series. The in-plane O-Cu-N chelate angles show the smallest deviation from the ideal octahedral value of 90°, and hence the lowest strain, for the eddadp complex with two equatorial β-propionate rings. A linear dependence between tetragonality and the number of five-membered rings has been established. A natural bonding orbital analysis of the series of complexes is also presented.

  4. PROTEGE-II: computer support for development of intelligent systems from libraries of components.

    Science.gov (United States)

    Musen, M A; Gennari, J H; Eriksson, H; Tu, S W; Puerta, A R

    1995-01-01

    PROTEGE-II is a suite of tools that facilitates the development of intelligent systems. A tool called MAiTRE allows system builders to create and refine abstract models (ontologies) of application domains. A tool called DASH takes as input a modified domain ontology and generates automatically a knowledge-acquisition tool that application specialists can use to enter the detailed content knowledge required to define particular applications. The domain-dependent knowledge entered into the knowledge-acquisition tool is used by assemblies of domain-independent problem-solving methods that provide the computational strategies required to solve particular application tasks. The result is an architecture that offers a divide-and-conquer approach that separates system-building tasks that require skill in domain analysis and modeling from those that require simple entry of content knowledge. At the same time, applications can be constructed from libraries of component--of both domain ontologies and domain-independent problem-solving methods--allowing the reuse of knowledge and facilitating ongoing system maintenance. We have used PROTEGE-II to construct a number of knowledge-based systems, including the reasoning components of T-Helper, which assists physicians in the protocol-based care of patients who have HIV infection.

  5. High Performance Computing Application: Solar Dynamo Model Project II, Corona and Heliosphere Component Initialization, Integration and Validation

    Science.gov (United States)

    2015-06-24

    allocate solar heating into any location of the corona . Its total contribution depended on the integration of the unsigned magnetic flux at 1 Rs...AFRL-RD-PS- TR-2015-0028 AFRL-RD-PS- TR-2015-0028 HIGH PERFORMANCE COMPUTING APPLICATION: SOLAR DYNAMO MODEL PROJECT II; CORONA AND HELIOSPHERE...Dynamo Model Project II, Corona and Heliosphere Component Initialization, Integration and Validation 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  6. Design, simulation, and experimental verification of a computer model and enhanced position estimator for the NPS AUV II

    OpenAIRE

    Warner, David C.

    1991-01-01

    A full six-degree-of-freedom computer model of the Naval Postgraduate School Autonomous Underwater Vehicle (NPS AUV II) is developed. Hydrodynamic Coefficients are determined by geometric similarity with an existing swimmer delivery vehicle and analysis of initial open loop AUV II trials. Comparisons between simulated and experimental results demonstrate the validity of the model and the techniques used. A reduced order observer of lateral velocity was produced to provide an input for an enha...

  7. Comparison of the Laryngeal View during Tracheal Intubation Using Airtraq and Macintosh Laryngoscopes by Unskillful Anesthesiology Residents: A Clinical Study

    Directory of Open Access Journals (Sweden)

    Carlos Ferrando

    2011-01-01

    Full Text Available Background and Objective. The Airtraq laryngoscope (Prodol Meditec, Vizcaya, Spain is a novel tracheal intubation device. Studies, performed until now, have compared the Airtraq with the Macintosh laryngoscope, concluding that it reduces the intubation times and increase the success rate at first intubation attempt, decreasing the Cormack-Lehane score. The aim of the study was to evaluate if, in unskillful anesthesiology residents during the laryngoscopy, the Airtraq compared with the Macintosh laryngoscope improves the laryngeal view, decreasing the Cormack-Lehane score. Methods. A prospective, randomized, crossed-over trial was carried out on 60 patients. Each one of the patients were intubated using both devices by unskillful (less than two hundred intubations with the Macintosh laryngoscope and 10 intubations using the Airtraq anesthesiology residents. The Cormack-Lehane score, the success rate at first intubation attempt, and the laryngoscopy and intubation times were compared. Results. The Airtraq significantly decreased the Cormack-Lehane score (=0.04. On the other hand, there were no differences in times of laryngoscopy (=0.645; IC 95% 3.1, +4.8 and intubation (=0.62; C95%  −6.1, +10.0 between the two devices. No relevant complications were found during the maneuvers of intubation using both devices. Conclusions. The Airtraq is a useful laryngoscope in unskillful anesthesiology residents improving the laryngeal view and, therefore, facilitating the tracheal intubation.

  8. A comparison of the forces applied to a manikin during laryngoscopy with the GlideScope and Macintosh laryngoscopes.

    Science.gov (United States)

    Russell, T; Lee, C; Firat, M; Cooper, R M

    2011-11-01

    The force applied during laryngoscopy can cause local tissue trauma and can induce cardiovascular responses and cervical spine movement in susceptible patients. Previous studies have identified numerous operator and patient factors that influence the amount of force applied during intubation. There are few studies evaluating the effect of different laryngoscope blades and no study involving video laryngoscopes. In this study we measured the forces using two laryngoscopic techniques. Three FlexiForce Sensors (A201-25, Tekscan, Boston, MA, USA) were attached to the concave blade surface of a Macintosh and a GlideScope laryngoscope. Experienced anaesthetists performed Macintosh and GlideScope intubations on the Laerdal Airway Management Trainer manikin. Compared to Macintosh intubations, the GlideScope intubations had equal or superior views of the glottis with 55%, 58% and 66% lower median peak, average and impulse forces applied to the tongue base. The distal sensor registered the most force in both devices and the force distribution pattern was similar between the devices. The findings suggest that the GlideScope requires less force for similar or better laryngoscopic views, at least in a manikin model.

  9. Computed tomography assessment of temporomandibular joint position and dimensions in patients with class II division 1 and division 2 malocclusions

    Science.gov (United States)

    Ciger, Semra

    2017-01-01

    Background This study aimed to investigate and compare the positions and dimensions of the temporomandibular joint and its components, respectively, in patients with Class II division 1 and division 2 malocclusions. Material and Methods Computed tomography images of 14 patients with Class II division 1 and 14 patients with Class II division 2 malocclusion were included with a mean age of 11.4 ± 1.2 years. The following temporomandibular joint measurements were made with OsiriX medical imaging software program. From the sagittal images, the anterior, superior, and posterior joint spaces and the mandibular fossa depths were measured. From the axial images, the greatest anteroposterior and mediolateral diameters of the mandibular condyles, angles between the long axis of the mandibular condyle and midsagittal plane, and vertical distances from the geometric centers of the condyles to midsagittal plane were measured. The independent samples t-test was used for comparing the measurements between the two sides and between the Class II division 1 and 2 groups. Results No statistically significant differences were observed between the right and left temporomandibular joints; therefore, the data were pooled. There were statistically significant differences between the Class II division 1 and 2 groups with regard to mandibular fossa depth and anterior joint space measurements. Conclusions In Class II patients, the right and left temporomandibular joints were symmetrical. In the Class II division 1 group, the anterior joint space was wider than that in Class II division 2 group, and the mandibular fossa was deeper and wider in the Class II division 1 group. Key words:Temporomandibular joint, Class II malocclusion, Cone beam computed tomography. PMID:28298985

  10. Endotracheal Intubation Using the Macintosh Laryngoscope or KingVision Video Laryngoscope during Uninterrupted Chest Compression

    Directory of Open Access Journals (Sweden)

    Ewelina Gaszynska

    2014-01-01

    Full Text Available Objective. Advanced airway management, endotracheal intubation (ETI, during CPR is more difficult than, for example, during anesthesia. However, new devices such as video laryngoscopes should help in such circumstances. The aim of this study was to assess the performance of the KingVision video laryngoscopes in a manikin cardiopulmonary resuscitation (CPR scenario. Methods. Thirty students enrolled in the third year of paramedic school took part in the study. The simulated CPR scenario was ETI using the standard laryngoscope with a Macintosh blade (MCL and ETI using the KingVision video laryngoscope performed during uninterrupted chest compressions. The primary endpoints were the time needed for ETI and the success ratio. Results. The mean time required for intubation was similar for both laryngoscopes: 16.6 (SD 5.11, median 15.64, range 7.9–27.9 seconds versus 17.91 (SD 5.6, median 16.28, range 10.6–28.6 seconds for the MCL and KingVision, respectively (P=0.1888. On the first attempt at ETI, the success rate during CPR was comparable between the evaluated laryngoscopes: P=0.9032. Conclusion. The KingVision video laryngoscope proves to be less superior when used for endotracheal intubation during CPR compared to the standard laryngoscope with a Mackintosh blade. This proves true in terms of shortening the time needed for ETI and increasing the success ratio.

  11. The Relationship between the Interactive Computer Interview System and the "Praxis II" Principles of Learning and Teaching Test

    Science.gov (United States)

    Pruett, Sharon M.

    2012-01-01

    The objective of this study was to compare the relationships between the subtests of the Interactive Computer Interview System and the ETS "Praxis II" Principles of Learning and Teaching examination. In particular, this study compares scores on the ICIS instrument subtests to those gathered from the same classroom teachers on the…

  12. X-Ray Crystallographic Analysis, EPR Studies, and Computational Calculations of a Cu(II) Tetramic Acid Complex

    Science.gov (United States)

    Matiadis, Dimitrios; Tsironis, Dimitrios; Stefanou, Valentina; Igglessi–Markopoulou, Olga; McKee, Vickie; Sanakis, Yiannis; Lazarou, Katerina N.

    2017-01-01

    In this work we present a structural and spectroscopic analysis of a copper(II) N-acetyl-5-arylidene tetramic acid by using both experimental and computational techniques. The crystal structure of the Cu(II) complex was determined by single crystal X-ray diffraction and shows that the copper ion lies on a centre of symmetry, with each ligand ion coordinated to two copper ions, forming a 2D sheet. Moreover, the EPR spectroscopic properties of the Cu(II) tetramic acid complex were also explored and discussed. Finally, a computational approach was performed in order to obtain a detailed and precise insight of product structures and properties. It is hoped that this study can enrich the field of functional supramolecular systems, giving place to the formation of coordination-driven self-assembly architectures. PMID:28316540

  13. Using world wide web via netscape - a short guide for PEP-II/BABAR

    Energy Technology Data Exchange (ETDEWEB)

    Chan, A.; Nelson, J.

    1995-09-01

    This report discusses the following topics dealing with searching the internet at the PEP-II storage ring facility: (1) what is the Internet, Mosaic and Netscape; (2) using URL`s; (3) Netscape menus and buttons - what they do; (4) Using bookmarks; (5) FTP through Netscape; (6) FTP through Fetch on the Macintosh; (7) installation Netscape; (8) configuring Netscape; and (9) references.

  14. Investigation of mixed mode - I/II fracture problems - Part 1: computational and experimental analyses

    Directory of Open Access Journals (Sweden)

    O. Demir

    2016-01-01

    Full Text Available In this study, to investigate and understand the nature of fracture behavior properly under in-plane mixed mode (Mode-I/II loading, three-dimensional fracture analyses and experiments of compact tension shear (CTS specimen are performed under different mixed mode loading conditions. Al 7075-T651 aluminum machined from rolled plates in the L-T rolling direction (crack plane is perpendicular to the rolling direction is used in this study. Results from finite element analyses and fracture loads, crack deflection angles obtained from the experiments are presented. To simulate the real conditions in the experiments, contacts are defined between the contact surfaces of the loading devices, specimen and loading pins. Modeling, meshing and the solution of the problem involving the whole assembly, i.e., loading devices, pins and the specimen, with contact mechanics are performed using ANSYSTM. Then, CTS specimen is analyzed separately using a submodeling approach, in which three-dimensional enriched finite elements are used in FRAC3D solver to calculate the resulting stress intensity factors along the crack front. Having performed the detailed computational and experimental studies on the CTS specimen, a new specimen type together with its loading device is also proposed that has smaller dimensions compared to the regular CTS specimen. Experimental results for the new specimen are also presented.

  15. A randomised comparative study of the effect of Airtraq optical laryngoscope vs. Macintosh laryngoscope on intraocular pressure in non-ophthalmic surgery

    Directory of Open Access Journals (Sweden)

    Bikramjit Das

    2016-02-01

    Full Text Available BACKGROUND: We compared intraocular pressure changes following laryngoscopy and intubation with conventional Macintosh blade and Airtraq optical laryngoscope. METHODS: Ninety adult patients were randomly assigned to study group or control group. Study group (n = 45 - Airtraq laryngoscope was used for laryngoscopy. Control group (n = 45 - conventional Macintosh laryngoscope was used for laryngoscopy. Preoperative baseline intraocular pressure was measured with Schiotz tonometer. Laryngoscopy was done as per group protocol. Intraocular pressure and haemodynamic parameters were recorded just before insertion of the device and subsequently three times at an interval of one minute after insertion of the device. RESULTS: Patient characteristics, baseline haemodynamic parameters and baseline intraocular pressure were comparable in the two groups. Following insertion of the endotracheal tube with Macintosh laryngoscope, there was statistically significant rise in heart rate and intraocular pressure compared to Airtraq group. There was no significant change in MAP. Eight patients in Macintosh group had tongue-lip-dental trauma during intubation, while only 2 patients received upper airway trauma in Airtraq group. CONCLUSION: We conclude that Airtraq laryngoscope in comparison to Macintosh laryngoscope results in significantly fewer rises in intraocular pressure and clinically less marked increase in haemodynamic response to laryngoscopy and intubation.

  16. Computer simulation of effect of conditions on discharge-excited high power gas flow CO laser

    Science.gov (United States)

    Ochiai, Ryo; Iyoda, Mitsuhiro; Taniwaki, Manabu; Sato, Shunichi

    2017-01-01

    The authors have developed the computer simulation codes to analyze the effect of conditions on the performances of discharge excited high power gas flow CO laser. The six be analyzed. The simulation code described and executed by Macintosh computers consists of some modules to calculate the kinetic processes. The detailed conditions, kinetic processes, results and discussions are described in this paper below.

  17. Evaluation of Truview evo2® Laryngoscope In Anticipated Difficult Intubation-A Comparison To Macintosh Laryngoscope

    Directory of Open Access Journals (Sweden)

    Ishwar Singh

    2009-01-01

    Full Text Available The aim of the study was to assess and compare laryngoscopic view of Truview evo2 laryngoscope with that of Macintosh laryngoscope in patients with one or more predictors of difficult intubation (PDI. Moreover ease of intubation with Truview evo2 in terms of absolute time requirement was also aimed at. Patients for elective surgery requiring endotracheal intubation were initially assessed for three PDI parameters - modified Mallampati test, thyro-mental distance& Atlanto-occipital (AO joint extension. Patients with cumulative PDI scores of 2 to 5 (in a scale of 0 to 8 were evaluated for Cormack& Lehane (CL grading by Macintosh blade after standard induction. Cases with CL grade of two or more were further evaluated by Truview evo2 laryngoscope and corresponding CL grades were assigned. Intubation attempted under Truview evo2 vision and time required for each successful tracheal intubation (i.e. tracheal intubation completed within one minute was noted. Total fifty cases were studied. The CL grades assigned by Macintosh blade correlated well with the cumulative PDI scores assigned preoperatively, confirming there predictability. Truview evo2 improved laryngeal view in 92 % cases by one or more CL grade. Intubation with Truview evo2 was possible in 88% cases within stipulated time of one minute and mean time of 28.6 seconds with SD of 11.23 was reasonably quick. No significant complication like oro- pharyngeal trauma or extreme pressor response to laryngoscopy was noticed. To conclude, Truview evo2 proved to be a better tool than conventional laryngoscope in anticipated difficult situations.

  18. Computational probes into the basis of silver ion chromatography. II. Silver(I)-olefin complexes

    NARCIS (Netherlands)

    Kaneti, J.; Smet, de L.C.P.M.; Boom, R.M.; Zuilhof, H.; Sudhölter, E.J.R.

    2002-01-01

    Alkene complexes of silver(I) are studied by four computational methodologies: ab initio RHF, MP2, and MP4 computations, and density functional B3LYP computations, with a variety of all-electron and effective core potential basis sets. Methodological studies indicate that MP2/SBK(d) computations can

  19. Oxidized calmodulin kinase II regulates conduction following myocardial infarction: a computational analysis.

    Directory of Open Access Journals (Sweden)

    Matthew D Christensen

    2009-12-01

    Full Text Available Calmodulin kinase II (CaMKII mediates critical signaling pathways responsible for divergent functions in the heart including calcium cycling, hypertrophy and apoptosis. Dysfunction in the CaMKII signaling pathway occurs in heart disease and is associated with increased susceptibility to life-threatening arrhythmia. Furthermore, CaMKII inhibition prevents cardiac arrhythmia and improves heart function following myocardial infarction. Recently, a novel mechanism for oxidative CaMKII activation was discovered in the heart. Here, we provide the first report of CaMKII oxidation state in a well-validated, large-animal model of heart disease. Specifically, we observe increased levels of oxidized CaMKII in the infarct border zone (BZ. These unexpected new data identify an alternative activation pathway for CaMKII in common cardiovascular disease. To study the role of oxidation-dependent CaMKII activation in creating a pro-arrhythmia substrate following myocardial infarction, we developed a new mathematical model of CaMKII activity including both oxidative and autophosphorylation activation pathways. Computer simulations using a multicellular mathematical model of the cardiac fiber demonstrate that enhanced CaMKII activity in the infarct BZ, due primarily to increased oxidation, is associated with reduced conduction velocity, increased effective refractory period, and increased susceptibility to formation of conduction block at the BZ margin, a prerequisite for reentry. Furthermore, our model predicts that CaMKII inhibition improves conduction and reduces refractoriness in the BZ, thereby reducing vulnerability to conduction block and reentry. These results identify a novel oxidation-dependent pathway for CaMKII activation in the infarct BZ that may be an effective therapeutic target for improving conduction and reducing heterogeneity in the infarcted heart.

  20. GPS-MBA: computational analysis of MHC class II epitopes in type 1 diabetes.

    Science.gov (United States)

    Cai, Ruikun; Liu, Zexian; Ren, Jian; Ma, Chuang; Gao, Tianshun; Zhou, Yanhong; Yang, Qing; Xue, Yu

    2012-01-01

    As a severe chronic metabolic disease and autoimmune disorder, type 1 diabetes (T1D) affects millions of people world-wide. Recent advances in antigen-based immunotherapy have provided a great opportunity for further treating T1D with a high degree of selectivity. It is reported that MHC class II I-A(g7) in the non-obese diabetic (NOD) mouse and human HLA-DQ8 are strongly linked to susceptibility to T1D. Thus, the identification of new I-A(g7) and HLA-DQ8 epitopes would be of great help to further experimental and biomedical manipulation efforts. In this study, a novel GPS-MBA (MHC Binding Analyzer) software package was developed for the prediction of I-A(g7) and HLA-DQ8 epitopes. Using experimentally identified epitopes as the training data sets, a previously developed GPS (Group-based Prediction System) algorithm was adopted and improved. By extensive evaluation and comparison, the GPS-MBA performance was found to be much better than other tools of this type. With this powerful tool, we predicted a number of potentially new I-A(g7) and HLA-DQ8 epitopes. Furthermore, we designed a T1D epitope database (TEDB) for all of the experimentally identified and predicted T1D-associated epitopes. Taken together, this computational prediction result and analysis provides a starting point for further experimental considerations, and GPS-MBA is demonstrated to be a useful tool for generating starting information for experimentalists. The GPS-MBA is freely accessible for academic researchers at: http://mba.biocuckoo.org.

  1. Influencia de la Escuela de Oxford en el desarrollo de la Anestesiología Moderna en España: la huella de Robert Macintosh

    OpenAIRE

    Unzueta Merino, M. Carmen

    2014-01-01

    El objetivo de este estudio es investigar cómo se introdujo la Anestesia Moderna en España y demostrar que la Escuela de Oxford, personalizada en Robert Macintosh, influyó de forma trascendental en ello. A raíz de su visita a España en 1946, invitado por el Consejo Superior de Investigaciones Científicas, Macintosh, catedrático de Anestesiología en Oxford, ejerció una influencia considerable en la introducción y desarrollo de la Anestesia Moderna en España. Durante su estancia realizó múltipl...

  2. IMPACT OF USING COMPUTER ASSISTED LEARNING IN II MBBS PHARMACOLOGY TEACHING - PERCEPTIONS OF STUDENTS IN A MEDICAL COLLEGE

    Directory of Open Access Journals (Sweden)

    Veena

    2015-10-01

    Full Text Available BACKGROUND: Animal experiments are essential as per II year MBBS practical syllabus for learning basic concepts in Pharmacology. Due to the strict regulations and ethical issues in procurement of animals related to their use, a need was felt to design and develop computer based simulation software as an alternative to animal use. It is a group learning technique used offline or online involving interaction of the student with programmed instructional materials or through it with teacher. These integrated multimedia software's act as animal simulators provide an environment that closely mimics reality. OBJECTIVES: The aim of this study is to assess the students opinions on the interactive computer assisted learning (CAL in Pharmacology practical experiments. MATERIALS AND METHODS: This is an observational questionnaire based study. Seventy seven (77 II-year MBBS students at BGSGIMS attended the practical's and filled a survey questionnaire on the outcomes, advantages and disadvantages of the CAL session using a 5-point Likert scale. RESULTS: More than 90% of II MBBS students find that CAL helped them to achieve the learning objectives, enriches and personalizes the learning experience at their own pace within the time slot. CAL helped students recollect and apply theoretical knowledge of drugs in practical session. CONCLUSION: Learning basic concepts in Pharmacology using CAL, animal simulation software as an education tool has been perceived positively by II MBBS students. CAL program coupled with application of theoretical knowledge of drugs to the practical classes helped them to fulfil the learning outcomes.

  3. Circulatory responses to nasotracheal intubation: comparison of GlideScope(R) videolaryngoscope and Macintosh direct laryngoscope

    Institute of Scientific and Technical Information of China (English)

    XUE Fu-shan; LI Xuan-ying; LIU Qian-jin; LIU He-ping; YANG Quan-yong; XU Ya-chao; LIAO Xu; LIU Yi

    2008-01-01

    Background The GlideScope videolaryngoscope (GSVL) has been shown to have no special advantage over theMacintosh direct laryngoscope (MDL) in attenuating the circulatory responses to orotracheal intubation, but no study has compared the circulatory responses to nasotracheal intubation (NTI) using the two devices. This prospective randomized clinical study was designed to determine whether there was a clinically relevant difference between the circulatory responses to NTI with the GSVL and the MDL.Methods Seventy-six adult patients were randomly allocated equally to the GSVL group and the MDL group. After induction of anesthesia, NTI was performed. Non-invasive blood pressure (BP) and heart rate (HR) were recorded before induction (baseline values) and immediately before intubation (post-induction values), at intubation and every minute for a further five minutes. During the observation, times required to reach the maximum values of systolic BP (SBP) and HR, times required for recovery of SBP and HR to postinduction values and incidence of SBP and HR percent changes>30% of baseline values were also noted. The product of HR and systolic BP, I.e. Rate pressure product (RPP), and the areas under SBP and HR vs. Time curves (AUCSBP and AUCHR) were calculated.Results The NTI with the GSVL resulted in significant increases in BP, HR and RPP compared to postinduction values, but these circulatory changes did not exceed baseline values. BPs at all measuring points, AUCSBP, maximum values of BP and incidence of SBP percent increase>30% of baseline value during the observation did not differ significantly between groups. However, HR and RPP at intubation and their maximum values, AUCHR and incidence of HR percent increase > 30% of baseline value were significantly higher in the MDL group than in the GSVL group. -times required for recovery of SBP and HR to postinduction values were significantly longer in the MDL group than in the GSVL group.Conclusions The pressor response to

  4. Copper (II) diamino acid complexes: Quantum chemical computations regarding diastereomeric effects on the energy of complexation

    NARCIS (Netherlands)

    Zuilhof, H.; Morokuma, K.

    2003-01-01

    Quantum chemical calculations were used to rationalize the observed enantiodifferentiation in the complexation of alpha-amino acids to chiral Cu(II) complexes. Apart from Cu(II)-pi interactions and steric repulsions between the anchoring cholesteryl-Glu moiety and an aromatic amino acid R group, hyd

  5. Computational evaluation of unsaturated carbonitriles as neutral receptor model for beryllium(II) recognition.

    Science.gov (United States)

    Rosli, Ahmad Nazmi; Ahmad, Mohd Rais; Alias, Yatimah; Zain, Sharifuddin Md; Lee, Vannajan Sanghiran; Woi, Pei Meng

    2014-12-01

    Design of neutral receptor molecules (ionophores) for beryllium(II) using unsaturated carbonitrile models has been carried out via density functional theory, G3, and G4 calculations. The first part of this work focuses on gas phase binding energies between beryllium(II) and 2-cyano butadiene (2-CN BD), 3-cyano propene (3-CN P), and simpler models with two separate fragments; acrylonitrile and ethylene. Interactions between beryllium(II) and cyano nitrogen and terminal olefin in the models have been examined in terms of geometrical changes, distribution of charge over the entire π-system, and rehybridization of vinyl carbon orbitals. NMR shieldings and vibrational frequencies probed charge centers and strength of interactions. The six-membered cyclic complexes have planar structures with the rehybridized carbon slightly out of plane (16° in 2-CN BD). G3 results show that in 2-CN BD complex participation of vinyl carbon further stabilizes the cyclic adduct by 16.3 kcal mol(-1), whereas, in simpler models, interaction between beryllium(II) and acetonitrile is favorable by 46.4 kcal mol(-1) compared with that of ethylene. The terminal vinyl carbon in 2-CN BD rehybridizes to sp (3) with an increase of 7 % of s character to allow interaction with beryllium(II). G4 calculations show that the Be(II) and 2-CN BD complex is more strongly bound than those with Mg(II) and Ca(II) by 98.5 and 139.2 kcal mol(-1) (-1), respectively. QST2 method shows that the cyclic and acyclic forms of Be(II)-2-CN BD complexes are separated by 12.3 kcal mol(-1) barrier height. Overlap population analysis reveals that Ca(II) can be discriminated based on its tendency to form ionic interaction with the receptor models.

  6. Density Functionalized [Ru(II)(NO)(Salen)(Cl)] Complex: Computational Photodynamics and In Vitro Anticancer Facets.

    Science.gov (United States)

    Mir, Jan Mohammad; Jain, N; Jaget, P S; Maurya, R C

    2017-07-22

    Photodynamic therapy (PDT) is a treatment that uses photosensitizing agents to kill cancer cells. Scientific community has been eager for decades in enduring curiosity to design an efficient PDT drug. Under such purview, the current report deals with the computational photodynamic behavior of ruthenium(II) nitrosyl complex containing N, N'-salicyldehyde-ethylenediimine (SalenH2), the synthesis and X-ray crystallography of which is already known [Ref. 36]. Gaussian 09W software package was employed to carry out the density functional (DFT) studies. DFT calculations with Becke-3-Lee-Yang-Parr (B3LYP)/Los Alamos National Laboratory 2 Double Z (LanL2DZ) specified for Ru atom and B3LYP/6-31G(d,p) combination for all other atoms were used using effective core potential method. Both, the ground and excited states of the complex were evolved. Some known photosensitizers were compared with the target complex. Pthalocyanine and porphyrin derivatives were the compounds selected for the respective comparative study. It is suggested that effective photoactivity was found due to the presence of ruthenium core in the model complex. In addition to the evaluation of theoretical aspects in vitro anticancer aspects against COLO-205 human cancer cells have also been carried out with regard to the complex. More emphasis was laid to extrapolate DFT to depict the chemical power of the target compound to release nitric oxide. A promising visible light triggered nitric oxide releasing power of the compound has been inferred. In vitro antiproliferative studies of [RuCl3(PPh3)3] and [Ru(NO)(Salen)(Cl)] have revealed the model complex as an excellent anticancer agent. From IC50 values of 40.031mg/mL in former and of 9.74mg/mL in latter, it is established that latter bears more anticancer potentiality. From overall study the DFT based structural elucidation and the efficiency of NO, Ru and Salen co-ligands has shown promising drug delivery property and a good candidacy for both chemotherapy as

  7. Modelling the effect of arbitrary P-T-t histories on argon diffusion in minerals using the MacArgon program for the Apple Macintosh

    Science.gov (United States)

    Lister, Gordon S.; Baldwin, Suzanne L.

    1996-03-01

    Argon diffusion in mineral grains has been numerically modelled using P-T-t histories that may be relevant to multiply metamorphosed orogenic terranes and for rocks that have resided at high ambient temperatures in the Earth's crust for long durations. The MacArgon program generates argon concentration profiles in minerals assuming argon loss occurs via volume diffusion. It can be run on an Apple Macintosh computer, with arbitrary P-T-t histories used as input. Finite-difference equations are used in the calculation of 40Ar∗ concentration profiles across individual diffusion domains. The associated MacSpectrometer generates model spectra after a P-T-t history has been specified. The form of model {40Ar }/{39Ar } apparent age spectra suggests that considerable caution needs to be exercised in the use of the closure temperature concept and in the interpretation of the significance of plateaux observed in many {40Ar }/{39Ar } apparent age spectra, particularly in cases involving metamorphic rocks, where complex P-T-t histories might apply. Although modelled spectra cannot be directly compared to experimentally determined {40Ar }/{39Ar } age spectra, especially when hydrous phases are involved or in cases where loss of argon has not occurred via volume diffusion, they do provide insight into theoretically expected age spectra for samples that have experienced complex P-T-t histories. MacArgon can be obtained by e-mail from MacArgon artemis.earth.monash.edu.au with enquiries to gordonartemis.earth.monash.edu.au

  8. WE-B-BRD-01: Innovation in Radiation Therapy Planning II: Cloud Computing in RT

    Energy Technology Data Exchange (ETDEWEB)

    Moore, K [University of California, San Diego, La Jolla, CA (United States); Kagadis, G [University Patras, Rion - Patras (Greece); Xing, L [Stanford University, Stanford, CA (United States); McNutt, T [Johns Hopkins University, Severna Park, MD (United States)

    2014-06-15

    As defined by the National Institute of Standards and Technology, cloud computing is “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Despite the omnipresent role of computers in radiotherapy, cloud computing has yet to achieve widespread adoption in clinical or research applications, though the transition to such “on-demand” access is underway. As this transition proceeds, new opportunities for aggregate studies and efficient use of computational resources are set against new challenges in patient privacy protection, data integrity, and management of clinical informatics systems. In this Session, current and future applications of cloud computing and distributed computational resources will be discussed in the context of medical imaging, radiotherapy research, and clinical radiation oncology applications. Learning Objectives: Understand basic concepts of cloud computing. Understand how cloud computing could be used for medical imaging applications. Understand how cloud computing could be employed for radiotherapy research.4. Understand how clinical radiotherapy software applications would function in the cloud.

  9. A comparison of tracheal intubation using the Airtraq or the Macintosh laryngoscope in routine airway management: A randomised, controlled clinical trial.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-11-01

    The Airtraq laryngoscope is a novel single use tracheal intubation device. We compared the Airtraq with the Macintosh laryngoscope in patients deemed at low risk for difficult intubation in a randomised, controlled clinical trial. Sixty consenting patients presenting for surgery requiring tracheal intubation were randomly allocated to undergo intubation using a Macintosh (n = 30) or Airtraq (n = 30) laryngoscope. All patients were intubated by one of four anaesthetists experienced in the use of both laryngoscopes. No significant differences in demographic or airway variables were observed between the groups. All but one patient, in the Macintosh group, was successfully intubated on the first attempt. There was no difference between groups in the duration of intubation attempts. In comparison to the Macintosh laryngoscope, the Airtraq resulted in modest improvements in the intubation difficulty score, and in ease of use. Tracheal intubation with the Airtraq resulted in less alterations in heart rate. These findings demonstrate the utility of the Airtraq laryngoscope for tracheal intubation in low risk patients.

  10. Comparison of the Airtraq® and Truview® laryngoscopes to the Macintosh laryngoscope for use by Advanced Paramedics in easy and simulated difficult intubation in manikins

    Directory of Open Access Journals (Sweden)

    O' Donnell John

    2009-02-01

    Full Text Available Abstract Background Paramedics are frequently required to perform tracheal intubation, a potentially life-saving manoeuvre in severely ill patients, in the prehospital setting. However, direct laryngoscopy is often more difficult in this environment, and failed tracheal intubation constitutes an important cause of morbidity. Novel indirect laryngoscopes, such as the Airtraq® and Truview® laryngoscopes may reduce this risk. Methods We compared the efficacy of these devices to the Macintosh laryngoscope when used by 21 Paramedics proficient in direct laryngoscopy, in a randomized, controlled, manikin study. Following brief didactic instruction with the Airtraq® and Truview® laryngoscopes, each participant took turns performing laryngoscopy and intubation with each device, in an easy intubation scenario and following placement of a hard cervical collar, in a SimMan® manikin. Results The Airtraq® reduced the number of optimization manoeuvres and reduced the potential for dental trauma when compared to the Macintosh, in both the normal and simulated difficult intubation scenarios. In contrast, the Truview® increased the duration of intubation attempts, and required a greater number of optimization manoeuvres, compared to both the Macintosh and Airtraq® devices. Conclusion The Airtraq® laryngoscope performed more favourably than the Macintosh and Truview® devices when used by Paramedics in this manikin study. Further studies are required to extend these findings to the clinical setting.

  11. A Randomized Comparison Simulating Face to Face Endotracheal Intubation of Pentax Airway Scope, C-MAC Video Laryngoscope, Glidescope Video Laryngoscope, and Macintosh Laryngoscope

    Directory of Open Access Journals (Sweden)

    Hyun Young Choi

    2015-01-01

    Full Text Available Objectives. Early airway management is very important for severely ill patients. This study aimed to investigate the efficacy of face to face intubation in four different types of laryngoscopes (Macintosh laryngoscope, Pentax airway scope (AWS, Glidescope video laryngoscope (GVL, and C-MAC video laryngoscope (C-MAC. Method. Ninety-five nurses and emergency medical technicians were trained to use the AWS, C-MAC, GVL and Macintosh laryngoscope with standard airway trainer manikin and face to face intubation. We compared VCET (vocal cord exposure time, tube pass time, 1st ventilation time, VCET to tube pass time, tube pass time to 1st ventilation time, and POGO (percentage of glottis opening score. In addition, we compared success rate according to the number of attempts and complications. Result. VCET was similar among all laryngoscopes and POGO score was higher in AWS. AWS and Macintosh blade were faster than GVL and C-MAC in total intubation time. Face to face intubation success rate was lower in GVL than other laryngoscopes. Conclusion. AWS and Macintosh were favorable laryngoscopes in face to face intubation. GVL had disadvantage performing face to face intubation.

  12. Nickel(II), copper(II) and zinc(II) metallo-intercalators: structural details of the DNA-binding by a combined experimental and computational investigation.

    Science.gov (United States)

    Lauria, Antonino; Bonsignore, Riccardo; Terenzi, Alessio; Spinello, Angelo; Giannici, Francesco; Longo, Alessandro; Almerico, Anna Maria; Barone, Giampaolo

    2014-04-28

    We present a thorough characterization of the interaction of novel nickel(II) (1), copper(II) (2) and zinc(II) (3) Schiff base complexes with native calf thymus DNA (ct-DNA), in buffered aqueous solution at pH 7.5. UV-vis absorption, circular dichroism (CD) and viscometry titrations provided clear evidence of the intercalative mechanism of the three square-planar metal complexes, allowing us to determine the intrinsic DNA-binding constants (K(b)), equal to 1.3 × 10(7), 2.9 × 10(6), and 6.2 × 10(5) M(-1) for 1, 2 and 3, respectively. Preferential affinity, of one order of magnitude, toward AT compared to GC base pair sequences was detected by UV-vis absorption titrations of 1 with [poly(dG-dC)]2 and [poly(dA-dT)]2. Structural details of the intercalation site of the three metal complexes within [dodeca(dA-dT)]2 were obtained by molecular dynamics (MD) simulations followed by density functional theory/molecular mechanics (DFT/MM) calculations. The calculations revealed that three major intermolecular interactions contribute to the strong affinity between DNA and the three metal complexes: (1) the electrostatic attraction between the two positively charged triethylammoniummethyl groups of the metal complexes and the negatively charged phosphate groups of the DNA backbone; (2) the intercalation of the naphthalene moiety within the four nitrogen bases of the intercalation site; (3) the metal coordination by exocyclic donor atoms of the bases, specifically the carbonyl oxygen and amine nitrogen atoms. Remarkably, the Gibbs formation free energy calculated for the intercalation complexes of 1, 2 and 3 with [dodeca(dA-dT)]2 in the implicit water solution is in agreement with the experimental Gibbs free energy values obtained from the DNA-binding constants as ΔG° = -RT ln(K(b)). In particular, the DNA-binding affinity trend, 1 > 2 > 3, is reproduced. Finally, the first shell coordination distances calculated for the intercalation complex 3/[dodeca(dA-dT)]2 are in

  13. A Comparison of Equality in Computer Algebra and Correctness in Mathematical Pedagogy (II)

    Science.gov (United States)

    Bradford, Russell; Davenport, James H.; Sangwin, Chris

    2010-01-01

    A perennial problem in computer-aided assessment is that "a right answer", pedagogically speaking, is not the same thing as "a mathematically correct expression", as verified by a computer algebra system, or indeed other techniques such as random evaluation. Paper I in this series considered the difference in cases where there was "the right…

  14. Experience of direct percutaneous sac injection in type II endoleak using cone beam computed tomography.

    Science.gov (United States)

    Park, Yoong-Seok; Do, Young Soo; Park, Hong Suk; Park, Kwang Bo; Kim, Dong-Ik

    2015-04-01

    Cone beam CT, usually used in dental area, could easily obtain 3-dimensional images using cone beam shaped ionized radiation. Cone beam CT is very useful for direct percutaneous sac injection (DPSI) which needs very precise measurement to avoid puncture of inferior vena cava or vessel around sac or stent graft. Here we describe two cases of DPSI using cone beam CT. In case 1, a 79-year-old male had widening of preexisted type II endoleak after endovascular aneurysm repair (EVAR). However, transarterial embolization failed due to tortuous collateral branches of lumbar arteries. In case 2, a 72-year-old female had symptomatic sac enlargement by type II endoleak after EVAR. However, there was no route to approach the lumbar arteries. Therefore, we performed DPSI assisted by cone beam CT in cases 1, 2. Six-month CT follow-up revealed no sign of sac enlargement by type II endoleak.

  15. Synthesis, crystal structure and luminescent properties of some Zn(II) Schiff base complexes: experimental and computational study.

    Science.gov (United States)

    Eltayeb, Naser Eltaher; Teoh, Siang Guan; Adnan, Rohana; Teh, Jeannie Bee-Jan; Fun, Hoong-Kun

    2011-07-01

    A series of Zn(II)-Schiff bases I, II and III complexes were synthesized by reaction of o-phenylenediamine with 3-methylsalicylaldehyde, 4-methylsalicylaldehyde and 5-methylsalicylaldehyde. These complexes were characterized using FT-IR, UV-Vis, Diffuse reflectance UV-Vis, elemental analysis and conductivity. Complex III was characterized by XRD single crystal, which crystallizes in the triclinic system, space group P-1, with lattice parameters a=9.5444(2) Å, b=11.9407(2) Å, c=21.1732(3) Å, V=2390.24(7) Å(3), D ( c )=1.408 Mg m(-3), Z=4, F(000)=1050, GOF=0.981, R1=0.0502, wR2=0.1205. Luminescence property of these complexes was investigated in DMF solution and in the solid state. Computational study of the electronic properties of complex III showed good agreement with the experimental data.

  16. Intelligent Computer-Aided Instruction and Musical Performance Skills. CITE Report No. 18.

    Science.gov (United States)

    Baker, Michael

    This paper is a transcription from memory of a short talk that used overhead projector slides, with musical examples played on an Apple Macintosh computer and a Yamaha CX5 synthesizer. The slides appear in the text as reduced "icons" at the point where they would have been used in the talk. The paper concerns ways in which artificial intelligence…

  17. Photospheric Magnitude Diagrams for Type II Supernovae: A Promising Tool to Compute Distances

    Science.gov (United States)

    Rodríguez, Ósmar; Clocchiatti, Alejandro; Hamuy, Mario

    2014-12-01

    We develop an empirical color-based standardization for Type II supernovae (SNe II), equivalent to the classical surface brightness method given in Wesselink. We calibrate this standardization using SNe II with host galaxy distances measured using Cepheids, and a well-constrained shock breakout epoch and extinction due to the host galaxy. We estimate the reddening with an analysis of the B - V versus V - I color-color curves, similar to that of Natali et al. With four SNe II meeting the above requirements, we build a photospheric magnitude versus color diagram (similar to an H-R diagram) with a dispersion of 0.29 mag. We also show that when using time since shock breakout instead of color as the independent variable, the same standardization gives a dispersion of 0.09 mag. Moreover, we show that the above time-based standardization corresponds to the generalization of the standardized candle method of Hamuy & Pinto for various epochs throughout the photospheric phase. To test the new tool, we construct Hubble diagrams for different subsamples of 50 low-redshift (cz 3000 km s-1) and with a well-constrained shock breakout epoch we obtain values of 68-69 km s-1 Mpc-1 for the Hubble constant and a mean intrinsic scatter of 0.12 mag or 6% in relative distances.

  18. Synthetic, Crystallographic, and Computational Study of Copper(II) Complexes of Ethylenediaminetetracarboxylate Ligands

    NARCIS (Netherlands)

    Matovic, Zoran D.; Miletic, Vesna D.; Cendic, Marina; Meetsma, Auke; van Koningsbruggen, Petra J.; Deeth, Robert J.; Matović, Zoran D.; Miletić, Vesna D.; Ćendić, Marina

    2013-01-01

    Copper(II) complexes of hexadentate ethylenediaminetetracarboxylic acid type ligands H(4)eda3p and Rieddadp (H(4)eda3p = ethylenediamine-N-acetic-N,N',N'-tri-3-propionic acid; ateddadp = ethylenediamine-N,N'-diaceticN,N'-di-3-propionic acid) have been prepared. An octahedral trans(O-6) geometry (two

  19. Synthetic, Crystallographic, and Computational Study of Copper(II) Complexes of Ethylenediaminetetracarboxylate Ligands

    NARCIS (Netherlands)

    Matovic, Zoran D.; Miletic, Vesna D.; Cendic, Marina; Meetsma, Auke; van Koningsbruggen, Petra J.; Deeth, Robert J.; Matović, Zoran D.; Miletić, Vesna D.; Ćendić, Marina

    2013-01-01

    Copper(II) complexes of hexadentate ethylenediaminetetracarboxylic acid type ligands H(4)eda3p and Rieddadp (H(4)eda3p = ethylenediamine-N-acetic-N,N',N'-tri-3-propionic acid; ateddadp = ethylenediamine-N,N'-diaceticN,N'-di-3-propionic acid) have been prepared. An octahedral trans(O-6) geometry (two

  20. Photospheric Magnitude Diagrams for Type II Supernovae: A Promising Tool to Compute Distances

    CERN Document Server

    Rodríguez, Ósmar; Hamuy, Mario

    2014-01-01

    We develop an empirical color-based standardization for Type II supernovae (SNe II), equivalent to the classical surface brightness method given in Wesselink (1969). We calibrate it with SNe II with host galaxy distance measured with Cepheids, and well-constrained shock breakout epoch and extinction due to the host galaxy. We estimate the reddening with an analysis of the B-V versus V-I color-color curves, similar to that of Natali et al. (1994). With four SNe II meeting the above requirements, we build a photospheric magnitude versus color diagram (similar to an HR diagram) with a dispersion of 0.29 mag. We also show that when using time since shock breakout instead of color as independent variable, the same standardization gives a dispersion of 0.09 mag. Moreover, we show that the above time-based standardization corresponds to the generalization of the standardized candle method of Hamuy & Pinto (2002) for various epochs throughout the photospheric phase. To test the new tool, we construct Hubble diagr...

  1. Method for Determining Language Objectives and Criteria. Volume II. Methodological Tools: Computer Analysis, Data Collection Instruments.

    Science.gov (United States)

    1979-05-25

    This volume presents (1) Methods for computer and hand analysis of numerical language performance data (includes examples) (2) samples of interview, observation, and survey instruments used in collecting language data. (Author)

  2. Computer virus information update CIAC-2301

    Energy Technology Data Exchange (ETDEWEB)

    Orvis, W.J.

    1994-01-15

    While CIAC periodically issues bulletins about specific computer viruses, these bulletins do not cover all the computer viruses that affect desktop computers. The purpose of this document is to identify most of the known viruses for the MS-DOS and Macintosh platforms and give an overview of the effects of each virus. The authors also include information on some windows, Atari, and Amiga viruses. This document is revised periodically as new virus information becomes available. This document replaces all earlier versions of the CIAC Computer virus Information Update. The date on the front cover indicates date on which the information in this document was extracted from CIAC`s Virus database.

  3. Obstacle detection algorithm of low computational cost for Guanay II AUV

    OpenAIRE

    Galarza Bogotá, Cesar Mauricio; Prat Tasias, Jordi; Gomáriz Castro, Spartacus

    2016-01-01

    Obstacle detection is one of the most important stages in the obstacle avoidance system. This work is focused to explain the operation of a designed and implemented for the overall detection of objects with low computational cost strategy. This strategy of low computational cost is based on performing a spatial segmentation of the information obtained by the SONAR and determine the minimum distance between the SONAR (AUV) and the obstacle. Peer Reviewed

  4. Obstacle detection algorithm of low computational cost for Guanay II AUV

    OpenAIRE

    Galarza Bogotá, Cesar Mauricio; Prat Tasias, Jordi; Gomáriz Castro, Spartacus

    2016-01-01

    Obstacle detection is one of the most important stages in the obstacle avoidance system. This work is focused to explain the operation of a designed and implemented for the overall detection of objects with low computational cost strategy. This strategy of low computational cost is based on performing a spatial segmentation of the information obtained by the SONAR and determine the minimum distance between the SONAR (AUV) and the obstacle.

  5. A fast DNA sequence handling program for Apple II computer in BASIC and 6502 assembler.

    Science.gov (United States)

    Paolella, G

    1985-01-01

    A fast general purpose DNA handling program has been developed in BASIC and machine language. The program runs on the Apple II plus or on the Apple IIe microcomputer, without additional hardware except for disk drives and printer. The program allows file insertion and editing, translation into protein sequence, reverse translation, search for small strings and restriction enzyme sites. The homology may be shown either as a comparison of two sequences or through a matrix on screen. Two additional features are: (i) drawing restriction site maps on the printer; and (ii) simulating a gel electrophoresis of restriction fragments both on screen and on paper. All the operations are very fast. The more common tasks are carried out almost instantly; only more complex routines, like finding homology between large sequences or searching and sorting all the restriction sites in a long sequence require longer, but still quite acceptable, times (generally under 30 s).

  6. Performance/Design Requirements and Detailed Technical Description for a Computer-Directed Training Subsystem for Integration into the Air Force Phase II Base Level System.

    Science.gov (United States)

    Butler, A. K.; And Others

    The performance/design requirements and a detailed technical description for a Computer-Directed Training Subsystem to be integrated into the Air Force Phase II Base Level System are described. The subsystem may be used for computer-assisted lesson construction and has presentation capability for on-the-job training for data automation, staff, and…

  7. Computing the Lagrangians of the Standard Model II. The Ghost Term

    Science.gov (United States)

    Selesnick, S. A.

    2016-08-01

    We follow up an earlier attempt to compute the Yang-Mills Lagrangian density from first principles. In that work, the Lagrangian density emerged replete with a Feynman-'t Hooft gauge fixing term. In this note we find that similar methods may be applied to produce the concomitant ghost term. Our methods are elementary and entirely and straightforwardly algebraic. Insofar as one of our first principles in the earlier computation was the Schwinger Action Principle, which is a differential version of the Feynman path integral, our computation here may be viewed as a differential version of the Faddeev-Popov functional integral approach to generating the ghost Lagrangian. As such, it avoids all measure theoretic difficulties and ambiguities, though at the price of generality.

  8. Computing the Lagrangians of the Standard Model II. The Ghost Term

    Science.gov (United States)

    Selesnick, S. A.

    2016-11-01

    We follow up an earlier attempt to compute the Yang-Mills Lagrangian density from first principles. In that work, the Lagrangian density emerged replete with a Feynman-'t Hooft gauge fixing term. In this note we find that similar methods may be applied to produce the concomitant ghost term. Our methods are elementary and entirely and straightforwardly algebraic. Insofar as one of our first principles in the earlier computation was the Schwinger Action Principle, which is a differential version of the Feynman path integral, our computation here may be viewed as a differential version of the Faddeev-Popov functional integral approach to generating the ghost Lagrangian. As such, it avoids all measure theoretic difficulties and ambiguities, though at the price of generality.

  9. Quantum Computing and Hidden Variables II: The Complexity of Sampling Histories

    CERN Document Server

    Aaronson, S

    2004-01-01

    This paper shows that, if we could examine the entire history of a hidden variable, then we could efficiently solve problems that are believed to be intractable even for quantum computers. In particular, under any hidden-variable theory satisfying a reasonable axiom called "indifference to the identity," we could solve the Graph Isomorphism and Approximate Shortest Vector problems in polynomial time, as well as an oracle problem that is known to require quantum exponential time. We could also search an N-item database using O(N^{1/3}) queries, as opposed to O(N^{1/2}) queries with Grover's search algorithm. On the other hand, the N^{1/3} bound is optimal, meaning that we could probably not solve NP-complete problems in polynomial time. We thus obtain the first good example of a model of computation that appears slightly more powerful than the quantum computing model.

  10. High performance parallel computing of flows in complex geometries: II. Applications

    Energy Technology Data Exchange (ETDEWEB)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F [Computational Fluid Dynamics Team, CERFACS, Toulouse, 31057 (France); Poinsot, T [Institut de Mecanique des Fluides de Toulouse, Toulouse, 31400 (France)], E-mail: Nicolas.gourdain@cerfacs.fr

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  11. Investigation on aerodynamic characteristics of baseline-II E-2 blended wing-body aircraft with canard via computational simulation

    Science.gov (United States)

    Nasir, Rizal E. M.; Ali, Zurriati; Kuntjoro, Wahyu; Wisnoe, Wirachman

    2012-06-01

    Previous wind tunnel test has proven the improved aerodynamic charasteristics of Baseline-II E-2 Blended Wing-Body (BWB) aircraft studied in Universiti Teknologi Mara. The E-2 is a version of Baseline-II BWB with modified outer wing and larger canard, solely-designed to gain favourable longitudinal static stability during flight. This paper highlights some results from current investigation on the said aircraft via computational fluid dynamics simulation as a mean to validate the wind tunnel test results. The simulation is conducted based on standard one-equation turbulence, Spalart-Allmaras model with polyhedral mesh. The ambience of the flight simulation is made based on similar ambience of wind tunnel test. The simulation shows lift, drag and moment results to be near the values found in wind tunnel test but only within angles of attack where the lift change is linear. Beyond the linear region, clear differences between computational simulation and wind tunnel test results are observed. It is recommended that different type of mathematical model be used to simulate flight conditions beyond linear lift region.

  12. IN SILICO EVALUATION OF ANGIOTENSIN II RECEPTOR ANTAGONIST’S PLASMA PROTEIN BINDING USING COMPUTED MOLECULAR DESCRIPTORS

    Directory of Open Access Journals (Sweden)

    Jadranka Odović

    2014-03-01

    Full Text Available The discovery of new pharmacologically active substances and drugs modeling led to necessity of predicting drugs properties and its ADME data. Angiotensin II receptor antagonists are a group of pharmaceuticals which modulate the renin-angiotensin-aldosterone system and today represent the most commonly prescribed anti-hypertensive drugs. The aim of this study was to compare different molecular properties of seven angiotensin II receptor antagonists / blockers (ARBs, (eprosartan, irbesartan, losartan, olmesartan, telmisartan, valsartan and their plasma protein binding (PPB data. Several ARBs molecular descriptors were calculated using software package Molinspiration Depiction Software as well as Virtual Computational Chemistry Laboratory (electronic descriptor – PSA, constitutional parameter – Mw, geometric descriptor – Vol, lipophilicity descriptors - logP values, aqueous solubility data – logS. The correlations between all collected descriptors and plasma protein binding data obtained from relevant literature were established. In the simple linear regression poor correlations were obtained in relationships between PPB data and all calculated molecular descriptors. In the next stage of the study multiple linear regression (MLR was used for correlation of PPB data with two different descriptors as independent variables. The best correlation (R2=0.70 with P<0.05 was established between PPB data and molecular weight with addition of volume values as independent variables. The possible application of computed molecular descriptors in drugs protein binding evaluation can be of great importance in drug research.

  13. Renormalization and Computation II: Time Cut-off and the Halting Problem

    CERN Document Server

    Manin, Yuri I

    2009-01-01

    This is the second installment to the project initiated in [Ma3]. In the first Part, I argued that both philosophy and technique of the perturbative renormalization in quantum field theory could be meaningfully transplanted to the theory of computation, and sketched several contexts supporting this view. In this second part, I address some of the issues raised in [Ma3] and provide their development in three contexts: a categorification of the algorithmic computations; time cut--off and Anytime Algorithms; and finally, a Hopf algebra renormalization of the Halting Problem.

  14. Critical Analysis of Underground Coal Gasification Models. Part II: Kinetic and Computational Fluid Dynamics Models

    Directory of Open Access Journals (Sweden)

    Alina Żogała

    2014-01-01

    Originality/value: This paper presents state of art in the field of coal gasification modeling using kinetic and computational fluid dynamics approach. The paper also presents own comparative analysis (concerned with mathematical formulation, input data and parameters, basic assumptions, obtained results etc. of the most important models of underground coal gasification.

  15. Computer Model of a "Sense of Humour". II. Realization in Neural Networks

    CERN Document Server

    Suslov, I M

    1992-01-01

    The computer realization of a "sense of humour" requires the creation of an algorithm for solving the "linguistic problem", i.e. the problem of recognizing a continuous sequence of polysemantic images. Such algorithm may be realized in the Hopfield model of a neural network after its proper modification.

  16. Computational models of music perception and cognition II: Domain-specific music processing

    Science.gov (United States)

    Purwins, Hendrik; Grachten, Maarten; Herrera, Perfecto; Hazan, Amaury; Marxer, Ricard; Serra, Xavier

    2008-09-01

    In Part I [Purwins H, Herrera P, Grachten M, Hazan A, Marxer R, Serra X. Computational models of music perception and cognition I: The perceptual and cognitive processing chain. Physics of Life Reviews 2008, in press, doi:10.1016/j.plrev.2008.03.004], we addressed the study of cognitive processes that underlie auditory perception of music, and their neural correlates. The aim of the present paper is to summarize empirical findings from music cognition research that are relevant to three prominent music theoretic domains: rhythm, melody, and tonality. Attention is paid to how cognitive processes like category formation, stimulus grouping, and expectation can account for the music theoretic key concepts in these domains, such as beat, meter, voice, consonance. We give an overview of computational models that have been proposed in the literature for a variety of music processing tasks related to rhythm, melody, and tonality. Although the present state-of-the-art in computational modeling of music cognition definitely provides valuable resources for testing specific hypotheses and theories, we observe the need for models that integrate the various aspects of music perception and cognition into a single framework. Such models should be able to account for aspects that until now have only rarely been addressed in computational models of music cognition, like the active nature of perception and the development of cognitive capacities from infancy to adulthood.

  17. Conformal Wasserstein Distance: Comparing disk and sphere-type surfaces in polynomial time II, computational aspects

    CERN Document Server

    Lipman, Yaron; Daubechies, Ingrid

    2011-01-01

    This paper is a companion paper to [Lipman and Daubechies 2011]. We provide numerical procedures and algorithms for computing the alignment of and distance between two disk type surfaces. We furthermore generalize the framework to support sphere-type surfaces, prove a result connecting this distance to geodesic distortion, and provide convergence analysis on the discrete approximation to the arising mass-transportation problems.

  18. The Denver universal microspectroradiometer (DUM). II. Computer configuration and modular programming for radiometry.

    Science.gov (United States)

    Galbraith, W; Geyer, S B; David, G B

    1975-12-01

    This paper describes and discusses for microscopists and spectroscopists the choice of computer equipment and the design of programs used in the Denver Universal Microspectroradiometer (DUM). This instrument is an accurate computerized photon-counting microspectrophotometer, microspectrofluorimeter and microrefractometer. The computer is used to control the operation of the system, to acquire radiometric data of various kinds, and to reduce, analyse and output the data in a readily usable form. Since the radiometer was designed to carry out many kinds of measurements in a variety of micro- and macroscopic specimens, and since different methods of microscopy or spectroscopy have to be combined in various ways fro the study of any one specimen, no single master-program could fulfill efficiently all foreseeable requirements. Therefore, the programming developed is interactive, modular, hierarchical and hybrid. Modular interactive programming makes it possible for almost any kind of main program, applicable to almost any kind of measurement, to be assembled quickly from a collection of hierarchical subroutines. Main programs are short and composed mainly of Fortran statements calling subroutines; subroutines, in turn, automatically call other subroutines over many levels. The subroutines are independently written and optimized for maximum operational efficiency in the computer system used, or for maximum ease of transfer to other systems. This approach to programming enables someone unfamiliar with computer languages to operate the radiometric system from the console of the CRT terminal. The writing of new main programs, by linking groups of existing subroutines, requires only a minimum acquaintance with Fortran; only the writing and revision of subroutines requires programming experience. Differences and similarities in the method of computer operation between the present system and other computerized radiometers are briefly discussed.

  19. Computed tomography of the abdomen in Saanen goats: II. liver, spleen, abomasum, and intestine

    OpenAIRE

    2011-01-01

    This study describes the results of computed tomography (CT) of the liver, spleen, abomasum, small intestine and large intestine in 30 healthy Saanen goats. CT examination and anatomical slice preparation postmortem were performed as described in the first communication. After subjective evaluation of the CT images, various variables including the length/size, volume and density of the liver, spleen and gallbladder, the wall thickness of the abomasum, small intestine and large intestine and t...

  20. Simulation of Mental Disorders: II. Computer Models, Purposes and Future Directions.

    Science.gov (United States)

    Gold, Azgad; Dudai, Yadin

    2016-01-01

    The complexity of the human brain and the difficulties in identifying and dissecting the biological, social and contextual underpinnings of mental functions confound the study of the etiology and pathophysiology of mental disorders. Simulating mental disorders in animal models or in computer programs may contribute to the understanding of such disorders. In the companion paper (30), we discussed selected concepts and pragmatics pertaining to mental illness simulation in general, and then focused on issues pertaining to animal models of mental disease. In this paper, we focus on selected aspects of the merits and limitations of the use of large scale computer simulation in investigating mental disorders. We argue that at the current state of knowledge, the biological-phenomenological gap in understanding mental disorders markedly limits the ability to generate high-fidelity computational models of mental illness. We conclude that similarly to the animal model approach, brain simulation focusing on limited realistic objectives, such as mimicking the emergence of selected distinct attributes of specific mental symptoms in a virtual brain or parts thereof, may serve as a useful tool in exploring mental disorders.

  1. WISDOM-II: Screening against multiple targets implicated in malaria using computational grid infrastructures

    Directory of Open Access Journals (Sweden)

    Kenyon Colin

    2009-05-01

    Full Text Available Abstract Background Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Motivation Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR, and on a new promising one, glutathione-S-transferase. Methods In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. Results On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. Conclusion The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software

  2. A Quantum Computational Semantics for Epistemic Logical Operators. Part II: Semantics

    Science.gov (United States)

    Beltrametti, Enrico; Dalla Chiara, Maria Luisa; Giuntini, Roberto; Leporini, Roberto; Sergioli, Giuseppe

    2014-10-01

    By using the abstract structures investigated in the first Part of this article, we develop a semantics for an epistemic language, which expresses sentences like "Alice knows that Bob does not understand that π is irrational". One is dealing with a holistic form of quantum computational semantics, where entanglement plays a fundamental role; thus, the meaning of a global expression determines the contextual meanings of its parts, but generally not the other way around. The epistemic situations represented in this semantics seem to reflect some characteristic limitations of the real processes of acquiring information. Since knowledge is not generally closed under logical consequence, the unpleasant phenomenon of logical omniscience is here avoided.

  3. Improved Linear Algebra Methods for Redshift Computation from Limited Spectrum Data - II

    Science.gov (United States)

    Foster, Leslie; Waagen, Alex; Aijaz, Nabella; Hurley, Michael; Luis, Apolo; Rinsky, Joel; Satyavolu, Chandrika; Gazis, Paul; Srivastava, Ashok; Way, Michael

    2008-01-01

    Given photometric broadband measurements of a galaxy, Gaussian processes may be used with a training set to solve the regression problem of approximating the redshift of this galaxy. However, in practice solving the traditional Gaussian processes equation is too slow and requires too much memory. We employed several methods to avoid this difficulty using algebraic manipulation and low-rank approximation, and were able to quickly approximate the redshifts in our testing data within 17 percent of the known true values using limited computational resources. The accuracy of one method, the V Formulation, is comparable to the accuracy of the best methods currently used for this problem.

  4. Computer aided process of dimensional distortion determination of bounded plaster sandmix Part II

    OpenAIRE

    Pawlak, M.; Z. Niedźwiedzki

    2010-01-01

    A computer program allowing calculation of dimensional changes of mould made of cristobalite-gypsum composition in process of its heat treatment and preparation for molten metal casting is presented in this paper. The composition of the mixture and casting temperature to obtain cast of predetermined dimensions can be calculated using presented software. The base for program elaboration were the results of dilatometric test of bounded plaster sandmix composed of half hydrate α-CaSO4·0,5H2O of ...

  5. Computer aided process of dimensional distortion determination of bounded plaster sandmix Part II

    Directory of Open Access Journals (Sweden)

    M. Pawlak

    2010-01-01

    Full Text Available A computer program allowing calculation of dimensional changes of mould made of cristobalite-gypsum composition in process of its heat treatment and preparation for molten metal casting is presented in this paper. The composition of the mixture and casting temperature to obtain cast of predetermined dimensions can be calculated using presented software. The base for program elaboration were the results of dilatometric test of bounded plaster sandmix composed of half hydrate α-CaSO4·0,5H2O of various cristobalite ratio. Approximation was carried out in the range of temperatures 100÷700°C.

  6. LHCb computing in Run II and its evolution towards Run III

    CERN Document Server

    Falabella, Antonio

    2016-01-01

    his contribution reports on the experience of the LHCb computing team during LHC Run 2 and its preparation for Run 3. Furthermore a brief introduction on LHCbDIRAC, i.e. the tool to interface to the experiment distributed computing resources for its data processing and data management operations, is given. Run 2, which started in 2015, has already seen several changes in the data processing workflows of the experiment. Most notably the ability to align and calibrate the detector between two different stages of the data processing in the high level trigger farm, eliminating the need for a second pass processing of the data offline. In addition a fraction of the data is immediately reconstructed to its final physics format in the high level trigger and only this format is exported from the experiment site to the physics analysis. This concept have successfully been tested and will continue to be used for the rest of Run 2. Furthermore the distributed data processing has been improved with new concepts and techn...

  7. Comparison of the Glidescope and Pentax AWS laryngoscopes to the Macintosh laryngoscope for use by advanced paramedics in easy and simulated difficult intubation.

    LENUS (Irish Health Repository)

    Nasim, Sajid

    2009-01-01

    BACKGROUND: Intubation of the trachea in the pre-hospital setting may be lifesaving in severely ill and injured patients. However, tracheal intubation is frequently difficult to perform in this challenging environment, is associated with a lower success rate, and failed tracheal intubation constitutes an important cause of morbidity. Novel indirect laryngoscopes, such as the Glidescope and the AWS laryngoscopes may reduce this risk. METHODS: We compared the efficacy of these devices to the Macintosh laryngoscope when used by 25 Advanced Paramedics proficient in direct laryngoscopy, in a randomized, controlled, manikin study. Following brief didactic instruction with the Glidescope and the AWS laryngoscopes, each participant took turns performing laryngoscopy and intubation with each device, in an easy intubation scenario and following placement of a hard cervical collar, in a SimMan manikin. RESULTS: Both the Glidescope and the AWS performed better than the Macintosh, and demonstrate considerable promise in this context. The AWS had the least number of dental compressions in all three scenarios, and in the cervical spine immobilization scenario it required fewer maneuvers to optimize the view of the glottis. CONCLUSION: The Glidescope and AWS devices possess advantages over the conventional Macintosh laryngoscope when used by Advanced Paramedics in normal and simulated difficult intubation scenarios in this manikin study. Further studies are required to extend these findings to the clinical setting.

  8. Comparison of the Glidescope® and Pentax AWS® laryngoscopes to the Macintosh laryngoscope for use by Advanced Paramedics in easy and simulated difficult intubation

    Directory of Open Access Journals (Sweden)

    O' Donnell John

    2009-05-01

    Full Text Available Abstract Background Intubation of the trachea in the pre-hospital setting may be lifesaving in severely ill and injured patients. However, tracheal intubation is frequently difficult to perform in this challenging environment, is associated with a lower success rate, and failed tracheal intubation constitutes an important cause of morbidity. Novel indirect laryngoscopes, such as the Glidescope® and the AWS® laryngoscopes may reduce this risk. Methods We compared the efficacy of these devices to the Macintosh laryngoscope when used by 25 Advanced Paramedics proficient in direct laryngoscopy, in a randomized, controlled, manikin study. Following brief didactic instruction with the Glidescope® and the AWS® laryngoscopes, each participant took turns performing laryngoscopy and intubation with each device, in an easy intubation scenario and following placement of a hard cervical collar, in a SimMan® manikin. Results Both the Glidescope® and the AWS® performed better than the Macintosh, and demonstrate considerable promise in this context. The AWS® had the least number of dental compressions in all three scenarios, and in the cervical spine immobilization scenario it required fewer maneuvers to optimize the view of the glottis. Conclusion The Glidescope® and AWS® devices possess advantages over the conventional Macintosh laryngoscope when used by Advanced Paramedics in normal and simulated difficult intubation scenarios in this manikin study. Further studies are required to extend these findings to the clinical setting.

  9. Software to compute and conduct sequential Bayesian phase I or II dose-ranging clinical trials with stopping rules.

    Science.gov (United States)

    Zohar, Sarah; Latouche, Aurelien; Taconnet, Mathieu; Chevret, Sylvie

    2003-10-01

    The aim of dose-ranging phase I (resp. phase II) clinical trials is to rapidly identify the maximum tolerated dose (MTD) (resp., minimal effective dose (MED)) of a new drug or combination. For the conduct and analysis of such trials, Bayesian approaches such as the Continual Reassessment Method (CRM) have been proposed, based on a sequential design and analysis up to a completed fixed sample size. To optimize sample sizes, Zohar and Chevret have proposed stopping rules (Stat. Med. 20 (2001) 2827), the computation of which is not provided by available softwares. We present in this paper a user-friendly software for the design and analysis of these Bayesian Phase I (resp. phase II) dose-ranging Clinical Trials (BPCT). It allows to carry out the CRM with stopping rules or not, from the planning of the trial, with choice of model parameterization based on its operating characteristics, up to the sequential conduct and analysis of the trial, with estimation at stopping of the MTD (resp. MED) of the new drug or combination.

  10. Comparison of Shikani optical stylet and Macintosh laryngoscope for double-lumen endotracheal tube intubation%Shikani 喉镜与 Macintosh 喉镜在双腔气管导管插管中的比较

    Institute of Scientific and Technical Information of China (English)

    许挺; 李民; 郭向阳

    2015-01-01

    Objective:To compare the efficacy and safety of Shikani ( S) optical stylet and Macintosh (M) laryngoscope for double-lumen endotracheal tube intubation .Methods:In the study, 60 patients undergoing elective thoracic surgery were randomly allocated to group S ( n=30 ) and group M ( n=30 ) . After general anesthesia induction , the patients in group S and group M were intubated double-lumen en-dotracheal tube ( DLT) by Shikani optical stylet ( SOS) and macintosh laryngoscope respectively .Intuba-tion time, intubation attempts , cuff broken and oral mucosal or dental injury were recorded;Blood pres-sure and heart rate at baseline ( T0 ) , at the time of intubaiton onset ( T1 ) , 1 minute after intubaiton (T2), 3 minutes after intubation (T3) and 5 minutes after intubation (T3) were also recorded;Hoarse-ness and throat sore of the patients 24 hours after surgery were evaluated .Results:The intubaiton time with the SOS was faster than with the Macintosh [(37.4 ±9.7) s vs.(43.9 ±13.7) s, P=0.039] and the first attempt success rate (87%vs.80%, P=0.488) did not differ between the groups; No tube cuff broke in both the groups;Group S had fewer patients who suffered oral mucosal or dental injury than group M (8 vs.2, P=0.038);The blood pressure and heart rate at T0,T1,T2,T3 and T4 did not differ between the groups;Throat sore(7 vs.10, P=0.390) and hoarseness (5 vs.7, P=0.519) incidence did not differ between the groups .Conclusion:By comparison of the Macintosh laryngoscope , the SOS provides faster DLT intubation and causes less oral Mucosal or dental injury .%目的:比较Shikani喉镜和Macintosh喉镜在双腔气管导管插管中的有效性和安全性。方法:60例择期行胸外科手术的患者随机分为Shikani喉镜组(S组,n=30)和Macintosh喉镜组(M组,n=30),在全麻诱导后分别采用Shikani喉镜和Macintosh喉镜插入双腔气管导管,记录患者插管时间,插管次数,是否发生导管套囊破裂及口唇、牙

  11. Blast Noise Prediction. Volume II. BNOISE 3.2 Computer Program Description and Program Listing.

    Science.gov (United States)

    1981-03-01

    spec:itied grid si/c For example, it the x \\alues ofI hounds \\A~ere 20,000 and 4 *Ii1iii ind d i < - vii si/e~ 2000t. thein ihe nev, hounds; ire 20000 ) and...Tor he iied InI L,1LUiing he Itil S I I *I hIre C u ’cFrt I ~ ~ c (hi)I IMJ CS,\\Th can he LISedL Ir t iICUlitirig hW inilti CXjIMSHIf depend1InIfg 01n...0 C - 0 00 0 * 0 NCU’C 04 0 C0 L.LL.J04N0U𔃺 a00W�NN01iCCL.aCLLJ-4U’C.f’-...LO .400C00~- N V ’C SOON CC C P00 ZOU’CONON 20000 -.0,.iOZCZ.W󈧄-.ZN

  12. Computer generated track and field scoring tables: II. Theoretical foundation and development of a model.

    Science.gov (United States)

    Purdy, J G

    1975-01-01

    An investigation is made into the subject of scoring tables for track and field with emphasis on the application of computers to calculate and output the tables. The resulting scoring tables represent an attempt to describe the effective quality of performance for track and field events. This paper is published in three parts. The first portion reviewed the historical development of scoring tables. This part concerns the theoretical foundation and development of a mathematical model. A set of underlying principles and construction guidelines are established as a basis for all scoring tables. In order to satisfy the goals, a model which includes an exponential term is developed. The concept of a zero offset is introduced as a boundary value for the low-level performances. The final part concerns an evaluation of the model and an analysis of the point scores for different events.

  13. Computer technology of genogeographic analysis of a gene pool: II. Statistical transformation of maps

    Energy Technology Data Exchange (ETDEWEB)

    Balanovskaya, E.V.; Nurbaev, S.D.; Rychkov, Yu.G. [Vavilov Institute of General Genetics, Moscow (Russian Federation)

    1994-11-01

    Transformations of computer maps of geographic distribution of gene frequencies using basic mathematical statistical procedures are considered. These transformations are designated as statistical transformation of maps. Two transformation groups are considered: of one map separately and of a group of maps. Transformations possess a value beyond their use as intermediate stages of more complicated cartographical analysis: the resulting maps carry entirely new information on the geography of genes or a gene pool. This article considers three examples of obtaining new genetic profiles using statistical transformation algorithms. These profiles are of: (1) heterozygosity (of HLA-A, B, C loci in northeastern Eurasia); (2) disease risk (Rh-incompatibility of mother and child with simultaneous registration of Rh and ABO blood groups in Eastern Europe); (3) genetic distances (from own mean ethnic values for Belarus and from mean Russian values for the gene pool of Eastern Europe). 15 refs., 9 figs., 1 tab.

  14. II - Detector simulation for the LHC and beyond : how to match computing resources and physics requirements

    CERN Document Server

    CERN. Geneva

    2016-01-01

    Detector simulation at the LHC is one of the most computing intensive activities. In these lectures we will show how physics requirements were met for the LHC experiments and extrapolate to future experiments (FCC-hh case). At the LHC, detectors are complex, very precise and ambitious: this implies modern modelisation tools for geometry and response. Events are busy and characterised by an unprecedented energy scale with hundreds of particles to be traced and high energy showers to be accurately simulated. Furthermore, high luminosities imply many events in a bunch crossing and many bunch crossings to be considered at the same time. In addition, backgrounds not directly correlated to bunch crossings have also to be taken into account. Solutions chosen for ATLAS (a mixture of detailed simulation and fast simulation/parameterisation) will be described and CPU and memory figures will be given. An extrapolation to the FCC-hh case will be tried by taking as example the calorimeter simulation.

  15. EVOLVE : a Bridge between Probability, Set Oriented Numerics, and Evolutionary Computation II

    CERN Document Server

    Coello, Carlos; Tantar, Alexandru-Adrian; Tantar, Emilia; Bouvry, Pascal; Moral, Pierre; Legrand, Pierrick; EVOLVE 2012

    2013-01-01

    This book comprises a selection of papers from the EVOLVE 2012 held in Mexico City, Mexico. The aim of the EVOLVE is to build a bridge between probability, set oriented numerics and evolutionary computing, as to identify new common and challenging research aspects. The conference is also intended to foster a growing interest for robust and efficient methods with a sound theoretical background. EVOLVE is intended to unify theory-inspired methods and cutting-edge techniques ensuring performance guarantee factors. By gathering researchers with different backgrounds, a unified view and vocabulary can emerge where the theoretical advancements may echo in different domains. Summarizing, the EVOLVE focuses on challenging aspects arising at the passage from theory to new paradigms and aims to provide a unified view while raising questions related to reliability,  performance guarantees and modeling. The papers of the EVOLVE 2012 make a contribution to this goal. 

  16. Luminescent cyclometalated alkynylplatinum(II) complexes with a tridentate pyridine-based N-heterocyclic carbene ligand: synthesis, characterization, electrochemistry, photophysics, and computational studies.

    Science.gov (United States)

    Leung, Sammual Yu-Lut; Lam, Elizabeth Suk-Hang; Lam, Wai Han; Wong, Keith Man-Chung; Wong, Wing-Tak; Yam, Vivian Wing-Wah

    2013-07-29

    A new class of luminescent alkynylplatinum(II) complexes with a tridentate pyridine-based N-heterocyclic carbene (2,6-bis(1-butylimidazol-2-ylidenyl)pyridine) ligand, [Pt(II)(C^N^C)(C≡CR)][PF6], and their chloroplatinum(II) precursor complex, [Pt(II)(C^N^C)Cl][PF6], have been synthesized and characterized. One of the alkynylplatinum(II) complexes has also been structurally characterized by X-ray crystallography. The electrochemistry, electronic absorption and luminescence properties of the complexes have been studied. Nanosecond transient absorption (TA) spectroscopy has also been performed to probe the nature of the excited state. The origin of the absorption and emission properties has been supported by computational studies.

  17. Cosmic reionization on computers. II. Reionization history and its back-reaction on early galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Gnedin, Nickolay Y. [Particle Astrophysics Center, Fermi National Accelerator Laboratory, Batavia, IL 60510 (United States); Kaurov, Alexander A., E-mail: gnedin@fnal.gov, E-mail: kaurov@uchicago.edu [Department of Astronomy and Astrophysics, The University of Chicago, Chicago, IL 60637 (United States)

    2014-09-20

    We compare the results from several sets of cosmological simulations of cosmic reionization, produced under the Cosmic Reionization On Computers project, with existing observational data on the high-redshift Lyα forest and the abundance of Lyα emitters. We find good consistency with the observational measurements and previous simulation work. By virtue of having several independent realizations for each set of numerical parameters, we are able to explore the effect of cosmic variance on observable quantities. One unexpected conclusion we are forced into is that cosmic variance is unusually large at z > 6, with both our simulations and, most likely, observational measurements still not fully converged for even such basic quantities as the average Gunn-Peterson optical depth or the volume-weighted neutral fraction. We also find that reionization has little effect on the early galaxies or on global cosmic star formation history, because galaxies whose gas content is affected by photoionization contain no molecular (i.e., star-forming) gas in the first place. In particular, measurements of the faint end of the galaxy luminosity function by the James Webb Space Telescope are unlikely to provide a useful constraint on reionization.

  18. SIMMER-II: A computer program for LMFBR disrupted core analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bohl, W.R.; Luck, L.B.

    1990-06-01

    SIMMER-2 (Version 12) is a computer program to predict the coupled neutronic and fluid-dynamics behavior of liquid-metal fast reactors during core-disruptive accident transients. The modeling philosophy is based on the use of general, but approximate, physics to represent interactions of accident phenomena and regimes rather than a detailed representation of specialized situations. Reactor neutronic behavior is predicted by solving space (r,z), energy, and time-dependent neutron conservation equations (discrete ordinates transport or diffusion). The neutronics and the fluid dynamics are coupled via temperature- and background-dependent cross sections and the reactor power distribution. The fluid-dynamics calculation solves multicomponent, multiphase, multifield equations for mass, momentum, and energy conservation in (r,z) or (x,y) geometry. A structure field with nine density and five energy components; a liquid field with eight density and six energy components; and a vapor field with six density and on energy component are coupled by exchange functions representing a modified-dispersed flow regime with a zero-dimensional intra-cell structure model.

  19. Comparison of chemical and thermal protein denaturation by combination of computational and experimental approaches. II

    Science.gov (United States)

    Wang, Qian; Christiansen, Alexander; Samiotakis, Antonios; Wittung-Stafshede, Pernilla; Cheung, Margaret S.

    2011-11-01

    Chemical and thermal denaturation methods have been widely used to investigate folding processes of proteins in vitro. However, a molecular understanding of the relationship between these two perturbation methods is lacking. Here, we combined computational and experimental approaches to investigate denaturing effects on three structurally different proteins. We derived a linear relationship between thermal denaturation at temperature Tb and chemical denaturation at another temperature Tu using the stability change of a protein (ΔG). For this, we related the dependence of ΔG on temperature, in the Gibbs-Helmholtz equation, to that of ΔG on urea concentration in the linear extrapolation method, assuming that there is a temperature pair from the urea (Tu) and the aqueous (Tb) ensembles that produces the same protein structures. We tested this relationship on apoazurin, cytochrome c, and apoflavodoxin using coarse-grained molecular simulations. We found a linear correlation between the temperature for a particular structural ensemble in the absence of urea, Tb, and the temperature of the same structural ensemble at a specific urea concentration, Tu. The in silico results agreed with in vitro far-UV circular dichroism data on apoazurin and cytochrome c. We conclude that chemical and thermal unfolding processes correlate in terms of thermodynamics and structural ensembles at most conditions; however, deviations were found at high concentrations of denaturant.

  20. Full computation of massive AGB evolution. II. The role of mass loss and cross-sections

    CERN Document Server

    D'Antona, P V F

    2005-01-01

    In the course of a systematic exploration of the uncertainties associated to the input micro- and macro-physics in the modeling of the evolution of intermediate mass stars during their Asymptotic Giant Branch (AGB) phase, we focus on the role of the nuclear reactions rates and mass loss. We consider masses 3computing the full nucleosynthesis with hot bottom burning (HBB), for a network of 30 elements, using either the NACRE or the Cameron & Fowler (1988) cross-sections. The results differ in particular with respect to the Na23 nucleosynthesis (which is more efficient in the NACRE case) and the magnesium isotopes ratios. For both choices, however, the CNO nucleosynthesis shows that the C+N+O is constant within a factor of two, in our models employing a very efficient convection treatment. Different mass loss rates alter the physical conditions for HBB and the length of the AGB phase, changing indi...

  1. II - Template Metaprogramming for Massively Parallel Scientific Computing - Vectorization with Expression Templates

    CERN Document Server

    CERN. Geneva

    2016-01-01

    Large scale scientific computing raises questions on different levels ranging from the fomulation of the problems to the choice of the best algorithms and their implementation for a specific platform. There are similarities in these different topics that can be exploited by modern-style C++ template metaprogramming techniques to produce readable, maintainable and generic code. Traditional low-level code tend to be fast but platform-dependent, and it obfuscates the meaning of the algorithm. On the other hand, object-oriented approach is nice to read, but may come with an inherent performance penalty. These lectures aim to present he basics of the Expression Template (ET) idiom which allows us to keep the object-oriented approach without sacrificing performance. We will in particular show to to enhance ET to include SIMD vectorization. We will then introduce techniques for abstracting iteration, and introduce thread-level parallelism for use in heavy data-centric loads. We will show to to apply these methods i...

  2. Learning and performance of tracheal intubation by novice personnel: a comparison of the Airtraq and Macintosh laryngoscope.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-07-01

    Direct laryngoscopic tracheal intubation is taught to many healthcare professionals as it is a potentially lifesaving procedure. However, it is a difficult skill to acquire and maintain, and, of concern, the consequences of poorly performed intubation attempts are potentially serious. The Airtraq Laryngoscope is a novel intubation device which may possess advantages over conventional direct laryngoscopes for use by novice personnel. We conducted a prospective trial with 40 medical students who had no prior airway management experience. Following brief didactic instruction, each participant took turns in performing laryngoscopy and intubation using the Macintosh and Airtraq devices under direct supervision. Each student was allowed up to three attempts to intubate in three laryngoscopy scenarios using a Laerdal Intubation Trainer and one scenario in a Laerdal SimMan Manikin. They then performed tracheal intubation of the normal airway a second time to characterise the learning curve for each device. The Airtraq provided superior intubating conditions, resulting in greater success of intubation, particularly in the difficult laryngoscopy scenarios. In both easy and simulated difficult laryngoscopy scenarios, the Airtraq decreased the duration of intubation attempts, reduced the number of optimisation manoeuvres required, and reduced the potential for dental trauma. The Airtraq device showed a rapid learning curve and the students found it significantly easier to use. The Airtraq appears to be a superior device for novice personnel to acquire the skills of tracheal intubation.

  3. Endotracheal intubation using the C-MAC® video laryngoscope or the Macintosh laryngoscope: a prospective, comparative study in the ICU.

    Science.gov (United States)

    Noppens, Ruediger R; Geimer, Stephanie; Eisel, Nicole; David, Matthias; Piepho, Tim

    2012-06-13

    Endotracheal intubation in the ICU is a challenging procedure and is frequently associated with life-threatening complications. The aim of this study was to investigate the effect of the C-MAC® video laryngoscope on laryngeal view and intubation success compared with direct laryngoscopy. In a single-center, prospective, comparative before-after study in an anesthetist-lead surgical ICU of a tertiary university hospital, predictors of potentially difficult tracheal intubation, number of intubation attempts, success rate and glottic view were evaluated during a 2-year study period (first year, Macintosh laryngoscopy (ML); second year, C-MAC®). A total of 274 critically ill patients requiring endotracheal intubation were included; 113 intubations using ML and 117 intubations using the C-MAC® were assessed. In patients with at least one predictor for difficult intubation, the C-MAC® resulted in more successful intubations on first attempt compared with ML (34/43, 79% vs. 21/38, 55%; P = 0.03). The visualization of the glottis with ML using Cormack and Lehane (C&L) grading was more frequently rated as difficult (20%, C&L grade 3 and 4) compared with the C-MAC® (7%, C&L grade 3 and 4) (P intubating success rate on the first attempt in patients with predictors for difficult intubation in the ICU setting. Video laryngoscopy seems to be a useful tool in the ICU where potentially difficult endotracheal intubations regularly occur.

  4. Stray light in cone beam optical computed tomography: II. Reduction using a convergent light source.

    Science.gov (United States)

    Dekker, Kurtis H; Battista, Jerry J; Jordan, Kevin J

    2016-04-07

    Optical cone beam computed tomography (CBCT) using a broad beam and CCD camera is a fast method for densitometry of 3D optical gel dosimeters. However, diffuse light sources introduce considerable stray light into the imaging system, leading to underestimation of attenuation coefficients and non-uniformities in CT images unless corrections are applied to each projection image. In this study, the light source of a commercial optical CT scanner is replaced with a convergent cone beam source consisting of almost exclusively image forming primary rays. The convergent source is achieved using a small isotropic source and a Fresnel lens. To characterize stray light effects, full-field cone beam CT imaging is compared to fan beam CT (FBCT) using a 1 cm high fan beam aperture centered on the optic axis of the system. Attenuating liquids are scanned within a large 96 mm diameter uniform phantom and in a small 13.5 mm diameter finger phantom. For the uniform phantom, cone and fan beam CT attenuation coefficients agree within a maximum deviation of (1  ±  2)% between mean values over a wide range from 0.036 to 0.43 cm(-1). For the finger phantom, agreement is found with a maximum deviation of (4  ±  2)% between mean values over a range of 0.1-0.47 cm(-1). With the convergent source, artifacts associated with refractive index mismatch and vessel optical features are more pronounced. Further optimization of the source size to achieve a balance between quantitative accuracy and artifact reduction should enable practical, accurate 3D dosimetry, avoiding time consuming 3D scatter measurements.

  5. Stray light in cone beam optical computed tomography: II. Reduction using a convergent light source

    Science.gov (United States)

    Dekker, Kurtis H.; Battista, Jerry J.; Jordan, Kevin J.

    2016-04-01

    Optical cone beam computed tomography (CBCT) using a broad beam and CCD camera is a fast method for densitometry of 3D optical gel dosimeters. However, diffuse light sources introduce considerable stray light into the imaging system, leading to underestimation of attenuation coefficients and non-uniformities in CT images unless corrections are applied to each projection image. In this study, the light source of a commercial optical CT scanner is replaced with a convergent cone beam source consisting of almost exclusively image forming primary rays. The convergent source is achieved using a small isotropic source and a Fresnel lens. To characterize stray light effects, full-field cone beam CT imaging is compared to fan beam CT (FBCT) using a 1 cm high fan beam aperture centered on the optic axis of the system. Attenuating liquids are scanned within a large 96 mm diameter uniform phantom and in a small 13.5 mm diameter finger phantom. For the uniform phantom, cone and fan beam CT attenuation coefficients agree within a maximum deviation of (1  ±  2)% between mean values over a wide range from 0.036 to 0.43 cm-1. For the finger phantom, agreement is found with a maximum deviation of (4  ±  2)% between mean values over a range of 0.1-0.47 cm-1. With the convergent source, artifacts associated with refractive index mismatch and vessel optical features are more pronounced. Further optimization of the source size to achieve a balance between quantitative accuracy and artifact reduction should enable practical, accurate 3D dosimetry, avoiding time consuming 3D scatter measurements.

  6. Web-based computational chemistry education with CHARMMing II: Coarse-grained protein folding.

    Science.gov (United States)

    Pickard, Frank C; Miller, Benjamin T; Schalk, Vinushka; Lerner, Michael G; Woodcock, H Lee; Brooks, Bernard R

    2014-07-01

    A lesson utilizing a coarse-grained (CG) Gō-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing) web portal (www.charmming.org) to the Chemistry at HARvard Macromolecular Mechanics (CHARMM) molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the Gō-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG Gō model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field.

  7. Web-based computational chemistry education with CHARMMing II: Coarse-grained protein folding.

    Directory of Open Access Journals (Sweden)

    Frank C Pickard

    2014-07-01

    Full Text Available A lesson utilizing a coarse-grained (CG Gō-like model has been implemented into the CHARMM INterface and Graphics (CHARMMing web portal (www.charmming.org to the Chemistry at HARvard Macromolecular Mechanics (CHARMM molecular simulation package. While widely used to model various biophysical processes, such as protein folding and aggregation, CG models can also serve as an educational tool because they can provide qualitative descriptions of complex biophysical phenomena for a relatively cheap computational cost. As a proof of concept, this lesson demonstrates the construction of a CG model of a small globular protein, its simulation via Langevin dynamics, and the analysis of the resulting data. This lesson makes connections between modern molecular simulation techniques and topics commonly presented in an advanced undergraduate lecture on physical chemistry. It culminates in a straightforward analysis of a short dynamics trajectory of a small fast folding globular protein; we briefly describe the thermodynamic properties that can be calculated from this analysis. The assumptions inherent in the model and the data analysis are laid out in a clear, concise manner, and the techniques used are consistent with those employed by specialists in the field of CG modeling. One of the major tasks in building the Gō-like model is determining the relative strength of the nonbonded interactions between coarse-grained sites. New functionality has been added to CHARMMing to facilitate this process. The implementation of these features into CHARMMing helps automate many of the tedious aspects of constructing a CG Gō model. The CG model builder and its accompanying lesson should be a valuable tool to chemistry students, teachers, and modelers in the field.

  8. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. Computational analysis of neutronic parameters for TRIGA Mark-II research reactor using evaluated nuclear data libraries

    Energy Technology Data Exchange (ETDEWEB)

    Uddin, M.N. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh); Sarker, M.M., E-mail: sarker_md@yahoo.co [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Ganakbari, Savar, GPO Box 3787, Dhaka-1000 (Bangladesh); Khan, M.J.H. [Reactor Physics and Engineering Division, Institute of Nuclear Science and Technology, Atomic Energy Research Establishment, Ganakbari, Savar, GPO Box 3787, Dhaka-1000 (Bangladesh); Islam, S.M.A. [Department of Physics, Jahangirnagar University, Dhaka (Bangladesh)

    2010-03-15

    The aim of this study is to analyze the neutronic parameters of TRIGA Mark-II research reactor using the chain of NJOY-WIMS-CITATION computer codes based on evaluated nuclear data libraries CENDL-2.2 and JEFF-3.1.1. The nuclear data processing code NJOY99.0 has been employed to generate the 69 group WIMS library for the isotopes of TRIGA core. The cell code WIMSD-5B was used to generate the cross sections in CITATION format and then 3-dimensional diffusion code CITTATION was used to calculate the neutronic parameters of the TRIGA Mark-II research reactor. All the analyses were performed using the 7-group macroscopic cross section library. The CITATION test-runs using different cross section sets based on different models applied in WIMS calculations have shown a strong influence of those models on the final integral parameters. Some of the cells were specially treated with PRIZE options available in WIMSD-5B to take into account the fine structure of the flux gradient in the fuel-reflector interface region. It was observed that two basic parameters, the effective multiplication factor, k{sub eff} and the thermal neutron flux, were in good agreement among the calculated results with each other as well as the measured values. The maximum power densities at the hot spot were 1.0446E02 W/cc and 1.0426E02 W/cc for the libraries CENDL-2.2 and JEFF-3.1.1 respectively. The calculated total peaking factors 5.793 and 5.745 were compared to the original SAR value of 5.6325 as well as MCNP result. Consequently, this analysis will be helpful to enhance the neutronic calculations and also be used for the further thermal-hydraulics study of the TRIGA core.

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  14. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  15. Response to a field of the D = 3 Ising spin glass with Janus and JanusII dedicated computers

    Science.gov (United States)

    Seoane, Beatriz; Janus Collaboration Collaboration

    Using the Janus dedicated computer, and its new generation JanusII, we study the linear response to a field of the Edwards-Anderson model for times that cover twelve orders of magnitude. The fluctuation-dissipation relations are investigated for several values of tw. We observe that the violations of the fluctuation-dissipation theorem can be directly related to the P (q) measured in equilibrium at finite sizes, although a simple statics-dynamics dictionary L ξ (tw) is not enough to account for the behavior at large times. We show that the equivalence can be easily restored by taking into account the growth of ξ (t +tw) . Interestingly, experimental measurements of the spin glass correlation length rely precisely on the response of a spin glass to a field, although a direct relation between the measured object and the real ξ has never been established. In this work, we mimic the experimental protocol with Janus data, which lets us relate the experimental ξ with the length extracted from the spatial correlation function. These results allow us for the first time to make a quantitative comparison between experiments and simulations, finding a surprising good agreement with measurements in superspin glasses. This project has received funding from the European Union's Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No. 654971, the ERC grant CRIPHERASY (no. 247328) and from the MINECO(Spain) (No. FIS2012-35719-C02).

  16. Experimental and computational studies on 4-[(3,5-dimethyl-1H-pyrazol-1-yl)methoxy]phthalonitrile and synthesis and spectroscopic characterization of its novel phthalocyanine magnesium(II) and tin(II) metal complexes.

    Science.gov (United States)

    Akçay, Hakkı Türker; Bayrak, Rıza; Sahin, Ertan; Karaoğlu, Kaan; Demirbaş, Umit

    2013-10-01

    The molecular structure of the substituted phthalonitrile was analyzed crystallographically and compared with optimized geometric structure. The structural properties of the compound such as energy, vibrational frequency, ground state transitions, (1)H and (13)C NMR chemical shifts, NBO analysis and hyperpolarizability were computed by DFT (Density Functional Theory) method and compared with experimental results. The novel Mg(II) and Sn(II) phthalocyanines synthesized from the substituted phthalonitrile and their aggregation behaviors were investigated in different solvents and at different concentrations in DMSO.

  17. Comparison of the Pentax Airwayscope, Glidescope Video Laryngoscope, and Macintosh Laryngoscope During Chest Compression According to Bed Height.

    Science.gov (United States)

    Kim, Wonhee; Lee, Yoonje; Kim, Changsun; Lim, Tae Ho; Oh, Jaehoon; Kang, Hyunggoo; Lee, Sanghyun

    2016-02-01

    We aimed to investigate whether bed height affects intubation performance in the setting of cardiopulmonary resuscitation and which type of laryngoscope shows the best performance at each bed height.A randomized crossover manikin study was conducted. Twenty-one participants were enrolled, and they were randomly allocated to 2 groups: group A (n = 10) and group B (n = 11). The participants underwent emergency endotracheal intubation (ETI) using the Airwayscope (AWS), Glidescope video laryngoscope, and Macintosh laryngoscope in random order while chest compression was performed. Each ETI was conducted at 2 levels of bed height (minimum bed height: 68.9  cm and maximum bed height: 101.3 cm). The primary outcomes were the time to intubation (TTI) and the success rate of ETI. The P value for statistical significance was set at 0.05 and 0.017 in post-hoc test.The success rate of ETI was always 100% regardless of the type of laryngoscope or the bed height. TTI was not significantly different between the 2 bed heights regardless of the type of laryngoscope (all P > 0.05). The time for AWS was the shortest among the 3 laryngoscopes at both bed heights (13.7  ±  3.6 at the minimum bed height and 13.4  ±  4.7 at the maximum bed height) (all P bed height, whether adjusted to the minimum or maximum setting, did not affect intubation performance. In addition, regardless of the bed height, the intubation time with the video laryngoscopes, especially AWS, was significantly shorter than that with the direct laryngoscope during chest compression.

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. A comparison of the suction laryngoscope and the Macintosh laryngoscope in emergency medical technicians: a manikin model of severe airway haemorrhage.

    Science.gov (United States)

    Mitterlechner, T; Wipp, A; Herff, H; Wenzel, V; Strasak, A M; Felbinger, T W; Schmittinger, C A

    2012-01-01

    The use of a suction laryngoscope that enables simultaneous suction and laryngoscopy was evaluated. 34 emergency medical technicians intubated the trachea of a manikin with simulated upper airway haemorrhage using the suction laryngoscope and the Macintosh laryngoscope, in random order. When using the suction laryngoscope, the number of oesophageal intubations was lower (3/34 vs 11/34; p=0.021) and the time taken to intubation was shorter (mean (SD) 50 (15) vs 58 (27) s; p=0.041). In cases of airway haemorrhage, the use of the suction laryngoscope might be beneficial.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. Retention of laryngoscopy skills in medical students: a randomised, cross-over study of the Macintosh, A.P. Advance(™) , C-MAC(®) and Airtraq(®) laryngoscopes.

    Science.gov (United States)

    Hunter, I; Ramanathan, V; Balasubramanian, P; Evans, D A; Hardman, J G; McCahon, R A

    2016-10-01

    In addition to being effective and easy to learn how to use, the ideal laryngoscope should be associated with minimal reduction in skill performance during gaps in practice over time. We compared the time taken to intubate the trachea of a manikin by novice medical students immediately after training, and then after 1 month, with no intervening practice. We designed a two-period, four-group, randomised, cross-over trial to compare the Macintosh, Venner(™) A.P. Advance(™) with difficult airway blade, C-MAC(®) with D-Blade and Airtraq(®) with wireless video-viewer. A bougie was used to aid intubation with the Macintosh and the C-MAC. After training, there was no significant difference in median (IQR [range]) intubation time using the videolaryngoscopes compared with the Macintosh, which took 30 (26.5-35 [12-118])s. One month later, the intubation time was longer using the C-MAC (41 (29.5-52 [20-119])s; p = 0.002) and A.P. Advance (40 (28.5-57.5 [21-107])s; p = 0.0003)m compared with the Macintosh (27 (21-29 [16-90])s); there was no difference using the Airtraq (27 (20.5-32.5 [15-94])s; p = 0.258) compared with the Macintosh. While skill acquisition after a brief period of learning and practice was equal for each laryngoscope, performance levels differed after 1 month without practice. In particular, the consistency of performance using the C-MAC and A.P. Advance was worse compared with the Macintosh and the Airtraq. While the clinical significance of this is doubtful, we believe that reliable and consistent performance at laryngoscopy is desirable; for the devices that we tested, this requires regular practice.

  9. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  14. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. User's instructions for ORCENT II: a digital computer program for the analysis of steam turbine cycles supplied by light-water-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, L.C.

    1979-02-01

    The ORCENT-II digital computer program will perform calculations at valves-wide-open design conditions, maximum guaranteed rating conditions, and an approximation of part-load conditions for steam turbine cycles supplied with throttle steam characteristic of contemporary light-water reactors. Turbine performance calculations are based on a method published by the General Electric Company. Output includes all information normally shown on a turbine-cycle heat balance diagram. The program is written in FORTRAN IV for the IBM System 360 digital computers at the Oak Ridge National Laboratory.

  17. Comparative evaluation of platelet-rich fibrin versus beta-tri-calcium phosphate in the treatment of Grade II mandibular furcation defects using cone-beam computed tomography

    Science.gov (United States)

    Siddiqui, Zeba Rahman; Jhingran, Rajesh; Bains, Vivek Kumar; Srivastava, Ruchi; Madan, Rohit; Rizvi, Iram

    2016-01-01

    Objective: The objective of the study was to evaluate clinically and radiographically the efficacy of platelet-rich fibrin (PRF) versus β-tri-calcium phosphate (β-TCP) in the treatment of Grade II mandibular furcation defects. Materials and Methods: Forty-five Grade II furcation defect in mandibular molars which were assigned to open flap debridement (OFD) with PRF Group I (n = 15), to OFD with β-TCP Group II (n = 15), and to OFD alone Group III (n = 15) were analyzed for clinical parameters (probing pocket depth [PPD], vertical clinical attachment level [VCAL], horizontal clinical attachment level [HCAL], gingival recession, relative vertical height of furcation [r-VHF], and relative horizontal depth of furcation [r-HDF]) and radiographical parameters (horizontal depth of furcation [H-DOF], vertical height of furcation [V-HOF]) using cone-beam computed tomography (CBCT) at 6 months interval. Results: For clinical parameters, reduction in PPD and gain in VCAL and HCAL were higher in Group II as compared to Group I. Change in r-VHF and r-HDF was greater in Group II as compared to Group I. Mean percentage clinical vertical defect fill was higher in Group II as compared to Group I (58.52% ± 11.68% vs. 53.24% ± 13.22%, respectively). On CBCT, mean change at 6 months for all parameters showed nonsignificant difference between the two experimental groups. Mean change in V-HOF was higher in Group I as compared to Group II, but mean change in H-DOF and furcation width was more in Group II as compared to Group I. Conclusion: For both experimental and control groups, there was statistically significant improvement at 6 months follow-up from baseline values. PMID:28042265

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  19. Airway Management with Cervical Spine Immobilisation: A Comparison between the Macintosh Laryngoscope, Truview Evo2, and Totaltrack VLM Used by Novices—A Manikin Study

    Science.gov (United States)

    Gaszyński, Tomasz

    2016-01-01

    Airway management in patients with suspected cervical spine injury plays an important role in the pathway of care of trauma patients. The aim of this study was to evaluate three different airway devices during intubation of a patient with reduced cervical spine mobility. Forty students of the third year of emergency medicine studies participated in the study (F = 26, M = 14). The time required to obtain a view of the entry to the larynx and successful ventilation time were recorded. Cormack-Lehane laryngoscopic view and damage to the incisors were also assessed. All three airway devices were used by each student (a novice) and they were randomly chosen. The mean time required to obtain the entry-to-the-larynx view was the shortest for the Macintosh laryngoscope 13.4 s (±2.14). Truview Evo2 had the shortest successful ventilation time 35.7 s (±9.27). The best view of the entry to the larynx was obtained by the Totaltrack VLM device. The Truview Evo2 and Totaltrack VLM may be an alternative to the classic Macintosh laryngoscope for intubation of trauma patients with suspected injury to the cervical spine. The use of new devices enables achieving better laryngoscopic view as well as minimising incisor damage during intubation. PMID:27034926

  20. Evaluation of intubation using the Airtraq or Macintosh laryngoscope by anaesthetists in easy and simulated difficult laryngoscopy--a manikin study.

    LENUS (Irish Health Repository)

    Maharaj, C H

    2006-05-01

    The Airtraq Laryngoscope is a novel intubation device which allows visualisation of the vocal cords without alignment of the oral, pharyngeal and tracheal axes. We compared the Airtraq with the Macintosh laryngoscope in simulated easy and difficult laryngoscopy. Twenty-five anaesthetists were allowed up to three attempts to intubate the trachea in each of three laryngoscopy scenarios using a Laerdal Intubation Trainer followed by five scenarios using a Laerdal SimMan Manikin. Each anaesthetist then performed tracheal intubation of the normal airway a second time to characterise the learning curve. In the simulated easy laryngoscopy scenarios, there was no difference between the Airtraq and the Macintosh in success of tracheal intubation. The time taken to intubate at the end of the protocol was significantly lower using the Airtraq (9.5 (6.7) vs. 14.2 (7.4) s), demonstrating a rapid acquisition of skills. In the simulated difficult laryngoscopy scenarios, the Airtraq was more successful in achieving tracheal intubation, required less time to intubate successfully, caused less dental trauma, and was considered by the anaesthetists to be easier to use.

  1. Comparison of the McGrath® Series 5 and GlideScope® Ranger with the Macintosh laryngoscope by paramedics

    Directory of Open Access Journals (Sweden)

    Werner Christian

    2011-01-01

    Full Text Available Abstract Background Out-of-hospital endotracheal intubation performed by paramedics using the Macintosh blade for direct laryngoscopy is associated with a high incidence of complications. The novel technique of video laryngoscopy has been shown to improve glottic view and intubation success in the operating room. The aim of this study was to compare glottic view, time of intubation and success rate of the McGrath® Series 5 and GlideScope® Ranger video laryngoscopes with the Macintosh laryngoscope by paramedics. Methods Thirty paramedics performed six intubations in a randomised order with all three laryngoscopes in an airway simulator with a normal airway. Subsequently, every participant performed one intubation attempt with each device in the same manikin with simulated cervical spine rigidity using a cervical collar. Glottic view, time until visualisation of the glottis and time until first ventilation were evaluated. Results Time until first ventilation was equivalent after three intubations in the first scenario. In the scenario with decreased cervical motion, the time until first ventilation was longer using the McGrath® compared to the GlideScope® and AMacintosh (p ® device (p Conclusions The learning curve for video laryngoscopy in paramedics was steep in this study. However, these data do not support prehospital use of the McGrath® and GlideScope® devices by paramedics.

  2. Spectroscopic and computational characterization of CuII-OOR (R = H or cumyl) complexes bearing a Me6-tren ligand.

    Science.gov (United States)

    Choi, Yu Jin; Cho, Kyung-Bin; Kubo, Minoru; Ogura, Takashi; Karlin, Kenneth D; Cho, Jaeheung; Nam, Wonwoo

    2011-03-14

    A copper(II)-hydroperoxo complex, [Cu(Me(6)-tren)(OOH)](+) (2), and a copper(ii)-cumylperoxo complex, [Cu(Me(6)-tren)(OOC(CH(3))(2)Ph)](+) (3), were synthesized by reacting [Cu(Me(6)-tren)(CH(3)CN)](2+) (1) with H(2)O(2) and cumyl-OOH, respectively, in the presence of triethylamine. These intermediates, 2 and 3, were successfully characterized by various physicochemical methods such as UV-vis, ESI-MS, resonance Raman and EPR spectroscopies, leading us to propose structures of the Cu(II)-OOR species with a trigonal-bipyramidal geometry. Density functional theory (DFT) calculations provided geometric and electronic configurations of 2 and 3, showing trigonal bipyramidal copper(II)-OOR geometries. These copper(II)-hydroperoxo and -cumylperoxo complexes were inactive in electrophilic and nucleophilic oxidation reactions.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  4. Is computer aided detection (CAD) cost effective in screening mammography? A model based on the CADET II study

    Science.gov (United States)

    2011-01-01

    Background Single reading with computer aided detection (CAD) is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£), year 2007/08) of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner). Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate) CAD is unlikely to be a cost effective alternative to double reading for mammography screening in UK. This study

  5. Is computer aided detection (CAD cost effective in screening mammography? A model based on the CADET II study

    Directory of Open Access Journals (Sweden)

    Wallis Matthew G

    2011-01-01

    Full Text Available Abstract Background Single reading with computer aided detection (CAD is an alternative to double reading for detecting cancer in screening mammograms. The aim of this study is to investigate whether the use of a single reader with CAD is more cost-effective than double reading. Methods Based on data from the CADET II study, the cost-effectiveness of single reading with CAD versus double reading was measured in terms of cost per cancer detected. Cost (Pound (£, year 2007/08 of single reading with CAD versus double reading was estimated assuming a health and social service perspective and a 7 year time horizon. As the equipment cost varies according to the unit size a separate analysis was conducted for high, average and low volume screening units. One-way sensitivity analyses were performed by varying the reading time, equipment and assessment cost, recall rate and reader qualification. Results CAD is cost increasing for all sizes of screening unit. The introduction of CAD is cost-increasing compared to double reading because the cost of CAD equipment, staff training and the higher assessment cost associated with CAD are greater than the saving in reading costs. The introduction of single reading with CAD, in place of double reading, would produce an additional cost of £227 and £253 per 1,000 women screened in high and average volume units respectively. In low volume screening units, the high cost of purchasing the equipment will results in an additional cost of £590 per 1,000 women screened. One-way sensitivity analysis showed that the factors having the greatest effect on the cost-effectiveness of CAD with single reading compared with double reading were the reading time and the reader's professional qualification (radiologist versus advanced practitioner. Conclusions Without improvements in CAD effectiveness (e.g. a decrease in the recall rate CAD is unlikely to be a cost effective alternative to double reading for mammography screening

  6. Palladium(II) complex with thiazole containing tridentate ONN donor ligand: Synthesis, X-ray structure and DFT computation

    Science.gov (United States)

    Biswas, Sujan; Pramanik, Ajoy Kumar; Mondal, Tapan Kumar

    2015-05-01

    New palladium(II) complex with 2-(2-thiazolyl)-4-methylphenol (TAC) having general formula [Pd(TAC)Cl) (1) has been synthesized and characterized. The complex has been characterized by various spectroscopic techniques. Single crystal X-ray structure shows distorted square planar geometry around palladium(II). Cyclic voltammetric studies shows ligand based irreversible oxidation and reduction peaks. The electronic structure, redox properties and electronic excitations in the complex are interpreted by DFT and TDDFT calculations.

  7. New fluorescent azo-Schiff base Cu(II) and Zn(II) metal chelates; spectral, structural, electrochemical, photoluminescence and computational studies

    Science.gov (United States)

    Purtas, Fatih; Sayin, Koray; Ceyhan, Gokhan; Kose, Muhammet; Kurtoglu, Mukerrem

    2017-06-01

    A new Schiff base containing azo chromophore group obtained by condensation of 2-hydroxy-4-[(E)-phenyldiazenyl]benzaldehyde with 3,4-dimethylaniline (HL) are used for the syntheses of new copper(II) and zinc(II) chelates, [Cu(L)2], and [Zn(L)2], and characterized by physico-chemical and spectroscopic methods such as 1H and 13C NMR, IR, UV.-Vis. and elemental analyses. The solid state structure of the ligand was characterized by single crystal X-ray diffraction study. X-ray diffraction data was then used to calculate the harmonic oscillator model of aromaticity (HOMA) indexes for the rings so as to investigate of enol-imine and keto-amine tautomeric forms in the solid state. The phenol ring C10-C15 shows a considerable deviation from the aromaticity with HOMA value of 0.837 suggesting the shift towards the keto-amine tautomeric form in the solid state. The analytical data show that the metal to ligand ratio in the chelates was found to be 1:2. Theoretical calculations of the possible isomers of the ligand and two metal complexes are performed by using B3LYP method. Electrochemical and photoluminescence properties of the synthesized azo-Schiff bases were also investigated.

  8. Computational analysis of the MCoTI-II plant defence knottin reveals a novel intermediate conformation that facilitates trypsin binding

    Science.gov (United States)

    Jones, Peter M.; George, Anthony M.

    2016-03-01

    MCoTI-I and II are plant defence proteins, potent trypsin inhibitors from the bitter gourd Momordica cochinchinensis. They are members of the Knottin Family, which display exceptional stability due to unique topology comprising three interlocked disulfide bridges. Knottins show promise as scaffolds for new drug development. A crystal structure of trypsin-bound MCoTI-II suggested that loop 1, which engages the trypsin active site, would show decreased dynamics in the bound state, an inference at odds with an NMR analysis of MCoTI-I, which revealed increased dynamics of loop 1 in the presence of trypsin. To investigate this question, we performed unrestrained MD simulations of trypsin-bound and free MCoTI-II. This analysis found that loop 1 of MCoTI-II is not more dynamic in the trypsin-bound state than in the free state. However, it revealed an intermediate conformation, transitional between the free and bound MCoTI-II states. The data suggest that MCoTI-II binding involves a process in which initial interaction with trypsin induces transitions between the free and intermediate conformations, and fluctuations between these states account for the increase in dynamics of loop 1 observed for trypsin-bound MCoTI-I. The MD analysis thus revealed new aspects of the inhibitors’ dynamics that may be of utility in drug design.

  9. Comparison of interradicular distances and cortical bone thickness in Thai patients with class I and class II skeletal patterns using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Khumsarn, Nattida [Dental Division of Lamphun Hospital, Lamphun (Thailand); Patanaporn, Virush; Janhom, Apirum; Jotikasthira, Dhirawat [Faculty of Dentistry, Chiang Mai University, Chiang Mai (Thailand)

    2016-06-15

    This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns.

  10. Comparison of interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns using cone-beam computed tomography

    Science.gov (United States)

    Khumsarn, Nattida; Patanaporn, Virush; Jotikasthira, Dhirawat

    2016-01-01

    Purpose This study evaluated and compared interradicular distances and cortical bone thickness in Thai patients with Class I and Class II skeletal patterns, using cone-beam computed tomography (CBCT). Materials and Methods Pretreatment CBCT images of 24 Thai orthodontic patients with Class I and Class II skeletal patterns were included in the study. Three measurements were chosen for investigation: the mesiodistal distance between the roots, the width of the buccolingual alveolar process, and buccal cortical bone thickness. All distances were recorded at five different levels from the cementoenamel junction (CEJ). Descriptive statistical analysis and t-tests were performed, with the significance level for all tests set at p<0.05. Results Patients with a Class II skeletal pattern showed significantly greater maxillary mesiodistal distances (between the first and second premolars) and widths of the buccolingual alveolar process (between the first and second molars) than Class I skeletal pattern patients at 10 mm above the CEJ. The maxillary buccal cortical bone thicknesses between the second premolar and first molar at 8 mm above the CEJ in Class II patients were likewise significantly greater than in Class I patients. Patients with a Class I skeletal pattern showed significantly wider mandibular buccolingual alveolar processes than did Class II patients (between the first and second molars) at 4, 6, and 8 mm below the CEJ. Conclusion In both the maxilla and mandible, the mesiodistal distances, the width of the buccolingual alveolar process, and buccal cortical bone thickness tended to increase from the CEJ to the apex in both Class I and Class II skeletal patterns. PMID:27358819

  11. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  12. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    CERN Document Server

    Habib, Salman; LeCompte, Tom; Marshall, Zach; Borgland, Anders; Viren, Brett; Nugent, Peter; Asai, Makoto; Bauerdick, Lothar; Finkel, Hal; Gottlieb, Steve; Hoeche, Stefan; Sheldon, Paul; Vay, Jean-Luc; Elmer, Peter; Kirby, Michael; Patton, Simon; Potekhin, Maxim; Yanny, Brian; Calafiura, Paolo; Dart, Eli; Gutsche, Oliver; Izubuchi, Taku; Lyon, Adam; Petravick, Don

    2015-01-01

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  13. Synthesis, characterization, computational studies, antimicrobial activities and carbonic anhydrase inhibitor effects of 2-hydroxy acetophenone-N-methyl p-toluenesulfonylhydrazone and its Co(II), Pd(II), Pt(II) complexes

    Science.gov (United States)

    Özbek, Neslihan; Alyar, Saliha; Memmi, Burcu Koçak; Gündüzalp, Ayla Balaban; Bahçeci, Zafer; Alyar, Hamit

    2017-01-01

    2-Hydroxyacetophenone-N-methyl p-toluenesulfonylhydrazone (afptsmh) derived from p-toluenesulfonicacid-1-methylhydrazide (ptsmh) and its Co(II), Pd(II), Pt(II) complexes were synthesized for the first time. Synthesized compounds were characterized by spectroscopic methods (FT-IR, 1Hsbnd 13C NMR, LC-MS, UV-vis), magnetic susceptibility and conductivity measurements. 1H and 13C shielding tensors for crystal structure of ligand were calculated with GIAO/DFT/B3LYP/6-311++G(d,p) methods in CDCl3. The vibrational band assignments were performed at B3LYP/6-311++G(d,p) theory level combined with scaled quantum mechanics force field (SQMFF) methodology. The antibacterial activities of synthesized compounds were studied against some Gram positive and Gram negative bacteria by using microdilution and disc diffusion methods. In vitro enzyme inhibitory effects of the compounds were measured by UV-vis spectrophotometer. The enzyme activities against human carbonic anhydrase II (hCA II) were evaluated as IC50 (the half maximal inhibitory concentration) values. It was found that afptsmh and its metal complexes have inhibitory effects on hCA II isoenzyme. General esterase activities were determined using alpha and beta naphtyl acetate substrates (α- and β-NAs) of Drosophila melanogaster (D. melanogaster). Activity results show that afptsmh does not strongly affect the bacteria strains and also shows poor inhibitory activity against hCAII isoenzyme whereas all complexes posses higher biological activities.

  14. Computational Analysis of Intra-Ventricular Flow Pattern Under Partial and Full Support of BJUT-II VAD

    Science.gov (United States)

    Zhang, Qi; Gao, Bin; Chang, Yu

    2017-01-01

    Background Partial support, as a novel support mode, has been widely applied in clinical practice and widely studied. However, the precise mechanism of partial support of LVAD in the intra-ventricular flow pattern is unclear. Material/Methods In this study, a patient-specific left ventricular geometric model was reconstructed based on CT data. The intra-ventricular flow pattern under 3 simulated conditions – “heart failure”, “partial support”, and “full support” – were simulated by using fluid-structure interaction (FSI). The blood flow pattern, wall shear stress (WSS), time-average wall shear stress (TAWSS), oscillatory shear index (OSI), and relative residence time (RRT) were calculated to evaluate the hemodynamic effects. Results The results demonstrate that the intra-ventricular flow pattern is significantly changed by the support level of BJUT-II VAD. The intra-ventricular vortex was enhanced under partial support and was eliminated under full support, and the high OSI and RRT regions changed from the septum wall to the cardiac apex. Conclusions In brief, the support level of the BJUT-II VAD has significant effects on the intra-ventricular flow pattern. The partial support mode of BJUT-II VAD can enhance the intra-ventricular vortex, while the distribution of high OSI and RRT moved from the septum wall to the cardiac apex. Hence, the partial support mode of BJUT-II VAD can provide more benefit for intra-ventricular flow pattern. PMID:28239142

  15. Computational Analysis of Intra-Ventricular Flow Pattern Under Partial and Full Support of BJUT-II VAD.

    Science.gov (United States)

    Zhang, Qi; Gao, Bin; Chang, Yu

    2017-02-27

    BACKGROUND Partial support, as a novel support mode, has been widely applied in clinical practice and widely studied. However, the precise mechanism of partial support of LVAD in the intra-ventricular flow pattern is unclear. MATERIAL AND METHODS In this study, a patient-specific left ventricular geometric model was reconstructed based on CT data. The intra-ventricular flow pattern under 3 simulated conditions - "heart failure", "partial support", and "full support" - were simulated by using fluid-structure interaction (FSI). The blood flow pattern, wall shear stress (WSS), time-average wall shear stress (TAWSS), oscillatory shear index (OSI), and relative residence time (RRT) were calculated to evaluate the hemodynamic effects. RESULTS The results demonstrate that the intra-ventricular flow pattern is significantly changed by the support level of BJUT-II VAD. The intra-ventricular vortex was enhanced under partial support and was eliminated under full support, and the high OSI and RRT regions changed from the septum wall to the cardiac apex. CONCLUSIONS In brief, the support level of the BJUT-II VAD has significant effects on the intra-ventricular flow pattern. The partial support mode of BJUT-II VAD can enhance the intra-ventricular vortex, while the distribution of high OSI and RRT moved from the septum wall to the cardiac apex. Hence, the partial support mode of BJUT-II VAD can provide more benefit for intra-ventricular flow pattern.

  16. Manual Indexes versus Computer-Aided Indexes: Comparing the Readers' Guide to Periodical Literature to InfoTrac II.

    Science.gov (United States)

    Reese, Carol

    1988-01-01

    The relative effectiveness of the CD-ROM information retrieval system, InfoTrac II, and the manual "Readers' Guide to Periodical Literature," was studied. Seventeen community college students were divided into two groups which researched the same questions either on CD-ROM or in the printed index. Results showed the "Readers'…

  17. Experimental and Computational Evidence for the Reduction Mechanisms of Aromatic N-oxides by Aqueous Fe(II)-Tiron Complex.

    Science.gov (United States)

    Chen, Yiling; Dong, Hao; Zhang, Huichun

    2016-01-05

    A combined experimental-theoretical approach was taken to elucidate the reduction mechanisms of five representative aromatic N-oxides (ANOs) by Fe(II)-tiron complex and to identify the rate-limiting step. Based on the possible types of complexes formed with the reductant, three groups of ANOs were studied: type I refers to those forming 5-membered ring complexes through the N and O atoms on the side chain; type II refers to those forming 6-membered ring complexes through the N-oxide O atom and the O atom on the side chain; and type III refers to complexation through the N-oxide O atom only. Density functional theory calculations suggested that the elementary reactions, including protonation, N-O bond cleavage, and the second electron transfer processes, are barrierless, indicating that the first electron transfer is rate-limiting. Consistent with the theoretical results, the experimental solvent isotope effect, KIEH, for the reduction of quinoline N-oxide (a type III ANO) was obtained to be 1.072 ± 0.025, suggesting protonation was not involved in the rate-limiting step. The measured nitrogen kinetic isotope effect, KIEN, for the reduction of pyridine N-oxide (a type III ANO) (1.022 ± 0.006) is in good agreement with the calculated KIEN for its first electron transfer (1.011-1.028), confirming that the first electron transfer is rate-limiting. Electrochemical cell experiments demonstrated that the electron transfer process can be facilitated significantly by type I complexation with FeL2(6-) (1:2 Fe(II)-tiron complex), to some extent by type II complexation with free Fe(II), but not by weak type III complexation.

  18. Teaching Inorganic Photophysics and Photochemistry with Three Ruthenium(II) Polypyridyl Complexes: A Computer-Based Exercise

    Science.gov (United States)

    Garino, Claudio; Terenzi, Alessio; Barone, Giampaolo; Salassa, Luca

    2016-01-01

    Among computational methods, DFT (density functional theory) and TD-DFT (time-dependent DFT) are widely used in research to describe, "inter alia," the optical properties of transition metal complexes. Inorganic/physical chemistry courses for undergraduate students treat such methods, but quite often only from the theoretical point of…

  19. Grid connected integrated community energy system. Phase II: final state 2 report. Cost benefit analysis, operating costs and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-22

    A grid-connected Integrated Community Energy System (ICES) with a coal-burning power plant located on the University of Minnesota campus is planned. The cost benefit analysis performed for this ICES, the cost accounting methods used, and a computer simulation of the operation of the power plant are described. (LCL)

  20. Multiyear interactive computer almanac, 1800-2050

    CERN Document Server

    United States. Naval Observatory

    2005-01-01

    The Multiyear Interactive Computer Almanac (MICA Version 2.2.2 ) is a software system that runs on modern versions of Windows and Macintosh computers created by the U.S. Naval Observatory's Astronomical Applications Department, especially for astronomers, surveyors, meteorologists, navigators and others who regularly need accurate information on the positions, motions, and phenomena of celestial objects. MICA produces high-precision astronomical data in tabular form, tailored for the times and locations specified by the user. Unlike traditional almanacs, MICA computes these data in real time, eliminating the need for table look-ups and additional hand calculations. MICA tables can be saved as standard text files, enabling their use in other applications. Several important new features have been added to this edition of MICA, including: extended date coverage from 1800 to 2050; a redesigned user interface; a graphical sky map; a phenomena calculator (eclipses, transits, equinoxes, solstices, conjunctions, oppo...

  1. Computational modeling for irrigated agriculture planning. Part II: risk analysis Modelagem computacional para planejamento em agricultura irrigada: Parte II - Análise de risco

    Directory of Open Access Journals (Sweden)

    João C. F. Borges Júnior

    2008-09-01

    Full Text Available Techniques of evaluation of risks coming from inherent uncertainties to the agricultural activity should accompany planning studies. The risk analysis should be carried out by risk simulation using techniques as the Monte Carlo method. This study was carried out to develop a computer program so-called P-RISCO for the application of risky simulations on linear programming models, to apply to a case study, as well to test the results comparatively to the @RISK program. In the risk analysis it was observed that the average of the output variable total net present value, U, was considerably lower than the maximum U value obtained from the linear programming model. It was also verified that the enterprise will be front to expressive risk of shortage of water in the month of April, what doesn't happen for the cropping pattern obtained by the minimization of the irrigation requirement in the months of April in the four years. The scenario analysis indicated that the sale price of the passion fruit crop exercises expressive influence on the financial performance of the enterprise. In the comparative analysis it was verified the equivalence of P-RISCO and @RISK programs in the execution of the risk simulation for the considered scenario.Técnicas de avaliação de riscos procedentes de incertezas inerentes à atividade agrícola devem acompanhar os estudos de planejamento. A análise de risco pode ser desempenhada por meio de simulação, utilizando técnicas como o método de Monte Carlo. Neste trabalho, teve-se o objetivo de desenvolver um programa computacional, denominado P-RISCO, para utilização de simulações de risco em modelos de programação linear, aplicar a um estudo de caso e testar os resultados comparativamente ao programa @RISK. Na análise de risco, observou-se que a média da variável de saída, valor presente líquido total (U, foi consideravelmente inferior ao valor máximo de U obtido no modelo de programação linear. Constatou

  2. Diagnostic Sensitivity of Multidetector-Row Spiral Computed Tomography Angiography in the Evaluation of Type-II Endoleaks and their Source: Comparison between Axial Scans and Reformatting Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Saba, L.; Pascalis, L.; Montisci, R.; Sanfilippo, R.; Mallarini, G. (Depts. of Radiology and Vascular Surgery, Azienda Ospedaliero-Universitaria di Cagliari, Polo di Monserrato, Monserrato, Cagliari (Italy))

    2008-07-15

    Background: After endovascular stent-graft placement, several complications may occur. Retrograde filling of the aneurysm (type-II endoleak) is the most common. Purpose: To evaluate the accuracy, image quality, and interobserver agreement of multidetector-row spiral computed tomography angiography (MDCTA) in the diagnosis of type-II endoleak, by using various types of reformatting techniques in comparison to regular axial images. Material and Methods: Twenty-four patients who had had endovascular repair of an infrarenal abdominal aortic aneurysm with stent graft were retrospectively studied. In 12 of 24 patients, a type-II endoleak was found. CT scans were obtained after intravenous administration of 130 ml of nonionic contrast material using a 4-6-ml/s flow rate. All patients were investigated with axial scans, multiplanar reconstruction (MPR), maximum intensity projection (MIP), shaded-surface display (SSD), and volume-rendering (VR) techniques. For each patient and for each reconstruction method, the image quality of the scans was scored as 0 for bad quality, 1 for poor quality, 2 for good quality, and 3 for excellent quality images. Two radiologists reviewed the CT images independently. Sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV) were calculated for each reconstruction method, with the axial images as the reference method. Interobserver agreement and kappa value were also recorded. Results: MPR showed the highest sensitivity (83% and 67% for observers 1 and 2, respectively), PPV (91% and 80% for observers 1 and 2, respectively), and NPV (85% and 71% for observers 1 and 2, respectively), whereas VR showed the highest specificity (92% for both observer 1 and 2). Conclusion: Reformatting techniques provide good-quality images; nevertheless, their efficacy in the study of type-II endoleak was found to be suboptimal in comparison to regular axial images. The MPR technique is probably the best choice in conjunction

  3. Synthesis, structure determination, and spectroscopic/computational characterization of a series of Fe(II)-thiolate model complexes: implications for Fe-S bonding in superoxide reductases.

    Science.gov (United States)

    Fiedler, Adam T; Halfen, Heather L; Halfen, Jason A; Brunold, Thomas C

    2005-02-16

    A combined synthetic/spectroscopic/computational approach has been employed to prepare and characterize a series of Fe(II)-thiolate complexes that model the square-pyramidal [Fe(II)(N(His))(4)(S(Cys))] structure of the reduced active site of superoxide reductases (SORs), a class of enzymes that detoxify superoxide in air-sensitive organisms. The high-spin (S = 2) Fe(II) complexes [(Me(4)cyclam)Fe(SC(6)H(4)-p-OMe)]OTf (2) and [FeL]PF(6) (3) (where Me(4)cyclam = 1,4,8,11-tetramethylcyclam and L is the pentadentate monoanion of 1-thioethyl-4,8,11-trimethylcyclam) were synthesized and subjected to structural, magnetic, and electrochemical characterization. X-ray crystallographic studies confirm that 2 and 3 possess an N(4)S donor set similar to that found for the SOR active site and reveal molecular geometries intermediate between square pyramidal and trigonal bipyramidal for both complexes. Electronic absorption, magnetic circular dichroism (MCD), and variable-temperature variable-field MCD (VTVH-MCD) spectroscopies were utilized, in conjunction with density functional theory (DFT) and semiemperical INDO/S-CI calculations, to probe the ground and excited states of complexes 2 and 3, as well as the previously reported Fe(II) SOR model [(L(8)py(2))Fe(SC(6)H(4)-p-Me)]BF(4) (1) (where L(8)py(2) is a tetradentate pyridyl-appended diazacyclooctane macrocycle). These studies allow for a detailed interpretation of the S-->Fe(II) charge transfer transitions observed in the absorption and MCD spectra of complexes 1-3 and provide significant insights into the nature of Fe(II)-S bonding in complexes with axial thiolate ligation. Of the three models investigated, complex 3 exhibits an absorption spectrum that is particularly similar to the one reported for the reduced SOR enzyme (SOR(red)), suggesting that this model accurately mimics key elements of the electronic structure of the enzyme active site; namely, highly covalent Fe-S pi- and sigma-interactions. These spectral

  4. N-((5-chloropyridin-2-yl)carbamothioyl)furan-2-carboxamide and its Co(II), Ni(II) and Cu(II) complexes: Synthesis, characterization, DFT computations, thermal decomposition, antioxidant and antitumor activity

    Science.gov (United States)

    Yeşilkaynak, Tuncay; Özpınar, Celal; Emen, Fatih Mehmet; Ateş, Burhan; Kaya, Kerem

    2017-02-01

    N-((5-chloropyridin-2-yl)carbamothioyl)furan-2-carboxamide (HL: C11H8ClN3O2S) and its Co(II), Ni(II) and Cu(II) complexes have been synthesized and characterized by elemental analysis, FT-IR,1H NMR and HR-MS methods. The HL was characterized by single crystal X-ray diffraction technique. It crystallizes in the monoclinic system. The HL has the space group P 1 21/c 1, Z = 4, and its unit cell parameters are a = 4.5437(5) Å, b = 22.4550(3) Å, c = 11.8947(14) Å. The ligand coordinates the metal ions as bidentate and thus essentially yields neutral complexes of the [ML2] type. ML2 complex structures were optimized using B97D/TZVP level. Molecular orbitals of both HL ligand were calculated at the same level. Thermal decomposition of the complexes has been investigated by thermogravimetry. The complexes were screened for their anticancer and antioxidant activities. Antioxidant activity of the complexes was determined by using the DPPH and ABTS assays. The anticancer activity of the complexes was studied by using MTT assay in MCF-7 breast cancer cells.

  5. Computer-monitored radionuclide tracking of three-dimensional mandibular movements. Part II: experimental setup and preliminary results - Posselt diagram

    Energy Technology Data Exchange (ETDEWEB)

    Salomon, J.A.; Waysenson, B.D.; Warshaw, B.D.

    1979-04-01

    This article described a new method to track mandibular movements using a computer-assisted radionuclide kinematics technique. The usefulness of various image-enhancement techniques is discussed, and the reproduction of physiologic displacements is shown. Vertical, lateral, and protrusive envelopes of motion of a point on a tooth of a complete denture mounted on a semiadjustable articulator were measured. A demonstrative example of the validity of this approach is reproducing the motion of the dental point, which clearly evidences the Posselt diagram.

  6. Proceedings of the International Conference on Stiff Computation, April 12-14, 1982, Park City, Utah. Volume II.

    Science.gov (United States)

    1982-01-01

    January 1982 𔃻 I. 1. ABSTRACT: ’ Henrici (1962) discussed optimal Stormer Cowell class of linear multistep methods for the second order differential...i.e. Stiefel and Bettis (1969), Lambert and Watson (1975),Hairer (1979), Fatunla (1981) 2. Introduction: Henrici [4] and Lambert (5] discussed the...373-379. 4. P. Henrici , Discrete Variable methods in ordinary differential equations (1962), John Miley and Sons. 5. J.D. Lambert, Computational

  7. Atmospheric transfer of radiation above an inhomogeneous non-Lambertian reflective ground. II - Computational considerations and results

    Science.gov (United States)

    Diner, D. J.; Martonchik, J. V.

    1984-10-01

    The theoretical foundation for solution of the three-dimensional radiative transfer problem described in the preceding paper is reviewed. Practical considerations involved in implementing the Fourier transform/Gauss-Seidel method on a minicomputer are discussed, along with derivations of symmetry relations and approximations which can be used to enhance the computational efficiency. Model results for a surface whose albedo varies as a step function are presented and compared with published solutions obtained by using the Monte Carlo method.

  8. Cavity QED and Quantum Computation in the Weak Coupling Regime II Complete Construction of the Controlled-Controlled NOT Gate

    CERN Document Server

    Fujii, K; Kato, R; Wada, Y; Fujii, Kazuyuki; Higashida, Kyoko; Kato, Ryosuke; Wada, Yukako

    2005-01-01

    In this paper we treat a cavity QED quantum computation. Namely, we consider a model of quantum computation based on n atoms of laser-cooled and trapped linearly in a cavity and realize it as the n atoms Tavis-Cummings Hamiltonian interacting with n external (laser) fields. We solve the Schr{\\" o}dinger equation of the model in the weak coupling regime to construct the controlled NOT gate in the case of n=2, and to construct the controlled-controlled NOT gate in the case of n=3 by making use of several resonance conditions and rotating wave approximation associated to them. We also present an idea to construct general quantum circuits. The approach is more sophisticated than that of the paper [K. Fujii, Higashida, Kato and Wada, Cavity QED and Quantum Computation in the Weak Coupling Regime, J. Opt. B : Quantum Semiclass. Opt. {\\bf 6} (2004), 502]. Our method is not heuristic but completely mathematical, and the significant feature is based on a consistent use of Rabi oscillations.

  9. Mono and binuclear ruthenium(II) complexes containing 5-chlorothiophene-2-carboxylic acid ligands: Spectroscopic analysis and computational studies

    Science.gov (United States)

    Swarnalatha, Kalaiyar; Kamalesu, Subramaniam; Subramanian, Ramasamy

    2016-11-01

    New Ruthenium complexes I, II and III were synthesized using 5-chlorothiophene-2-carboxylic acid (5TPC), as ligand and the complexes were characterized by elemental analysis, FT-IR, 1H, 13C NMR, and mass spectroscopic techniques. Photophysical and electrochemical studies were carried out and the structures of the synthesized complex were optimized using density functional theory (DFT). The molecular geometry, the highest occupied molecular orbital (HOMO), the lowest unoccupied molecular orbital (LUMO) energies and Mulliken atomic charges of the molecules are determined at the B3LYP method and standard 6-311++G (d,p) basis set starting from optimized geometry. They possess excellent stabilities and their thermal decomposition temperatures are 185 °C, 180 °C and 200 °C respectively, indicating that the metal complexes are suitable for the fabrication processes of optoelectronic devices.

  10. Parametric Investigation of Radome Analysis Methods. Volume II. Computer-Aided Radome Analysis Using Geometrical Optics and Lorentz Reciprocity.

    Science.gov (United States)

    1981-02-01

    Read :ind \\’r it e TI’LE aceordi nq to, 18A.1 foiina t Li tier e 5-67: R-. nif iiitt dat a uO i ug free-field format. Line o : CORnII)’t " si no of the of...comjiut e C1 , L.’’ 17-1_1e,: c’all RXMIT and complUte tab, (it t iecsl.o-.ffi - coint ; versus sine of incidenice iiq. fl i rst c:il I to RXMIT bulx ds...0ZZ X x .j L c1 - X. -4 N p Z Z 4, -a a .0) a 4 wL- Z V~-X x 0* 1- /) 4 ty IC * Z 6. n .~ -4A14. 6 .- l-- IA I- w -4 >( ffa) CL L U-> Z~ z z z0D3- V

  11. Computer vision system approach in colour measurements of foods: Part II. validation of methodology with real foods

    Directory of Open Access Journals (Sweden)

    Fatih TARLAK

    2016-01-01

    Full Text Available Abstract The colour of food is one of the most important factors affecting consumers’ purchasing decision. Although there are many colour spaces, the most widely used colour space in the food industry is L*a*b* colour space. Conventionally, the colour of foods is analysed with a colorimeter that measures small and non-representative areas of the food and the measurements usually vary depending on the point where the measurement is taken. This leads to the development of alternative colour analysis techniques. In this work, a simple and alternative method to measure the colour of foods known as “computer vision system” is presented and justified. With the aid of the computer vision system, foods that are homogenous and uniform in colour and shape could be classified with regard to their colours in a fast, inexpensive and simple way. This system could also be used to distinguish the defectives from the non-defectives. Quality parameters of meat and dairy products could be monitored without any physical contact, which causes contamination during sampling.

  12. Barrier-free proton transfer in the valence anion of 2'-deoxyadenosine-5'-monophosphate. II. A computational study.

    Science.gov (United States)

    Kobyłecka, Monika; Gu, Jiande; Rak, Janusz; Leszczynski, Jerzy

    2008-01-28

    The propensity of four representative conformations of 2(')-deoxyadenosine-5(')-monophosphate (5(')-dAMPH) to bind an excess electron has been studied at the B3LYP6-31++G(d,p) level. While isolated canonical adenine does not support stable valence anions in the gas phase, all considered neutral conformations of 5(')-dAMPH form adiabatically stable anions. The type of an anionic 5(')-dAMPH state, i.e., the valence, dipole bound, or mixed (valence/dipole bound), depends on the internal hydrogen bond(s) pattern exhibited by a particular tautomer. The most stable anion results from an electron attachment to the neutral syn-south conformer. The formation of this anion is associated with a barrier-free proton transfer triggered by electron attachment and the internal rotation around the C4(')-C5(') bond. The adiabatic electron affinity of the a_south-syn anion is 1.19 eV, while its vertical detachment energy is 1.89 eV. Our results are compared with the photoelectron spectrum (PES) of 5(')-dAMPH(-) measured recently by Stokes et al., [J. Chem. Phys. 128, 044314 (2008)]. The computational VDE obtained for the most stable anionic structure matches well with the experimental electron binding energy region of maximum intensity. A further understanding of DNA damage might require experimental and computational studies on the systems in which purine nucleotides are engaged in hydrogen bonding.

  13. A computational study of structural and magnetic properties of bi- and trinuclear Cu(II) complexes with extremely long Cu-Cu distances

    Science.gov (United States)

    Baryshnikov, Gleb V.; Minaev, Boris F.; Baryshnikova, Alina T.; Ågren, Hans

    2017-07-01

    Three recently synthesized copper(II) complexes with aroylhydrazones of trifluoroacetic and benzenecarboxylic acids (Dalton Trans., 2013, 42, 16878) have been computationally investigated by density functional theory within the broken symmetry approximation accounting for empirical dispersion corrections. A topological analysis of electron density distributions has been carried out using Bader's ;quantum theory of atoms in molecules; formalism. The calculated values of spin-spin exchange for the studied dinuclear complexes indicate a very weak ferromagnetic coupling of the unpaired electrons in good agreement with experimental data. At the same time, the trinuclear copper(II) complex possesses a low-spin doublet ground state with one ferromagnetic and two antiferromagnetic spin projections between the triangular-positioned Cu2+ ions. The estimated values of the coupling constants for the spin-spin exchange in this trinuclear complex are in a good agreement with experimental observations. The calculations support a mechanism of exchange coupling through the aromatic links in these strongly spin-separated systems.

  14. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    Energy Technology Data Exchange (ETDEWEB)

    Uzunyan, S. A. [Northern Illinois Univ., DeKalb, IL (United States); Blazey, G. [Northern Illinois Univ., DeKalb, IL (United States); Boi, S. [Northern Illinois Univ., DeKalb, IL (United States); Coutrakon, G. [Northern Illinois Univ., DeKalb, IL (United States); Dyshkant, A. [Northern Illinois Univ., DeKalb, IL (United States); Francis, K. [Northern Illinois Univ., DeKalb, IL (United States); Hedin, D. [Northern Illinois Univ., DeKalb, IL (United States); Johnson, E. [Northern Illinois Univ., DeKalb, IL (United States); Kalnins, J. [Northern Illinois Univ., DeKalb, IL (United States); Zutshi, V. [Northern Illinois Univ., DeKalb, IL (United States); Ford, R. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rauch, J. E. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Rubinov, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Sellberg, G. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Wilson, P. [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Naimuddin, M. [Univ. of Delhi, New Delhi (India)

    2015-12-29

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulation.

  15. Synthesis, X-ray characterization and computational studies of Cu(II) complexes of N-pyrazolyl pyrimidine.

    Science.gov (United States)

    Cañellas, Pablo; Bauzá, Antonio; García-Raso, Angel; Fiol, Juan J; Deyà, Pere M; Molins, Elies; Mata, Ignasi; Frontera, Antonio

    2012-08-28

    In this manuscript we report the synthesis and X-ray characterization of several complexes of Cu(II) with a 2-(1H-pyrazol-1-yl)-pyrimidine (L) ligand. Complexes CuLCl(2) (1), [CuL(2)(H(2)O)(2)](NO(3))(2) (2) and [CuL(2)H(2)O](NO(3))(2) (3) are mononuclear systems and [CuL(NO(3))(2)](n) (4) is polymeric. In the solid state, complexes 2 and 3 are characterized by the presence of anion-π interactions that are relevant for the final 3D architecture and packing. In complexes 1 and 4, where the counterion is directly bonded to the metal, anion-π interactions are not observed. High level ab initio calculations (RI-MP2/def2-TZVP) have been used to evaluate the noncovalent interactions observed in the solid state and the interplay between them. We also demonstrate that the presence of anions above the aromatic ligand is not due only to strong electrostatic interactions between the counterparts.

  16. Anticancer activity and computational modeling of ternary copper (II) complexes with 3-indolecarboxylic acid and 1,10-phenanthroline.

    Science.gov (United States)

    Zhang, Zhen; Wang, Huiyun; Wang, Qibao; Yan, Maocai; Wang, Huannan; Bi, Caifeng; Sun, Shanshan; Fan, Yuhua

    2016-08-01

    Metal-containing compounds have been extensively studied for many years as potent proteasome inhibitors. The 20S proteasome, the main component of the ubiquitin proteasome pathway, is one of the excellent targets in anticancer drug development. We recently reported that several copper complexes were able to inhibit cancer-special proteasome and induce cell death in human cancer cells. However, the involved molecular mechanism is not known yet. We therefore synthesized three copper complexes and investigated their abilities on inhibiting proteasome activity and inducting apoptosis in human breast cancer cells. Furthermore, we employed molecular dockings to analyze the possible interaction between the synthetic copper complexes and the β5 subunit of proteasome which only reflects the chymotrypsin-like activity. Our results demonstrate that three Cu(II) complexes possess potent proteasome inhibition capability in a dose-dependent and time-dependent manner in MDA-MB-231 human breast cancer cells. They could bind to the β5 subunit of the 20S proteasome, which consequently cause deactivation of the proteasome and tumor cell death. The present study is significant for providing important theoretical basis for design and synthesis of anticancer drugs with low toxicity, high efficiency and high selectivity.

  17. Neutronics and thermal hydraulic analysis of TRIGA Mark II reactor using MCNPX and COOLOD-N2 computer code

    Science.gov (United States)

    Tiyapun, K.; Wetchagarun, S.

    2017-06-01

    The neutronic analysis of TRIGA Mark II reactor has been performed. A detailed model of the reactor core was conducted including standard fuel elements, fuel follower control rods, and irradiation devices. As the approach to safety nuclear design are based on determining the criticality (keff), reactivity worth, reactivity excess, hot rod power factor and power peaking of the reactor, the MCNPX code had been used to calculate the nuclear parameters for different core configuration designs. The thermal-hydraulic model has been developed using COOLOD-N2 for steady state, using the nuclear parameters and power distribution results from MCNPX calculation. The objective of the thermal-hydraulic model is to determine the thermal safety margin and to ensure that the fuel integrity is maintained during steady state as well as during abnormal condition at full power. The hot channel fuel centerline temperature, fuel surface temperature, cladding surface temperature, the departure from nucleate boiling (DNB) and DNB ratio were determined. The good agreement between experimental data and simulation concerning reactor criticality proves the reliability of the methodology of analysis from neutronic and thermal hydraulic perspective.

  18. Calibration and GEANT4 Simulations of the Phase II Proton Compute Tomography (pCT) Range Stack Detector

    CERN Document Server

    Uzunyan, S A; Boi, S; Coutrakon, G; Dyshkant, A; Francis, K; Hedin, D; Johnson, E; Kalnins, J; Zutshi, V; Ford, R; Rauch, J E; Rubinov, P; Sellberg, G; Wilson, P; Naimuddin, M

    2016-01-01

    Northern Illinois University in collaboration with Fermi National Accelerator Laboratory (FNAL) and Delhi University has been designing and building a proton CT scanner for applications in proton treatment planning. The Phase II proton CT scanner consists of eight planes of tracking detectors with two X and two Y coordinate measurements both before and after the patient. In addition, a range stack detector consisting of a stack of thin scintillator tiles, arranged in twelve eight-tile frames, is used to determine the water equivalent path length (WEPL) of each track through the patient. The X-Y coordinates and WEPL are required input for image reconstruction software to find the relative (proton) stopping powers (RSP) value of each voxel in the patient and generate a corresponding 3D image. In this Note we describe tests conducted in 2015 at the proton beam at the Central DuPage Hospital in Warrenville, IL, focusing on the range stack calibration procedure and comparisons with the GEANT~4 range stack simulati...

  19. Computational modeling of elastic properties of carbon nanotube/polymer composites with interphase regions. Part II: Mechanical modeling

    KAUST Repository

    Han, Fei

    2014-01-01

    We present two modeling approaches for predicting the macroscopic elastic properties of carbon nanotubes/polymer composites with thick interphase regions at the nanotube/matrix frontier. The first model is based on local continuum mechanics; the second one is based on hybrid local/non-local continuum mechanics. The key computational issues, including the peculiar homogenization technique and treatment of periodical boundary conditions in the non-local continuum model, are clarified. Both models are implemented through a three-dimensional geometric representation of the carbon nanotubes network, which has been detailed in Part I. Numerical results are shown and compared for both models in order to test convergence and sensitivity toward input parameters. It is found that both approaches provide similar results in terms of homogenized quantities but locally can lead to very different microscopic fields. © 2013 Elsevier B.V. All rights reserved.

  20. Computer Simulations to Study Diffraction Effects of Stacking Faults in Beta-SiC: II. Experimental Verification. 2; Experimental Verification

    Science.gov (United States)

    Pujar, Vijay V.; Cawley, James D.; Levine, S. (Technical Monitor)

    2000-01-01

    Earlier results from computer simulation studies suggest a correlation between the spatial distribution of stacking errors in the Beta-SiC structure and features observed in X-ray diffraction patterns of the material. Reported here are experimental results obtained from two types of nominally Beta-SiC specimens, which yield distinct XRD data. These samples were analyzed using high resolution transmission electron microscopy (HRTEM) and the stacking error distribution was directly determined. The HRTEM results compare well to those deduced by matching the XRD data with simulated spectra, confirming the hypothesis that the XRD data is indicative not only of the presence and density of stacking errors, but also that it can yield information regarding their distribution. In addition, the stacking error population in both specimens is related to their synthesis conditions and it appears that it is similar to the relation developed by others to explain the formation of the corresponding polytypes.

  1. 明视插管软镜和 Macintosh 直接喉镜在颈椎制动患者气管插管中的比较%Comparison of tracheal intubations using video intubationscope and Macintosh direct laryngoscope in patients with cervical spine immobilization

    Institute of Scientific and Technical Information of China (English)

    卢增停; 叶茜琳; 张康聪; 胡浩翔; 涂泽华

    2016-01-01

    目的:比较明视插管软镜与 Macintosh 直接喉镜在颈椎制动患者经口气管插管中的临床效果及对血流动力学的影响。方法择期气管插管全麻手术患者60例,美国麻醉师协会评级(ASA)Ⅰ或Ⅱ级,年龄19~68岁,随机分为明视插管软镜组(V 组)和 Macintosh 直接喉镜组(M 组),每组30例。常规静脉麻醉诱导后,手法制动头颈部,V 组采用明视插管软镜,M 组采用 Macintosh 直接喉镜行经口气管插管。观察记录两组声门暴露时间、镜下 Cormark-Lehane(C-L 分级)、导管置入时间、试插次数、失败例数、气管插管一次成功率及气管插管总成功率,记录麻醉诱导前(T0)、插管前(T1)、声门暴露时(T2)、插管后即刻(T3)、插管后1 min(T4)和插管后3 min(T5)时的平均动脉压(MAP)、心率(HR)及气管插管不良反应。结果与 M 组比较,V 组声门暴露情况(C-L 分级)更好(P 0.05), T3~ T5时 V 组 MAP 明显升高(P 0.05) and were significantly increased at T3~T5 (P < 0.05); compared with group M, MAP at T2~T4 in group V were significantly lower (P < 0.05). Compared with T1, HR in group V were no significantly changed at T2~T5, HR in group M were significantly increased at T2~T4 (P < 0.05), and significantly higher than that in group V at the same time point (P < 0.05). Conclusion Compared with Macintosh direct laryngoscopy in patients with cervical spine immobilization, Video intubationscope could provide better view of glottic exposure, decrease the difficulty of intubation and increase the success rate of intubation, have less complications and influence on patient’s hemodynamics.

  2. [A comparison of the grade of laryngeal visualisation;--the McCoy compared with the Macintosh and the Miller blade in adults].

    Science.gov (United States)

    Sakai, T; Konishi, A; Nishiyama, T; Higashizawa, T; Bito, H

    1998-08-01

    Effectiveness in visualization of the vocal cord during orotracheal intubation with McCoy (McC) compared with Macintosh (Min) and Miller (Mil) blades were investigated. After an institutional review board approval, 117 patients for elective surgery under general anesthesia requiring tracheal intubation were investigated. Five board certified anesthesiologists tried to visualize the vocal cord of a patient three times with the three different types of laryngoscope. Total of 351 intubation attempts were studied. The view obtained at laryngoscopy with each of the three blades was recorded as follows. Grade 1. If most of the glottis is visible. Grade 2. If only the posterior extremity of the glottis is visible. Grade 3. If no part of the glottis can be seen. Grade 4. If not even the epiglottis can be exposed. Eight-two Grade 1 views were obtained with McC, 72 with Mil and 47 with Min, respectively. Thirty-three Grade 2 views were obtained with McC, 36 with Min and 24 with Mil. Two Grade 3 views with McC, 34 with Min and 14 with Mil were obtained. Seven Grade 4 views were obtained with Mil. The grades of laryngeal visualization with McC were significantly lower than those with Min and Mil.

  3. Child endotracheal intubation with a Clarus Levitan fiberoptic stylet vs Macintosh laryngoscope during resuscitation performed by paramedics: a randomized crossover manikin trial.

    Science.gov (United States)

    Szarpak, Lukasz; Truszewski, Zenon; Czyzewski, Lukasz; Kurowski, Andrzej; Bogdanski, Lukasz; Zasko, Piotr

    2015-11-01

    The main cause of cardiac arrest in pediatric patients is respiratory failure. To test the ability of paramedics to intubate the trachea of a child by means of the standard Macintosh [MAC] laryngoscope vs the Clarus Leviatan fiberoptic stylet (FPS) during 3-airway scenarios. This was a randomized crossover manikin study involving 89 paramedics. The participants performed tracheal intubations using the MAC laryngoscope and the Clarus Leviatan FPS in 3 pediatric airway scenarios: scenario A, normal airway without chest compression (CC); scenario B, normal airway with CC; and scenario C, difficult airway with CC. A total of 89 paramedics participated in this study. In scenario A, the FPS maintained a better success rate at first attempt (97.8% vs 88.9%; P=.73) and time required to intubate (17 [interquartile range, 15-21) seconds vs 18 [interquartile range, 16-22] seconds; P=.67) when compared with MAC. In scenarios B and C, the results with FPS were significantly better than those with MAC (P<.05) for all analyzed variables. This study suggested that the FPS could be used as an option for airway management even for paramedics with little experience. Future studies should explore the efficacy of FPS in pediatric clinical emergency settings. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. Computer programming for nucleic acid studies. II. Total chemical shifts calculation of all protons of double-stranded helices.

    Science.gov (United States)

    Giessner-Prettre, C; Ribas Prado, F; Pullman, B; Kan, L; Kast, J R; Ts'o, P O

    1981-01-01

    A FORTRAN computer program called SHIFTS is described. Through SHIFTS, one can calculate the NMR chemical shifts of the proton resonances of single and double-stranded nucleic acids of known sequences and of predetermined conformations. The program can handle RNA and DNA for an arbitrary sequence of a set of 4 out of the 6 base types A,U,G,C,I and T. Data files for the geometrical parameters are available for A-, A'-, B-, D- and S-conformations. The positions of all the atoms are calculated using a modified version of the SEQ program [1]. Then, based on this defined geometry three chemical shift effects exerted by the atoms of the neighboring nucleotides on the protons of each monomeric unit are calculated separately: the ring current shielding effect: the local atomic magnetic susceptibility effect (including both diamagnetic and paramagnetic terms); and the polarization or electric field effect. Results of the program are compared with experimental results for a gamma (ApApGpCpUpU) 2 helical duplex and with calculated results on this same helix based on model building of A'-form and B-form and on graphical procedure for evaluating the ring current effects.

  5. Velocity autocorrelation by quantum simulations for direct parameter-free computations of the neutron cross sections. II. Liquid deuterium

    Science.gov (United States)

    Guarini, E.; Neumann, M.; Bafile, U.; Celli, M.; Colognesi, D.; Bellissima, S.; Farhi, E.; Calzavara, Y.

    2016-06-01

    Very recently we showed that quantum centroid molecular dynamics (CMD) simulations of the velocity autocorrelation function provide, through the Gaussian approximation (GA), an appropriate representation of the single-molecule dynamic structure factor of liquid H2, as witnessed by a straightforward absolute-scale agreement between calculated and experimental values of the total neutron cross section (TCS) at thermal and epithermal incident energies. Also, a proper quantum evaluation of the self-dynamics was found to guarantee, via the simple Sköld model, a suitable account of the distinct (intermolecular) contributions that influence the neutron TCS of para-H2 for low-energy neutrons (below 10 meV). The very different role of coherent nuclear scattering in D2 makes the neutron response from this liquid much more extensively determined by the collective dynamics, even above the cold neutron range. Here we show that the Sköld approximation maintains its effectiveness in producing the correct cross section values also in the deuterium case. This confirms that the true key point for reliable computational estimates of the neutron TCS of the hydrogen liquids is, together with a good knowledge of the static structure factor, the modeling of the self part, which must take into due account quantum delocalization effects on the translational single-molecule dynamics. We demonstrate that both CMD and ring polymer molecular dynamics (RPMD) simulations provide similar results for the velocity autocorrelation function of liquid D2 and, consequently, for the neutron double differential cross section and its integrals. This second investigation completes and reinforces the validity of the proposed quantum method for the prediction of the scattering law of these cryogenic liquids, so important for cold neutron production and related condensed matter research.

  6. Developing predictive approaches to characterize adaptive responses of the reproductive endocrine axis to aromatase inhibition: II. Computational modeling.

    Science.gov (United States)

    Breen, Miyuki; Villeneuve, Daniel L; Ankley, Gerald T; Bencic, David C; Breen, Michael S; Watanabe, Karen H; Lloyd, Alun L; Conolly, Rory B

    2013-06-01

    Endocrine-disrupting chemicals can affect reproduction and development in humans and wildlife. We developed a computational model of the hypothalamic-pituitary-gonadal (HPG) axis in female fathead minnows to predict dose-response and time-course (DRTC) behaviors for endocrine effects of the aromatase inhibitor, fadrozole (FAD). The model describes adaptive responses to endocrine stress involving regulated secretion of a generic gonadotropin (LH/FSH) from the hypothalamic-pituitary complex. For model development, we used plasma 17β-estradiol (E2) concentrations and ovarian cytochrome P450 (CYP) 19A aromatase mRNA data from two time-course experiments, each of which included both an exposure and a depuration phase, and plasma E2 data from a third 4-day study. Model parameters were estimated using E2 concentrations for 0, 0.5, and 3 µg/l FAD exposure concentrations, and good fits to these data were obtained. The model accurately predicted CYP19A mRNA fold changes for controls and three FAD doses (0, 0.5, and 3 µg/l) and plasma E2 dose response from the 4-day study. Comparing the model-predicted DRTC with experimental data provided insight into how the feedback control mechanisms in the HPG axis mediate these changes: specifically, adaptive changes in plasma E2 levels occurring during exposure and "overshoot" occurring postexposure. This study demonstrates the value of mechanistic modeling to examine and predict dynamic behaviors in perturbed systems. As this work progresses, we will obtain a refined understanding of how adaptive responses within the vertebrate HPG axis affect DRTC behaviors for aromatase inhibitors and other types of endocrine-active chemicals and apply that knowledge in support of risk assessments.

  7. Design of a digital beam attenuation system for computed tomography. Part II. Performance study and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Szczykutowicz, Timothy P. [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 (United States); Mistretta, Charles A. [Department of Medical Physics, University of Wisconsin-Madison, 1111 Highland Avenue, Madison, Wisconsin 53705 (United States); Department of Radiology, University of Wisconsin-Madison, 600 Highland Avenue, Madison, Wisconsin 53792 (United States); Department of Biomedical Engineering, University of Wisconsin-Madison, 1550 Engineering Drive, Madison, Wisconsin 53706 (United States)

    2013-02-15

    a SPR reduction of Almost-Equal-To 4 times relative to flat field CT. The dynamic range for the DBA prototype was 3.7 compared to 84.2 for the flat field scan. Conclusions: Based on the results presented in this paper and the companion paper [T. Szczykutowicz and C. Mistretta, 'Design of a digital beam attenuation system for computed tomography. Part I. System design and simulation framework,' Med. Phys. 40, 021905 (2013)], FFMCT implemented via the DBA device seems feasible and should result in both a dose reduction and an improvement in image quality as judged by noise uniformity and scatter reduction. In addition, the dynamic range reduction achievable using the DBA may allow photon counting imaging to become a clinical reality. This study may allow for yet another step to be taken in the field of patient specific dose modulation.

  8. cobalt (ii), nickel (ii)

    African Journals Online (AJOL)

    DR. AMINU

    ABSTRACT. The manganese (II), cobalt (II), nickel (II) and copper (II) complexes of N, N' – ... temperature and coordinated water were determined ... indicating fairly stable complex compounds (Table 1). The complex compounds are insoluble [Table 2] in water and common organic solvents, but are readily soluble in ...

  9. COMPUTER HARDWARE MARKING

    CERN Multimedia

    Groupe de protection des biens

    2000-01-01

    As part of the campaign to protect CERN property and for insurance reasons, all computer hardware belonging to the Organization must be marked with the words 'PROPRIETE CERN'.IT Division has recently introduced a new marking system that is both economical and easy to use. From now on all desktop hardware (PCs, Macintoshes, printers) issued by IT Division with a value equal to or exceeding 500 CHF will be marked using this new system.For equipment that is already installed but not yet marked, including UNIX workstations and X terminals, IT Division's Desktop Support Service offers the following services free of charge:Equipment-marking wherever the Service is called out to perform other work (please submit all work requests to the IT Helpdesk on 78888 or helpdesk@cern.ch; for unavoidable operational reasons, the Desktop Support Service will only respond to marking requests when these coincide with requests for other work such as repairs, system upgrades, etc.);Training of personnel designated by Division Leade...

  10. An Approach for a Synthetic CTL Vaccine Design against Zika Flavivirus Using Class I and Class II Epitopes Identified by Computer Modeling

    Directory of Open Access Journals (Sweden)

    Edecio Cunha-Neto

    2017-06-01

    Full Text Available The threat posed by severe congenital abnormalities related to Zika virus (ZKV infection during pregnancy has turned development of a ZKV vaccine into an emergency. Recent work suggests that the cytotoxic T lymphocyte (CTL response to infection is an important defense mechanism in response to ZKV. Here, we develop the rationale and strategy for a new approach to developing cytotoxic T lymphocyte (CTL vaccines for ZKV flavivirus infection. The proposed approach is based on recent studies using a protein structure computer model for HIV epitope selection designed to select epitopes for CTL attack optimized for viruses that exhibit antigenic drift. Because naturally processed and presented human ZKV T cell epitopes have not yet been described, we identified predicted class I peptide sequences on ZKV matching previously identified DNV (Dengue class I epitopes and by using a Major Histocompatibility Complex (MHC binding prediction tool. A subset of those met the criteria for optimal CD8+ attack based on physical chemistry parameters determined by analysis of the ZKV protein structure encoded in open source Protein Data File (PDB format files. We also identified candidate ZKV epitopes predicted to bind promiscuously to multiple HLA class II molecules that could provide help to the CTL responses. This work suggests that a CTL vaccine for ZKV may be possible even if ZKV exhibits significant antigenic drift. We have previously described a microsphere-based CTL vaccine platform capable of eliciting an immune response for class I epitopes in mice and are currently working toward in vivo testing of class I and class II epitope delivery directed against ZKV epitopes using the same microsphere-based vaccine.

  11. A Multicenter Evaluation of Utility of Chest Computed Tomography and Bone Scans in Liver Transplant Candidates With Stages I and II Hepatoma

    Science.gov (United States)

    Koneru, Baburao; Teperman, Lewis W.; Manzarbeitia, Cosme; Facciuto, Marcelo; Cho, Kyunghee; Reich, David; Sheiner, Patricia; Fisher, Adrian; Noto, Khristian; Goldenberg, Alec; Korogodsky, Maria; Campbell, Donna

    2005-01-01

    Objective: To determine utility of practice of chest computed tomography (CCT) and bone scan (BS) in patients with early-stage hepatoma evaluated for transplantation (LT). Summary Background Data: Consensus-based policy mandates routine CCT and BS in LT candidates with hepatoma. No data exist either to support or refute this policy. Methods: From January 1999 to December 2002, stages I and II hepatoma patients evaluated at 4 centers were included. Scan interpretation was positive, indeterminate, or negative. Outcomes of evaluation and transplantation were compared between groups based on scans. Total charges incurred were derived from mean of charges at the centers. Results: One hundred seventeen stages I and II patients were evaluated. None had positive scans, 78 had negative, 29 had at least 1 indeterminate, and 10 did not have 1 or both scans. Twelve patients were declined listing, 6 from progression of hepatoma but none from CCT or BS findings. Two listed patients were delisted for progression of the hepatoma. Proportion of patients listed, transplanted, clinical and pathologic stage of hepatoma, and recurrence after LT were similar in groups with negative and indeterminate scans. Indeterminate scans led to 6 invasive procedures, 1 patient died of complications of a mediastinal biopsy, and none of the 6 showed metastases. Charges of $2933 were generated per patient evaluated. Conclusions: Positive yield of routine CCT and BS in patients with hepatoma is very low despite substantial charges and potential complications. CCT and BS performed only when clinically indicated will be a more cost-effective and safer approach. PMID:15798464

  12. Unenhanced Cone Beam Computed Tomography and Fusion Imaging in Direct Percutaneous Sac Injection for Treatment of Type II Endoleak: Technical Note

    Energy Technology Data Exchange (ETDEWEB)

    Carrafiello, Gianpaolo, E-mail: gcarraf@gmail.com; Ierardi, Anna Maria [Insubria University, Interventional Radiology, Department of Radiology (Italy); Radaelli, Alessandro [Philips Healthcare (Netherlands); Marchi, Giuseppe De; Floridi, Chiara [Insubria University, Interventional Radiology, Department of Radiology (Italy); Piffaretti, Gabriele [University of Insubria, Vascular Surgery Department (Italy); Federico, Fontana [Insubria University, Interventional Radiology, Department of Radiology (Italy)

    2016-03-15

    AimTo evaluate safety, feasibility, technical success, and clinical success of direct percutaneous sac injection (DPSI) for the treatment of type II endoleaks (T2EL) using anatomical landmarks on cone beam computed tomography (CBCT) and fusion imaging (FI).Materials and MethodsEight patients with T2EL were treated with DPSI using CBCT as imaging guidance. Anatomical landmarks on unenhanced CBCT were used for referencing T2EL location in the first five patients, while FI between unenhanced CBCT and pre-procedural computed tomography angiography (CTA) was used in the remaining three patients. Embolization was performed with thrombin, glue, and ethylene–vinyl alcohol copolymer. Technical and clinical success, iodinated contrast utilization, procedural time, fluoroscopy time, and mean radiation dose were registered.ResultsDPSI was technically successful in all patients: the needle was correctly positioned at the first attempt in six patients, while in two of the first five patients the needle was repositioned once. Neither minor nor major complications were registered. Average procedural time was 45 min and the average administered iodinated contrast was 13 ml. Mean radiation dose of the procedure was 60.43 Gy cm{sup 2} and mean fluoroscopy time was 18 min. Clinical success was achieved in all patients (mean follow-up of 36 months): no sign of T2EL was reported in seven patients until last CT follow-up, while it persisted in one patient with stability of sac diameter.ConclusionsDPSI using unenhanced CBCT and FI is feasible and provides the interventional radiologist with an accurate and safe alternative to endovascular treatment with limited iodinated contrast utilization.

  13. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    Science.gov (United States)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  14. Molecular diagnosis of mucopolysaccharidosis Type II (Hunter syndrome) by automated sequencing and computer-assisted interpretation: Toward mutation mapping of the Iduronate-2-sulfatase gene

    Energy Technology Data Exchange (ETDEWEB)

    Jonsson, J.J.; Aronovich, E.L.; Braun, S.E.; Whitley, C.B. [Univ. of Minnesota Medical School, Minneapolis, MN (United States)

    1995-03-01

    Virtually all mutations causing Hunter syndrome (mucopolysaccharidosis type II) are expected to be new mutations. Therefore, as a means of molecular diagnosis, we developed a rapid method to sequence the entire iduronate-2-sulfatase (IDS) coding region. PCR amplicons representing the IDS cDNA were sequenced with an automatic instrument, and output was analyzed by computer-assisted interpretation of tracings, using Staden programs on a Sun computer. Mutations were found in 10 of 11 patients studied. Unique missense mutations were identified in five patients: H229Y (685{r_arrow}T, severe phenotype); P358R (1073C{r_arrow}G, severe); R468W (1402C{r_arrow}T, mild); P469H (1406C{r_arrow}A, mild); and Y523C (1568A{r_arrow}G, mild). Nonsense mutations were identified in two patients: R172X (514C{r_arrow}T, severe) and Q389X (1165C{r_arrow}T, severe). Two other patients with severe disease had insertions of 1 and 14 bp, in exons 3 and 6, respectively. In another patient with severe disease, the predominant (<95%) IDS message resulted from aberrant splicing, which skipped exon 3. In this last case, consensus sequences for splice sites in exon 3 were intact, but a 395C{r_arrow}G mutation was identified 24 bp upstream from the 3` splice of exon 3. This mutation created a cryptic 5` splice site with a better consensus sequence for 5` splice sites than the natural 5` splice site of intron 3. A minor population of the IDS message was processed by using this cryptic splice site; however, no correctly spliced message was detected in leukocytes from this patient. The mutational topology of the IDS gene is presented. 46 refs., 6 figs., 2 tabs.

  15. Comparison of the GlideRite to the conventional malleable stylet for endotracheal intubation by the Macintosh laryngoscope: a simulation study using manikins

    Science.gov (United States)

    Kong, Yong Tack; Lee, Hyun Jung; Na, Ji Ung; Shin, Dong Hyuk; Han, Sang Kuk; Lee, Jeong Hun; Choi, Pil Cho

    2016-01-01

    Objective To compare the effectiveness of the GlideRite stylet with the conventional malleable stylet (CMS) in endotracheal intubation (ETI) by the Macintosh laryngoscope. Methods This study is a randomized, crossover, simulation study. Participants performed ETI using both the GlideRite stylet and the CMS in a normal airway model and a tongue edema model (simulated difficult airway resulting in lower percentage of glottic opening [POGO]). Results In both the normal and tongue edema models, all 36 participants successfully performed ETI with the two stylets on the first attempt. In the normal airway model, there was no difference in time required for ETI (TETI) or in ease of handling between the two stylets. In the tongue edema model, the TETI using the CMS increased as the POGO score decreased (POGO score was negatively correlated with TETI for the CMS, Spearman’s rho=-0.518, P=0.001); this difference was not seen with the GlideRite (rho=-0.208, P=0.224). The TETI was shorter with the GlideRite than with the CMS, however, this difference was not statistically significant (15.1 vs. 18.8 seconds, P=0.385). Ease of handling was superior with the GlideRite compared with the CMS (P=0.006). Conclusion Performance of the GlideRite and the CMS were not different in the normal airway model. However, in the simulated difficult airway model with a low POGO score, the GlideRite performed better than the CMS for direct laryngoscopic intubation.

  16. Muscle activity during endotracheal intubation using 4 laryngoscopes (Macintosh laryngoscope, Intubrite, TruView Evo2 and King Vision – A comparative study

    Directory of Open Access Journals (Sweden)

    Tomasz Gaszyński

    2016-04-01

    Full Text Available Background: Successful endotracheal intubation requires mental activity and no less important physical activity from the anesthesiologist, so ergonomics of used devices is important. The aim of our study has been to compare 4 laryngoscopes regarding an operator’s activity of selected muscles of the upper limb, an operator’s satisfaction with used devices and an operator’s fatigue during intubation attempts. Material and Methods: The study included 13 anesthesiologists of similar seniority. To measure muscle activity MyoPlus 2 with 2-channel surface ElectroMyoGraphy (sEMG test device was used. Participant’s satisfaction with studied devices was evaluated using Visual Analog Scale. An operator’s fatigue during intubation efforts was evaluated by means of the modified Borg’s scale. Results: The highest activity of all the studied muscles was observed for the Intubrite laryngoscope, followed by the Mackintosh, TruView Evo2 and the lowest one – for the King Vision video laryngoscope. A significant statistical difference was observed for the King Vision and the rest of laryngoscopes (p 0.05. The shortest time of intubation was achieved using the standard Macintosh blade laryngoscope. The highest satisfaction was noted for the King Vision video laryngoscope, and the lowest for – the TruView Evo2. The Intubrite was the most demanding in terms of workload, in the opinion of the participants’, and the least demanding was the King Vision video laryngoscope. Conclusions: Muscle activity, namely the force used for intubation, is the smallest when the King Vision video laryngoscope is used with the highest satisfaction and lowest workload, and the highest muscle activity was proven for the Intubrite laryngoscope with the highest workload. Med Pr 2016;67(2:155–162

  17. Comparison of Macintosh, McCoy and C-MAC D-Blade video laryngoscope intubation by prehospital emergency health workers: a simulation study.

    Science.gov (United States)

    Yildirim, Ahmet; Kiraz, Hasan A; Ağaoğlu, İbrahim; Akdur, Okhan

    2017-02-01

    The aim of the this study is to evaluate the intubation success rates of emergency medical technicians using a Macintosh laryngoscope (ML), McCoy laryngoscope (MCL), and C MAC D-Blade (CMDB) video laryngoscope on manikin models with immobilized cervical spines. This randomized crossover study included 40 EMTs with at least 2 years' active service in ambulances. All participating technicians completed intubations in three scenarios-a normal airway model, a rigid cervical collar model, and a manual in-line cervical stabilization model-with three different laryngoscopes. The scenario and laryngoscope model were determined randomly. We recorded the scenario, laryngoscope method, intubation time in seconds, tooth pressure, and intubation on a previously prepared study form. We performed Friedman tests to determine whether there is a significant change in the intubation success rate, duration of tracheal intubation, tooth pressure, and visual analog scale scores due to violations of parametric test assumptions. We performed the Wilcoxon test to determine the significance of pairwise differences for multiple comparisons. An overall 5 % type I error level was used to infer statistical significance. We considered a p value of less than 0.05 statistically significant. The CMDB and MCL success rates were significantly higher than the ML rates in all scenario models (p < 0.05). The CMDB intubation duration was significantly shorter when compared with ML and MCL in all models. CMDB and MCL may provide an easier, faster intubation by prehospital emergency health care workers in patients with immobilized cervical spines.

  18. A computational platform for robotized fluorescence microscopy (II): DNA damage, replication, checkpoint activation, and cell cycle progression by high-content high-resolution multiparameter image-cytometry.

    Science.gov (United States)

    Furia, Laura; Pelicci, Pier Giuseppe; Faretta, Mario

    2013-04-01

    Dissection of complex molecular-networks in rare cell populations is limited by current technologies that do not allow simultaneous quantification, high-resolution localization, and statistically robust analysis of multiple parameters. We have developed a novel computational platform (Automated Microscopy for Image CytOmetry, A.M.I.CO) for quantitative image-analysis of data from confocal or widefield robotized microscopes. We have applied this image-cytometry technology to the study of checkpoint activation in response to spontaneous DNA damage in nontransformed mammary cells. Cell-cycle profile and active DNA-replication were correlated to (i) Ki67, to monitor proliferation; (ii) phosphorylated histone H2AX (γH2AX) and 53BP1, as markers of DNA-damage response (DDR); and (iii) p53 and p21, as checkpoint-activation markers. Our data suggest the existence of cell-cycle modulated mechanisms involving different functions of γH2AX and 53BP1 in DDR, and of p53 and p21 in checkpoint activation and quiescence regulation during the cell-cycle. Quantitative analysis, event selection, and physical relocalization have been then employed to correlate protein expression at the population level with interactions between molecules, measured with Proximity Ligation Analysis, with unprecedented statistical relevance. Copyright © 2013 International Society for Advancement of Cytometry.

  19. Computational POM and DFT Evaluation of Experimental in-vitro Cancer Inhibition of Staurosporine-Ruthenium(II) Complexes: the Power Force of Organometallics in Drug Design.

    Science.gov (United States)

    Hadda, Taibi Ben; Genc, Zuhal K; Masand, Vijay H; Nebbache, Nadia; Warad, Ismail; Jodeh, Shehdeh; Genc, Murat; Mabkhot, Yahia N; Barakat, Assem; Salgado-Zamora, Hector

    2015-01-01

    A computational Petra/Osiris/Molinspiration/DFT(POM/DFT) based model has been developed for the identification of physico-chemical parameters governing the bioactivity of ruthenium-staurosporine complexes 2-4 containing an antitumoral-kinase (TK) pharmacophore sites. The four compounds 1-4 analyzed here were previously screened for their antitumor activity, compounds 2 and 4 are neutral, whereas analogue compound 3 is a monocation with ruthenium(II) centre. The highest anti- antitumor activity was obtained for compounds 3 and 4, which exhibited low IC(50) values (0.45 and 8 nM, respectively), superior to staurosporine derivative (pyridocarbazole ligand 1, 150 · 10(3) nM). The IC(50) of 3 (0.45 nM), represents 20,000 fold increased activity as compared to staurosporine derivative 1. The increase of bioactivity could be attributed to the existence of pi-charge transfer from metal-staurosporine to its (CO(δ)--NH(δ+)) antitumor pharmacophore site.

  20. Computational studies on the excited states of luminescent platinum(II) alkynyl systems of tridentate pincer ligands in radiative and nonradiative processes.

    Science.gov (United States)

    Lam, Wai Han; Lam, Elizabeth Suk-Hang; Yam, Vivian Wing-Wah

    2013-10-09

    Platinum(II) alkynyl complexes of various tridentate pincer ligands, [Pt(trpy)(C≡CR)](+) (trpy = 2,2':6',2″-terpyridine), [Pt(R'-bzimpy)(C≡CR)](+) (R'-bzimpy = 2,6-bis(N-alkylbenzimidazol-2'-yl)pyridine and R' = alkyl), [Pt(R'-bzimb)(C≡CR)] (R'-bzimb = 1,3-bis(N-alkylbenzimidazol-2'-yl)benzene and R' = C4H9), have been found to possess rich photophysical properties. The emission in dilute solutions of [Pt(trpy)(C≡CR)](+) originated from a triplet alkynyl-to-tridentate pincer ligand-to-ligand charge transfer (LLCT) excited state, with mixing of a platinum-to-tridentate pincer ligand metal-to-ligand charge transfer (MLCT) excited state, while that of [Pt(R'-bzimb)(C≡CR)] originated from a triplet excited state of intraligand (IL) character of the tridentate ligand mixed with a platinum-to-tridentate ligand MLCT character. Interestingly, both emissions were observed in [Pt(R'-bzimpy)(C≡CR)](+) in some cases. In addition, [Pt(R'-bzimb)(C≡CR)] displayed a photoluminescence quantum yield higher than that of [Pt(R'-bzimpy)(C≡CR)](+). Computational studies have been performed on the representative complexes [Pt(trpy)(C≡CPh)](+) (1), [Pt(R'-bzimpy)(C≡CPh)](+) (2), and [Pt(R'-bzimb)(C≡CPh)] (3), where R' = CH3 and Ph = C6H5, to provide an in-depth understanding of the nature of their emissive origin as well as the radiative and nonradiative processes. In particular, the factors governing the ordering of the triplet excited states and radiative decay rate constants of the emissive state ((3)ES) have been examined. The potential energy profiles for the deactivation process from the (3)ES via triplet metal-centered ((3)MC) states have also been explored. This work reveals for the first time the potential energy profiles for the thermal deactivation pathway of square planar platinum(II) complexes.

  1. Polymorphisms in the F8 gene and MHC-II variants as risk factors for the development of inhibitory anti-factor VIII antibodies during the treatment of hemophilia a: a computational assessment.

    Directory of Open Access Journals (Sweden)

    Gouri Shankar Pandey

    Full Text Available The development of neutralizing anti-drug-antibodies to the Factor VIII protein-therapeutic is currently the most significant impediment to the effective management of hemophilia A. Common non-synonymous single nucleotide polymorphisms (ns-SNPs in the F8 gene occur as six haplotypes in the human population (denoted H1 to H6 of which H3 and H4 have been associated with an increased risk of developing anti-drug antibodies. There is evidence that CD4+ T-cell response is essential for the development of anti-drug antibodies and such a response requires the presentation of the peptides by the MHC-class-II (MHC-II molecules of the patient. We measured the binding and half-life of peptide-MHC-II complexes using synthetic peptides from regions of the Factor VIII protein where ns-SNPs occur and showed that these wild type peptides form stable complexes with six common MHC-II alleles, representing 46.5% of the North American population. Next, we compared the affinities computed by NetMHCIIpan, a neural network-based algorithm for MHC-II peptide binding prediction, to the experimentally measured values and concluded that these are in good agreement (area under the ROC-curve of 0.778 to 0.972 for the six MHC-II variants. Using a computational binding predictor, we were able to expand our analysis to (a include all wild type peptides spanning each polymorphic position; and (b consider more MHC-II variants, thus allowing for a better estimation of the risk for clinical manifestation of anti-drug antibodies in the entire population (or a specific sub-population. Analysis of these computational data confirmed that peptides which have the wild type sequence at positions where the polymorphisms associated with haplotypes H3, H4 and H5 occur bind MHC-II proteins significantly more than a negative control. Taken together, the experimental and computational results suggest that wild type peptides from polymorphic regions of FVIII constitute potential T-cell epitopes

  2. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    Science.gov (United States)

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2017-08-12

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm(3), 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  3. Evaluation of Condylar Position after Orthognathic Surgery for Treatment of Class II Vertical Maxillary Excess and Mandibular Deficiency by Using Cone-Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Reza Tabrizi

    2016-12-01

    Full Text Available Statement of the Problem: In orthognathic surgeries, proper condylar position is one of the most important factors in postoperative stability. Knowing the condylar movement after orthognathic surgery can help preventing postoperative instabilities. Purpose: The aim of this study was to evaluate the condylar positional changes after Le Fort I maxillary superior repositioning along with mandibular advancement by using cone beam computed tomography (CBCT. Materials and Method: This cross-sectional study was conducted on 22 subjects who had class II skeletal malocclusion along with vertical maxillary excess. Subjects underwent maxillary superior repositioning (Le Fort I osteotomy along with mandibular advancement. The CBCT images were taken a couple of days before the surgery (T0, and one month (T1 and 9 months (T2 after the surgery. The condyles positions were determined from the most superior point of the condyle to three distances including the deepest point of the glenoid fossa, the most anterior-inferior point of the articular eminence, and the most superior point of the external auditory meatus in the sagittal plane. Results: The mean mandibular advancement was 4.33±2.1 mm and the mean maxillary superior repositioning was 4.66±0.3 mm. The condyles displaced inferiorly, anteriorly, and laterally between T0 and T1. They were repositioned approximately in the initial position in T2. No correlation was observed between the mandibular and maxillary movement and the condylar positions. Conclusion: The condyles displaced in the inferior-anterior-lateral position one month after the bilateral sagittal split osteotomy for mandibular advancement in combination with the maxillary Le Fort I superior repositioning. It seems that the condyles adapted approximately in their initial position nine months after the surgeries. Keywords ● Mandible ● Condyle ● CBCT ● Sagittal Osteotomy ● Vertical Maxillary Excess

  4. Evaluation of Condylar Position after Orthognathic Surgery for Treatment of Class II Vertical Maxillary Excess and Mandibular Deficiency by Using Cone-Beam Computed Tomography

    Science.gov (United States)

    Tabrizi, Reza; Shahidi, Shoaleh; Bahramnejad, Emad; Arabion, Hamidreza

    2016-01-01

    Statement of the Problem: In orthognathic surgeries, proper condylar position is one of the most important factors in postoperative stability. Knowing the condylar movement after orthognathic surgery can help preventing postoperative instabilities. Purpose: The aim of this study was to evaluate the condylar positional changes after Le Fort I maxillary superior repositioning along with mandibular advancement by using cone beam computed tomography (CBCT). Materials and Method: This cross-sectional study was conducted on 22 subjects who had class II skeletal malocclusion along with vertical maxillary excess. Subjects underwent maxillary superior repositioning (Le Fort I osteotomy) along with mandibular advancement. The CBCT images were taken a couple of days before the surgery (T0), and one month (T1) and 9 months (T2) after the surgery. The condyles positions were determined from the most superior point of the condyle to three distances including the deepest point of the glenoid fossa, the most anterior-inferior point of the articular eminence, and the most superior point of the external auditory meatus in the sagittal plane. Results: The mean mandibular advancement was 4.33±2.1 mm and the mean maxillary superior repositioning was 4.66±0.3 mm. The condyles displaced inferiorly, anteriorly, and laterally between T0 and T1. They were repositioned approximately in the initial position in T2. No correlation was observed between the mandibular and maxillary movement and the condylar positions. Conclusion: The condyles displaced in the inferior-anterior-lateral position one month after the bilateral sagittal split osteotomy for mandibular advancement in combination with the maxillary Le Fort I superior repositioning. It seems that the condyles adapted approximately in their initial position nine months after the surgeries. PMID:27942547

  5. Comparisons of the Pentax-AWS, Glidescope, and Macintosh Laryngoscopes for Intubation Performance during Mechanical Chest Compressions in Left Lateral Tilt: A Randomized Simulation Study of Maternal Cardiopulmonary Resuscitation

    Directory of Open Access Journals (Sweden)

    Sanghyun Lee

    2015-01-01

    Full Text Available Purpose. Rapid advanced airway management is important in maternal cardiopulmonary resuscitation (CPR. This study aimed to compare intubation performances among Pentax-AWS (AWS, Glidescope (GVL, and Macintosh laryngoscope (MCL during mechanical chest compression in 15° and 30° left lateral tilt. Methods. In 19 emergency physicians, a prospective randomized crossover study was conducted to examine the three laryngoscopes. Primary outcomes were the intubation time and the success rate for intubation. Results. The median intubation time using AWS was shorter than that of GVL and MCL in both tilt degrees. The time to visualize the glottic view in GVL and AWS was significantly lower than that of MCL (all P<0.05, whereas there was no significant difference between the two video laryngoscopes (in 15° tilt, P=1; in 30° tilt, P=0.71. The progression of tracheal tube using AWS was faster than that of MCL and GVL in both degrees (all P<0.001. Intubations using AWS and GVL showed higher success rate than that of Macintosh laryngoscopes. Conclusions. The AWS could be an appropriate laryngoscope for airway management of pregnant women in tilt CPR considering intubation time and success rate.

  6. Methionine-pyrene hybrid based fluorescent probe for trace level detection and estimation of Hg(II) in aqueous environmental samples: experimental and computational studies.

    Science.gov (United States)

    Banerjee, Arnab; Karak, Debasis; Sahana, Animesh; Guha, Subarna; Lohar, Sisir; Das, Debasis

    2011-02-15

    A new fluorescent, Hg(2+) selective chemosensor, 4-methylsulfanyl-2-[(pyren-4-ylmethylene)-amino] butyric acid methyl ester (L, MP) was synthesized by blending methionine with pyrene. It was well characterized by different analytical techniques, viz. (1)H NMR, (13)C NMR, QTOF mass spectra, elemental analysis, FTIR and UV-vis spectroscopy. The reaction of this ligand with Hg(2+) was studied by steady state and time-resolved fluorescence spectroscopy. The Hg(2+) complexation process was confirmed by comparing FTIR, UV-vis, thermal, QTOF mass spectra and (1)H NMR data of the product with that of the free ligand values. The composition (Hg(2+):L=1:1) of the Hg(2+) complex in solution was evaluated by fluorescence titration method. Based on the chelation assisted fluorescence quenching, a highly sensitive spectrofluorometric method was developed for the determination of trace amounts of Hg(2+) in water. The ligand had an excitation and emission maxima at 360 nm and 455 nm, respectively. The fluorescence life times of the ligand and its Hg(2+) complex were 1.54 ns and 0.72 ns respectively. The binding constant of the ligand, L with Hg(2+) was calculated using Benesi-Hildebrand equation and was found to be 7.5630×10(4). The linear range of the method was from 0 to 16 μg L(-1) with a detection limit of 0.056 μg L(-1) for Hg(2+). The quantum yields of the ligand and its Hg(2+) complex were found to be 0.1206 and 0.0757 respectively. Both the ligand and its Hg(2+) complex have been studied computationally (Ab-initio, Hartree Fock method) to get their optimized structure and other related physical parameters, including bond lengths, bond angles, dipole moments, orbital interactions etc. The binding sites of the ligand to the Hg(2+) ion as obtained from the theoretical calculations were well supported by (1)H NMR titration. The interference of foreign ions was negligible. This method has been successfully applied to the determination of mercury(II) in industrial waste water

  7. Graphics gems V (Macintosh version)

    CERN Document Server

    Paeth, Alan W

    1995-01-01

    Graphics Gems V is the newest volume in The Graphics Gems Series. It is intended to provide the graphics community with a set of practical tools for implementing new ideas and techniques, and to offer working solutions to real programming problems. These tools are written by a wide variety of graphics programmers from industry, academia, and research. The books in the series have become essential, time-saving tools for many programmers.Latest collection of graphics tips in The Graphics Gems Series written by the leading programmers in the field.Contains over 50 new gems displaying some of t

  8. [Anaesthesiology in the Polish Armed Forces in the West during World War II].

    Science.gov (United States)

    Rutkiewicz, Aleksander; Duda, Izabela; Musioł, Ewa

    2011-01-01

    Until the outbreak of WW II, anaesthesiology, as a separate specialty, did not exist in Poland. After the fall of Poland, a large section of the Polish Armed Forces was evacuated to France and after that, to the UK, where Polish military physicians had a unique opportunity to obtain training in modern anaesthesia. The first regular courses were established at the University of Edinburgh. After WW II, doctor Stanisław Pokrzywnicki, a pioneer of Polish anaesthesiology, who was trained by Sir Robert Macintosh, and doctor Bolesław Rutkowski, an anaesthesiologist in London, returned to Poland and started regular services. This led to the registering of anaesthesiology as a separate specialty in 1951. In the article, the wartime and post-war stories of the first Polish anaesthesiologists are presented.

  9. ISORROPIA II: a computationally efficient thermodynamic equilibrium model for K+─Ca²+─Mg²+─NH4+─Na+─SO4²-─NO3-─Cl-─H2O aerosols

    Directory of Open Access Journals (Sweden)

    C. Fountoukis

    2007-09-01

    Full Text Available This study presents ISORROPIA II, a thermodynamic equilibrium model for the K+–Ca2+–Mg2+–NH4+–Na+–SO42−–NO3−–Cl−–H2O aerosol system. A comprehensive evaluation of its performance is conducted against water uptake measurements for laboratory aerosol and predictions of the SCAPE2 thermodynamic module over a wide range of atmospherically relevant conditions. The two models agree well, to within 13% for aerosol water content and total PM mass, 16% for aerosol nitrate and 6% for aerosol chloride and ammonium. Largest discrepancies were found under conditions of low RH, primarily from differences in the treatment of water uptake and solid state composition. In terms of computational speed, ISORROPIA II was more than an order of magnitude faster than SCAPE2, with robust and rapid convergence under all conditions. The addition of crustal species does not slow down the thermodynamic calculations (compared to the older ISORROPIA code because of optimizations in the activity coefficient calculation algorithm. Based on its computational rigor and performance, ISORROPIA II appears to be a highly attractive alternative for use in large scale air quality and atmospheric transport models.

  10. Synthesis, crystal structure, spectroscopic characterization and nonlinear optical properties of manganese (II) complex of picolinate: A combined experimental and computational study

    Science.gov (United States)

    Tamer, Ömer; Avcı, Davut; Atalay, Yusuf; Çoşut, Bünyemin; Zorlu, Yunus; Erkovan, Mustafa; Yerli, Yusuf

    2016-02-01

    A novel manganese (II) complex with picolinic acid (pyridine 2-carboxylic acid, Hpic), namely, [Mn(pic)2(H2O)2] was prepared and its crystal structure was fully characterized by using single crystal X-ray diffraction. Picolinate (pic) ligands were coordinated to the central manganese(II) ion as bidentate N,O-donors through the nitrogen atoms of pyridine rings and the oxygen atoms of carboxylate groups forming five-membered chelate rings. The spectroscopic characterization of Mn(II) complex was performed by the applications of FT-IR, Raman, UV-vis and EPR techniques. In order to support these studies, density functional theory (DFT) calculations were carried out by using B3LYP level. IR and Raman spectra were simulated at B3LYP level, and obtained results indicated that DFT calculations generally give compatible results to the experimental ones. The electronic structure of the Mn(II) complex was predicted using time dependent DFT (TD-DFT) method with polarizable continuum model (PCM). Molecular stability, hyperconjugative interactions, intramolecular charge transfer (ICT) and bond strength were investigated by applying natural bond orbital (NBO) analysis. Nonlinear optical properties of Mn(II) complex were investigated by the determining of molecular polarizability (α) and hyperpolarizability (β) parameters.

  11. Comparison of HC video-laryngoscope versus Macintosh laryngoscope for tracheal intubation%HC视频喉镜与Macintosh喉镜引导气管插管效果的比较

    Institute of Scientific and Technical Information of China (English)

    弓胜凯; 孙政; 樊肖冲; 吕慧敏; 储勤军; 张卫

    2013-01-01

    目的 比较HC视频喉镜与Macintosh喉镜引导气管插管的效果.方法 择期全麻患者60例,ASA分级Ⅰ或Ⅱ级,性别不限,年龄18 ~ 64岁,体重指数19 ~ 27 kg/m2,Mallampati分级Ⅰ或Ⅱ级,采用随机数字表法,将其随机分为2组(n=30):HC视频喉镜组(H组)和Macintosh喉镜组(M组).麻醉诱导后分别用HC视频喉镜和Macintosh喉镜引导经口气管插管.记录两组患者声门暴露时间、气管插管时间、Cormack-Lehane分级(用于计算声门暴露满意率)、环状软骨按压情况,观察气管插管并发症的发生情况.结果 与M组比较,H组声门暴露满意率升高,环状软骨按压次数降低(P<0.05).两组患者声门暴露时间、气管插管时间和气管插管并发症发生率差异无统计学意义(P>0.05).结论 HC视频喉镜引导气管插管的效果优于Macintosh喉镜.%Objective To compare HC video-laryngoscope with Macintosh laryngoscope for tracheal intubation.Methods Sixty ASA Ⅰ or Ⅱ patients of both sexes,aged 18-64 yr,with body mass index 19-27 kg/m2,Mallampati grade Ⅰ-Ⅱ,undergoing elective surgery,were randomly divided into 2 groups (n =30 each):HC video-laryngoscope group (group H) and Macintosh laryngoscope (group M).After induction of anesthesia,the patients underwent orotracheal intubation assisted by HC video-laryngoscope in group H,and by Macintosh laryngoscope in group M.The glottic exposure time,intubation time,Cormack-Lehane grade,the number of pressing the cricoid and intubation-related complications were recorded.Results The rate of satisfactory glottic exposure was significantly higher and the number of pressing the cricoid was smaller in group H than in group M (P < 0.05).There was no significant difference in the glottic exposure time,intubation time and incidence of intubation-related complications between the two groups (P > 0.05).Conclusion The efficacy of tracheal intubation guided by HC video-laryngoscope is better than that guided by

  12. Mixing Computations and Proofs

    Directory of Open Access Journals (Sweden)

    Michael Beeson

    2016-01-01

    Full Text Available We examine the relationship between proof and computation in mathematics, especially in formalized mathematics. We compare the various approaches to proofs with a significant computational component, including (i verifying  the algorithms, (ii verifying the results of the unverified algorithms, and (iii trusting an external computation.

  13. MICRO-VERS. Micro-computer Software for the Vocational Education Reporting System. User's Guide and Reference Manual. Version 3.1. Apple II.

    Science.gov (United States)

    Illinois State Board of Education, Springfield. Dept. of Adult, Vocational and Technical Education.

    This manual is intended to accompany a software system for the Apple II microcomputer that is designed to aid local districts in completing vocational education enrollment claims and Vocational Education Data System (VEDS) reports. Part I, Introduction, gives a brief overview of the Microcomputer Vocational Education Reporting System (MICRO-VERS),…

  14. Air oxygenation chemistry of 4-TBC catalyzed by chloro bridged dinuclear copper(II) complexes of pyrazole based tridentate ligands: synthesis, structure, magnetic and computational studies.

    Science.gov (United States)

    Banerjee, Ishita; Samanta, Pabitra Narayan; Das, Kalyan Kumar; Ababei, Rodica; Kalisz, Marguerite; Girard, Adrien; Mathonière, Corine; Nethaji, M; Clérac, Rodolphe; Ali, Mahammad

    2013-02-07

    Four dinuclear bis(μ-Cl) bridged copper(II) complexes, [Cu(2)(μ-Cl)(2)(L(X))(2)](ClO(4))(2) (L(X) = N,N-bis[(3,5-dimethylpyrazole-1-yl)-methyl]benzylamine with X = H(1), OMe(2), Me(3) and Cl(4)), have been synthesized and characterized by the single crystal X-ray diffraction method. In these complexes, each copper(II) center is penta-coordinated with square-pyramidal geometry. In addition to the tridentate L(X) ligand, a chloride ion occupies the last position of the square plane. This chloride ion is also bonded to the neighboring Cu(II) site in its axial position forming an SP-I dinuclear Cu(II) unit that exhibits small intramolecular ferromagnetic interactions and supported by DFT calculations. The complexes 1-3 exhibit methylmonooxygenase (pMMO) behaviour and oxidise 4-tert-butylcatechol (4-TBCH(2)) with molecular oxygen in MeOH or MeCN to 4-tert-butyl-benzoquinone (4-TBQ), 5-methoxy-4-tert-butyl-benzoquinone (5-MeO-4-TBQ) as the major products along with 6,6'-Bu(t)-biphenyl-3,4,3',4'-tetraol and others as minor products. These are further confirmed by ESI- and FAB-mass analyses. A tentative catalytic cycle has been framed based on the mass spectral analysis of the products and DFT calculations on individual intermediates that are energetically feasible.

  15. Memorias Conferencia Internacional IEEE Mexico 1971, Sobre Sistemas, Redes Y Computadoras. Volumen I and Volumen II. (Proceedings of International Conference of IEEE Concerning Systems, Networks, and Computers. Volume I and Volume II.

    Science.gov (United States)

    Concheiro, A. Alonso, Ed.; And Others

    The following papers in English from this international conference may be of particular interest to those in the field of education. T. Nakahara, A. Tsukamota, and M. Matsumoto describe a computer-aided design technique for an economical urban cable television system. W. D. Wasson and R. K. Chitkara outline a recognition scheme based on analysis…

  16. Memorias Conferencia Internacional IEEE Mexico 1971, Sobre Sistemas, Redes Y Computadoras. Volumen I and Volumen II. (Proceedings of International Conference of IEEE Concerning Systems, Networks, and Computers. Volume I and Volume II.

    Science.gov (United States)

    Concheiro, A. Alonso, Ed.; And Others

    The following papers in English from this international conference may be of particular interest to those in the field of education. T. Nakahara, A. Tsukamota, and M. Matsumoto describe a computer-aided design technique for an economical urban cable television system. W. D. Wasson and R. K. Chitkara outline a recognition scheme based on analysis…

  17. Synthesis, characterization, crystal structure determination and computational study of a new Cu(II) complex of bis [2-{(E)-[2-chloroethyl)imino]methyl}phenolato)] copper(II) Schiff base complex

    Science.gov (United States)

    Grivani, Gholamhossein; Vakili, Mohammad; Khalaji, Aliakbar Dehno; Bruno, Giuseppe; Rudbari, Hadi Amiri; Taghavi, Maedeh

    2016-07-01

    The copper (II) Schiff base complex of [CuL2] (1), HL = 2-{(E)-[2-chloroethyl) imino]methyl}phenol, has been synthesized and characterized by elemental (CHN) analysis, UV-Vis and FT-IR spectroscopy. The molecular structure of 1 was determined by single crystal X-ray diffraction technique. The conformational analysis and molecular structures of CuL2 were investigated by means of density functional theory (DFT) calculations at B3LYP/6-311G* level. An excellent agreement was observed between theoretical and experimental results. The Schiff base ligand of HL acts as a chelating ligand and coordinates via one nitrogen atom and one oxygen atom to the metal center. The copper (II) center is coordinated by two nitrogen atoms and two oxygen atoms from two Schiff base ligands in an approximately square planar trans-[MN2O2] coordination geometry. Thermogravimetric analysis of CuL2 showed that it was decomposed in five stages. In addition, the CuL2 complex thermally decomposed in air at 660 °C and the XRD pattern of the obtained solid showed the formation of CuO nanoparticles with an average size of 34 nm.

  18. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    Science.gov (United States)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  19. Assignment of solid-state 13C and 1H NMR spectra of paramagnetic Ni(II) acetylacetonate complexes aided by first-principles computations

    DEFF Research Database (Denmark)

    Rouf, Syed Awais; Jakobsen, Vibe Boel; Mareš, Jiří

    2017-01-01

    Recent advances in computational methodology allowed for first-principles calculations of the nuclear shielding tensor for a series of paramagnetic nickel(II) acetylacetonate complexes, [Ni(acac)2L2] with L = H2O, D2O, NH3, ND3, and PMe2Ph have provided detailed insight into the origin of the par......Recent advances in computational methodology allowed for first-principles calculations of the nuclear shielding tensor for a series of paramagnetic nickel(II) acetylacetonate complexes, [Ni(acac)2L2] with L = H2O, D2O, NH3, ND3, and PMe2Ph have provided detailed insight into the origin...... of the paramagnetic contributions to the total shift tensor. This was employed for the assignment of the solid-state 1,2H and 13C MAS NMR spectra of these compounds. The two major contributions to the isotropic shifts are by orbital (diamagnetic-like) and contact mechanism. The orbital shielding, contact, as well...... as dipolar terms all contribute to the anisotropic component. The calculations suggest reassignment of the 13C methyl and carbonyl resonances in the acac ligand [Inorg. Chem. 53, 2014, 399] leading to isotropic paramagnetic shifts of δ(13C) ≈ 800–1100 ppm and ≈180–300 ppm for 13C for the methyl and carbonyl...

  20. Mathematical and Computational Aspects of Multiscale Materials Modeling, Mathematics-Numerical analysis, Section II.A.a.3.4, Conference and symposia organization II.A.2.a

    Science.gov (United States)

    2015-02-04

    Triangle Park, NC 27709-2211 mathematical modeling; physica modeling; computational methods ; conference organization REPORT DOCUMENTATION PAGE 11. SPONSOR...researchers from France and US working in the field of Mathematics and Mechanics . The Center has organized meetings every year over the past 7 years...is similar to that used in relativity to account for space curvature, but its application to mechanics of classical continua has not been explored

  1. Computation Modeling of Limb-bud Dysmorphogenesis: Predicting Cellular Dynamics and Key Events in Developmental Toxicity with a Multicellular Systems Model (FutureToxII)

    Science.gov (United States)

    Congenital limb malformations are among the most frequent malformation occurs in humans, with a frequency of about 1 in 500 to 1 in 1000 human live births. ToxCast is profiling the bioactivity of thousands of chemicals based on high-throughput (HTS) and computational methods that...

  2. A comparative evaluation of Cone Beam Computed Tomography (CBCT) and Multi-Slice CT (MSCT). Part II: On 3D model accuracy

    NARCIS (Netherlands)

    Liang, X.; Lambrichts, I.; Sun, Y.; Denis, K.; Hassan, B.; Li, L.; Pauwels, R.; Jacobs, R.

    2010-01-01

    Aim: The study aim was to compare the geometric accuracy of three-dimensional (3D) surface model reconstructions between five Cone Beam Computed Tomography (CBCT) scanners and one Multi-Slice CT (MSCT) system. Materials and methods: A dry human mandible was scanned with five CBCT systems (NewTom 3G,

  3. CASY: a dynamic simulation of the gas-cooled fast breeder reactor core auxiliary cooling system. Volume II. Example computer run

    Energy Technology Data Exchange (ETDEWEB)

    1979-09-01

    A listing of a CASY computer run is presented. It was initiated from a demand terminal and, therefore, contains the identification ST0952. This run also contains an INDEX listing of the subroutine UPDATE. The run includes a simulated scram transient at 30 seconds.

  4. HC视频喉镜与Macintosh直接喉镜引导患儿气管插管术效果的比较%Comparison of HC video-laryngoscope and Macintosh direct laryngoscope for tracheal intubation in pediatric patients

    Institute of Scientific and Technical Information of China (English)

    何伟; 黄梦朦; 刘铁帅; 张冰; 曾睿峰; 上官王宁; 连庆泉; 李军

    2014-01-01

    Objective To compare HC video-laryngoscope and Macintosh direct laryngoscope for tracheal intubation in the pediatric patients.Methods One hundred and twenty pediatric patients,of ASA physical status [or Ⅱ (Mallampati class Ⅰ or Ⅱ),aged 1-6 yr,scheduled for elective surgery under general anesthesia,were randomly divided into 2 groups(n =60 each) using a random number table:HC video-laryngoscope group (group H1) and Macintosh direct laryngoscope group (group M1).Forty pediatric patients,aged 3-6 yr,of ASA physical status Ⅰ or Ⅱ (Mallampati class Ⅲ or Ⅳ,) suspected as having a difficult airway,scheduled for elective surgery under general anesthesia,were randomly divided into 2 groups (n =20 each) using a random number table:HC video-laryngoscope group (group H2) and Macintosh direct laryngoscope group (group M2).After induction of anesthesia,orotracheal intubation was carried out by HC video-laryngoscope (group H1 and H2) or by Macintosh direct laryngoscope (group M1 and M2).The exposure of the glottis was evaluated with Cormack-Lehane classification.The intubation time,rate of successful intubation,and distance between upper and lower incisors when intubation was successful in H2 and M2 groups were recorded.The development of damage to lips,teeth,gums and soft tissues of throat during intubation and hoarseness after operation was recorded.Results Compared with M1 group,no significant change was found in the intubation time,rate of successful intubation at first attempt and Cormark-Lehane grade,and the incidence of damage to lips,teeth,gums and soft tissues of throat during intubation and hoarseness after operation was significantly decreased in group H1.Compared with group M2,the intubation time was significantly shortened,the rate of successful intubation at first attempt was increased,the distance between upper and lower incisors when intubation was successful was reduced,Cormark-Lehane grade was decreased,and the incidence of damage to lips

  5. [Interactive, multimedia learning system for the study of primary open angle glaucoma--system description from the viewpoint of the computer scientist].

    Science.gov (United States)

    Zenz, H; Maresch, H; Faulborn, J

    1994-01-01

    For use on an Apple Macintosh PC, a software packet was developed with the aid of which medical students can learn all about primary open-angle glaucoma; at the same time, it is also highly suitable for supplementing lectures. The subject is subdivided into the three sections anatomy, pathology and clinical picture. All the possibilities of computer technology, for example controllable animation, video sequences and special graphic effects are utilized. A special lexicon, a complex of questions and all the possible cross-references combine to make the learning system a very flexible and highly effective learning tool.

  6. Special Education Teacher Computer Literacy Training. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume II. Final Report.

    Science.gov (United States)

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the second of four project objectives, the development of a special education teacher computer literacy…

  7. Can early computed tomography angiography after endovascular aortic aneurysm repair predict the need for reintervention in patients with type II endoleak?

    Science.gov (United States)

    Dudeck, O; Schnapauff, D; Herzog, L; Löwenthal, D; Bulla, K; Bulla, B; Halloul, Z; Meyer, F; Pech, M; Gebauer, B; Ricke, J

    2015-02-01

    This study was designed to identify parameters on CT angiography (CTA) of type II endoleaks following endovascular aortic aneurysm repair (EVAR) for abdominal aortic aneurysm (AAA), which can be used to predict the subsequent need for reinterventions. We retrospectively identified 62 patients with type II endoleak who underwent early CTA in mean 3.7 ± 1.9 days after EVAR. On the basis of follow-up examinations (mean follow-up period 911 days; range, 373-1,987 days), patients were stratified into two groups: those who did (n = 18) and those who did not (n = 44) require reintervention. CTA characteristics, such as AAA, endoleak, as well as nidus dimensions, patency of the inferior mesenteric artery, number of aortic branch vessels, and the pattern of endoleak appearance, were recorded and correlated with the clinical outcome. Univariate and receiver operating characteristic curve regression analyses revealed significant differences between the two groups for the endoleak volume (surveillance group: 1391.6 ± 1427.9 mm(3); reintervention group: 3227.7 ± 2693.8 mm(3); cutoff value of 2,386 mm(3); p = 0.002), the endoleak diameter (13.6 ± 4.3 mm compared with 25.9 ± 9.6 mm; cutoff value of 19 mm; p < 0.0001), the number of aortic branch vessels (2.9 ± 1.2 compared with 4.2 ± 1.4 vessels; p = 0.001), as well as a "complex type" endoleak pattern (13.6 %, n = 6 compared with 44.4 %, n = 8; p = 0.02). Early CTA can predict the future need for reintervention in patients with type II endoleak. Therefore, treatment decision should be based not only on aneurysm enlargement alone but also on other imaging characteristics.

  8. Chiral mobile phase in ligand-exchange chromatography of amino acids: exploring the copper(II) salt anion effect with a computational approach.

    Science.gov (United States)

    Sardella, Roccaldo; Macchiarulo, Antonio; Carotti, Andrea; Ianni, Federica; Rubiño, Maria Eugenia García; Natalini, Benedetto

    2012-12-21

    With the use of a chiral ligand-exchange chromatography (CLEC) system operating with the O-benzyl-(S)-serine [(S)-OBS] [1,2] as the chiral mobile phase (CMP) additive to the eluent, the effect of the copper(II) anion type on retention (k) and separation (α) factors was evaluated, by rationally changing the following experimental conditions: salt concentration and temperature. The CLEC-CMP analysis was carried out on ten amino acidic racemates and with nine different cupric salts. While the group of analytes comprised both aliphatic (leucine, isoleucine, nor-leucine, proline, valine, nor-valine, and α-methyl-valine) and aromatic (1-aminoindan-1,5-dicarboxylic acid, phenylglycine, and tyrosine) species, representative organic (formate, methanesulfonate, and trifluoroacetate) and inorganic (bromide, chloride, fluoride, nitrate, perchlorate, and sulfate) Cu(II) salts were selected as the metal source into the eluent. This route of investigation was pursued with the aim of identifying analogies among the employed Cu(II) salts, by observing the variation profile of the selected chromatographic parameters, upon a change of the above experimental conditions. All the data were collected and analyzed through a statistical approach (PCA and k-means clustering) that revealed the presence of two behavioral classes of cupric salts, sharing the same variation profile for k and α values. Interestingly, this clustering can be explained in terms of ESP (electrostatic surface potential) balance (ESP(bal)) values, obtained by an ab initio calculation operated on the cupric salts. The results of this appraisal could aid the rational choice of the most suitable eluent system, to succeed in the enantioseparation of difficult-to-resolve compounds, along with the eventual scale-up to a semi-preparative level.

  9. Research and Development in Natural Language Understanding as Part of the Strategic Computing Program. Volume 3. A Guide to IRUS-II Application Development. Revision.

    Science.gov (United States)

    1990-10-01

    Section 26 4.2.1 Variable Binding List 26 4.2.2 [Rule Body 27 4.2.3 IRule Examples 28 4.3 Creating !Rules 30 4.3.1 Using IRACQ 30 4.3.2 An IRACQ session...representing a user’s input) i; addition contains fields for the parse tree, WML, discourse entities, and anaphor resolution information for that input...more than one possible referent is found for an anaphor , the options are presented to the user, who indicates which to use. 2.2 IRUS-II Capabilities

  10. Prediction of Six-Degree-of-Freedom Store Separation Trajectories at Speeds up to the Critical Speed. Volume II. Users Manual for the Computer Programs

    Science.gov (United States)

    1974-10-01

    NUMSTR( J ) store number; different for each store and < 99 NSHAPE(J) shape number of store; < 99 SLTHC(J) length of store, feet SRMAX(J) maximum radius...chord leading edge immediately above store, feet; positive below SIC( J ) store incidence angle measured relative to wing root i chord...the j store th 192 Table II-8 Concluded, SXL(K,J) XLEL(I) XSNC(J) XSNI(J) XWSOI(J) Yd) YWSO(J) YSN(J) ZLEL(I) ZSN(J) ZWSO(J) array

  11. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  12. Interactions of the "piano-stool" [ruthenium(II)(η(6) -arene)(quinolone)Cl](+) complexes with water; DFT computational study.

    Science.gov (United States)

    Zábojníková, Tereza; Cajzl, Radim; Kljun, Jakob; Chval, Zdeněk; Turel, Iztok; Burda, Jaroslav V

    2016-07-15

    Full optimizations of stationary points along the reaction coordinate for the hydration of several quinolone Ru(II) half-sandwich complexes were performed in water environment using the B3PW91/6-31+G(d)/PCM/UAKS method. The role of diffuse functions (especially on oxygen) was found crucial for correct geometries along the reaction coordinate. Single-point (SP) calculations were performed at the B3LYP/6-311++G(2df,2pd)/DPCM/saled-UAKS level. In the first part, two possible reaction mechanisms-associative and dissociative were compared. It was found that the dissociative mechanism of the hydration process is kinetically slightly preferred. Another important conclusion concerns the reaction channels. It was found that substitution of chloride ligand (abbreviated in the text as dechlorination reaction) represents energetically and kinetically the most feasible pathway. In the second part the same hydration reaction was explored for reactivity comparison of the Ru(II)-complexes with several derivatives of nalidixic acid: cinoxacin, ofloxacin, and (thio)nalidixic acid. The hydration process is about four orders of magnitude faster in a basic solution compared to neutral/acidic environment with cinoxacin and nalidixic acid as the most reactive complexes in the former and latter environments, respectively. The explored hydration reaction is in all cases endergonic; nevertheless the endergonicity is substantially lower (by ∼6 kcal/mol) in basic environment. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Crystallographic and Computational Studies of a Class II MHC Complex with a Nonconforming Peptide: HLA-DRA/DRB3*0101

    Science.gov (United States)

    Parry, Christian S.; Gorski, Jack; Stern, Lawrence J.

    2003-03-01

    The stable binding of processed foreign peptide to a class II major histocompatibility (MHC) molecule and subsequent presentation to a T cell receptor is a central event in immune recognition and regulation. Polymorphic residues on the floor of the peptide binding site form pockets that anchor peptide side chains. These and other residues in the helical wall of the groove determine the specificity of each allele and define a motif. Allele specific motifs allow the prediction of epitopes from the sequence of pathogens. There are, however, known epitopes that do not satisfy these motifs: anchor motifs are not adequate for predicting epitopes as there are apparently major and minor motifs. We present crystallographic studies into the nature of the interactions that govern the binding of these so called nonconforming peptides. We would like to understand the role of the P10 pocket and find out whether the peptides that do not obey the consensus anchor motif bind in the canonical conformation observed in in prior structures of class II MHC-peptide complexes. HLA-DRB3*0101 complexed with peptide crystallized in unit cell 92.10 x 92.10 x 248.30 (90, 90, 90), P41212, and the diffraction data is reliable to 2.2ÅWe are complementing our studies with dynamical long time simulations to answer these questions, particularly the interplay of the anchor motifs in peptide binding, the range of protein and ligand conformations, and water hydration structures.

  14. Na(I)/Cu(I-II) heterometallic cages interconnected by unusual linear 2-coordinate OCN-Cu(I)-NCO links: synthesis, structural, magnetostructural correlation and computational studies.

    Science.gov (United States)

    Ray, Aurkie; Rosair, Georgina M; Rajeev, Ramanan; Sunoj, Raghavan B; Rentschler, Eva; Mitra, Samiran

    2009-11-21

    A new Na(I)/Cu(I-II) heterometallic coordination complex [Cu(2)L(2)Na(NCO)(2)Cu](n) (1) with an unusual architecture has been synthesised. In 1 cyclic Na-O-Cu-O-Cu cages constructed by the tetradentate N(2)O(2) donor Schiff base ligand (H(2)L = N,N'-bis(2-hydroxyacetophenone)propylenediimine) are interconnected to each other by a rare singly end-to-end bridged OCN-Cu(I)-NCO link generating 1D chain. The complex has been characterised by elemental, spectral and structural analysis. The cyclic voltammogram of 1 has been compared with the analogous complexes. Cryomagnetic susceptibility studies indicate the copper(II) centers in the cyclic Na-O-Cu-O-Cu cages are antiferromagnetically coupled with J = -13.8 cm(-1). Complex 1 is a new addition to a class of rare singly end-to-end cyanato bridged copper(I) species and interestingly the copper ions involved in OCN-Cu(I)-NCO links possess a linear 2-coordinate geometry. Density functional theory calculations have been carried out to gain additional insights into the metal and ligand orbitals participating in this unusual structure.

  15. Computational photochemistry of the azobenzene scaffold of Sudan I and Orange II dyes: excited-state proton transfer and deactivation via conical intersections.

    Science.gov (United States)

    Guan, Pei-Jie; Cui, Ganglong; Fang, Qiu

    2015-03-16

    We employed the complete active space self-consistent field (CASSCF) and its multistate second-order perturbation (MS-CASPT2) methods to explore the photochemical mechanism of 2-hydroxyazobenzene, the molecular scaffold of Sudan I and Orange II dyes. It was found that the excited-state intramolecular proton transfer (ESIPT) along the bright diabatic (1) ππ* state is barrierless and ultrafast. Along this diabatic (1) ππ* relaxation path, the system can jump to the dark (1) nπ* state via the (1) ππ*/(1) nπ* crossing point. However, ESIPT in this dark state is largely inhibited owing to a sizeable barrier. We also found two deactivation channels that decay (1) ππ* keto and (1) nπ* enol species to the ground state via two energetically accessible S1 /S0 conical intersections. Finally, we encountered an interesting phenomenon in the excited-state hydrogen-bonding strength: it is reinforced in the (1) ππ* state, whereas it is reduced in the (1) nπ* state. The present work sets the stage for understanding the photophysics and photochemistry of Sudan I-IV, Orange II, Ponceau 2R, Ponceau 4R, and azo violet.

  16. Trenton ICES: demonstration of a grid connected integrated community energy system. Phase II. Volume 3. Preliminary design of ICES system and analysis of community ownership: computer printouts

    Energy Technology Data Exchange (ETDEWEB)

    1978-03-22

    This volume supplements Vol. 2 and consists entirely of computer printouts. The report consists of three parts: (1) hourly log of plant simulation based on 1982 ICES Community, with thermal storage, on-peak and off-peak electric generation, and 80% maximum kW trip-off; (2) same as (1) except without thermal storage; and (3) hourly load and demand profiles--1979, 1980, and 1982 ICES communities.

  17. Evaluation of Condylar Position after Orthognathic Surgery for Treatment of Class II Vertical Maxillary Excess and Mandibular Deficiency by Using Cone-Beam Computed Tomography

    OpenAIRE

    Reza Tabrizi; Shoaleh Shahidi; Dept. of Oral and Maxillofacial Radiology, Biomaterials Research Center, Shiraz University of Medical Sciences, Shiraz, Iran.; Hamidreza Arabion

    2016-01-01

    Statement of the Problem: In orthognathic surgeries, proper condylar position is one of the most important factors in postoperative stability. Knowing the condylar movement after orthognathic surgery can help preventing postoperative instabilities. Purpose: The aim of this study was to evaluate the condylar positional changes after Le Fort I maxillary superior repositioning along with mandibular advancement by using cone beam computed tomography (CBCT). Materials and Method: This cross...

  18. Numerical computation of spherical harmonics of arbitrary degree and order by extending exponent of floating point numbers: II first-, second-, and third-order derivatives

    Science.gov (United States)

    Fukushima, Toshio

    2012-11-01

    We confirm that the first-, second-, and third-order derivatives of fully-normalized Legendre polynomial (LP) and associated Legendre function (ALF) of arbitrary degree and order can be correctly evaluated by means of non-singular fixed-degree formulas (Bosch in Phys Chem Earth 25:655-659, 2000) in the ordinary IEEE754 arithmetic when the values of fully-normalized LP and ALF are obtained without underflow problems, for e.g., using the extended range arithmetic we recently developed (Fukushima in J Geod 86:271-285, 2012). Also, we notice the same correctness for the popular but singular fixed-order formulas unless (1) the order of differentiation is greater than the order of harmonics and (2) the point of evaluation is close to the poles. The new formulation using the fixed-order formulas runs at a negligible extra computational time, i.e., 3-5 % increase in computational time per single ALF when compared with the standard algorithm without the exponent extension. This enables a practical computation of low-order derivatives of spherical harmonics of arbitrary degree and order.

  19. Modelo computacional para suporte à decisão em áreas irrigadas. Parte II: testes e aplicação Computer model for decision support in irrigated areas. Part II: tests and application

    Directory of Open Access Journals (Sweden)

    Paulo A. Ferreira

    2006-12-01

    Full Text Available Apresentou-se, na Parte I desta pesquisa, o desenvolvimento de um modelo computacional denominado MCID, para suporte à tomada de decisão quanto ao planejamento e manejo de projetos de irrigação e/ou drenagem. Objetivou-se, na Parte II, testar e aplicar o MCID. No teste comparativo com o programa DRAINMOD, espaçamentos entre drenos, obtidos com o MCID, foram ligeiramente maiores ou idênticos. Os espaçamentos advindos com o MCID e o DRAINMOD foram consideravelmente maiores que os obtidos por meio de metodologias tradicionais de dimensionamento de sistemas de drenagem. A produtividade relativa total, YRT, obtida com o MCID foi, em geral, inferior à conseguida com o DRAINMOD, devido a diferenças de metodologia ao se estimar a produtividade da cultura em resposta ao déficit hídrico. Na comparação com o programa CROPWAT, obtiveram-se resultados muito próximos para (YRT e evapotranspiração real. O modelo desenvolvido foi aplicado para as condições do Projeto Jaíba, MG, para culturas perenes e anuais cultivadas em diferentes épocas. Os resultados dos testes e aplicações indicaram a potencialidade do MCID como ferramenta de apoio à decisão em projetos de irrigação e/ou drenagem.Part I of this research presented the development of a decision support model, called MCID, for planning and managing irrigation and/or drainage projects. Part II is aimed at testing and applying MCID. In a comparative test with the DRAINMOD model, drain spacings obtained with MCID were slightly larger or identical. The spacings obtained with MCID and DRAINMOD were considerably larger than those obtained through traditional methodologies of design of drainage systems. The relative crop yield (YRT obtained with MCID was, in general, lower than the one obtained with DRAINMOD due to differences in the estimate of crop response to water deficit. In comparison with CROPWAT, very close results for YRT and for actual evapotranspiration were obtained. The

  20. Self-focusing and filamentation of a laser beam within the paraxial stationary approximation. Part II: computer simulations; Autofocalisation et filamentation d`un faisceau laser dans le cadre de l`approximation paraxiale et stationnaire. Partie II: simulations numeriques

    Energy Technology Data Exchange (ETDEWEB)

    Blain, M.A.; Bonnaud, G.; Chiron, A.; Riazuelo, G.

    1996-02-01

    This report addresses the propagation of an intense laser beam in a unmagnetized plasma, which is relevant for both the inertial confinement fusion (ICF) and the ultra-high intensity (UHI) pulses. The width and the irradiance of the laser pulses are respectively: (0.1-10) nanosecond and (10{sup 13}-10{sup 16}) W/cm{sup 2} for the ICF context and (0.1-1) picosecond and in excess of 10{sup 1}8 W/cm{sup 2} for the UHI context. The nonlinear mechanisms for beam self-focusing and filamentation, induced by both the ponderomotive expelling of charged particles and the relativistic increase of the electron mass, are specified studied. Part I deals with the theoretical aspects and part II is concerned with the results of two-dimensional simulations. The results have been obtained within the framework of the paraxial approximation and the stationary response of the plasma. The large set of scenarios that characterize the behavior of Gaussian beam and a modulated beam is presented; a synthetic overview of the previous theoretical works is also provided. The interplay of two crossing beams is discussed. This report will be a help to improve the uniformity of the laser irradiation in the ICF context and to channel a very intense laser beam over large distance in the UHI context. (authors). 17 refs., 53 figs., 14 tabs.

  1. Can Early Computed Tomography Angiography after Endovascular Aortic Aneurysm Repair Predict the Need for Reintervention in Patients with Type II Endoleak?

    Energy Technology Data Exchange (ETDEWEB)

    Dudeck, O., E-mail: oliver.dudeck@med.ovgu.de [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Schnapauff, D. [Charité Universitätsmedizin Berlin, Department of Radiology (Germany); Herzog, L.; Löwenthal, D.; Bulla, K.; Bulla, B. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Halloul, Z.; Meyer, F. [University of Magdeburg, Department of General, Visceral and Vascular Surgery (Germany); Pech, M. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany); Gebauer, B. [Charité Universitätsmedizin Berlin, Department of Radiology (Germany); Ricke, J. [University of Magdeburg, Department of Radiology and Nuclear Medicine (Germany)

    2015-02-15

    PurposeThis study was designed to identify parameters on CT angiography (CTA) of type II endoleaks following endovascular aortic aneurysm repair (EVAR) for abdominal aortic aneurysm (AAA), which can be used to predict the subsequent need for reinterventions.MethodsWe retrospectively identified 62 patients with type II endoleak who underwent early CTA in mean 3.7 ± 1.9 days after EVAR. On the basis of follow-up examinations (mean follow-up period 911 days; range, 373–1,987 days), patients were stratified into two groups: those who did (n = 18) and those who did not (n = 44) require reintervention. CTA characteristics, such as AAA, endoleak, as well as nidus dimensions, patency of the inferior mesenteric artery, number of aortic branch vessels, and the pattern of endoleak appearance, were recorded and correlated with the clinical outcome.ResultsUnivariate and receiver operating characteristic curve regression analyses revealed significant differences between the two groups for the endoleak volume (surveillance group: 1391.6 ± 1427.9 mm{sup 3}; reintervention group: 3227.7 ± 2693.8 mm{sup 3}; cutoff value of 2,386 mm{sup 3}; p = 0.002), the endoleak diameter (13.6 ± 4.3 mm compared with 25.9 ± 9.6 mm; cutoff value of 19 mm; p < 0.0001), the number of aortic branch vessels (2.9 ± 1.2 compared with 4.2 ± 1.4 vessels; p = 0.001), as well as a “complex type” endoleak pattern (13.6 %, n = 6 compared with 44.4 %, n = 8; p = 0.02).ConclusionsEarly CTA can predict the future need for reintervention in patients with type II endoleak. Therefore, treatment decision should be based not only on aneurysm enlargement alone but also on other imaging characteristics.

  2. Zero-field splitting in pseudotetrahedral Co(II) complexes: a magnetic, high-frequency and -field EPR, and computational study.

    Science.gov (United States)

    Idešicová, Monika; Titiš, Ján; Krzystek, J; Boča, Roman

    2013-08-19

    Six pseudotetrahedral cobalt(II) complexes of the type [CoL2Cl2], with L = heterocyclic N-donor ligand, have been studied in parallel by magnetometry, and high-frequency and -field electron paramagnetic resonance (HFEPR). HFEPR powder spectra were recorded in a 50 GHz < ν < 700 GHz range in a 17 T superconducting and 25 T resistive magnet, which allowed constructing of resonance field vs frequency diagrams from which the fitting procedure yielded the S = 3/2 spin ground state Hamiltonian parameters. The sign of the axial anisotropy parameter D was determined unambiguously; the values range between -8 and +11 cm(-1) for the given series of complexes. These data agree well with magnetometric analysis. Finally, quantum chemical ab initio calculations were performed on the whole series of complexes to probe the relationship between the magnetic anisotropy, electronic, and geometric structure.

  3. Bis(o-methylserotonin)-containing iridium(III) and ruthenium(II) complexes as new cellular imaging dyes: synthesis, applications, and photophysical and computational studies.

    Science.gov (United States)

    Núñez, Cristina; Silva López, Carlos; Faza, Olalla Nieto; Fernández-Lodeiro, Javier; Diniz, Mario; Bastida, Rufina; Capelo, Jose Luis; Lodeiro, Carlos

    2013-08-01

    We report the synthesis, characterization, and scope of a new versatile emissive molecular probe functionalized with a 1,10-phenanthroline moiety containing methylserotonin groups as binding sites for metal ion recognition. The synthesis, characterization, and evaluation of the in vitro imaging capability of the iridium(III) and ruthenium(II) complexes [Ir(ppy)2(N-N)](+) and [Ru(bpy)2(N-N)](2+), in which ppy is 2-phenylpyridine, bpy is 2,2'-bipyridine, and N-N is a 1,10-phenanthroline ligand functionalized with two methylserotonin groups to serve as binding sites for metal ion recognition, is reported. The uptake of these compounds by living freshwater fish (Carassius auratus) was studied by fluorescence microscopy, and the cytotoxicity of ligand N-N and [Ru(bpy)2(N-N)](2+) in this species was also investigated.

  4. Synthesis of novel palladium(II) complexes with oxalic acid diamide derivatives and their interaction with nucleosides and proteins. Structural, solution, and computational study.

    Science.gov (United States)

    Mrkalić, Emina M; Jelić, Ratomir M; Klisurić, Olivera R; Matović, Zoran D

    2014-10-28

    Novel palladium complexes, KH[Pd(obap)]2·3H2O (3) with oxamido-N-aminopropyl-N'-benzoic acid and [Pd(apox)] (4) with N,N'-bis(3-aminopropyl)ethanediamide, were synthesized. Exhaustive synthetic, solution and structural studies of the two Pd(ii) complexes are reported. The binary and ternary systems of the Pd(ii) ion with H2apox or H3obap as primary ligands and nucleosides (Ado or Cyt) as secondary ligands, are investigated in order to better understand their equilibrium chemistry. The relative stabilities of the ternary complexes are determined and compared with those of the corresponding binary complexes in terms of their Δlog K values. The species distribution of all complexes in solution is evaluated. Fluorescence spectroscopy data shows that the fluorescence quenching of HSA is a result of the formation of the [PdL]-HSA complex. The structure of complex 3 is confirmed using X-ray crystallography. The results are compared to those obtained for palladium complexes of similar structures. Density functional theory (DFT) has been applied for modelling and energetic analysis purposes. The nature of the Pd-N(O) bond interaction is analyzed using NBO. We report here docking simulation experiments in order to predict the most probable mechanism of pro-drug-action. The next free binding energy order of the best scores from the [PdL]-DNA docking simulations, cis-[Pt(NH3)2(H2O)2](2+) > [Pd(obap)] > [Pd(mda)], has been observed in the case of DNA alteration. For the ER and cytosolic stress mechanisms the results of the docking simulations to the chaperons Grp78 and Hsc70 are promising for possible applications as potent protein inhibitors (Ki of [Pd(mda)]/GRP78 being ∼66 μM and Ki for [Pd(obap)]/HSC70 being 14.39 μM).

  5. Relationships (II) of International Classification of High-resolution Computed Tomography for Occupational and Environmental Respiratory Diseases with ventilatory functions indices for parenchymal abnormalities.

    Science.gov (United States)

    Tamura, Taro; Suganuma, Narufumi; Hering, Kurt G; Vehmas, Tapio; Itoh, Harumi; Akira, Masanori; Takashima, Yoshihiro; Hirano, Harukazu; Kusaka, Yukinori

    2015-01-01

    The International Classification of High-Resolution Computed Tomography (HRCT) for Occupational and Environmental Respiratory Diseases (ICOERD) is used to screen and diagnose respiratory illnesses. Using univariate and multivariate analysis, we investigated the relationship between subject characteristics and parenchymal abnormalities according to ICOERD, and the results of ventilatory function tests (VFT). Thirty-five patients with and 27 controls without mineral-dust exposure underwent VFT and HRCT. We recorded all subjects' occupational history for mineral dust exposure and smoking history. Experts independently assessed HRCT using the ICOERD parenchymal abnormalities (Items) grades for well-defined rounded opacities (RO), linear and/or irregular opacities (IR), and emphysema (EM). High-resolution computed tomography showed that 11 patients had RO; 15 patients, IR; and 19 patients, EM. According to the multiple regression model, age and height had significant associations with many indices ventilatory functions such as vital capacity, forced vital capacity, and forced expiratory volume in 1 s (FEV1). The EM summed grades on the upper, middle, and lower zones of the right and left lungs also had significant associations with FEV1 and the maximum mid-expiratory flow rate. The results suggest the ICOERD notation is adequate based on the good and significant multiple regression modeling of ventilatory function with the EM summed grades.

  6. Use of Transportable Radiation Detection Instruments to Assess Internal Contamination from Intakes of Radionuclides Part II: Calibration Factors and ICAT Computer Program.

    Science.gov (United States)

    Anigstein, Robert; Olsher, Richard H; Loomis, Donald A; Ansari, Armin

    2016-12-01

    The detonation of a radiological dispersion device or other radiological incidents could result in widespread releases of radioactive materials and intakes of radionuclides by affected individuals. Transportable radiation monitoring instruments could be used to measure radiation from gamma-emitting radionuclides in the body for triaging individuals and assigning priorities to their bioassay samples for in vitro assessments. The present study derived sets of calibration factors for four instruments: the Ludlum Model 44-2 gamma scintillator, a survey meter containing a 2.54 × 2.54-cm NaI(Tl) crystal; the Captus 3000 thyroid uptake probe, which contains a 5.08 × 5.08-cm NaI(Tl) crystal; the Transportable Portal Monitor Model TPM-903B, which contains two 3.81 × 7.62 × 182.9-cm polyvinyltoluene plastic scintillators; and a generic instrument, such as an ionization chamber, that measures exposure rates. The calibration factors enable these instruments to be used for assessing inhaled or ingested intakes of any of four radionuclides: Co, I, Cs, and Ir. The derivations used biokinetic models embodied in the DCAL computer software system developed by the Oak Ridge National Laboratory and Monte Carlo simulations using the MCNPX radiation transport code. The three physical instruments were represented by MCNP models that were developed previously. The affected individuals comprised children of five ages who were represented by the revised Oak Ridge National Laboratory pediatric phantoms, and adult men and adult women represented by the Adult Reference Computational Phantoms described in Publication 110 of the International Commission on Radiological Protection. These calibration factors can be used to calculate intakes; the intakes can be converted to committed doses by the use of tabulated dose coefficients. These calibration factors also constitute input data to the ICAT computer program, an interactive Microsoft Windows-based software package that estimates intakes of

  7. Modeling and Multi-Objective Optimization of Engine Performance and Hydrocarbon Emissions via the Use of a Computer Aided Engineering Code and the NSGA-II Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Richard Fiifi Turkson

    2016-01-01

    Full Text Available It is feared that the increasing population of vehicles in the world and the depletion of fossil-based fuel reserves could render transportation and other activities that rely on fossil fuels unsustainable in the long term. Concerns over environmental pollution issues, the high cost of fossil-based fuels and the increasing demand for fossil fuels has led to the search for environmentally friendly, cheaper and efficient fuels. In the search for these alternatives, liquefied petroleum gas (LPG has been identified as one of the viable alternatives that could be used in place of gasoline in spark-ignition engines. The objective of the study was to present the modeling and multi-objective optimization of brake mean effective pressure and hydrocarbon emissions for a spark-ignition engine retrofitted to run on LPG. The use of a one-dimensional (1D GT-Power™ model, together with Group Method of Data Handling (GMDH neural networks, has been presented. The multi-objective optimization was implemented in MATLAB® using the non-dominated sorting genetic algorithm (NSGA-II. The modeling process generally achieved low mean squared errors (0.0000032 in the case of the hydrocarbon emissions model for the models developed and was attributed to the collection of a larger training sample data using the 1D engine model. The multi-objective optimization and subsequent decisions for optimal performance have also been presented.

  8. Conception of a course for professional training and education in the field of computer and mobile forensics: Part II: Android Forensics

    Science.gov (United States)

    Kröger, Knut; Creutzburg, Reiner

    2013-03-01

    The growth of Android in the mobile sector and the interest to investigate these devices from a forensic point of view has rapidly increased. Many companies have security problems with mobile devices in their own IT infrastructure. To respond to these incidents, it is important to have professional trained staff. Furthermore, it is necessary to further train their existing employees in the practical applications of mobile forensics owing to the fact that a lot of companies are trusted with very sensitive data. Inspired by these facts, this paper - a continuation of a paper of January 2012 [1] which showed the conception of a course for professional training and education in the field of computer and mobile forensics - addresses training approaches and practical exercises to investigate Android mobile devices.

  9. A computer model allowing maintenance of large amounts of genetic variability in Mendelian populations. II. The balance of forces between linkage and random assortment.

    Science.gov (United States)

    Wills, C; Miller, C

    1976-02-01

    It is shown, through theory and computer simulations of outbreeding Mendelian populations, that there may be conditions under which a balance is struck between two facotrs. The first is the advantage of random assortment, which will, when multilocus selection is for intermediate equilibrium values, lead to higher average heterozygosity than when linkage is introduced. There is some indication that random assortment is also advantageous when selection is toward a uniform distribution of equilibrium values. The second factor is the advantage of linkage between loci having positive epistatic interactions. When multilocus selection is for a bimodal distribution of equilibrium values an early advantage of random assortment is replaced by a later disadvantage. Linkage disequilibrium, which in finite populations is increased only by random or selective sampling, may hinder the movement of alleles to their selective equilibria, thus leading to the advantage of random assortment.-Some consequences of this approach to the structure of natural populations are discussed.

  10. Computed tomography of the abdomen of calves during the first 105 days of life: II. Liver, spleen, and small and large intestines.

    Science.gov (United States)

    Braun, U; Schnetzler, C; Augsburger, H; Müller, U; Dicht, S; Ohlerth, S

    2014-05-01

    Computed tomography (CT) findings of the liver, spleen and intestines of five healthy calves during six examinations in the first 105 days of life were compared with corresponding cadaver slices. The liver was located in the right hemiabdomen adjacent to the diaphragm and right abdominal wall. The caudal vena cava was seen dorsomedially and the portal vein further ventrally. The umbilical vein was seen running from the navel to the liver in all calves in the first scan and in four calves in the second scan. The spleen ran dorsoventrally adjacent to the costal part of the left abdominal wall and appeared sickle-shaped on transverse images. Differentiation of small and large intestines was only possible when the former contained fluid content and the latter gaseous content. The small intestine was in the left hemiabdomen dorsal to the abomasum and caudodorsal to the rumen at the first two examinations. Growth of the forestomachs caused displacement of the small intestine to the right and toward the ventral abdomen caudal to the liver and adjacent to the right abdominal wall. The large intestine was located caudodorsally, and the typical features of the spiral colon were apparent in the dorsal plane. The location of the caecum varied from dorsal to the spiral colon to adjacent to the right abdominal wall with the apex always pointing caudally. The rectum was easily identified in the pelvic region. The size, volume and density of the described organs throughout the study are shown in several tables.

  11. A comparative evaluation of Cone Beam Computed Tomography (CBCT) and Multi-Slice CT (MSCT). Part II: On 3D model accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Liang Xin, E-mail: Xin.Liang@med.kuleuven.b [Oral Imaging Centre, School of Dentistry, Oral Pathology and Maxillofacial Surgery, Faculty of Medicine, Catholic University of Leuven (Belgium); College of Stomatology, Dalian Medical University (China); Lambrichts, Ivo, E-mail: Ivo.Lambrichts@uhasselt.b [Department of Basic Medical Sciences, Histology and Electron Microscopy, Faculty of Medicine, University of Hasselt, Diepenbeek (Belgium); Sun Yi, E-mail: Sunyihello@hotmail.co [Oral Imaging Centre, School of Dentistry, Oral Pathology and Maxillofacial Surgery, Faculty of Medicine, Catholic University of Leuven (Belgium); Denis, Kathleen, E-mail: kathleen.denis@groept.b [Department of Industrial Sciences and Techology-Engineering (IWT), XIOS Hogeschool Limburg, Hasselt (Belgium); Hassan, Bassam, E-mail: b.hassan@acta.n [Department of Oral Radiology, Academic Centre for Dentistry Amsterdam (ACTA), Amsterdam (Netherlands); Li Limin, E-mail: Limin.Li@uz.kuleuven.b [Department of Paediatric Dentistry and Special Dental Care, School of Dentistry, Oral Pathology and Maxillofacial Surgery, Faculty of Medicine, Catholic University of Leuven (Belgium); Pauwels, Ruben, E-mail: Ruben.Pauwels@med.kuleuven.b [Oral Imaging Centre, School of Dentistry, Oral Pathology and Maxillofacial Surgery, Faculty of Medicine, Catholic University of Leuven (Belgium); Jacobs, Reinhilde, E-mail: Reinhilde.Jacobs@uz.kuleuven.b [Oral Imaging Centre, School of Dentistry, Oral Pathology and Maxillofacial Surgery, Faculty of Medicine, Catholic University of Leuven (Belgium)

    2010-08-15

    Aim: The study aim was to compare the geometric accuracy of three-dimensional (3D) surface model reconstructions between five Cone Beam Computed Tomography (CBCT) scanners and one Multi-Slice CT (MSCT) system. Materials and methods: A dry human mandible was scanned with five CBCT systems (NewTom 3G, Accuitomo 3D, i-CAT, Galileos, Scanora 3D) and one MSCT scanner (Somatom Sensation 16). A 3D surface bone model was created from the six systems. The reference (gold standard) 3D model was obtained with a high resolution laser surface scanner. The 3D models from the five systems were compared with the gold standard using a point-based rigid registration algorithm. Results: The mean deviation from the gold standard for MSCT was 0.137 mm and for CBCT were 0.282, 0.225, 0.165, 0.386 and 0.206 mm for the i-CAT, Accuitomo, NewTom, Scanora and Galileos, respectively. Conclusion: The results show that the accuracy of CBCT 3D surface model reconstructions is somewhat lower but acceptable comparing to MSCT from the gold standard.

  12. Benefits and risks of fish consumption Part II. RIBEPEIX, a computer program to optimize the balance between the intake of omega-3 fatty acids and chemical contaminants.

    Science.gov (United States)

    Domingo, José L; Bocio, Ana; Martí-Cid, Roser; Llobet, Juan M

    2007-02-12

    In recent years, and based on the importance of fish as a part of a healthy diet, there has been a notable promotion of fish and seafood consumption. However, a number of recent studies have shown that fish may be a potential source of exposure to chemical pollutants, some of them with well known adverse effects on human health. Recently, we determined in 14 edible marine species the concentrations of eicosapentaenoic acid (EPA) and docosohexaenoic acid (DHA), as well as those of a number of chemical contaminants: Cd, Hg, Pb, polychlorinated dibenzo-p-dioxins and furans, polychlorinated biphenyls, hexachlorobenzene, polycyclic aromatic hydrocarbons, polychlorinated naphthalenes, polybrominated diphenylethers and polychlorinated diphenylethers. To quantitative establish the intake of these pollutants (risks) versus that of EPA+DHA (benefits), we designed a simple computer program, RIBEPEIX. The concentrations of EPA, DHA, and the chemical pollutants were introduced into the program. We here present how RIBEPEIX may be used as an easy tool to optimize fish consumption: most suitable species, frequency of consumption, and size of meals. RIBEPEIX can be useful not only for professionals (cardiologists, general physicians, nutritionists, toxicologists, etc.), but also for the general population. It is available at: .

  13. Efeitos transversais da expansão rápida da maxila em pacientes com má oclusão de Classe II: avaliação por Tomografia Computadorizada Cone-Beam Transverse effects of rapid maxillary expansion in Class II malocclusion patients: a Cone-Beam Computed Tomography study

    Directory of Open Access Journals (Sweden)

    Carolina Baratieri

    2010-10-01

    Full Text Available OBJETIVO: avaliar por meio de Tomografia Computadorizada Cone-Beam (TCCB os efeitos transversais, imediatos e após o período de contenção, da expansão rápida da maxila (ERM em pacientes com má oclusão de Classe II. MÉTODOS: dezessete crianças (idade inicial média de 10,36 anos com má oclusão de Classe II e deficiência transversal esquelética da maxila foram submetidas ao protocolo de ERM com aparelho expansor de Haas. TCCBs foram realizadas antes dos procedimentos clínicos (T1, imediatamente após a estabilização do parafuso expansor (T2 e após completados 6 meses de contenção e removido o aparelho (T3. Com o software Dolphin, foram possíveis a manipulação das imagens e as mensurações. O teste t de Student pareado foi utilizado para identificar significância estatística (pOBJECTIVE: The aim of this study was to evaluate by Cone-Beam Computed Tomography (CBCT transversal responses, immediately and after the retention period, to rapid maxillary expansion (RME, in Class II malocclusion patients. METHODS: Seventeen children (mean initial age of 10.36 years, with Class II malocclusion and skeletal constricted maxilla, underwent Haas´ protocol for RME. CBCT scans were taken before treatment (T1, at the end of the active expansion phase (T2 and after the retention period of six months (T3. The scans were managed in Dolphin software, where landmarks were marked and measured, on a coronal slice passing through the upper first molar. The paired Student´s t-test was used to identify significant differences (p<0.05 between T2 and T1, T3 and T2, and T3 and T1. RESULTS: Immediately after RME, the mean increase in maxillary basal, alveolar and dental width was 1.95 mm, 4.30 mm and 6.89 mm, respectively. This was accompanied by buccal inclination of the right (7.31° and left (6.46° first molars. At the end of the retention period, the entire transverse dimension increased was maintained and the dentoalveolar inclination resumed

  14. User manual for AQUASTOR: a computer model for cost analysis of aquifer thermal-energy storage oupled with district-heating or cooling systems. Volume II. Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Huber, H.D.; Brown, D.R.; Reilly, R.W.

    1982-04-01

    A computer model called AQUASTOR was developed for calculating the cost of district heating (cooling) using thermal energy supplied by an aquifer thermal energy storage (ATES) system. the AQUASTOR Model can simulate ATES district heating systems using stored hot water or ATES district cooling systems using stored chilled water. AQUASTOR simulates the complete ATES district heating (cooling) system, which consists of two prinicpal parts: the ATES supply system and the district heating (cooling) distribution system. The supply system submodel calculates the life-cycle cost of thermal energy supplied to the distribution system by simulating the technical design and cash flows for the exploration, development, and operation of the ATES supply system. The distribution system submodel calculates the life-cycle cost of heat (chill) delivered by the distribution system to the end-users by simulating the technical design and cash flows for the construction and operation of the distribution system. The model combines the technical characteristics of the supply system and the technical characteristics of the distribution system with financial and tax conditions for the entities operating the two systems into one techno-economic model. This provides the flexibility to individually or collectively evaluate the impact of different economic and technical parameters, assumptions, and uncertainties on the cost of providing district heating (cooling) with an ATES system. This volume contains all the appendices, including supply and distribution system cost equations and models, descriptions of predefined residential districts, key equations for the cooling degree-hour methodology, a listing of the sample case output, and appendix H, which contains the indices for supply input parameters, distribution input parameters, and AQUASTOR subroutines.

  15. Unified QSAR & network-based computational chemistry approach to antimicrobials. II. Multiple distance and triadic census analysis of antiparasitic drugs complex networks.

    Science.gov (United States)

    Prado-Prado, Francisco J; Ubeira, Florencio M; Borges, Fernanda; González-Díaz, Humberto

    2010-01-15

    In the previous work, we reported a multitarget Quantitative Structure-Activity Relationship (mt-QSAR) model to predict drug activity against different fungal species. This mt-QSAR allowed us to construct a drug-drug multispecies Complex Network (msCN) to investigate drug-drug similarity (González-Díaz and Prado-Prado, J Comput Chem 2008, 29, 656). However, important methodological points remained unclear, such as follows: (1) the accuracy of the methods when applied to other problems; (2) the effect of the distance type used to construct the msCN; (3) how to perform the inverse procedure to study species-species similarity with multidrug resistance CNs (mdrCN); and (4) the implications and necessary steps to perform a substructural Triadic Census Analysis (TCA) of the msCN. To continue the present series with other important problem, we developed here a mt-QSAR model for more than 700 drugs tested in the literature against different parasites (predicting antiparasitic drugs). The data were processed by Linear Discriminate Analysis (LDA) and the model classifies correctly 93.62% (1160 out of 1239 cases) in training. The model validation was carried out by means of external predicting series; the model classified 573 out of 607, that is, 94.4% of cases. Next, we carried out the first comparative study of the topology of six different drug-drug msCNs based on six different distances such as Euclidean, Chebychev, Manhattan, etc. Furthermore, we compared the selected drug-drug msCN and species-species mdsCN with random networks. We also introduced here the inverse methodology to construct species-species msCN based on a mt-QSAR model. Last, we reported the first substructural analysis of drug-drug msCN using Triadic Census Analysis (TCA) algorithm. Copyright 2009 Wiley Periodicals, Inc.

  16. Visualizing Infrared (IR) Spectroscopy with Computer Animation

    Science.gov (United States)

    Abrams, Charles B.; Fine, Leonard W.

    1996-01-01

    IR Tutor, an interactive, animated infrared (IR) spectroscopy tutorial has been developed for Macintosh and IBM-compatible computers. Using unique color animation, complicated vibrational modes can be introduced to beginning students. Rules governing the appearance of IR absorption bands become obvious because the vibrational modes can be visualized. Each peak in the IR spectrum is highlighted, and the animation of the corresponding normal mode can be shown. Students can study each spectrum stepwise, or click on any individual peak to see its assignment. Important regions of each spectrum can be expanded and spectra can be overlaid for comparison. An introduction to the theory of IR spectroscopy is included, making the program a complete instructional package. Our own success in using this software for teaching and research in both academic and industrial environments will be described. IR Tutor consists of three sections: (1) The 'Introduction' is a review of basic principles of spectroscopy. (2) 'Theory' begins with the classical model of a simple diatomic molecule and is expanded to include larger molecules by introducing normal modes and group frequencies. (3) 'Interpretation' is the heart of the tutorial. Thirteen IR spectra are analyzed in detail, covering the most important functional groups. This section features color animation of each normal mode, full interactivity, overlay of related spectra, and expansion of important regions. This section can also be used as a reference.

  17. Influence of the solvent on the self-assembly of a modified amyloid beta peptide fragment. II. NMR and computer simulation investigation.

    Science.gov (United States)

    Hamley, I W; Nutt, D R; Brown, G D; Miravet, J F; Escuder, B; Rodríguez-Llansola, F

    2010-01-21

    The conformation of a model peptide AAKLVFF based on a fragment of the amyloid beta peptide Abeta16-20, KLVFF, is investigated in methanol and water via solution NMR experiments and molecular dynamics computer simulations. In previous work, we have shown that AAKLVFF forms peptide nanotubes in methanol and twisted fibrils in water. Chemical shift measurements were used to investigate the solubility of the peptide as a function of concentration in methanol and water. This enabled the determination of critical aggregation concentrations. The solubility was lower in water. In dilute solution, diffusion coefficients revealed the presence of intermediate aggregates in concentrated solution, coexisting with NMR-silent larger aggregates, presumed to be beta-sheets. In water, diffusion coefficients did not change appreciably with concentration, indicating the presence mainly of monomers, coexisting with larger aggregates in more concentrated solution. Concentration-dependent chemical shift measurements indicated a folded conformation for the monomers/intermediate aggregates in dilute methanol, with unfolding at higher concentration. In water, an antiparallel arrangement of strands was indicated by certain ROESY peak correlations. The temperature-dependent solubility of AAKLVFF in methanol was well described by a van't Hoff analysis, providing a solubilization enthalpy and entropy. This pointed to the importance of solvophobic interactions in the self-assembly process. Molecular dynamics simulations constrained by NOE values from NMR suggested disordered reverse turn structures for the monomer, with an antiparallel twisted conformation for dimers. To model the beta-sheet structures formed at higher concentration, possible model arrangements of strands into beta-sheets with parallel and antiparallel configurations and different stacking sequences were used as the basis for MD simulations; two particular arrangements of antiparallel beta-sheets were found to be stable, one

  18. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  19. Synthesis, structural characterization, antibacterial activity and computational studies of new cobalt (II) complexes with 1,1,3,3-tetrakis (3,5-dimethyl-1-pyrazolyl)propane ligand

    Science.gov (United States)

    Beheshti, Azizolla; Safaeiyan, Forough; Hashemi, Faeze; Motamedi, Hossein; Mayer, Peter; Bruno, Giuseppe; Rudbari, Hadi Amiri

    2016-11-01

    Two new mono- and dinuclear Co(II) complexes namely [Co(tdmpp)Cl2]2·H2O (1) and [Co2(tdmpp)Cl4] (2) (where tdmpp = 1,1,3,3-tetrakis(3,5-dimethyl-1-pyrazolyl)propane) were prepared by one-pot reactions in methanol as a solvent. These compounds have been characterized by single crystal X-ray diffraction, elemental analysis, infrared spectroscopy, antibacterial activity and computational studies. In both complexes, Co (II) atom is tetrahedrally coordinated by two N atoms from one of the chelating bidentate bis(3,5-dimethylpyrazolyl)methane units of the tdmpp ligand and two Cl as terminal ligands. In these structures, the neighboring [Co(tdmpp)Cl2]2·H2O (1) and [Co2(tdmpp)Cl4] (2) molecules are joined together by the intermolecular Csbnd H⋯Cl hydrogen bonds to form a 1D chain structure. As a consequence of the intermolecular Csbnd H⋯π interactions these chains are further linked to generate a two-dimensional non-covalent bonded structure. The in vitro antibacterial activity studies of the free tdmpp ligand, compounds 1 and 2 show that the ability of these compounds to inhibit growth of the tested bacteria increase progressively from tdmpp to the dinuclear complex 2. Molecular-docking investigations between the five standard antibiotic, free tdmpp ligand, title complexes and five biological macromolecule enzymes (receptors) were carried out from using Autodock vina function. The results of docking studies confirmed that the metal complexes are more active than the free ligand. This is consistent with the results obtained by the antibacterial activities of these compounds.

  20. HC video laryngoscope and Macintosh laryngoscope used to observe the effect of contrast conventional intubation anesthesia%HC视频喉镜与 Macintosh喉镜引导麻醉气管插管的效果比较

    Institute of Scientific and Technical Information of China (English)

    向璟

    2014-01-01

    目的:比较HC视频喉镜与传统Macintosh喉镜引导气管插管的效果。方法选取模拟颈椎制动患者共100例,随机分为观察组和对照组,每组各50例。对照组采用常规的Macintosh喉镜进行插管,观察组采用HC视频喉镜插管,统计并比较两种方法的插管时间、喉镜显露难易评分、插管时心血管反应、声门喉镜显露评级、插管失败率。结果观察组插管时间[(22.1±8.5)vs(55.3±9.0)]s,暴露难易程度、声门喉镜显露评级优良率等各项指标均显著优于对照组(P<0.05)。结论颈椎制动患者插管治疗时,HC视频喉镜引导气管插管的效果优于Macintosh喉镜,能够提高插管效率、成功率和声门显露效果。%Objective To investigate the effect of HC video laryngoscope and Macintosh laryngoscope used in traditional clini-cal intubation anesthesia.Methods From 2013 January to 2014 January in this hospital, cervical spine immobilization was simulated in 100 patients, who were randomly divided into two groups, 50 patients in each group.The control group was treated with conventional Macintosh laryngoscope intubation ,and the observation group were given HC video laryngoscope in treatment, to observe and compare the therapeutic effects in the two groups.Results In the observation group, 50 patients received HC video laryngoscope intubation for treatment, the intubation time was (22.1 ±8.5) s,whereas that in the control group was (55.3 ±9.0) s;and for the patients in the observation group,the intubation difficulty score was significantly lower than that in the control group, with significant differences be-tween the two groups(P<0.05).Conclusions For the patients with cervical spine immobilization intubation in clinical treatment, the HC video laryngoscope intubation can be used, the difficulty is low, with few cardiovascular reactions , high success rate.The effect of HC video laryngoscope intubation is obvious

  1. Graphics gems II

    CERN Document Server

    Arvo, James

    1991-01-01

    Graphics Gems II is a collection of articles shared by a diverse group of people that reflect ideas and approaches in graphics programming which can benefit other computer graphics programmers.This volume presents techniques for doing well-known graphics operations faster or easier. The book contains chapters devoted to topics on two-dimensional and three-dimensional geometry and algorithms, image processing, frame buffer techniques, and ray tracing techniques. The radiosity approach, matrix techniques, and numerical and programming techniques are likewise discussed.Graphics artists and comput

  2. Potentiometric study of atenolol as hypertension drug with Co(II, Ni(II, Cu(II and Zn(II transition metal ions in aqueous solution

    Directory of Open Access Journals (Sweden)

    Abdulbaset A. Zaid

    2015-01-01

    Full Text Available Binary and ternary complexes of Co(II, Ni(II, Cu(II and Zn(II with atenolol as hypertension drug and glycine have been determined pH metrically at room temperature and 0.01 M ionic strength (NaClO4 in aqueous solution. The formation of various possible species has been evaluated by computer program and discussed in terms of various relative stability parameters.

  3. Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II binary complexes of l-methionine in 1,2-propanediol-water mixtures

    Directory of Open Access Journals (Sweden)

    M. Padma Latha

    2007-04-01

    Full Text Available Chemical speciation of Pb(II, Cd(II, Hg(II, Co(II, Ni(II, Cu(II and Zn(II complexes of L-methionine in 0.0-60 % v/v 1,2-propanediol-water mixtures maintaining an ionic strength of 0.16 M at 303 K has been studied pH metrically. The active forms of ligand are LH2+, LH and L-. The predominant species detected are ML, MLH, ML2, ML2H, ML2H2 and MLOH. Models containing different numbers of species were refined by using the computer program MINIQUAD 75. The best-fit chemical models were arrived at based on statistical parameters. The trend in variation of complex stability constants with change in the dielectric constant of the medium is explained on the basis of electrostatic and non-electrostatic forces.

  4. Pb II

    African Journals Online (AJOL)

    Windows User

    ISSN 1684–5315 ©2012 Academic Journals ... Exposure to Pb above permissible limit (50 ppb in water) .... taken and analyzed for residual metal concentration determination. ..... loss in Pb(II) sorption capacity up to five cycles of reuse of.

  5. Container II

    OpenAIRE

    Baraklianou, Stella

    2016-01-01

    Container II, self-published artists book.\\ud The book was made on the occasion of the artists residency at the Banff Arts Centre, in Alberta Canada. \\ud \\ud Container II is a performative piece, it worked in conjunction with the photographic installation "Stage Set: Cool Tone" . (photographic floor installation, Reclaimed wood, frames, 130x145cm, 2016) \\ud The photographic installation was also part of the artists residency titled "New Materiality" at the Banff Arts Centre. \\ud \\ud Limited E...

  6. High-risk Plaque Detected on Coronary Computed Tomography Angiography Predicts Acute Coronary Syndrome Independent of Significant Stenosis in Patients with Acute Chest Pain – Results from ROMICAT II Trial

    Science.gov (United States)

    Puchner, Stefan B.; Liu, Ting; Mayrhofer, Thomas; Truong, Quynh A.; Lee, Hang; Fleg, Jerome L.; Nagurney, John T.; Udelson, James E.; Hoffmann, Udo; Ferencik, Maros

    2014-01-01

    Background To determine whether high-risk plaque as detected by coronary computed tomography angiography (CTA) permits improved early diagnosis of acute coronary syndrome (ACS) independent to the presence of significant CAD in acute chest pain patients. Objectives The primary aim was to determine whether high-risk plaque features, as detected by CTA in the emergency department, may improve diagnostic certainty of ACS independent and incremental to the presence of significant CAD and clinical risk assessment in patients with acute chest pain but without objective evidence of myocardial ischemia or myocardial infarction. Methods We included patients randomized to the CCTA arm of ROMICAT II trial. Readers assessed coronary CTA qualitatively for the presence of non-obstructive CAD (1-49% stenosis), significant CAD (≥50% or ≥70% stenosis), and the presence of at least 1 of the high-risk plaque features (positive remodeling, low coronary CTA with diagnostic image quality (mean age 53.9±8.0 years, 52.8% men) had ACS (7.8%; MI n=5, UAP n=32)]. CAD was present in 262 (55.5%) patients [non-obstructive CAD 217 (46.0%) patients, significant CAD with ≥50% stenosis 45 (9.5%) patients]. High-risk plaques were more frequent in patients with ACS and remained a significant predictor of ACS (OR 8.9, 95% CI 1.8-43.3, p=0.006) after adjusting for ≥50% stenosis (OR 38.6, 95% CI 14.2-104.7, pstenosis. Conclusions In patients presenting to the ED with acute chest pain but negative initial electrocardiogram and troponin, presence of high-risk plaque on coronary CTA increases the likelihood of ACS independent of significant CAD and clinical risk assessment (age, gender, and number of cardiovascular risk factors). PMID:25125300

  7. Computer Series, 38.

    Science.gov (United States)

    Moore, John W., Ed.

    1983-01-01

    Discusses numerical solution of the one-dimension Schrodinger equation. A PASCAL computer program for the Apple II which performs the calculations is available from the authors. Also discusses quantization and perturbation theory using microcomputers, indicating benefits of using the addition of a perturbation term to harmonic oscillator as an…

  8. TBscore II

    DEFF Research Database (Denmark)

    Rudolf, Frauke; Lemvik, Grethe; Abate, Ebba;

    2013-01-01

    Abstract Background: The TBscore, based on simple signs and symptoms, was introduced to predict unsuccessful outcome in tuberculosis patients on treatment. A recent inter-observer variation study showed profound variation in some variables. Further, some variables depend on a physician assessing...... them, making the score less applicable. The aim of the present study was to simplify the TBscore. Methods: Inter-observer variation assessment and exploratory factor analysis were combined to develop a simplified score, the TBscore II. To validate TBscore II we assessed the association between start...

  9. Office 2011 for Macintosh The Missing Manual

    CERN Document Server

    Grover, Chris

    2010-01-01

    Office 2011 for Mac is easy to use, but to unleash its full power, you need to go beyond the basics. This entertaining guide not only gets you started with Word, Excel, PowerPoint, and the new Outlook for Mac, it also reveals useful lots of things you didn't know the software could do. Get crystal-clear explanations on the features you use most -- and plenty of power-user tips when you're ready for more. Take advantage of new tools. Navigate with the Ribbon, use SmartArt graphics, and work online with Office Web Apps.Create professional-looking documents. Use Word to craft beautiful reports,

  10. The Research of the IIS6 FTP Server Platform Security Issues and Protection Measures in the Centre of the Computer%IIS6平台下FTP服务器的安全性问题的研究及防护措施

    Institute of Scientific and Technical Information of China (English)

    安晓瑞

    2012-01-01

    IIS6是Windows Server中Web服务器组件,通常IIS6来建立WEB站点,很少用IIS6来建立FTP站点,提供FTP上传下载服务.经过一段时间的研究发现IIS6所带的FTP服务器功能并不比其他专业服务器软件逊色,而且由于其用户管理和Windows的用户管理整合在一起,与Windows有更好的整合性,使其变得更加安全.本文将详细分析使用IIS6建立FTP服务器遇到的安全问题及防护措施.%IIS6 release of Windows Server,a Web server components,usually IIS6 t o create a web site,rarely using IIS6 to create an FTP site,provides FTP uploa ding and downloading services.Study found that after a period of not the IIS6 b rought FTP server function less than other professional server software,and int egrated user management and Windows user management,better integration with Win dows,make it a safer place.This article has a detailed analysis of the IIS6 t o establish an FTP server encountered security issues and protection measures.

  11. Computer Reader for the Blind

    Science.gov (United States)

    1990-01-01

    Optacon II uses the same basic technique of converting printed information into a tactile image as did Optacon. Optacon II can also be connected directly to a personal computer, which opens up a new range of job opportunities for the blind. Optacon II is not limited to reading printed words, it can convert any graphic image viewed by the camera. Optacon II demands extensive training for blind operators. TSI provides 60-hour training courses at its Mountain View headquarters and at training centers around the world. TeleSensory discontinued production of the Optacon as of December 1996.

  12. Improved selectivity for Pb(II) by sulfur, selenium and tellurium analogues of 1,8-anthraquinone-18-crown-5: synthesis, spectroscopy, X-ray crystallography and computational studies.

    Science.gov (United States)

    Mariappan, Kadarkaraisamy; Alaparthi, Madhubabu; Hoffman, Mariah; Rama, Myriam Alcantar; Balasubramanian, Vinothini; John, Danielle M; Sykes, Andrew G

    2015-07-14

    We report here a series of heteroatom-substituted macrocycles containing an anthraquinone moiety as a fluorescent signaling unit and a cyclic polyheteroether chain as the receptor. Sulfur, selenium, and tellurium derivatives of 1,8-anthraquinone-18-crown-5 (1) were synthesized by reacting sodium sulfide (Na2S), sodium selenide (Na2Se) and sodium telluride (Na2Te) with 1,8-bis(2-bromoethylethyleneoxy)anthracene-9,10-dione in a 1 : 1 ratio. The optical properties of the new compounds are examined and the sulfur and selenium analogues produce an intense green emission enhancement upon association with Pb(II) in acetonitrile. Selectivity for Pb(II) is markedly improved as compared to the oxygen analogue 1 which was also competitive for Ca(II) ion. UV-Visible and luminescence titrations reveal that 2 and 3 form 1 : 1 complexes with Pb(II), confirmed by single-crystal X-ray studies where Pb(II) is complexed within the macrocycle through coordinate covalent bonds to neighboring carbonyl, ether and heteroether donor atoms. Cyclic voltammetry of 2-8 showed classical, irreversible oxidation potentials for sulfur, selenium and tellurium heteroethers in addition to two one-electron reductions for the anthraquinone carbonyl groups. DFT calculations were also conducted on 1, 2, 3, 6, 6 + Pb(II) and 6 + Mg(II) to determine the trend in energies of the HOMO and the LUMO levels along the series.

  13. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  14. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  15. A Nuclear Reactions Primer with Computers.

    Science.gov (United States)

    Calle, Carlos I.; Roach, Jennifer A.

    1987-01-01

    Described is a microcomputer software program NUCLEAR REACTIONS designed for college level students and in use at Sweet Briar College (Sweet Briar, VA). The program is written in Microsoft Basic Version 2.1 for the Apple Macintosh Microcomputer. It introduces two conservation principles: (1) conservation of charge; and (2) conservation of nucleon…

  16. Felipe II

    Directory of Open Access Journals (Sweden)

    Carlos Restrepo Canal

    1962-04-01

    Full Text Available Como parte de la monumental Historia de España que bajo la prestante y acertadísima dirección de don Ramón Menéndez Pidal se comenzó a dar a la prensa desde 1954 por la Editorial Espasa Calpe S. A., aparecieron en 1958 dos tomos dedicados al reinado de Felipe II; aquella época en que el imperio español alcanzó su unidad peninsular juntamente con el dilatado poderío que le constituyó en la primera potencia de Europa.

  17. Project Final Report: HPC-Colony II

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Terry R [ORNL; Kale, Laxmikant V [University of Illinois, Urbana-Champaign; Moreira, Jose [IBM T. J. Watson Research Center

    2013-11-01

    This report recounts the HPC Colony II Project which was a computer science effort funded by DOE's Advanced Scientific Computing Research office. The project included researchers from ORNL, IBM, and the University of Illinois at Urbana-Champaign. The topic of the effort was adaptive system software for extreme scale parallel machines. A description of findings is included.

  18. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    高振桥

    2002-01-01

    If you work with a computer,it is certain that you can not avoid dealing, with at least one computer virus.But how much do you know about it? Well,actually,a computer virus is not a biological' one as causes illnesses to people.It is a kind of computer program

  19. Grid Computing

    Indian Academy of Sciences (India)

    2016-05-01

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers on demand. In this article,we describe the grid computing model and enumerate themajor differences between grid and cloud computing.

  20. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  1. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  2. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  3. Computational chemistry

    OpenAIRE

    2000-01-01

    Computational chemistry has come of age. With significant strides in computer hardware and software over the last few decades, computational chemistry has achieved full partnership with theory and experiment as a tool for understanding and predicting the behavior of a broad range of chemical, physical, and biological phenomena. The Nobel Prize award to John Pople and Walter Kohn in 1998 highlighted the importance of these advances in computational chemistry. With massively parallel computers ...

  4. Computer Assisted Cytologic Assessment,

    Science.gov (United States)

    1983-12-27

    Exfoliative Cytology in Oral Cancer Detection. Oral Surg 17. 327-330. TABLE I CATEGORIZATION ACCORDING TO NUCLEAR:CYTOPLASMIC RATIOS CLASS I CLASS II CLASS III... CYTOLOGIC ASSESSMENT Manuscript for Publication 6. PERFORMING ORG. REPORT NUMBER 7. AUTHOR(s) 8. CONTRACT OR GRANT NUMBER(s) Bryant, Michael L...CLASSIFICATION OF TH4IS PAGM(37,.n D~a Enterod) d, SF myvqN~ A SF’ .xt COMPUTER ASSISTED CYTOLOGIC ASSESSMENT BRYANT, MICHAEL L. HOLLINGER, JEFFREY 0

  5. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  6. Duality Computing in Quantum Computers

    Institute of Scientific and Technical Information of China (English)

    LONG Gui-Lu; LIU Yang

    2008-01-01

    In this letter, we propose a duality computing mode, which resembles particle-wave duality property when a quantum system such as a quantum computer passes through a double-slit. In this mode, computing operations are not necessarily unitary. The duality mode provides a natural link between classical computing and quantum computing. In addition, the duality mode provides a new tool for quantum algorithm design.

  7. Computational manufacturing

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a general framework for computational manufacturing. The methodology of computational manufacturing aims at integrating computational geometry, machining principle, sensor information fusion, optimization, computational intelligence and virtual prototyping to solve problems of the modeling, reasoning, control, planning and scheduling of manufacturing processes and systems. There are three typical problems in computational manufacturing, i.e., scheduling (time-domain), geometric reasoning (space-domain) and decision- making (interaction between time-domain and space-domain). Some theoretical fundamentals of computational manufacturing are also discussed.

  8. Contextual Computing

    CERN Document Server

    Porzel, Robert

    2011-01-01

    This book uses the latest in knowledge representation and human-computer interaction to address the problem of contextual computing in artificial intelligence. It uses high-level context to solve some challenging problems in natural language understanding.

  9. Computer Algebra.

    Science.gov (United States)

    Pavelle, Richard; And Others

    1981-01-01

    Describes the nature and use of computer algebra and its applications to various physical sciences. Includes diagrams illustrating, among others, a computer algebra system and flow chart of operation of the Euclidean algorithm. (SK)

  10. Computational dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Siebert, B.R.L.; Thomas, R.H.

    1996-01-01

    The paper presents a definition of the term ``Computational Dosimetry`` that is interpreted as the sub-discipline of computational physics which is devoted to radiation metrology. It is shown that computational dosimetry is more than a mere collection of computational methods. Computational simulations directed at basic understanding and modelling are important tools provided by computational dosimetry, while another very important application is the support that it can give to the design, optimization and analysis of experiments. However, the primary task of computational dosimetry is to reduce the variance in the determination of absorbed dose (and its related quantities), for example in the disciplines of radiological protection and radiation therapy. In this paper emphasis is given to the discussion of potential pitfalls in the applications of computational dosimetry and recommendations are given for their avoidance. The need for comparison of calculated and experimental data whenever possible is strongly stressed.

  11. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  12. Quantum computing

    OpenAIRE

    Li, Shu-Shen; Long, Gui-lu; Bai, Feng-Shan; Feng, Song-Lin; Zheng, Hou-Zhi

    2001-01-01

    Quantum computing is a quickly growing research field. This article introduces the basic concepts of quantum computing, recent developments in quantum searching, and decoherence in a possible quantum dot realization.

  13. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  14. Second-order adjoint sensitivity analysis methodology (2nd-ASAM) for computing exactly and efficiently first- and second-order sensitivities in large-scale linear systems: II. Illustrative application to a paradigm particle diffusion problem

    Science.gov (United States)

    Cacuci, Dan G.

    2015-03-01

    This work presents an illustrative application of the second-order adjoint sensitivity analysis methodology (2nd-ASAM) to a paradigm neutron diffusion problem, which is sufficiently simple to admit an exact solution, thereby making transparent the underlying mathematical derivations. The general theory underlying 2nd-ASAM indicates that, for a physical system comprising Nα parameters, the computation of all of the first- and second-order response sensitivities requires (per response) at most (2Nα + 1) "large-scale" computations using the first-level and, respectively, second-level adjoint sensitivity systems (1st-LASS and 2nd-LASS). Very importantly, however, the illustrative application presented in this work shows that the actual number of adjoint computations needed for computing all of the first- and second-order response sensitivities may be significantly less than (2Nα + 1) per response. For this illustrative problem, four "large-scale" adjoint computations sufficed for the complete and exact computations of all 4 first- and 10 distinct second-order derivatives. Furthermore, the construction and solution of the 2nd-LASS requires very little additional effort beyond the construction of the adjoint sensitivity system needed for computing the first-order sensitivities. Very significantly, only the sources on the right-sides of the diffusion (differential) operator needed to be modified; the left-side of the differential equations (and hence the "solver" in large-scale practical applications) remained unchanged. All of the first-order relative response sensitivities to the model parameters have significantly large values, of order unity. Also importantly, most of the second-order relative sensitivities are just as large, and some even up to twice as large as the first-order sensitivities. In the illustrative example presented in this work, the second-order sensitivities contribute little to the response variances and covariances. However, they have the

  15. Computable models

    CERN Document Server

    Turner, Raymond

    2009-01-01

    Computational models can be found everywhere in present day science and engineering. In providing a logical framework and foundation for the specification and design of specification languages, Raymond Turner uses this framework to introduce and study computable models. In doing so he presents the first systematic attempt to provide computational models with a logical foundation. Computable models have wide-ranging applications from programming language semantics and specification languages, through to knowledge representation languages and formalism for natural language semantics. They are al

  16. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...... cybernetics and Maturana and Varela’s theory of autopoiesis, which are both erroneously taken to support info-computationalism....

  17. Computing fundamentals introduction to computers

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    The absolute beginner's guide to learning basic computer skills Computing Fundamentals, Introduction to Computers gets you up to speed on basic computing skills, showing you everything you need to know to conquer entry-level computing courses. Written by a Microsoft Office Master Instructor, this useful guide walks you step-by-step through the most important concepts and skills you need to be proficient on the computer, using nontechnical, easy-to-understand language. You'll start at the very beginning, getting acquainted with the actual, physical machine, then progress through the most common

  18. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  19. Computational Complexity

    Directory of Open Access Journals (Sweden)

    J. A. Tenreiro Machado

    2017-02-01

    Full Text Available Complex systems (CS involve many elements that interact at different scales in time and space. The challenges in modeling CS led to the development of novel computational tools with applications in a wide range of scientific areas. The computational problems posed by CS exhibit intrinsic difficulties that are a major concern in Computational Complexity Theory. [...

  20. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    of the new microprocessors and network technologies. However, the understanding of the computer represented within this program poses a challenge for the intentions of the program. The computer is understood as a multitude of invisible intelligent information devices which confines the computer as a tool...

  1. Distributed Computing.

    Science.gov (United States)

    Ryland, Jane N.

    1988-01-01

    The microcomputer revolution, in which small and large computers have gained tremendously in capability, has created a distributed computing environment. This circumstance presents administrators with the opportunities and the dilemmas of choosing appropriate computing resources for each situation. (Author/MSE)

  2. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  3. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot en...

  4. Computer Ease.

    Science.gov (United States)

    Drenning, Susan; Getz, Lou

    1992-01-01

    Computer Ease is an intergenerational program designed to put an Ohio elementary school's computer lab, software library, staff, and students at the disposal of older adults desiring to become computer literate. Three 90-minute instructional sessions allow seniors to experience 1-to-1 high-tech instruction by enthusiastic, nonthreatening…

  5. The Computing Grids

    Energy Technology Data Exchange (ETDEWEB)

    Govoni, P. [Universita and INFN Milano-Bicocca (Italy)

    2009-12-15

    Since the beginning of the millennium, High Energy Physics research institutions like CERN and INFN pioneered several projects aimed at exploiting the synergy among computing power, storage and network resources, and creating an infrastructure of distributed computing on a worldwide scale. In the year 2000, after the Monarch project [(http://monarc.web.cern.ch/MONARC/)], DataGrid started [(http://eu-datagrid.web.cern.ch/eu-datagrid/)] aimed at providing High Energy Physics with the computing power needed for the LHC enterprise. This program evolved into the EU DataGrid project, that implemented the first actual prototype of a Grid middleware running on a testbed environment. The next step consisted in the application to the LHC experiments, with the LCG project [(http://lcg.web.cern.ch/LCG/)], in turn followed by the EGEE [(http://www.eu-egee.org/)] and EGEE II programs.

  6. A computational mechanistic investigation of hydrogen production in water using the [Rh(III)(dmbpy)2Cl2](+)/[Ru(II)(bpy)3](2+)/ascorbic acid photocatalytic system.

    Science.gov (United States)

    Kayanuma, Megumi; Stoll, Thibaut; Daniel, Chantal; Odobel, Fabrice; Fortage, Jérôme; Deronzier, Alain; Collomb, Marie-Noëlle

    2015-04-28

    We recently reported an efficient molecular homogeneous photocatalytic system for hydrogen (H2) production in water combining [Rh(III)(dmbpy)2Cl2](+) (dmbpy = 4,4'-dimethyl-2,2'-bipyridine) as a H2 evolving catalyst, [Ru(II)(bpy)3](2+) (bpy = 2,2'-bipyridine) as a photosensitizer and ascorbic acid as a sacrificial electron donor (Chem. - Eur. J., 2013, 19, 781). Herein, the possible rhodium intermediates and mechanistic pathways for H2 production with this system were investigated at DFT/B3LYP level of theory and the most probable reaction pathways were proposed. The calculations confirmed that the initial step of the mechanism is a reductive quenching of the excited state of the Ru photosensitizer by ascorbate, affording the reduced [Ru(II)(bpy)2(bpy˙(-))](+) form, which is capable, in turn, of reducing the Rh(III) catalyst to the distorted square planar [Rh(I)(dmbpy)2](+) species. This two-electron reduction by [Ru(II)(bpy)2(bpy˙(-))](+) is sequential and occurs according to an ECEC mechanism which involves the release of one chloride after each one-electron reduction step of the Rh catalyst. The mechanism of disproportionation of the intermediate Rh(II) species, much less thermodynamically favoured, cannot be barely ruled out since it could also be favoured from a kinetic point of view. The Rh(I) catalyst reacts with H3O(+) to generate the hexa-coordinated hydride [Rh(III)(H)(dmbpy)2(X)](n+) (X = Cl(-) or H2O), as the key intermediate for H2 release. The DFT study also revealed that the real source of protons for the hydride formation as well as the subsequent step of H2 evolution is H3O(+) rather than ascorbic acid, even if the latter does govern the pH of the aqueous solution. Besides, the calculations have shown that H2 is preferentially released through an heterolytic mechanism by reaction of the Rh(III)(H) hydride and H3O(+); the homolytic pathway, involving the reaction of two Rh(III)(H) hydrides, being clearly less favoured. In parallel to this

  7. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  8. Computer science

    CERN Document Server

    Blum, Edward K

    2011-01-01

    Computer Science: The Hardware, Software and Heart of It focuses on the deeper aspects of the two recognized subdivisions of Computer Science, Software and Hardware. These subdivisions are shown to be closely interrelated as a result of the stored-program concept. Computer Science: The Hardware, Software and Heart of It includes certain classical theoretical computer science topics such as Unsolvability (e.g. the halting problem) and Undecidability (e.g. Godel's incompleteness theorem) that treat problems that exist under the Church-Turing thesis of computation. These problem topics explain in

  9. Computer Science Research: Computation Directorate

    Energy Technology Data Exchange (ETDEWEB)

    Durst, M.J. (ed.); Grupe, K.F. (ed.)

    1988-01-01

    This report contains short papers in the following areas: large-scale scientific computation; parallel computing; general-purpose numerical algorithms; distributed operating systems and networks; knowledge-based systems; and technology information systems.

  10. Density functional computations of the cyclopropanation of ethene catalyzed by iron (II) carbene complexes Cp(CO)(L)Fe=CHR, L D CO, PMe3, R D Me, OMe, ph, CO2Me

    Science.gov (United States)

    Wang, Fen; Meng, Qingxi; Li, Ming

    Density functional theory has been used to study the Fe-catalyzed cyclopropanation of Fe-carbene complexes with ethene. All the intermediates and transition states were optimized completely at the B3LYP/6-31+G(d,p) level. Calculation results confirm that the cyclopropanation of Fe-carbene complexes with ethene involves the two reaction paths I and II. In the reaction path I, the double bond of ethene attacks directly on the carbene carbon of Fe-carbene complexes to generate the cyclopropane. In the reaction path II, ethene substitution for PMe3 or CO in the Fe-carbene complexes leads to the complexes M2; and the attack of one carbon of ethene on the carbene carbon results in the complexes M3 with a Fe bond C bond C bond C four-membered ring, and then generates the cyclopropane via the elimination reaction. For Fe-carbene complexes A, C, D, E, and H, the main reaction mode is the reaction path I; for Fe-carbene complexes B, F, and G, the main reaction mode is the reaction path II.0

  11. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  12. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  13. Elizabeth II uus kunstigalerii

    Index Scriptorium Estoniae

    1999-01-01

    Tähistamaks oma troonile asumise 50. aastapäeva, avab Elizabeth II 6. II 2002 Buckinghami palees uue kunstigalerii, mis ehitatakse palee tiibhoonena. Arhitekt John Simpson. Elizabeth II kunstikogust

  14. Elizabeth II uus kunstigalerii

    Index Scriptorium Estoniae

    1999-01-01

    Tähistamaks oma troonile asumise 50. aastapäeva, avab Elizabeth II 6. II 2002 Buckinghami palees uue kunstigalerii, mis ehitatakse palee tiibhoonena. Arhitekt John Simpson. Elizabeth II kunstikogust

  15. RELAP4/MOD5: a computer program for transient thermal-hydraulic analysis of nuclear reactors and related systems. User's manual. Volume II. Program implementation. [PWR and BWR

    Energy Technology Data Exchange (ETDEWEB)

    None

    1976-09-01

    This portion of the RELAP4/MOD5 User's Manual presents the details of setting up and entering the reactor model to be evaluated. The input card format and arrangement is presented in depth, including not only cards for data but also those for editing and restarting. Problem initalization including pressure distribution and energy balance is discussed. A section entitled ''User Guidelines'' is included to provide modeling recommendations, analysis and verification techniques, and computational difficulty resolution. The section is concluded with a discussion of the computer output form and format.

  16. Computer Literacy: Teaching Computer Ethics.

    Science.gov (United States)

    Troutner, Joanne

    1986-01-01

    Suggests learning activities for teaching computer ethics in three areas: (1) equal access; (2) computer crime; and (3) privacy. Topics include computer time, advertising, class enrollments, copyright law, sabotage ("worms"), the Privacy Act of 1974 and the Freedom of Information Act of 1966. (JM)

  17. Foreign Language Teaching and the Computer.

    Science.gov (United States)

    Garrett, Nina, Ed.; Hart, Robert S., Ed.

    1986-01-01

    "Morgens geht Fritz zur Schule" is a software tutorial and drill program, designed for use with the Apple II computer, which provides practice with German prepositions for students in a beginning German language course. (CB)

  18. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  19. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  20. Quantum Computing

    CERN Document Server

    Steane, A M

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarise not just quantum computing, but the whole subject of quantum information theory. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, the review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the EPR experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from classical information theory, and, arguably, quantum from classical physics. Basic quantum information ideas are described, including key distribution, teleportation, data compression, quantum error correction, the universal quantum computer and qua...

  1. Computational Study of Metal-Dinitrogen Keggin-Type Polyoxometalate Complexes [PW11O39M(II)N2)](5-) (M = Ru, Os, Re, Ir): Bonding Nature and Dinitrogen Splitting.

    Science.gov (United States)

    Liu, Chun-Guang; Liu, Shuang; Zheng, Ting

    2015-08-17

    Molecular geometry, electronic structure, and metal-dinitrogen bonding nature of a series of metal-dinitrogen derivatives of Keggin-type polyoxometalates (POMs) [PW11O39M(II)N2)](5-) (M = Ru, Os, Re, Ir) have been studied by using a density functional theory (DFT) method with the M06L functional. Among these Keggin-type POM complexes, Os- and Re-substituted POM complexes are the most active for N2 adsorption with considerable adsorption energy. The electronic structure analysis shows that Os(II) and Re(II) centers in their metal-dinitrogen POM complexes possess π(2)xzπ(2)yzπ(2)xy and π(2)xzπ(2)yzπ(1)xy configurations, respectively. DFT-M06L calculations show that the possible synthesis routes proposed in this work for the Ru-, Os-, and Re-dinitrogen POM complexes are thermodynamically feasible under various solvent environments. Meanwhile, the Re-dinitrogen POM complex was assessed for the direct cleavage of dinitrogen molecule. In the reaction mechanism, a dimeric Keggin-type POM derivative of rhenium could represent the intermediate which undergoes N-N bond scission. The calculated free energy barrier (ΔG(⧧)) for a transition state with a zigzag conformation is 16.05 kcal mol(-1) in tetrahydrofuran, which is a moderate barrier for the cleavage of the N-N bond when compared with the literature values. In conclusion, regarding the direct cleavage of the dinitrogen molecule, the findings would be very useful to guide the search for a potential N2 cleavage compound into totally inorganic POM fields.

  2. Computer Virus

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Computer viruses are small software programs that are designed to spread from one computerto another and to interfere with computer operation.A virus might delete data on your computer,use your e-mail program to spread itself to othercomputers,or even erase everything on your hard disk.Viruses are most easily spread by attach-ments in e-mail messages or instant messaging messages.That is why it is essential that you never

  3. Fog computing

    OpenAIRE

    Poplštein, Karel

    2016-01-01

    The purpose of this bachelor's thesis is to address fog computing technology, that emerged as a possible solution for the internet of things requirements and aims to lower latency and network bandwidth by moving a substantial part of computing operation to the network edge. The thesis identifies advantages as well as potential threats and analyses the possible solutions to these problems, proceeding to comparison of cloud and fog computing and specifying areas of use for both of them. Finally...

  4. Comparison between GlideScope and Macintosh laryngoscope for double-lumen endobronchial tube intubation in patients with difficult glottis exposure%声门显露困难者使用GlideScope可视喉镜行双腔支气管插管的可行性研究

    Institute of Scientific and Technical Information of China (English)

    承耀中; 孙莉; 武小勇; 丁超; 郑春京; 赵桂军

    2011-01-01

    Objective To compare the use of GlideScope and conventional Macintosh laryngoscope in difficult glottis exposure during surgery on malignant chest tumors. Methods Forty Mallampati M and ]V patients during surgery on malignant chest tumors were recruited to our randomized controlled trial. Group G (ra =20)had endobronchial intubation performed using GlideScope and Group M (n =20) underwent en-dobronchial intubation using a Macintosh laryngoscope. The best laryngeal view, difficulty of the tracheal intubation, time taken for successful endobronchial intubation, manoeuvre needed to aid tracheal intubation were recorded. Results The median Cormack and Lehane grade was significantly better in Group G than in Group M. Group G had a significantly shorter endobronchial intubation time than Group M[ ( mean 51. 3 ± SD 23. 4) s vs (mean 66. 2 ± SD 26. 6) s ,P <0. 05 ) ]. Conclusion The GlideScope improved the laryngeal view and decreased time for endobronchial intubation as compared with the Macintosh laryngoscope in patients with difficult glottis exposure. The GlideScope may be a good alternative for managing the difficult airway.%目的 观察与普通喉镜相比,GlideScope 可视喉镜用于Mallampati评分Ⅲ~Ⅳ级的胸部肿瘤患者行双腔支气管插管时的可行性和临床应用价值.方法 选择40例术前评估Mallampati评分Ⅲ~Ⅳ级的食管癌或者肺癌患者,随机分为G组和M组,每组20例,分别采用GlideScope 视频喉镜和普通直接喉镜进行双腔支气管插管,记录插管一次成功的人数、需要环状软骨压迫的人数以及声门暴露程度,并记录气管插管时间.结果 声门显露情况:Cormack &Lehanef分级:M组Ⅰ级2例,Ⅱ级1例,Ⅲ级12例,Ⅳ级5例,G组声门显露明显改善,其中Ⅰ级16例,Ⅱ级4例,Ⅲ级0例,Ⅳ级0例.两组患者的一次插管成功率分别为:G组90.0%,M组50.0%,G组显著高于M组(P<0.05);需要喉部按压的例数:G组2例次,M组18例次.两

  5. Biological computation

    CERN Document Server

    Lamm, Ehud

    2011-01-01

    Introduction and Biological BackgroundBiological ComputationThe Influence of Biology on Mathematics-Historical ExamplesBiological IntroductionModels and Simulations Cellular Automata Biological BackgroundThe Game of Life General Definition of Cellular Automata One-Dimensional AutomataExamples of Cellular AutomataComparison with a Continuous Mathematical Model Computational UniversalitySelf-Replication Pseudo Code Evolutionary ComputationEvolutionary Biology and Evolutionary ComputationGenetic AlgorithmsExample ApplicationsAnalysis of the Behavior of Genetic AlgorithmsLamarckian Evolution Genet

  6. Cloud Computing

    CERN Document Server

    Mirashe, Shivaji P

    2010-01-01

    Computing as you know it is about to change, your applications and documents are going to move from the desktop into the cloud. I'm talking about cloud computing, where applications and files are hosted on a "cloud" consisting of thousands of computers and servers, all linked together and accessible via the Internet. With cloud computing, everything you do is now web based instead of being desktop based. You can access all your programs and documents from any computer that's connected to the Internet. How will cloud computing change the way you work? For one thing, you're no longer tied to a single computer. You can take your work anywhere because it's always accessible via the web. In addition, cloud computing facilitates group collaboration, as all group members can access the same programs and documents from wherever they happen to be located. Cloud computing might sound far-fetched, but chances are you're already using some cloud applications. If you're using a web-based email program, such as Gmail or Ho...

  7. An Application of Programming and Mathematics: Writing a Computer Graphing Program.

    Science.gov (United States)

    Waits, Bert; Demana, Franklin

    1988-01-01

    Suggests computer graphing as a topic for computer programing. Reviews Apple II computer graphics information and gives suggestions for writing the programs. Presents equations to help place information onto the screen with proper coordinates. (MVL)

  8. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  9. Computational Sustainability

    OpenAIRE

    Eaton, Eric; University of Pennsylvania; Gomes, Carla P.; Cornell University; Williams, Brian; Massachusetts Institute of Technology

    2014-01-01

    Computational sustainability problems, which exist in dynamic environments with high amounts of uncertainty, provide a variety of unique challenges to artificial intelligence research and the opportunity for significant impact upon our collective future. This editorial provides an overview of artificial intelligence for computational sustainability, and introduces this special issue of AI Magazine.

  10. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  11. Grid Computing

    Science.gov (United States)

    Foster, Ian

    2001-08-01

    The term "Grid Computing" refers to the use, for computational purposes, of emerging distributed Grid infrastructures: that is, network and middleware services designed to provide on-demand and high-performance access to all important computational resources within an organization or community. Grid computing promises to enable both evolutionary and revolutionary changes in the practice of computational science and engineering based on new application modalities such as high-speed distributed analysis of large datasets, collaborative engineering and visualization, desktop access to computation via "science portals," rapid parameter studies and Monte Carlo simulations that use all available resources within an organization, and online analysis of data from scientific instruments. In this article, I examine the status of Grid computing circa 2000, briefly reviewing some relevant history, outlining major current Grid research and development activities, and pointing out likely directions for future work. I also present a number of case studies, selected to illustrate the potential of Grid computing in various areas of science.

  12. Computational Deception

    NARCIS (Netherlands)

    Nijholt, Antinus; Acosta, P.S.; Cravo, P.

    2010-01-01

    In the future our daily life interactions with other people, with computers, robots and smart environments will be recorded and interpreted by computers or embedded intelligence in environments, furniture, robots, displays, and wearables. These sensors record our activities, our behaviour, and our

  13. Computational Science

    Institute of Scientific and Technical Information of China (English)

    K. Li

    2007-01-01

    @@ Computer science is the discipline that anchors the computer industry which has been improving processor performance, communication bandwidth and storage capacity on the so called "Moore's law" curve or at the rate of doubling every 18 to 24 months during the past decades.

  14. Granular Computing

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    The basic ideas and principles of granular computing (GrC) have been studied explicitly or implicitly in many fields in isolation. With the recent renewed and fast growing interest, it is time to extract the commonality from a diversity of fields and to study systematically and formally the domain independent principles of granular computing in a unified model. A framework of granular computing can be established by applying its own principles. We examine such a framework from two perspectives,granular computing as structured thinking and structured problem solving. From the philosophical perspective or the conceptual level,granular computing focuses on structured thinking based on multiple levels of granularity. The implementation of such a philosophy in the application level deals with structured problem solving.

  15. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    on Theory of Computing, pages 25-334, May 2000. [3]Tal Rabin and Michael Ben-Or. Verifiable secret sharing and multiparty protocols with honest majority (extended abstract). In Proceedings of the Twenty First Annual ACM Symposium on Theory of Computing, pages 73-85, Seattle, Washington, 15-17 May 1989.......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... an impossibility result indicating that a similar equivalence does not hold for Multiparty Computation (MPC): we show that even if protocols are given black-box access for free to an idealized secret sharing scheme secure for the access structure in question, it is not possible to handle all relevant access...

  16. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  17. A physically-derived nonquasi-static model of ferroelectric amplifiers for computer-aided device simulation - Part II: The ferroelectric common-source and common-gate amplifiers

    Science.gov (United States)

    Sayyah, Rana; Hunt, Mitchell; Ho, Fat D.

    2013-08-01

    In this paper, Part II of the authors' paper [1], the physically-derived nonquasi-static model presented in [1] is applied to the ferroelectric common-source and common-gate amplifiers. The model is based on the method of partitioned channel and ferroelectric layers and is valid in accumulation, depletion, and all three cases of inversion: weak, moderate, and strong. The equations of this model are based on the standard MOSFET equations that have been adapted to include the ferroelectric properties. The model code is written in MATLAB and outputs voltage plots with respect to time. The accuracy and effectiveness of the model are verified by two test cases, where the modeled results are compared to empirically-derived oscilloscope plots.

  18. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  19. Preparation, spectrochemical, and computational analysis of L-carnosine (2-[(3-aminopropanoyl)amino]-3-(1H-imidazol-5-yl)propanoic acid) and its ruthenium (II) coordination complexes in aqueous solution.

    Science.gov (United States)

    Branham, Michael Lee; Singh, Parvesh; Bisetty, Krishna; Sabela, Myalo; Govender, Thirumala

    2011-12-09

    This study reports the synthesis and characterization of novel ruthenium (II) complexes with the polydentate dipeptide, L-carnosine (2-[(3-aminopropanoyl)amino]-3-(1H-imidazol-5-yl)propanoic acid). Mixed-ligand complexes with the general composition [ML(p)(Cl)(q)(H₂O)(r)]·xH₂O (M = Ru(II); L = L-carnosine; p = 3 - q; r = 0-1; and x = 1-3) were prepared by refluxing aqueous solutions of the ligand with equimolar amounts of ruthenium chloride (black-alpha form) at 60 °C for 36 h. Physical properties of the complexes were characterized by elemental analysis, DSC/TGA, and cyclic voltammetry. The molecular structures of the complexes were elucidated using UV-Vis, ATR-IR, and heteronuclear NMR spectroscopy, then confirmed by density function theory (DFT) calculations at the B3LYP/LANL2DZ level. Two-dimensional NMR experiments (¹H COSY, ¹³C gHMBC, and ¹⁵N gHMBC) were also conducted for the assignment of chemical shifts and calculation of relative coordination-induced shifts (RCIS) by the complex formed. According to our results, the most probable coordination geometries of ruthenium in these compounds involve nitrogen (N1) from the imidazole ring and an oxygen atom from the carboxylic acid group of the ligand as donor atoms. Additional thermogravimetric and electrochemical data suggest that while the tetrahedral-monomer or octahedral-dimer are both possible structures of the formed complexes, the metal in either structure occurs in the ²⁺ oxidation state. Resulting RCIS values indicate that the amide-carbonyl, and the amino-terminus of the dipeptide are not involved in chelation and these observations correlate well with theoretical shift predictions by DFT.

  20. Crystallographic and Computational Analysis of the Barrel Part of the PsbO Protein of Photosystem II: Carboxylate-Water Clusters as Putative Proton Transfer Relays and Structural Switches.

    Science.gov (United States)

    Bommer, Martin; Bondar, Ana-Nicoleta; Zouni, Athina; Dobbek, Holger; Dau, Holger

    2016-08-23

    In all organisms that employ oxygenic photosynthesis, the membrane-extrinsic PsbO protein is a functionally important component of photosystem II. To study the previously proposed proton antenna function of carboxylate clusters at the protein-water interface, we combined crystallography and simulations of a truncated cyanobacterial (Thermosynechococcus elongatus) PsbO without peripheral loops. We expressed the PsbO β-barrel heterologously and determined crystal structures at resolutions of 1.15-1.5 Å at 100 K at various pH values and at 297 K and pH 6. (1) Approximately half of the 177 surface waters identified at 100 K are resolved at 297 K, suggesting significant occupancy of specific water sites at room temperature, and loss of resolvable occupancy for other sites. (2) Within a loop region specific to cyanobacterial PsbO, three residues and four waters coordinating a calcium ion are well ordered even at 297 K; the ligation differs for manganese. (3) The crystal structures show water-carboxylate clusters that could facilitate fast Grotthus-type proton transfer along the protein surface and/or store protons. (4) Two carboxylate side chains, which are part of a structural motif interrupting two β-strands and connecting PsbO to photosystem II, are within hydrogen bonding distance at pH 6 (100 K). Simulations indicate coupling between protein structure and carboxylate protonation. The crystal structure determined at 100 K and pH 10 indicates broken hydrogen bonding between the carboxylates and local structural change. At pH 6 and 297 K, both conformations were present in the crystal, suggesting conformational dynamics in the functionally relevant pH regime. Taken together, crystallography and molecular dynamics underline a possible mechanism for pH-dependent structural switching.

  1. Computational analysis of Amsacrine resistance in human topoisomerase II alpha mutants (R487K and E571K) using homology modeling, docking and all-atom molecular dynamics simulation in explicit solvent.

    Science.gov (United States)

    Sader, Safaa; Wu, Chun

    2017-03-01

    Amsacrine is an effective topoisomerase II enzyme inhibitor in acute lymphatic leukemia. Previous experimental studies have successfully identified two important mutations (R487K and E571K) conferring 100 and 25 fold resistance to Amsacrine respectively. Although the reduction of the cleavage ligand-DNA-protein ternary complex has been well thought as the major cause of drug resistance, the detailed energetic, structural and dynamic mechanisms remain to be elusive. In this study, we constructed human topoisomerase II alpha (hTop2α) homology model docked with Amsacrine based on crystal structure of human Top2β in complex with etoposide. This wild type complex was used to build the ternary complex with R487K and E571K mutants. Three 500ns molecular dynamics simulations were performed on complex systems of wild type and two mutants. The detailed energetic, structural and dynamic analysis were performed on the simulation data. Our binding data indicated a significant impairment of Amsacrine binding energy in the two mutants compared with the wild type. The order of weakening (R487K>E571K) was in agreement with the order of experimental drug resistance fold (R489K>E571K). Our binding energy decomposition further indicated that weakening of the ligand-protein interaction rather than the ligand-DNA interaction was the major contributor of the binding energy difference between R487K and E571K. In addition, key residues contributing to the binding energy (ΔG) or the decrease of the binding energy (ΔΔG) were identified through the energy decomposition analysis. The change in ligand binding pose, dynamics of protein, DNA and ligand upon the mutations were thoroughly analyzed and discussed. Deciphering the molecular basis of drug resistance is crucial to overcome drug resistance using rational drug design. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Orbscan II、电脑验光仪、综合验光测量散光的对比观察%Comparison of computer optometer, Orbscan II and comprehensive optometry in measurement of astigmatism

    Institute of Scientific and Technical Information of China (English)

    李凯; 王育良; 吴静; 张玉娟

    2010-01-01

    目的 分析准分子手术前近视散光患者术前应用Orbscan II 眼前节分析系统,自动电脑验光仪测得的柱镜和散光轴向与综合验光测得结果的差异及关系.方法采用Orbscan II 眼前节分析系统、小瞳下电脑验光、散瞳后电脑验光和综合验光4种测量方法,对130例(250只眼)近视散光进行检查并比较.结果综合验光、小瞳电脑验光和散瞳电脑验光3种方法测得的轴向基本一致,统计学分析无显著性差异(P>0.05),而Orbscan II与综合验光测得的轴向统计学上有显著性差异(P<0.01);综合验光、小瞳电脑验光和散瞳电脑验光3种方法测得的散光度数统计学上有显著性差异(P<0.01).结论角膜地形图检查能反映角膜屈光状况,电脑验光仪对散光轴向的准确性较高,但临床检查仍应以睫状肌麻痹前后综合验光检查作为确定近视散光及散光轴的标准.

  3. Preparation, Spectrochemical, and Computational Analysis of L-Carnosine (2-[(3-Aminopropanoylamino]-3-(1H-imidazol-5-ylpropanoic Acid and Its Ruthenium (II Coordination Complexes in Aqueous Solution

    Directory of Open Access Journals (Sweden)

    Myalo Sabela

    2011-12-01

    Full Text Available This study reports the synthesis and characterization of novel ruthenium (II complexes with the polydentate dipeptide, L-carnosine (2-[(3-aminopropanoylamino]-3-(1H-imidazol-5-ylpropanoic acid. Mixed-ligand complexes with the general composition [MLp(Clq(H2Or]·xH2O (M = Ru(II; L = L-carnosine; p = 3 − q; r = 0–1; and x = 1–3 were prepared by refluxing aqueous solutions of the ligand with equimolar amounts of ruthenium chloride (black-alpha form at 60 °C for 36 h. Physical properties of the complexes were characterized by elemental analysis, DSC/TGA, and cyclic voltammetry. The molecular structures of the complexes were elucidated using UV-Vis, ATR-IR, and heteronuclear NMR spectroscopy, then confirmed by density function theory (DFT calculations at the B3LYP/LANL2DZ level. Two-dimensional NMR experiments (1H COSY, 13C gHMBC, and 15N gHMBC were also conducted for the assignment of chemical shifts and calculation of relative coordination-induced shifts (RCIS by the complex formed. According to our results, the most probable coordination geometries of ruthenium in these compounds involve nitrogen (N1 from the imidazole ring and an oxygen atom from the carboxylic acid group of the ligand as donor atoms. Additional thermogravimetric and electrochemical data suggest that while the tetrahedral-monomer or octahedral-dimer are both possible structures of the formed complexes, the metal in either structure occurs in the (2+ oxidation state. Resulting RCIS values indicate that the amide-carbonyl, and the amino-terminus of the dipeptide are not involved in chelation and these observations correlate well with theoretical shift predictions by DFT.

  4. The Causal-Compositional Concept of Information—Part II: Information through Fairness: How Does the Relationship between Information, Fairness and Language Evolve, Stimulate the Development of (New Computing Devices and Help to Move towards the Information Society

    Directory of Open Access Journals (Sweden)

    Gerhard Luhn

    2012-09-01

    Full Text Available We are moving towards the information society, and we need to overcome the discouraging perspective, which is caused by the false belief that our thoughts (and thereby also our acting represent a somehow externally existing world. Indeed, it is already a step forward to proclaim that there exists a somehow common world for all people. But if those internal forms of representation are primarily bound to the subject itself, then, consequently, anybody can argue for his or her view of the world as being the “right” one. Well, what is the exit strategy out of this dilemma? It is information; information as understood in its actual and potential dimension, in its identity of structure and meaning. Such an approach requires a deeper elaborated conceptual approach. The goal of this study is to show that such a concept is glued by the strong relationship between seemingly unrelated disciplines: physics, semantics (semiotics/cognition and computer science, and even poetry. But the terminus of information is nowadays discussed and elaborated in all those disciplines. Hence, there is no shortcut, no way around. The aim of this study is not even to show that those strong relationships exist. We will see within the same horizon that, based on such a concept, new kinds of computing systems are becoming possible. Nowadays energy consumption is becoming a major issue regarding computing systems. We will work towards an approach, which enables new devices consuming a minimum amount of energy and maximizing the performance at the same time. And within the same horizon it becomes possible to release the saved energy towards a new ethical spirit—towards the information society.

  5. Chromatin computation.

    Directory of Open Access Journals (Sweden)

    Barbara Bryant

    Full Text Available In living cells, DNA is packaged along with protein and RNA into chromatin. Chemical modifications to nucleotides and histone proteins are added, removed and recognized by multi-functional molecular complexes. Here I define a new computational model, in which chromatin modifications are information units that can be written onto a one-dimensional string of nucleosomes, analogous to the symbols written onto cells of a Turing machine tape, and chromatin-modifying complexes are modeled as read-write rules that operate on a finite set of adjacent nucleosomes. I illustrate the use of this "chromatin computer" to solve an instance of the Hamiltonian path problem. I prove that chromatin computers are computationally universal--and therefore more powerful than the logic circuits often used to model transcription factor control of gene expression. Features of biological chromatin provide a rich instruction set for efficient computation of nontrivial algorithms in biological time scales. Modeling chromatin as a computer shifts how we think about chromatin function, suggests new approaches to medical intervention, and lays the groundwork for the engineering of a new class of biological computing machines.

  6. Conjunctival impression cytology in computer users.

    Science.gov (United States)

    Kumar, S; Bansal, R; Khare, A; Malik, K P S; Malik, V K; Jain, K; Jain, C

    2013-01-01

    It is known that the computer users develop the features of dry eye. To study the cytological changes in the conjunctiva using conjunctival impression cytology in computer users and a control group. Fifteen eyes of computer users who had used computers for more than one year and ten eyes of an age-and-sex matched control group (those who had not used computers) were studied by conjunctival impression cytology. Conjunctival impression cytology (CIC) results in the control group were of stage 0 and stage I while the computer user group showed CIC results between stages II to stage IV. Among the computer users, the majority ( > 90 %) showed stage III and stage IV changes. We found that those who used computers daily for long hours developed more CIC changes than those who worked at the computer for a shorter daily duration. © NEPjOPH.

  7. Computing methods

    CERN Document Server

    Berezin, I S

    1965-01-01

    Computing Methods, Volume 2 is a five-chapter text that presents the numerical methods of solving sets of several mathematical equations. This volume includes computation sets of linear algebraic equations, high degree equations and transcendental equations, numerical methods of finding eigenvalues, and approximate methods of solving ordinary differential equations, partial differential equations and integral equations.The book is intended as a text-book for students in mechanical mathematical and physics-mathematical faculties specializing in computer mathematics and persons interested in the

  8. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  9. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  10. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  11. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  12. COMPUTERS HAZARDS

    Directory of Open Access Journals (Sweden)

    Andrzej Augustynek

    2007-01-01

    Full Text Available In June 2006, over 12.6 million Polish users of the Web registered. On the average, each of them spent 21 hours and 37 minutes monthly browsing the Web. That is why the problems of the psychological aspects of computer utilization have become an urgent research subject. The results of research into the development of Polish information society carried out in AGH University of Science and Technology, under the leadership of Leslaw H. Haber, in the period from 2000 until present time, indicate the emergence dynamic changes in the ways of computer utilization and their circumstances. One of the interesting regularities has been the inverse proportional relation between the level of computer skills and the frequency of the Web utilization.It has been found that in 2005, compared to 2000, the following changes occurred:- A significant drop in the number of students who never used computers and the Web;- Remarkable increase in computer knowledge and skills (particularly pronounced in the case of first years student- Decreasing gap in computer skills between students of the first and the third year; between male and female students;- Declining popularity of computer games.It has been demonstrated also that the hazard of computer screen addiction was the highest in he case of unemployed youth outside school system. As much as 12% of this group of young people were addicted to computer. A lot of leisure time that these youths enjoyed inducted them to excessive utilization of the Web. Polish housewives are another population group in risk of addiction to the Web. The duration of long Web charts carried out by younger and younger youths has been another matter of concern. Since the phenomenon of computer addiction is relatively new, no specific therapy methods has been developed. In general, the applied therapy in relation to computer addition syndrome is similar to the techniques applied in the cases of alcohol or gambling addiction. Individual and group

  13. Quantum Computers

    Science.gov (United States)

    2010-03-04

    efficient or less costly than their classical counterparts. A large-scale quantum computer is certainly an extremely ambi- tious goal, appearing to us...outperform the largest classical supercomputers in solving some specific problems important for data encryption. In the long term, another application...which the quantum computer depends, causing the quantum mechanically destructive process known as decoherence . Decoherence comes in several forms

  14. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  15. Small Diameter Bomb Increment II (SDB II)

    Science.gov (United States)

    2013-12-01

    been further delays to the F-35 System Development and Demonstration ( SDD ) program. As a result, the SDB II integration will be accomplished as a...follow-on integration to the F-35 SDD . SDB II OT&E on the F-35 will not be completed by the FRP threshold of October 2019, thus delaying the FRP decision

  16. Computational oncology.

    Science.gov (United States)

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  17. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  18. Computer-based Astronomy Labs for Non-science Majors

    Science.gov (United States)

    Smith, A. B. E.; Murray, S. D.; Ward, R. A.

    1998-12-01

    We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.

  19. Belle-II Experiment Network Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Asner, David [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Bell, Greg [ESnet; Carlson, Tim [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Cowley, David [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Dart, Eli [ESnet; Erwin, Brock [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Godang, Romulus [Univ. of South Alabama, Mobile, AL (United States); Hara, Takanori [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Johnson, Jerry [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Johnson, Ron [Univ. of Washington, Seattle, WA (United States); Johnston, Bill [ESnet; Dam, Kerstin Kleese-van [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Kaneko, Toshiaki [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Kubota, Yoshihiro [NII; Kuhr, Thomas [Karlsruhe Inst. of Technology (KIT) (Germany); McCoy, John [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Miyake, Hideki [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Monga, Inder [ESnet; Nakamura, Motonori [NII; Piilonen, Leo [Virginia Polytechnic Inst. and State Univ. (Virginia Tech), Blacksburg, VA (United States); Pordes, Ruth [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Ray, Douglas [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Russell, Richard [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schram, Malachi [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Schroeder, Jim [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Sevior, Martin [Univ. of Melbourne (Australia); Singh, Surya [Pacific Northwest National Laboratory (PNNL), Richland, WA (United States); Suzuki, Soh [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Sasaki, Takashi [High Energy Accelerator Research Organization (KEK), Tsukuba (Japan); Williams, Jim [Indiana Univ., Bloomington, IN (United States)

    2013-05-28

    The Belle experiment, part of a broad-based search for new physics, is a collaboration of ~400 physicists from 55 institutions across four continents. The Belle detector is located at the KEKB accelerator in Tsukuba, Japan. The Belle detector was operated at the asymmetric electron-positron collider KEKB from 1999-2010. The detector accumulated more than 1 ab-1 of integrated luminosity, corresponding to more than 2 PB of data near 10 GeV center-of-mass energy. Recently, KEK has initiated a $400 million accelerator upgrade to be called SuperKEKB, designed to produce instantaneous and integrated luminosity two orders of magnitude greater than KEKB. The new international collaboration at SuperKEKB is called Belle II. The first data from Belle II/SuperKEKB is expected in 2015. In October 2012, senior members of the Belle-II collaboration gathered at PNNL to discuss the computing and neworking requirements of the Belle-II experiment with ESnet staff and other computing and networking experts. The day-and-a-half-long workshop characterized the instruments and facilities used in the experiment, the process of science for Belle-II, and the computing and networking equipment and configuration requirements to realize the full scientific potential of the collaboration's work.

  20. [RuII(NO+)]3+-core complexes with 4-methyl-pyrimidine and ethyl-isonicotinate: synthesis, X-ray structure, spectroscopy, and computational and NO-release studies upon UVA irradiation.

    Science.gov (United States)

    Tamasi, Gabriella; Curci, Matia; Berrettini, Francesco; Justice, Nicholas; Sega, Alessandro; Chiasserini, Luisa; Cini, Renzo

    2008-10-01

    The reaction of RuCl(3)(NO).H(2)O with 4-methylpyrimidine (MePYM) and ethylisonicotinate (EINT), in absolute ethanol at 40-55 degrees C afforded crystalline trans-[RuCl(3)(NO)L(2)] complexes. Structural studies via X-ray diffraction, and spectroscopic methods (NMR, IR, UV-visible (UV-Vis)) revealed that the molecular structures have the two Ls in trans positions (axial) and the chloride anions and the NO(+) cation as equatorial ligands; pyrimidine...pyrimidine pairing pattern via two weak C-H...N interactions occur. The molecular structures for the EINT derivative was inferred from spectroscopy and computations. Under irradiation at 366 nm several solutions of the title compounds deliver NO via first order processes. Visible light (420-700 nm) does not produce significant NO release from CH(2)Cl(2) and CH(3)CN solutions within 24h.

  1. Optimizing the structure of Tetracyanoplatinate(II)

    DEFF Research Database (Denmark)

    Dohn, Asmus Ougaard; Møller, Klaus Braagaard; Sauer, Stephan P. A.

    2013-01-01

    The geometry of tetracyanoplatinate(II) (TCP) has been optimized with density functional theory (DFT) calculations in order to compare different computational strategies. Two approximate scalar relativistic methods, i.e. the scalar zeroth-order regular approximation (ZORA) and non-relativistic ca...

  2. Generating Analog IC Layouts with LAYGEN II

    CERN Document Server

    Martins, Ricardo M F; Horta, Nuno C G

    2013-01-01

    This book presents an innovative methodology for the automatic generation of analog integrated circuits (ICs) layout, based on template descriptions and on evolutionary computational techniques. A design automation tool, LAYGEN II, was implemented to validate the proposed approach giving special emphasis to reusability of expert design knowledge and to efficiency on retargeting operations.

  3. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer; Methode d'analyse des resonances induites par les neutrons dans les experiences de transmission par temps-de-vol et automatisation de ces methodes sur ordinateur IBM-7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, C

    1967-07-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  4. Computational fluid dynamic simulation (CFD) for hydrogen emission in batteries rooms of new technologic safeguards system of nuclear power plant Vandellos II; Simulacion de dinamica de fluidos computacional (CFD) para la emision de hidrogeno en las salas de baterias de nuevo sistema de salvaguardias tecnologicas de C.N. Vandellos II

    Energy Technology Data Exchange (ETDEWEB)

    Aleman, A.; Arino, X; Colomer, C.

    2010-07-01

    CFD (Computational Fluid Dynamics) technology is a powerful tool used when traditional methods of engineering are not sufficient to address the complexity of a problem and want to avoid the construction of prototypes. Natural ventilation and transport of hydrogen gas, is a problem where there are no models based on experimental data or analytical expressions that can reflect, the complex behaviour, of the fluid, but which can be addressed by use of CFD. (Author). 3 Refs.

  5. Radix-3 Algorithm for Realization of Type-II Discrete Sine Transform

    Directory of Open Access Journals (Sweden)

    M. N. Murty

    2015-06-01

    Full Text Available In this paper, radix-3 algorithm for computation of type-II discrete sine transform (DST-II of length N = 3 ( = 1,2, … . is presented. The DST-II of length N can be realized from three DST-II sequences, each of length N/3. A block diagram of for computation of the radix-3 DST-II algorithm is given. Signal flow graph for DST-II of length = 3 2 is shown to clarify the proposed algorithm.

  6. Chiari II malformation. Pt. 4

    Energy Technology Data Exchange (ETDEWEB)

    Naidich, T.P.; McLone, D.G.; Fulling, K.H.

    1983-08-01

    Computed tomography successfully delineates the multiple components of the Chiari II malformation at the craniocervical junction, the hindbrain, and the cervical spinal cord. These include wide foramen magnum and upper cervical spinal canal; incomplete fusions of the posterior arches of C1 and lower cervical vertebrae; cascading protrusions of vermis, fourth ventricle, medulla, and cervical cord into the spinal canal; cervicomedullary ''kinking''; anterior displacement and sequential sagittal compression of each protrusion by the protrusions posterior to it; compression of all protrusions by the posterior lip of foramen magnum and the posterior arch of C1; and associated cervical hydromyelia, cervical diastematomyelia, and cervical arachnoid cysts.

  7. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  8. Factor II deficiency

    Science.gov (United States)

    ... if one or more of these factors are missing or are not functioning like they should. Factor II is one such coagulation factor. Factor II deficiency runs in families (inherited) and is very rare. Both parents must ...

  9. On the Lewis acidic character of bis(salicylaldiminato)zinc(ii) Schiff-base complexes: a computational and experimental investigation on a series of compounds varying the bridging diimine.

    Science.gov (United States)

    Forte, Giuseppe; Oliveri, Ivan Pietro; Consiglio, Giuseppe; Failla, Salvatore; Di Bella, Santo

    2017-03-20

    This contribution explores the effect of the 1,2-diimine bridge upon the Lewis acidic character of a series of bis(salicylaldiminato)zinc(ii), ZnL, Schiff-base complexes. The structure of the monomeric and dimeric ZnL complexes, and of the 1 : 1 adducts with pyridine, ZnL·py, is fully optimized by means of DFT calculations. The Gibbs free energy for the dimerization of ZnL complexes and for the formation of ZnL·py adducts is evaluated by accurate composite calculations. It accounts for their spontaneous dimerization and for the greater stability of the ZnL·py adducts with respect to the dimers. Calculated binding constants for the formation ZnL·py adducts are in excellent agreement with experimentally derived values, thus allowing establishing a relative Lewis acidity scale within this series. While the complex derived from the non-conjugated ethylenediamine reveals the lowest Lewis acidity, the complex derived from the diaminomaleonitrile represents the stronger Lewis acidic species. These findings are in good agreement with the greater catalytic activity observed for ZnL Schiff-base complexes derived from conjugated 1,2-diamines in comparison to the non-conjugated analogues. Both in ZnL dimers as well as in ZnL·py adducts the geometry of the coordination sphere seems to be a relevant feature to assess their relative stability. Thus, while the quasi-planarity of ZnL monomers of the conjugated diimines is an unfavourable feature in the dimerization process, it represents an important aspect in stabilizing ZnL·py adducts in a nearly perfect square-pyramidal coordination. These features are relevant for the sensing and catalytic properties of these complexes.

  10. A novel azo-aldehyde and its Ni(II) chelate; synthesis, characterization, crystal structure and computational studies of 2-hydroxy-5-{(E)-[4-(propan-2-yl)phenyl]diazenyl}benzaldehyde

    Science.gov (United States)

    Eren, Tuğba; Kose, Muhammet; Sayin, Koray; McKee, Vickie; Kurtoglu, Mukerrem

    2014-05-01

    A novel azo-salicylaldeyde, 2-hydroxy-5-{(E)-[4-(propan-2-yl)phenyl]diazenyl} benzaldehyde and its Ni(II) chelate were obtained and characterized by analytical and spectral techniques. Molecular structure of the azo chromophore containing azo-aldehyde was determined by single crystal X-ray crystallography. X-ray data show that the compound crystallizes in the orthorhombic, Pbca space group with unit cell parameters a = 11.2706(9), b = 8.3993(7), c = 28.667(2) Å, V = 2713.7(4) Å3 and Z = 8. There is a strong phenol-aldehyde (OH⋯O) hydrogen bond forming a S(6) hydrogen bonding motif in the structure. There is also a weaker inter-molecular phenol-aldeyhde (OH⋯O) hydrogen bonding resulting in a dimeric structure and generating a D22(4) hydrogen bonding motif. Hydrogen bonded dimers are linked by π-π interactions within the structure. The azo-aldehyde ligand behaved as bidentate, coordinating through the nitrogen atom of the azomethine group and or oxygen atom of phenolic hydroxyl group. Additionally, optimized structures of the three possible tautomers of the compound were obtained using B3LYP method with 6-311++G(d,p), 6-31G and 3-21G basis sets in the gas phase. B3LYP/6-311++G(d,p) level is found to be the best level for calculation. The electronic spectra of the compounds in the 200-800 nm range were obtained in three organic solvents.

  11. Inducing magnetic communication in caged dinuclear Co(II) systems.

    Science.gov (United States)

    Caballero-Jiménez, Judith; Habib, Fatemah; Ramírez-Rosales, Daniel; Grande-Aztatzi, Rafael; Merino, Gabriel; Korobkov, Ilia; Singh, Mukesh Kumar; Rajaraman, Gopalan; Reyes-Ortega, Yasmi; Murugesu, Muralee

    2015-05-14

    The synthesis, structural, electronic and magnetic characterization of five dinuclear Co(II) azacryptand compounds (1-5) bridged through different ions are reported. The magnetic exchange interactions, 2J values, obtained from theoretical computations show that the variation of the intermetallic angles and distances lead to antiferromagnetic behaviours. Magneto-structural correlations show a trend, where the angles Co(II)-bridge-Co(II) closer to 180° favour an increase in the superexchange pathway leading to higher AF interaction values.

  12. Multiple, simultaneous, independent gradients for a versatile multidimensional liquid chromatography. Part II: Application 2: Computer controlled pH gradients in the presence of urea provide improved separation of proteins: Stability influenced anion and cation exchange chromatography.

    Science.gov (United States)

    Hirsh, Allen G; Tsonev, Latchezar I

    2017-04-28

    This paper details the use of a method of creating controlled pH gradients (pISep) to improve the separation of protein isoforms on ion exchange (IEX) stationary phases in the presence of various isocratic levels of urea. The pISep technology enables the development of computer controlled pH gradients on both cationic (CEX) and anionic (AEX) IEX stationary phases over the very wide pH range from 2 to 12. In pISep, titration curves generated by proportional mixing of the acidic and basic pISep working buffers alone, or in the presence of non-buffering solutes such as the neutral salt NaCl (0-1M), polar organics such as urea (0-8M) or acetonitrile (0-80 Vol%), can be fitted with high fidelity using high order polynomials which, in turn allows construction of a mathematical manifold %A (% acidic pISep buffer) vs. pH vs. [non-buffering solute], permitting precise computer control of pH and the non-buffering solute concentration allowing formation of dual uncoupled liquid chromatographic (LC) gradients of arbitrary shape (Hirsh and Tsonev, 2012 [1]). The separation of protein isoforms examined in this paper by use of such pH gradients in the presence of urea demonstrates the fractionation power of a true single step two dimensional liquid chromatography which we denote as Stability-Influenced Ion Exchange Chromatography (SIIEX). We present evidence that SIIEX is capable of increasing the resolution of protein isoforms difficult to separate by ordinary pH gradient IEX, and potentially simplifying the development of laboratory and production purification strategies involving on-column simultaneous pH and urea unfolding or refolding of targeted proteins. We model some of the physics implied by the dynamics of the observed protein fractionations as a function of both urea concentration and pH assuming that urea-induced native state unfolding competes with native state electrostatic interaction binding to an IEX stationary phase. Implications for in vivo protein

  13. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    The second half of the 20th century has been characterized by an explosive development in information technology (Maney, Hamm, & O'Brien, 2011). Processing power, storage capacity and network bandwidth have increased exponentially, resulting in new possibilities and shifting IT paradigms. In step...... with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production...

  14. Quantum computers.

    Science.gov (United States)

    Ladd, T D; Jelezko, F; Laflamme, R; Nakamura, Y; Monroe, C; O'Brien, J L

    2010-03-04

    Over the past several decades, quantum information science has emerged to seek answers to the question: can we gain some advantage by storing, transmitting and processing information encoded in systems that exhibit unique quantum properties? Today it is understood that the answer is yes, and many research groups around the world are working towards the highly ambitious technological goal of building a quantum computer, which would dramatically improve computational power for particular tasks. A number of physical systems, spanning much of modern physics, are being developed for quantum computation. However, it remains unclear which technology, if any, will ultimately prove successful. Here we describe the latest developments for each of the leading approaches and explain the major challenges for the future.

  15. Computational mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D. Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.

  16. Computational Artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature of that wh...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  17. Distributed computing

    CERN Document Server

    Van Renesse, R

    1991-01-01

    This series will start with an introduction to distributed computing systems. Distributed computing paradigms will be presented followed by a discussion on how several important contemporary distributed operating systems use these paradigms. Topics will include processing paradigms, storage paradigms, scalability and robustness. Throughout the course everything will be illustrated by modern distributed systems notably the Amoeba distributed operating system of the Free University in Amsterdam and the Plan 9 operating system of AT&T Bell Laboratories. Plan 9 is partly designed and implemented by Ken Thompson, the main person behind the successful UNIX operating system.

  18. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  19. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  20. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  1. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  2. COMPUTATIONAL THINKING

    OpenAIRE

    Evgeniy K. Khenner

    2016-01-01

    Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education;...

  3. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  4. Computer immunology.

    Science.gov (United States)

    Forrest, Stephanie; Beauchemin, Catherine

    2007-04-01

    This review describes a body of work on computational immune systems that behave analogously to the natural immune system. These artificial immune systems (AIS) simulate the behavior of the natural immune system and in some cases have been used to solve practical engineering problems such as computer security. AIS have several strengths that can complement wet lab immunology. It is easier to conduct simulation experiments and to vary experimental conditions, for example, to rule out hypotheses; it is easier to isolate a single mechanism to test hypotheses about how it functions; agent-based models of the immune system can integrate data from several different experiments into a single in silico experimental system.

  5. Computer systems

    Science.gov (United States)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  6. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  7. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  8. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...... set of skills rather than one single skill. Skills acquisition at these layers can be tailored to the specific needs of students. The work presented here builds upon experience from courses for such students from the Humanities in which programming is taught as a tool for other purposes. Results...

  9. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  10. Comparisons Between Ionic Diffusion and Electromigration in Moving Boundary System and Isotachophoresis Formed by Strong Electrolytes at Steady State:II. Computer-aided Analyses of Experimental Data 1

    Institute of Scientific and Technical Information of China (English)

    Li Renzhi; Xu Hongbin; Cao Chengxi

    1999-01-01

      In this paper, with the computer-aided analyses and the mathematical expressions defined in the accompanying report, the authors compared the fluxes of ionic diffusion and electromigration in moving boundary system(MBS) and isotachophoresis(ITP) formed by strong electrolytes at steady state.Firstly, the results show that the ratios of the flux of ionic diffusion divided by that of ionic electromigration are, in general, less than 10 percent in a moving boundary of the steady state of MBS(or ITP) formed by different strong electrolytes, and the ratios in a stationary/or concentration boundary at steady state are all less than 1 percent. Thus, the flux of ionic diffusion is small and can be approximately omitted in contrast to that of ionic electromigration in the steady state of MBS(or ITP). Hence this results, coupled with the experiments and theoretical derivations by McInnes-Longsworth-Cowperthwaite about 60 years ago, show the validity of the assumption -- the omission of ionic diffusion in MBS or ITP. Secondly, the ratios(in the range from 0.01% to 1%) for a stationary boundary are much smaller than those(generally in the range from 1% to 10%) for a moving boundary, this results imply that the gradual widening of a stationary boundary is caused not only by ionic diffusion, but also by other reason like the un-existence of "self-regulating effect" in a stationary boundary . Thirdly, the results possess significance for biomedicine, such drug release by diffusion or electric field.

  11. A novel method for measuring and monitoring monobloc distraction osteogenesis using three-dimensional computed tomography rendered images with the "biporion-dorsum sellae" plane. Part II: comparison of measurements before and after distraction.

    Science.gov (United States)

    Salyer, Kenneth E; Por, Yong-Chen; Genecov, David G; Barcelo, Carlos Raul

    2008-03-01

    The aim was to assess the stability of monobloc distraction osteogenesis using three-dimensional computed tomographic (CT) scan volume-rendered images with the "biporion-dorsum sellae" plane. This was a prospective study of patients undergoing monobloc internal distraction osteogenesis at the International Craniofacial Institute, Dallas, TX. Measurements were made of the perpendicular distance of 8 skeletal facial points to the static "biporion-dorsum sellae" plane. The statistical analyses were performed with the paired-samples t test in SPSS. Three male patients were included in the study. Of these patients, 2 had Apert syndrome (A, B) and 1 had Carpenter syndrome (C). The mean age was 73 (range 30-112) months, and the mean follow up was 14 (range 8-12) months. The consolidation period was 17, 23, and 28 weeks in each patient, respectively. In patient A, the paired-samples t test of matched points was P = 0.022. Further analysis of the three-dimensional lateral profile revealed an obvious relapse, and predistractor removal CT scans (at 17 weeks) also showed deficient bone growth across the distraction gaps at the anterior cranial fossae and the temporal bones. In contrast, patients B and C showed a stable outcome after distraction and after removal of distraction devices. On analysis of the predistractor removal three-dimensional CT scans (23 and 28 weeks, respectively), there was more bone growth across the distraction gaps at the anterior cranial fossa and temporal bones. The "biporion-dorsum sellae" plane was used to assess the results of monobloc distraction osteogenesis. Relapse was associated with inadequate bone growth across the anterior cranial fossa and temporal bone. The findings seem to point the way for an increased consolidation period and more detailed examination of the CT scans before removal of internal distraction devices.

  12. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  13. Computational Physics.

    Science.gov (United States)

    Borcherds, P. H.

    1986-01-01

    Describes an optional course in "computational physics" offered at the University of Birmingham. Includes an introduction to numerical methods and presents exercises involving fast-Fourier transforms, non-linear least-squares, Monte Carlo methods, and the three-body problem. Recommends adding laboratory work into the course in the…

  14. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  15. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  16. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  17. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management....

  18. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  19. Computational trigonometry

    Energy Technology Data Exchange (ETDEWEB)

    Gustafson, K. [Univ. of Colorado, Boulder, CO (United States)

    1994-12-31

    By means of the author`s earlier theory of antieigenvalues and antieigenvectors, a new computational approach to iterative methods is presented. This enables an explicit trigonometric understanding of iterative convergence and provides new insights into the sharpness of error bounds. Direct applications to Gradient descent, Conjugate gradient, GCR(k), Orthomin, CGN, GMRES, CGS, and other matrix iterative schemes will be given.

  20. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).