WorldWideScience

Sample records for program digital computer

  1. A computer program for planimetric analysis of digitized images

    DEFF Research Database (Denmark)

    Lynnerup, N; Lynnerup, O; Homøe, P

    1992-01-01

    bones as seen on X-rays. By placing the X-rays on a digitizer tablet and tracing the outline of the cell system, the area was calculated by the program. The calculated data and traced images could be stored and printed. The program is written in BASIC; necessary hardware is an IBM-compatible personal......Planimetrical measurements are made to calculate the area of an entity. By digitizing the entity the planimetrical measurements may be done by computer. This computer program was developed in conjunction with a research project involving measurement of the pneumatized cell system of the temporal...... computer, a digitizer tablet and a printer....

  2. The engineering design integration (EDIN) system. [digital computer program complex

    Science.gov (United States)

    Glatt, C. R.; Hirsch, G. N.; Alford, G. E.; Colquitt, W. N.; Reiners, S. J.

    1974-01-01

    A digital computer program complex for the evaluation of aerospace vehicle preliminary designs is described. The system consists of a Univac 1100 series computer and peripherals using the Exec 8 operating system, a set of demand access terminals of the alphanumeric and graphics types, and a library of independent computer programs. Modification of the partial run streams, data base maintenance and construction, and control of program sequencing are provided by a data manipulation program called the DLG processor. The executive control of library program execution is performed by the Univac Exec 8 operating system through a user established run stream. A combination of demand and batch operations is employed in the evaluation of preliminary designs. Applications accomplished with the EDIN system are described.

  3. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  4. Gender Digital Divide and Challenges in Undergraduate Computer Science Programs

    Science.gov (United States)

    Stoilescu, Dorian; McDougall, Douglas

    2011-01-01

    Previous research revealed a reduced number of female students registered in computer science studies. In addition, the female students feel isolated, have reduced confidence, and underperform. This article explores differences between female and male students in undergraduate computer science programs in a mid-size university in Ontario. Based on…

  5. The Design of an Undergraduate Degree Program in Computer & Digital Forensics

    Directory of Open Access Journals (Sweden)

    Gary C. Kessler

    2006-09-01

    Full Text Available Champlain College formally started an undergraduate degree program in Computer & Digital Forensics in 2003. The underlying goals were that the program be multidisciplinary, bringing together the law, computer technology, and the basics of digital investigations; would be available as on online and on-campus offering; and would have a process-oriented focus. Success of this program has largely been due to working closely with practitioners, maintaining activity in events related to both industry and academia, and flexibility to respond to ever-changing needs. This paper provides an overview of how this program was conceived, developed, and implemented; its evolution over time; and current and planned initiatives.

  6. REEFER: a digital computer program for the simulation of high energy electron tubes. [Reefer

    Energy Technology Data Exchange (ETDEWEB)

    Boers, J.E.

    1976-11-01

    A digital computer program for the simulation of very high-energy electron and ion beams is described. The program includes space-charge effects through the solution of Poisson's equation and magnetic effects (both induced and applied) through the relativistic trajectory equations. Relaxation techniques are employed while alternately computing electric fields and trajectories. Execution time is generally less than 15 minutes on a CDC 6600 digital computer. Either space-charge-limited or field-emission sources may be simulated. The input data is described in detail and an example data set is included.

  7. Digital computers in action

    CERN Document Server

    Booth, A D

    1965-01-01

    Digital Computers in Action is an introduction to the basics of digital computers as well as their programming and various applications in fields such as mathematics, science, engineering, economics, medicine, and law. Other topics include engineering automation, process control, special purpose games-playing devices, machine translation and mechanized linguistics, and information retrieval. This book consists of 14 chapters and begins by discussing the history of computers, from the idea of performing complex arithmetical calculations to the emergence of a modern view of the structure of a ge

  8. A digital computer program for the dynamic interaction simulation of controls and structure (DISCOS), volume 1

    Science.gov (United States)

    Bodley, C. S.; Devers, A. D.; Park, A. C.; Frisch, H. P.

    1978-01-01

    A theoretical development and associated digital computer program system for the dynamic simulation and stability analysis of passive and actively controlled spacecraft are presented. The dynamic system (spacecraft) is modeled as an assembly of rigid and/or flexible bodies not necessarily in a topological tree configuration. The computer program system is used to investigate total system dynamic characteristics, including interaction effects between rigid and/or flexible bodies, control systems, and a wide range of environmental loadings. In addition, the program system is used for designing attitude control systems and for evaluating total dynamic system performance, including time domain response and frequency domain stability analyses.

  9. SNOW: a digital computer program for the simulation of ion beam devices

    Energy Technology Data Exchange (ETDEWEB)

    Boers, J.E.

    1980-08-01

    A digital computer program, SNOW, has been developed for the simulation of dense ion beams. The program simulates the plasma expansion cup (but not the plasma source itself), the acceleration region, and a drift space with neutralization if desired. The ion beam is simulated by computing representative trajectories through the device. The potentials are simulated on a large rectangular matrix array which is solved by iterative techniques. Poisson's equation is solved at each point within the configuration using space-charge densities computed from the ion trajectories combined with background electron and/or ion distributions. The simulation methods are described in some detail along with examples of both axially-symmetric and rectangular beams. A detailed description of the input data is presented.

  10. An introduction to digital computing

    CERN Document Server

    George, F H

    2014-01-01

    An Introduction to Digital Computing provides information pertinent to the fundamental aspects of digital computing. This book represents a major step towards the universal availability of programmed material.Organized into four chapters, this book begins with an overview of the fundamental workings of the computer, including the way it handles simple arithmetic problems. This text then provides a brief survey of the basic features of a typical computer that is divided into three sections, namely, the input and output system, the memory system for data storage, and a processing system. Other c

  11. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  12. Microcprocessing Computer Technician, Digital and Microprocessor Technician Program. Post-Graduate 5th Year.

    Science.gov (United States)

    Carangelo, Pasquale R.; Janeczek, Anthony J.

    Materials are provided for a two-semester digital and microprocessor technician postgraduate program. Prerequisites stated for the program include a background in DC and AC theory, solid state devices, basic circuit fundamentals, and basic math. A chronology of major topics and a listing of course objectives appear first. Theory outlines for each…

  13. Comparison of direct digital mammography, computed radiography, and film-screen in the French national breast cancer screening program.

    Science.gov (United States)

    Séradour, Brigitte; Heid, Patrice; Estève, Jacques

    2014-01-01

    The purpose of this article was to compare the performance of digital mammography using hardcopy image reading against film-screen mammography in a French national routine population-based screening program with a decentralized organization. The French context offered the opportunity to examine separately computed radiography and direct digital mammography performances in a large cohort. The study includes 23,423 direct digital mammography, 73,320 computed radiography, and 65,514 film-screen mammography examinations performed by 123 facilities in Bouches du Rhône, France, for women 50-74 years old between 2008 and 2010. We compared abnormal mammography findings rate, cancer detection rate, and tumor characteristics among the technologies. Abnormal finding rates were higher for direct digital mammography (7.78% vs 6.11% for film-screen mammography and 5.34% for computed radiography), particularly in younger women and in denser breasts. Cancer detection rates were also higher for direct digital mammography (0.71% vs 0.66% for film-screen mammography and 0.55% for computed radiography). The contrast between detection rates was stronger for ductal carcinoma in situ. Breast density was the main factor explaining the differences in detection rates. For direct digital mammography only, the detection rate was clearly higher in dense breasts whatever the age (odds ratio, 2.20). Except for grade, no differences were recorded concerning tumor characteristics in which the proportion of high-grade tumors was larger for direct digital mammography for invasive and in situ tumors. Direct digital mammography has a higher detection rate than film-screen mammography in dense breasts and for tumors of high grade. This latter association warrants further study to measure the impact of technology on efficacy of screening. The data indicate that computed radiography detects fewer tumors than film-screen mammography in most instances.

  14. DIGITAL ERA: UTILIZE OF CLOUD COMPUTING TECHNOLOGY IN DIGITAL LIBRARY

    OpenAIRE

    T. RAGHUNADHA REDDY

    2012-01-01

    With the purpose of applying cloud computing to digital library, the paper initially describes cloud computing and analyzes current status of cloud computing in digital library. Then it proposes the architecture of cloud computing in digital library and summarises the application of cloud computing in digital library. Finally the author brings out the future improvement in digital library using cloud computing technology.

  15. Computer enhanced digital angiography

    Energy Technology Data Exchange (ETDEWEB)

    Vas, R.; Diamond, G.A.; Levisman, J.A.; Nakano, F.H.; Neidorf, B.S.; Rose, R.M.; Whiting, J.S.; Forrester, J.S.

    1982-05-01

    A new computer image enhancement technique was employed on cardiac images of 10 dogs and 7 patients to demonstrate the feasibility of an on-line automatic delineation of the left ventricular endocardial silhouette with a peripheral venous injection of contrast material while simultaneously reducing the x-ray dosage. This technique employs a very fast analog-to-digital conversion system capable of digitizing on-line video frames. By storing and continuously updating the first 30 video frames and then subtracting each incoming frame from this memory, most of the background is eliminated leaving only the contrast filled ventricle. Using calibrated densitometric measurements, we found that iodine concentrations in the human left ventricle following venous injection of 40 ml Renografin-76 (25 ml/s), peaked at 4.3 +/- 0.3 mg/ml (mean +/- SD) compared to 14.8 +/- 0.8 mg/ml following direct injection of 40 ml at 13 ml/s (p less than 0.001). The computer enhanced venous-injected images had an optical contrast 14 times greater than that of the unenhanced direct left ventriculogram. This increase in optical contrast provided unambiguous subjective definition of the endocardial borders. This technique is applicable to both central and peripheral contrast injection whereby high quality images can be obtained at approximately 98% reduction in radiation (5 mA, 65-85 kV), allowing performance of serial studies.

  16. Close Reading and Slow ProgrammingComputer Code as Digital Scholarly Edition

    NARCIS (Netherlands)

    van Zundert, Joris J.

    2016-01-01

    Currently most digital scholarly editions are representational digital documentary editions, largely along the lines described by Elena Pierazzo (2015). Alternative theoretical perspectives take less document centric and more process analytical oriented approaches. Textual scholars have, for

  17. Digital design and computer architecture

    CERN Document Server

    Harris, David

    2010-01-01

    Digital Design and Computer Architecture is designed for courses that combine digital logic design with computer organization/architecture or that teach these subjects as a two-course sequence. Digital Design and Computer Architecture begins with a modern approach by rigorously covering the fundamentals of digital logic design and then introducing Hardware Description Languages (HDLs). Featuring examples of the two most widely-used HDLs, VHDL and Verilog, the first half of the text prepares the reader for what follows in the second: the design of a MIPS Processor. By the end of D

  18. User's instructions for ORCENT II: a digital computer program for the analysis of steam turbine cycles supplied by light-water-cooled reactors

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, L.C.

    1979-02-01

    The ORCENT-II digital computer program will perform calculations at valves-wide-open design conditions, maximum guaranteed rating conditions, and an approximation of part-load conditions for steam turbine cycles supplied with throttle steam characteristic of contemporary light-water reactors. Turbine performance calculations are based on a method published by the General Electric Company. Output includes all information normally shown on a turbine-cycle heat balance diagram. The program is written in FORTRAN IV for the IBM System 360 digital computers at the Oak Ridge National Laboratory.

  19. Basic EMC technology advancement for C(3) systems: SHIELD. Volume 4B. A digital computer program for computing crosstalk between shielded cables

    Science.gov (United States)

    Paul, C. R.

    1982-11-01

    This report contains the description and verification of a digital computer program, SHIELD, to be used in the prediction of crosstalk in transmission lines consisting of unshielded wires and/or shielded cables. The line may be above a ground plane (Type 1) or within an overall, circular, cylindrical shield which may be solid or braided and a wire (the shielded wire) located concentrically on the axis of the shield. All wires may be stranded and all conductors are treated as imperfect conductors; that is, their per-unit-length impedances are nonzero. Through-braid coupling for braided shields as well as diffusion for both types are included in the model. The shielded cables may have exposed sections at either end (pigtail sections) in which the shielded wire is not covered by the shield. Over these pigtail sections, a pigtail wire, parallel to the shielded wire, connects the shield to the reference conductor at that end via either a short circuit or an open circuit. These pigtail sections are included in the representation to simulate the common practice of terminating a shielded cable in a connector via these pigtail wires. The pigtail sections may be of different lengths. The program is written in FORTRAN IV and should be implementable on a wide range of digital computers.

  20. Digital Competition Game to Improve Programming Skills

    National Research Council Canada - National Science Library

    Julián Moreno

    2012-01-01

      The aim of this paper is to describe a digital game with an educational purpose in the subject of computer programming, which enables students to reinforce and improve their abilities on the concepts...

  1. Computer Programs.

    Science.gov (United States)

    Anderson, Tiffoni

    This module provides information on development and use of a Material Safety Data Sheet (MSDS) software program that seeks to link literacy skills education, safety training, and human-centered design. Section 1 discusses the development of the software program that helps workers understand the MSDSs that accompany the chemicals with which they…

  2. An Interactive Graphics Program for Investigating Digital Signal Processing.

    Science.gov (United States)

    Miller, Billy K.; And Others

    1983-01-01

    Describes development of an interactive computer graphics program for use in teaching digital signal processing. The program allows students to interactively configure digital systems on a monitor display and observe their system's performance by means of digital plots on the system's outputs. A sample program run is included. (JN)

  3. Digital computer structure and design

    CERN Document Server

    Townsend, R

    2014-01-01

    Digital Computer Structure and Design, Second Edition discusses switching theory, counters, sequential circuits, number representation, and arithmetic functions The book also describes computer memories, the processor, data flow system of the processor, the processor control system, and the input-output system. Switching theory, which is purely a mathematical concept, centers on the properties of interconnected networks of ""gates."" The theory deals with binary functions of 1 and 0 which can change instantaneously from one to the other without intermediate values. The binary number system is

  4. An Interactive Program on Digitizing Historical Seismograms

    Science.gov (United States)

    Xu, Y.; Xu, T.

    2013-12-01

    Retrieving information from historical seismograms is of great importance since they are considered the unique sources that provide quantitative information of historical earthquakes. Modern techniques of seismology require digital forms of seismograms that are essentially a sequence of time-amplitude pairs. However, the historical seismograms, after scanned into computers, are two dimensional arrays. Each element of the arrays contains the grayscale value or RGB value of the corresponding pixel. The problem of digitizing historical seismograms, referred to as converting historical seismograms to digital seismograms, can be formulated as an inverse problem that generating sequences of time-amplitude pairs from a two dimension arrays. This problem has infinite solutions. The algorithm for automatic digitization of historical seismogram presented considers several features of seismograms, including continuity, smoothness of the seismic traces as the prior information, and assumes that the amplitude is a single-valued function of time. An interactive program based on the algorithm is also presented. The program is developed using Matlab GUI and has both automatic and manual modality digitization. Users can easily switch between them, and try different combinations to get the optimal results. Several examples are given to illustrate the results of digitizing seismograms using the program, including a photographic record and a wide-angle reflection/refraction seismogram. Digitized result of the program (redrawn using Golden Software Surfer for high resolution image). (a) shows the result of automatic digitization, and (b) is the result after manual correction.

  5. Flight Trainer Digital Computer Study

    Science.gov (United States)

    1951-03-21

    speed „artet accuracy appear to-.depend on-in- creased size of units. Qn the ot;hgr^|Sg^d^ital cot"|wMr^ar^ stil ^ia-rr^-. their infancy and all...programming for onei .computation. cycle* The program ’diagrams are divided into numb erect frames , the first numb.er of ’.which specifies foe

  6. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  7. Augmenting digital displays with computation

    Science.gov (United States)

    Liu, Jing

    As we inevitably step deeper and deeper into a world connected via the Internet, more and more information will be exchanged digitally. Displays are the interface between digital information and each individual. Naturally, one fundamental goal of displays is to reproduce information as realistically as possible since humans still care a lot about what happens in the real world. Human eyes are the receiving end of such information exchange; therefore it is impossible to study displays without studying the human visual system. In fact, the design of displays is rather closely coupled with what human eyes are capable of perceiving. For example, we are less interested in building displays that emit light in the invisible spectrum. This dissertation explores how we can augment displays with computation, which takes both display hardware and the human visual system into consideration. Four novel projects on display technologies are included in this dissertation: First, we propose a software-based approach to driving multiview autostereoscopic displays. Our display algorithm can dynamically assign views to hardware display zones based on multiple observers' current head positions, substantially reducing crosstalk and stereo inversion. Second, we present a dense projector array that creates a seamless 3D viewing experience for multiple viewers. We smoothly interpolate the set of viewer heights and distances on a per-vertex basis across the arrays field of view, reducing image distortion, crosstalk, and artifacts from tracking errors. Third, we propose a method for high dynamic range display calibration that takes into account the variation of the chrominance error over luminance. We propose a data structure for enabling efficient representation and querying of the calibration function, which also allows user-guided balancing between memory consumption and the amount of computation. Fourth, we present user studies that demonstrate that the ˜ 60 Hz critical flicker fusion

  8. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We identify a number of common features in programming that seem...... conspicuously absent from the literature on biomolecular computing; to partially redress this absence, we introduce a model of computation that is evidently programmable, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined...... by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only executable, but are also compilable and interpretable. It is universal: all computable functions can be computed (in natural ways...

  9. Programming in biomolecular computation

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2010-01-01

    Our goal is to provide a top-down approach to biomolecular computation. In spite of widespread discussion about connections between biology and computation, one question seems notable by its absence: Where are the programs? We introduce a model of computation that is evidently programmable......, by programs reminiscent of low-level computer machine code; and at the same time biologically plausible: its functioning is defined by a single and relatively small set of chemical-like reaction rules. Further properties: the model is stored-program: programs are the same as data, so programs are not only...... in a strong sense: a universal algorithm exists, that is able to execute any program, and is not asymptotically inefficient. A prototype model has been implemented (for now in silico on a conventional computer). This work opens new perspectives on just how computation may be specified at the biological level....

  10. Flexible Animation Computer Program

    Science.gov (United States)

    Stallcup, Scott S.

    1990-01-01

    FLEXAN (Flexible Animation), computer program animating structural dynamics on Evans and Sutherland PS300-series graphics workstation with VAX/VMS host computer. Typical application is animation of spacecraft undergoing structural stresses caused by thermal and vibrational effects. Displays distortions in shape of spacecraft. Program displays single natural mode of vibration, mode history, or any general deformation of flexible structure. Written in FORTRAN 77.

  11. Electronic digital computers their use in science and engineering

    CERN Document Server

    Alt, Franz L

    1958-01-01

    Electronic Digital Computers: Their Use in Science and Engineering describes the principles underlying computer design and operation. This book describes the various applications of computers, the stages involved in using them, and their limitations. The machine is composed of the hardware which is run by a program. This text describes the use of magnetic drum for storage of data and some computing. The functions and components of the computer include automatic control, memory, input of instructions by using punched cards, and output from resulting information. Computers operate by using numbe

  12. Computing fundamentals digital literacy edition

    CERN Document Server

    Wempen, Faithe

    2014-01-01

    Computing Fundamentals has been tailor made to help you get up to speed on your Computing Basics and help you get proficient in entry level computing skills. Covering all the key topics, it starts at the beginning and takes you through basic set-up so that you'll be competent on a computer in no time.You'll cover: Computer Basics & HardwareSoftwareIntroduction to Windows 7Microsoft OfficeWord processing with Microsoft Word 2010Creating Spreadsheets with Microsoft ExcelCreating Presentation Graphics with PowerPointConnectivity and CommunicationWeb BasicsNetwork and Internet Privacy and Securit

  13. Computer Security Assistance Program

    Science.gov (United States)

    1997-09-01

    Information COMPUTER SECURITY ASSISTANCE PROGRAM OPR: HQ AFCA/SYS (CMSgt Hogan) Certified by: HQ USAF/SCXX (Lt Col Francis X. McGovern) Pages: 5...Distribution: F This instruction implements Air Force Policy Directive (AFPD) 33-2, Information Protection, establishes the Air Force Computer Security Assistance...Force single point of contact for reporting and handling computer security incidents and vulnerabilities including AFCERT advisories and Defense

  14. Digitized adiabatic quantum computing with a superconducting circuit.

    Science.gov (United States)

    Barends, R; Shabani, A; Lamata, L; Kelly, J; Mezzacapo, A; Las Heras, U; Babbush, R; Fowler, A G; Campbell, B; Chen, Yu; Chen, Z; Chiaro, B; Dunsworth, A; Jeffrey, E; Lucero, E; Megrant, A; Mutus, J Y; Neeley, M; Neill, C; O'Malley, P J J; Quintana, C; Roushan, P; Sank, D; Vainsencher, A; Wenner, J; White, T C; Solano, E; Neven, H; Martinis, John M

    2016-06-09

    Quantum mechanics can help to solve complex problems in physics and chemistry, provided they can be programmed in a physical device. In adiabatic quantum computing, a system is slowly evolved from the ground state of a simple initial Hamiltonian to a final Hamiltonian that encodes a computational problem. The appeal of this approach lies in the combination of simplicity and generality; in principle, any problem can be encoded. In practice, applications are restricted by limited connectivity, available interactions and noise. A complementary approach is digital quantum computing, which enables the construction of arbitrary interactions and is compatible with error correction, but uses quantum circuit algorithms that are problem-specific. Here we combine the advantages of both approaches by implementing digitized adiabatic quantum computing in a superconducting system. We tomographically probe the system during the digitized evolution and explore the scaling of errors with system size. We then let the full system find the solution to random instances of the one-dimensional Ising problem as well as problem Hamiltonians that involve more complex interactions. This digital quantum simulation of the adiabatic algorithm consists of up to nine qubits and up to 1,000 quantum logic gates. The demonstration of digitized adiabatic quantum computing in the solid state opens a path to synthesizing long-range correlations and solving complex computational problems. When combined with fault-tolerance, our approach becomes a general-purpose algorithm that is scalable.

  15. Digital computer operation of a nuclear reactor

    Science.gov (United States)

    Colley, R.W.

    1982-06-29

    A method is described for the safe operation of a complex system such as a nuclear reactor using a digital computer. The computer is supplied with a data base containing a list of the safe state of the reactor and a list of operating instructions for achieving a safe state when the actual state of the reactor does not correspond to a listed safe state, the computer selects operating instructions to return the reactor to a safe state.

  16. Digital image processing mathematical and computational methods

    CERN Document Server

    Blackledge, J M

    2005-01-01

    This authoritative text (the second part of a complete MSc course) provides mathematical methods required to describe images, image formation and different imaging systems, coupled with the principle techniques used for processing digital images. It is based on a course for postgraduates reading physics, electronic engineering, telecommunications engineering, information technology and computer science. This book relates the methods of processing and interpreting digital images to the 'physics' of imaging systems. Case studies reinforce the methods discussed, with examples of current research

  17. Will the digital computer transform classical mathematics?

    Science.gov (United States)

    Rotman, Brian

    2003-08-15

    Mathematics and machines have influenced each other for millennia. The advent of the digital computer introduced a powerfully new element that promises to transform the relation between them. This paper outlines the thesis that the effect of the digital computer on mathematics, already widespread, is likely to be radical and far-reaching. To articulate this claim, an abstract model of doing mathematics is introduced based on a triad of actors of which one, the 'agent', corresponds to the function performed by the computer. The model is used to frame two sorts of transformation. The first is pragmatic and involves the alterations and progressive colonization of the content and methods of enquiry of various mathematical fields brought about by digital methods. The second is conceptual and concerns a fundamental antagonism between the infinity enshrined in classical mathematics and physics (continuity, real numbers, asymptotic definitions) and the inherently real and material limit of processes associated with digital computation. An example which lies in the intersection of classical mathematics and computer science, the P=NP problem, is analysed in the light of this latter issue.

  18. Digital Immersive Virtual Environments and Instructional Computing

    Science.gov (United States)

    Blascovich, Jim; Beall, Andrew C.

    2010-01-01

    This article reviews theory and research relevant to the development of digital immersive virtual environment-based instructional computing systems. The review is organized within the context of a multidimensional model of social influence and interaction within virtual environments that models the interaction of four theoretical factors: theory…

  19. An interactive program on digitizing historical seismograms

    Science.gov (United States)

    Xu, Yihe; Xu, Tao

    2014-02-01

    Retrieving information from analog seismograms is of great importance since they are considered as the unique sources that provide quantitative information of historical earthquakes. We present an algorithm for automatic digitization of the seismograms as an inversion problem that forms an interactive program using Matlab® GUI. The program integrates automatic digitization with manual digitization and users can easily switch between the two modalities and carry out different combinations for the optimal results. Several examples about applying the interactive program are given to illustrate the merits of the method.

  20. Analysis of computational vulnerabilities in digital repositories

    Directory of Open Access Journals (Sweden)

    Valdete Fernandes Belarmino

    2015-04-01

    Full Text Available Objective. Demonstrates the results of research that aimed to analyze the computational vulnerabilities of digital directories in public Universities. Argues the relevance of information in contemporary societies like an invaluable resource, emphasizing scientific information as an essential element to constitute scientific progress. Characterizes the emergence of Digital Repositories and highlights its use in academic environment to preserve, promote, disseminate and encourage the scientific production. Describes the main software for the construction of digital repositories. Method. The investigation identified and analyzed the vulnerabilities that are exposed the digital repositories using Penetration Testing running. Discriminating the levels of risk and the types of vulnerabilities. Results. From a sample of 30 repositories, we could examine 20, identified that: 5% of the repositories have critical vulnerabilities, 85% high, 25% medium and 100% lowers. Conclusions. Which demonstrates the necessity to adapt actions for these environments that promote informational security to minimizing the incidence of external and / or internal systems attacks.Abstract Grey Text – use bold for subheadings when needed.

  1. Computer Assisted Parallel Program Generation

    CERN Document Server

    Kawata, Shigeo

    2015-01-01

    Parallel computation is widely employed in scientific researches, engineering activities and product development. Parallel program writing itself is not always a simple task depending on problems solved. Large-scale scientific computing, huge data analyses and precise visualizations, for example, would require parallel computations, and the parallel computing needs the parallelization techniques. In this Chapter a parallel program generation support is discussed, and a computer-assisted parallel program generation system P-NCAS is introduced. Computer assisted problem solving is one of key methods to promote innovations in science and engineering, and contributes to enrich our society and our life toward a programming-free environment in computing science. Problem solving environments (PSE) research activities had started to enhance the programming power in 1970's. The P-NCAS is one of the PSEs; The PSE concept provides an integrated human-friendly computational software and hardware system to solve a target ...

  2. Portable Digital Radiography and Computed Tomography Manual

    Energy Technology Data Exchange (ETDEWEB)

    2007-11-01

    This user manual describes the function and use of the portable digital radiography and computed tomography (DRCT) scanner. The manual gives a general overview of x-ray imaging systems along with a description of the DRCT system. An inventory of the all the system components, organized by shipping container, is also included. In addition, detailed, step-by-step procedures are provided for all of the exercises necessary for a novice user to successfully collect digital radiographs and tomographic images of an object, including instructions on system assembly and detector calibration and system alignment. There is also a short section covering the limited system care and maintenance needs. Descriptions of the included software packages, the DRCT Digital Imager used for system operation, and the DRCT Image Processing Interface used for image viewing and tomographic data reconstruction are given in the appendixes. The appendixes also include a cheat sheet for more experienced users, a listing of known system problems and how to mitigate them, and an inventory check-off sheet suitable for copying and including with the machine for shipment purposes.

  3. Linguistics in the digital humanities: (computational) corpus linguistics

    OpenAIRE

    Kim Ebensgaard Jensen

    2014-01-01

    Corpus linguistics has been closely intertwined with digital technology since the introduction of university computer mainframes in the 1960s. Making use of both digitized data in the form of the language corpus and computational methods of analysis involving concordancers and statistics software, corpus linguistics arguably has a place in the digital humanities. Still, it remains obscure and fi gures only sporadically in the literature on the digital humanities. Th is article provides an ove...

  4. Shade matching assisted by digital photography and computer software.

    Science.gov (United States)

    Schropp, Lars

    2009-04-01

    To evaluate the efficacy of digital photographs and graphic computer software for color matching compared to conventional visual matching. The shade of a tab from a shade guide (Vita 3D-Master Guide) placed in a phantom head was matched to a second guide of the same type by nine observers. This was done for twelve selected shade tabs (tests). The shade-matching procedure was performed visually in a simulated clinic environment and with digital photographs, and the time spent for both procedures was recorded. An alternative arrangement of the shade tabs was used in the digital photographs. In addition, a graphic software program was used for color analysis. Hue, chroma, and lightness values of the test tab and all tabs of the second guide were derived from the digital photographs. According to the CIE L*C*h* color system, the color differences between the test tab and tabs of the second guide were calculated. The shade guide tab that deviated least from the test tab was determined to be the match. Shade matching performance by means of graphic software was compared with the two visual methods and tested by Chi-square tests (alpha= 0.05). Eight of twelve test tabs (67%) were matched correctly by the computer software method. This was significantly better (p < 0.02) than the performance of the visual shade matching methods conducted in the simulated clinic (32% correct match) and with photographs (28% correct match). No correlation between time consumption for the visual shade matching methods and frequency of correct match was observed. Shade matching assisted by digital photographs and computer software was significantly more reliable than by conventional visual methods.

  5. Building Capacity Through Hands-on Computational Internships to Assure Reproducible Results and Implementation of Digital Documentation in the ICERT REU Program

    Science.gov (United States)

    Gomez, R.; Gentle, J.

    2015-12-01

    Modern data pipelines and computational processes require that meticulous methodologies be applied in order to insure that the source data, algorithms, and results are properly curated, managed and retained while remaining discoverable, accessible, and reproducible. Given the complexity of understanding the scientific problem domain being researched, combined with the overhead of learning to use advanced computing technologies, it becomes paramount that the next generation of scientists and researchers learn to embrace best-practices. The Integrative Computational Education and Research Traineeship (ICERT) is a National Science Foundation (NSF) Research Experience for Undergraduates (REU) Site at the Texas Advanced Computing Center (TACC). During Summer 2015, two ICERT interns joined the 3DDY project. 3DDY converts geospatial datasets into file types that can take advantage of new formats, such as natural user interfaces, interactive visualization, and 3D printing. Mentored by TACC researchers for ten weeks, students with no previous background in computational science learned to use scripts to build the first prototype of the 3DDY application, and leveraged Wrangler, the newest high performance computing (HPC) resource at TACC. Test datasets for quadrangles in central Texas were used to assemble the 3DDY workflow and code. Test files were successfully converted into a stereo lithographic (STL) format, which is amenable for use with a 3D printers. Test files and the scripts were documented and shared using the Figshare site while metadata was documented for the 3DDY application using OntoSoft. These efforts validated a straightforward set of workflows to transform geospatial data and established the first prototype version of 3DDY. Adding the data and software management procedures helped students realize a broader set of tangible results (e.g. Figshare entries), better document their progress and the final state of their work for the research group and community

  6. Narrowing the Digital Divide: "Head Start Teachers Develop Proficiency in Computer Technology"

    Science.gov (United States)

    Chen, Jie-Qi; Price, Valerie

    2006-01-01

    The Digital Divide originates with inequalities in children's access to computers. It is deepened by disparities in teacher readiness to use computers for educational purposes. This article describes a computer training program designed to help Head Start teachers develop attitudes, skills, and practices that maximize the educational benefits…

  7. Computer programs as accounting object

    Directory of Open Access Journals (Sweden)

    I.V. Perviy

    2015-03-01

    Full Text Available Existing approaches to the regulation of accounting software as one of the types of intangible assets have been considered. The features and current state of the legal protection of computer programs have been analyzed. The reasons for the need to use patent law as a means of legal protection of individual elements of computer programs have been discovered. The influence of the legal aspects of the use of computer programs for national legislation to their accounting reflection has been analyzed. The possible options for the transfer of rights from computer programs copyright owners have been analyzed that should be considered during creation of software accounting system at the enterprise. Identified and analyzed the characteristics of computer software as an intangible asset under the current law. General economic characteristics of computer programs as one of the types of intangible assets have been grounded. The main distinguishing features of software compared to other types of intellectual property have been all ocated

  8. A programming approach to computability

    CERN Document Server

    Kfoury, A J; Arbib, Michael A

    1982-01-01

    Computability theory is at the heart of theoretical computer science. Yet, ironically, many of its basic results were discovered by mathematical logicians prior to the development of the first stored-program computer. As a result, many texts on computability theory strike today's computer science students as far removed from their concerns. To remedy this, we base our approach to computability on the language of while-programs, a lean subset of PASCAL, and postpone consideration of such classic models as Turing machines, string-rewriting systems, and p. -recursive functions till the final chapter. Moreover, we balance the presentation of un solvability results such as the unsolvability of the Halting Problem with a presentation of the positive results of modern programming methodology, including the use of proof rules, and the denotational semantics of programs. Computer science seeks to provide a scientific basis for the study of information processing, the solution of problems by algorithms, and the design ...

  9. Digital architecture, wearable computers and providing affinity

    DEFF Research Database (Denmark)

    Guglielmi, Michel; Johannesen, Hanne Louise

    2005-01-01

    as the setting for the events of experience. Contemporary architecture is a meta-space residing almost any thinkable field, striving to blur boundaries between art, architecture, design and urbanity and break down the distinction between the material and the user or inhabitant. The presentation for this paper...... will, through research, a workshop and participation in a cumulus competition, focus on the exploration of boundaries between digital architecture, performative space and wearable computers. Our design method in general focuses on the interplay between the performing body and the environment – between......This paper aims at the tendency to create space that fosters and supports communication, emotion and experience. Traditionally, architecture has been a static, physical solution of course with vivid concepts, but with new technology you could propose supple solutions that recognize architecture...

  10. Digital Da Vinci computers in the arts and sciences

    CERN Document Server

    Lee, Newton

    2014-01-01

    Explores polymathic education through unconventional and creative applications of computer science in the arts and sciences Examines the use of visual computation, 3d printing, social robotics and computer modeling for computational art creation and design Includes contributions from leading researchers and practitioners in computer science, architecture and digital media

  11. Computer Program Newsletter No. 7

    Energy Technology Data Exchange (ETDEWEB)

    Magnuson, W.G. Jr.

    1982-09-01

    This issue of the Computer Program Newsletter updates an earlier newsletter (Number 2, September 1979) and focuses on electrical network analysis computer programs. In particular, five network analysis programs (SCEPTRE, SPICE2, NET2, CALAHAN, and EMTP) will be described. The objective of this newsletter will be to provide a very brief description of the input syntax and semantics for each program, highlight their strong and weak points, illustrate how the programs are run at Lawrence Livermore National Laboratory using the Octopus computer network, and present examples of input for each of the programs to illustrate some of the features of each program. In a sense, this newsletter can be used as a quick reference guide to the programs.

  12. NASA's computer science research program

    Science.gov (United States)

    Larsen, R. L.

    1983-01-01

    Following a major assessment of NASA's computing technology needs, a new program of computer science research has been initiated by the Agency. The program includes work in concurrent processing, management of large scale scientific databases, software engineering, reliable computing, and artificial intelligence. The program is driven by applications requirements in computational fluid dynamics, image processing, sensor data management, real-time mission control and autonomous systems. It consists of university research, in-house NASA research, and NASA's Research Institute for Advanced Computer Science (RIACS) and Institute for Computer Applications in Science and Engineering (ICASE). The overall goal is to provide the technical foundation within NASA to exploit advancing computing technology in aerospace applications.

  13. Designing computer programs

    CERN Document Server

    Haigh, Jim

    1994-01-01

    This is a book for students at every level who are learning to program for the first time - and for the considerable number who learned how to program but were never taught to structure their programs. The author presents a simple set of guidelines that show the programmer how to design in a manageable structure from the outset. The method is suitable for most languages, and is based on the widely used 'JSP' method, to which the student may easily progress if it is needed at a later stage.Most language specific texts contain very little if any information on design, whilst books on des

  14. Computer Program NIKE

    DEFF Research Database (Denmark)

    Spanget-Larsen, Jens

    2014-01-01

    FORTRAN source code for program NIKE (PC version of QCPE 343). Sample input and output for two model chemical reactions are appended: I. Three consecutive monomolecular reactions, II. A simple chain mechanism...

  15. Computing Hypercrossed Complex Pairings in Digital Images

    Directory of Open Access Journals (Sweden)

    Simge Öztunç

    2013-01-01

    Full Text Available We consider an additive group structure in digital images and introduce the commutator in digital images. Then we calculate the hypercrossed complex pairings which generates a normal subgroup in dimension 2 and in dimension 3 by using 8-adjacency and 26-adjacency.

  16. Cloud computing and digital media fundamentals, techniques, and applications

    CERN Document Server

    Li, Kuan-Ching; Shih, Timothy K

    2014-01-01

    Cloud Computing and Digital Media: Fundamentals, Techniques, and Applications presents the fundamentals of cloud and media infrastructure, novel technologies that integrate digital media with cloud computing, and real-world applications that exemplify the potential of cloud computing for next-generation digital media. It brings together technologies for media/data communication, elastic media/data storage, security, authentication, cross-network media/data fusion, interdevice media interaction/reaction, data centers, PaaS, SaaS, and more.The book covers resource optimization for multimedia clo

  17. Line-Editor Computer Program

    Science.gov (United States)

    Scott, Peter J.

    1989-01-01

    ZED editing program for DEC VAX computer simple, powerful line editor for text, program source code, and nonbinary data. Excels in processing of text by use of procedure files. Also features versatile search qualifiers, global changes, conditionals, online help, hexadecimal mode, space compression, looping, logical combinations of search strings, journaling, visible control characters, and automatic detabbing. Users of Cambridge implementation devised such ZED procedures as chess games, calculators, and programs for evaluating pi. Written entirely in C.

  18. The psychology of computer programming

    CERN Document Server

    Weinberg, Gerald Marvin

    1998-01-01

    This landmark 1971 classic is reprinted with a new preface, chapter-by-chapter commentary, and straight-from-the-heart observations on topics that affect the professional life of programmers. Long regarded as one of the first books to pioneer a people-oriented approach to computing, The Psychology of Computer Programming endures as a penetrating analysis of the intelligence, skill, teamwork, and problem-solving power of the computer programmer. Finding the chapters strikingly relevant to today's issues in programming, Gerald M. Weinberg adds new insights and highlights the similarities and differences between now and then. Using a conversational style that invites the reader to join him, Weinberg reunites with some of his most insightful writings on the human side of software engineering. Topics include egoless programming, intelligence, psychological measurement, personality factors, motivation, training, social problems on large projects, problem-solving ability, programming language design, team formati...

  19. Functional Programming in Computer Science

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Loren James [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Davis, Marion Kei [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-01-19

    We explore functional programming through a 16-week internship at Los Alamos National Laboratory. Functional programming is a branch of computer science that has exploded in popularity over the past decade due to its high-level syntax, ease of parallelization, and abundant applications. First, we summarize functional programming by listing the advantages of functional programming languages over the usual imperative languages, and we introduce the concept of parsing. Second, we discuss the importance of lambda calculus in the theory of functional programming. Lambda calculus was invented by Alonzo Church in the 1930s to formalize the concept of effective computability, and every functional language is essentially some implementation of lambda calculus. Finally, we display the lasting products of the internship: additions to a compiler and runtime system for the pure functional language STG, including both a set of tests that indicate the validity of updates to the compiler and a compiler pass that checks for illegal instances of duplicate names.

  20. Light Water Reactor Sustainability Program. Digital Architecture Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Kenneth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    The Digital Architecture effort is a part of the Department of Energy (DOE) sponsored Light-Water Reactor Sustainability (LWRS) Program conducted at Idaho National Laboratory (INL). The LWRS program is performed in close collaboration with industry research and development (R&D) programs that provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants (NPPs). One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. Therefore, a major objective of the LWRS program is the development of a seamless digital environment for plant operations and support by integrating information from plant systems with plant processes for nuclear workers through an array of interconnected technologies. In order to get the most benefits of the advanced technology suggested by the different research activities in the LWRS program, the nuclear utilities need a digital architecture in place to support the technology. A digital architecture can be defined as a collection of information technology (IT) capabilities needed to support and integrate a wide-spectrum of real-time digital capabilities for nuclear power plant performance improvements. It is not hard to imagine that many processes within the plant can be largely improved from both a system and human performance perspective by utilizing a plant wide (or near plant wide) wireless network. For example, a plant wide wireless network allows for real time plant status information to easily be accessed in the control room, field workers’ computer-based procedures can be updated based on the real time plant status, and status on ongoing procedures can be incorporated into smart schedules in the outage command center to allow for more accurate planning of critical tasks. The goal

  1. A Management-Based CIPP Evaluation of a Northern New Jersey School District's Digital Backpack Program

    Science.gov (United States)

    Bachenheimer, Barry A.

    2011-01-01

    The purpose of this study was to evaluate the Digital Backpack program in a Northern New Jersey School District using the CIPP Management-Based Evaluation model as a framework. The Stufflebeam (1971) CIPP model is an acronym for Context, Input, Process, and Product Evaluation. A "Digital Backpack" is a rolling computer bag given to K-12…

  2. Sonic boom research. [computer program

    Science.gov (United States)

    Zakkay, V.; Ting, L.

    1976-01-01

    A computer program for CDC 6600 is developed for the nonlinear sonic boom analysis including the asymmetric effect of lift near the vertical plane of symmetry. The program is written in FORTRAN 4 language. This program carries out the numerical integration of the nonlinear governing equations from the input data at a finite distance from the airplane configuration at a flight altitude to yield the pressure signitude at ground. The required input data and the format for the output are described. A complete program listing and a sample calculation are given.

  3. Digital Potentiometer for Hybrid Computer EAI 680-PDP-8/I

    DEFF Research Database (Denmark)

    Højberg, Kristian Søe; Olsen, Jens V.

    1974-01-01

    In this article a description is given of a 12 bit digital potentiometer for hybrid computer application. The system is composed of standard building blocks. Emphasis is laid on the development problems met and the problem solutions developed.......In this article a description is given of a 12 bit digital potentiometer for hybrid computer application. The system is composed of standard building blocks. Emphasis is laid on the development problems met and the problem solutions developed....

  4. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    Science.gov (United States)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  5. Risk-Assessment Computer Program

    Science.gov (United States)

    Dias, William C.; Mittman, David S.

    1993-01-01

    RISK D/C is prototype computer program assisting in attempts to do program risk modeling for Space Exploration Initiative (SEI) architectures proposed in Synthesis Group Report. Risk assessment performed with respect to risk events, probabilities, and severities of potential results. Enables ranking, with respect to effectiveness, of risk-mitigation strategies proposed for exploration program architecture. Allows for fact that risk assessment in early phases of planning subjective. Although specific to SEI in present form, also used as software framework for development of risk-assessment programs for other specific uses. Developed for Macintosh(TM) series computer. Requires HyperCard(TM) 2.0 or later, as well as 2 Mb of random-access memory and System 6.0.8 or later.

  6. Educational Impact of Digital Visualization Tools on Digital Character Production Computer Science Courses

    Science.gov (United States)

    van Langeveld, Mark Christensen

    2009-01-01

    Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…

  7. Elliptical Orbit Performance Computer Program

    Science.gov (United States)

    Myler, T.

    1984-01-01

    Elliptical Orbit Performance (ELOPE) computer program for analyzing orbital performance of space boosters uses orbit insertion data obtained from trajectory simulation to generate parametric data on apogee and perigee altitudes as function of payload data. Data used to generate presentation plots that display elliptical orbit performance capability of space booster.

  8. Computer Aided Teaching of Digital Signal Processing.

    Science.gov (United States)

    Castro, Ian P.

    1990-01-01

    Describes a microcomputer-based software package developed at the University of Surrey for teaching digital signal processing to undergraduate science and engineering students. Menu-driven software capabilities are explained, including demonstration of qualitative concepts and experimentation with quantitative data, and examples are given of…

  9. Intra- and interobserver reliability analysis of digital radiographic measurements for pediatric orthopedic parameters using a novel PACS integrated computer software program.

    Science.gov (United States)

    Segev, Eitan; Hemo, Yoram; Wientroub, Shlomo; Ovadia, Dror; Fishkin, Michael; Steinberg, David M; Hayek, Shlomo

    2010-08-01

    The between-observer reliability of repeated anatomic assessments in pediatric orthopedics relies on the precise definition of bony landmarks for measuring angles, indexes, and lengths of joints, limbs, and spine. We have analyzed intra- and interobserver reliability with a new digital measurement system (TraumaCad Wizard™). Five pediatric orthopedic surgeons measured 50 digital radiographs on three separate days using the TraumaCad system. There were 10 anterior-posterior (AP) pelvic views from developmental dysplasia of the hip (DDH) patients, 10 AP pelvic views from cerebral palsy (CP) patients, 10 AP standing view of the lower limb radiographs from leg length discrepancy (LLD) patients, and 10 AP and 10 lateral spine X-rays from scoliosis patients. All standing view of the lower limb radiographs were calibrated by the software to allow for accurate length measurements, using as reference a 1-inch metal ball placed at the level of the bone. Each observer performed 540 measurements (totaling 2,700). We estimated intra- and interobserver standard deviations for measurements in all categories by specialists and nonspecialists. The intraclass correlation coefficient (ICC) summarized the overall accuracy and precision of the measurement process relative to subject variation. We examined whether the relative accuracy of a measurement is adversely affected by the number of bony landmarks required for making the measurement. The overall ICC was >0.74 for 13 out of 18 measurements. Accuracy of the acetabular index for DDH was greater than for CP and relatively low for the center-edge angle in CP. Accuracy for bone length was better than for joint angulations in LLD and for the Cobb angle in AP views compared to lateral views for scoliosis. There were no clinically important biases, and most of the differences between specialists and nonspecialists were nonsignificant. The correlation between the results according to the number of bony landmarks that needed to be

  10. Edge enhancement of computed tomograms by digital unsharp masking.

    Science.gov (United States)

    Winter, J

    1980-04-01

    Edge enhanced images can be produced on existing commercial computed tomographic equipment by a method called "digital unsharp masking" without any expense or computer software development. This technique permits display of anatomic areas having an extremely wide range of densities, while making edge detail more apparent.

  11. Developing Digital Immigrants' Computer Literacy: The Case of Unemployed Women

    Science.gov (United States)

    Ktoridou, Despo; Eteokleous-Grigoriou, Nikleia

    2011-01-01

    Purpose: The purpose of this study is to evaluate the effectiveness of a 40-hour computer course for beginners provided to a group of unemployed women learners with no/minimum computer literacy skills who can be characterized as digital immigrants. The aim of the study is to identify participants' perceptions and experiences regarding technology,…

  12. How to optimize radiological images captured from digital cameras, using the Adobe Photoshop 6.0 program.

    Science.gov (United States)

    Chalazonitis, A N; Koumarianos, D; Tzovara, J; Chronopoulos, P

    2003-06-01

    Over the past decade, the technology that permits images to be digitized and the reduction in the cost of digital equipment allows quick digital transfer of any conventional radiological film. Images then can be transferred to a personal computer, and several software programs are available that can manipulate their digital appearance. In this article, the fundamentals of digital imaging are discussed, as well as the wide variety of optional adjustments that the Adobe Photoshop 6.0 (Adobe Systems, San Jose, CA) program can offer to present radiological images with satisfactory digital imaging quality.

  13. [Analog gamma camera digitalization computer system].

    Science.gov (United States)

    Rojas, G M; Quintana, J C; Jer, J; Astudillo, S; Arenas, L; Araya, H

    2004-01-01

    Digitalization of analogue gamma cameras systems, using special acquisition boards in microcomputers and appropriate software for acquisition and processing of nuclear medicine images is described in detail. Microcomputer integrated systems interconnected by means of a Local Area Network (LAN) and connected to several gamma cameras have been implemented using specialized acquisition boards. The PIP software (Portable Image Processing) was installed on each microcomputer to acquire and preprocess the nuclear medicine images. A specialized image processing software has been designed and developed for these purposes. This software allows processing of each nuclear medicine exam, in a semiautomatic procedure, and recording of the results on radiological films. . A stable, flexible and inexpensive system which makes it possible to digitize, visualize, process, and print nuclear medicine images obtained from analogue gamma cameras was implemented in the Nuclear Medicine Division. Such a system yields higher quality images than those obtained with analogue cameras while keeping operating costs considerably lower (filming: 24.6%, fixing 48.2% and developing 26%.) Analogue gamma camera systems can be digitalized economically. This system makes it possible to obtain optimal clinical quality nuclear medicine images, to increase the acquisition and processing efficiency, and to reduce the steps involved in each exam.

  14. Digital painting as a specific area of computer graphics

    OpenAIRE

    Vyhnánková, Iva

    2013-01-01

    TITLE: Digital painting as a specific area of computer graphics AUTHOR: Iva Vyhnánková DEPARTMENT: Department of Information Technology and Education SUPERVISOR: Mgr. Tomáš Jeřábek ABSTRACT: Digital painting brings new possibilities to the field of arts. These possibilities consist not only of general advantages of software usage such as undo and easy modifications, but also of number of new tools, which can simulate traditional media to a certain degree. The work deals with digital painting ...

  15. Handwritten Digits Recognition Using Neural Computing

    Directory of Open Access Journals (Sweden)

    Călin Enăchescu

    2009-12-01

    Full Text Available In this paper we present a method for the recognition of handwritten digits and a practical implementation of this method for real-time recognition. A theoretical framework for the neural networks used to classify the handwritten digits is also presented.The classification task is performed using a Convolutional Neural Network (CNN. CNN is a special type of multy-layer neural network, being trained with an optimized version of the back-propagation learning algorithm.CNN is designed to recognize visual patterns directly from pixel images with minimal preprocessing, being capable to recognize patterns with extreme variability (such as handwritten characters, and with robustness to distortions and simple geometric transformations.The main contributions of this paper are related to theoriginal methods for increasing the efficiency of the learning algorithm by preprocessing the images before the learning process and a method for increasing the precision and performance for real-time applications, by removing the non useful information from the background.By combining these strategies we have obtained an accuracy of 96.76%, using as training set the NIST (National Institute of Standards and Technology database.

  16. Computational intelligence in digital forensics forensic investigation and applications

    CERN Document Server

    Choo, Yun-Huoy; Abraham, Ajith; Srihari, Sargur

    2014-01-01

    Computational Intelligence techniques have been widely explored in various domains including forensics. Analysis in forensic encompasses the study of pattern analysis that answer the question of interest in security, medical, legal, genetic studies and etc. However, forensic analysis is usually performed through experiments in lab which is expensive both in cost and time. Therefore, this book seeks to explore the progress and advancement of computational intelligence technique in different focus areas of forensic studies. This aims to build stronger connection between computer scientists and forensic field experts.   This book, Computational Intelligence in Digital Forensics: Forensic Investigation and Applications, is the first volume in the Intelligent Systems Reference Library series. The book presents original research results and innovative applications of computational intelligence in digital forensics. This edited volume contains seventeen chapters and presents the latest state-of-the-art advancement ...

  17. BLAST: a digital computer program for the dynamic simulation of the high temperature gas cooled reactor reheater-steam generator module

    Energy Technology Data Exchange (ETDEWEB)

    Hedrick, R.A.; Cleveland, J.C.

    1976-06-24

    BLAST simulates the high temperature gas cooled reactor reheater-steam generator module with a multi-node, fixed boundary, homogenous flow model. The time dependent conservation of energy, mass, and momentum equations are solved by an implicit integration technique. The code contains equation of state formulations for both helium and water as well as heat transfer and friction factor correlations. Normal operational transients and more severe transients such as those resulting in low and/or reverse flow can be simulated. The code calculates helium and water temperature, pressure, flow rate, and tube bulk and wall temperatures at various points within the reheater-steam generator module during the transients. BLAST predictions will be compared with dynamic test results obtained from the Fort St. Vrain reactor owned by Public Service of Colorado, and, based on these comparisons, appropriate improvements will be made in BLAST. BLAST is written in FORTRAN IV for the IBM 360/91 computer at the Oak Ridge National Laboratory.

  18. Leaders in Computing Changing the digital world

    CERN Document Server

    IT, BCS -The Chartered Institute for; Booch, Grady; Torvalds, Linus; Wozniak, Steve; Cerf, Vint; Spärck Jones, Karen; Berners-Lee, Tim; Wales, Jimmy; Shirley, Stephanie

    2011-01-01

    This collection of interviews provides a fascinating insight into the thoughts and ideas of influential figures from the world of IT and computing, such as Sir Tim Berners-Lee, Donald Knuth, Linus Torvalds, Jimmy Wales and Steve Wozniak. It gives an excellent overview of important developments in this diverse field over recent years.

  19. Digital Geometry Algorithms Theoretical Foundations and Applications to Computational Imaging

    CERN Document Server

    Barneva, Reneta

    2012-01-01

    Digital geometry emerged as an independent discipline in the second half of the last century. It deals with geometric properties of digital objects and is developed with the unambiguous goal to provide rigorous theoretical foundations for devising new advanced approaches and algorithms for various problems of visual computing. Different aspects of digital geometry have been addressed in the literature. This book is the first one that explicitly focuses on the presentation of the most important digital geometry algorithms. Each chapter provides a brief survey on a major research area related to the general volume theme, description and analysis of related fundamental algorithms, as well as new original contributions by the authors. Every chapter contains a section in which interesting open problems are addressed.

  20. Computer Education in Dental Laboratory Technology Programs.

    Science.gov (United States)

    Rogers, William A.; Hawkins, Robert Ross

    1991-01-01

    A 1990 survey of 37 dental technology programs investigated 3 areas of computer use: current and anticipated general computer education courses; incorporation of computer applications into technology and management courses; and faculty use of the computer. Most programs are beginning to expand use of technology. (MSE)

  1. Digital doorway computer literacy through unassisted learning in South Africa

    CSIR Research Space (South Africa)

    Smith, R

    2006-02-01

    Full Text Available The Digital Doorway is a joint project between the Department of Science and Technology (DST) and the Meraka Institute, with a vision of making a fundamental difference to computer literacy and associated skills in Africa. Underpinning this project...

  2. The ongoing Digitalization of an Introductory Programming Course

    DEFF Research Database (Denmark)

    Nørmark, Kurt

    2016-01-01

    This paper is about the ongoing digitalization of a C programming course. The paper describes our considerations about the use of video resources, as well as other digital learning resources. In particular, we discuss the ongoing transition from using a number of supplementary videos (in a tradit......This paper is about the ongoing digitalization of a C programming course. The paper describes our considerations about the use of video resources, as well as other digital learning resources. In particular, we discuss the ongoing transition from using a number of supplementary videos (in...

  3. Letter from the Guest Editor: Digital Rhetoric, Digital Literacy, Computers, and Composition.

    Science.gov (United States)

    Handa, Carolyn

    2001-01-01

    Describes how this special issue collects a group of thought-provoking essays to encourage both writing teachers and the members of the more specialized field of Computers and Writing to consider questions of digital rhetoric and literacy in their many applications to subjects thought about everyday: pedagogy, professional writing, hypertext…

  4. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  5. Use of the computer program in a cloud computing

    Directory of Open Access Journals (Sweden)

    Radovanović Sanja

    2013-01-01

    Full Text Available Cloud computing represents a specific networking, in which a computer program simulates the operation of one or more server computers. In terms of copyright, all technological processes that take place within the cloud computing are covered by the notion of copying computer programs, and exclusive right of reproduction. However, this right suffers some limitations in order to allow normal use of computer program by users. Based on the fact that the cloud computing is virtualized network, the issue of normal use of the computer program requires to put all aspects of the permitted copying into the context of a specific computing environment and specific processes within the cloud. In this sense, the paper pointed out that the user of a computer program in cloud computing, needs to obtain the consent of the right holder for any act which he undertakes using the program. In other words, the copyright in the cloud computing is a full scale, and thus the freedom of contract (in the case of this particular restriction as well.

  6. Scaling the Digital Divide: Home Computer Technology and Student Achievement

    OpenAIRE

    Vigdor, Jacob L.; Helen F. Ladd

    2010-01-01

    Does differential access to computer technology at home compound the educational disparities between rich and poor? Would a program of government provision of computers to early secondary school students reduce these disparities? We use administrative data on North Carolina public school students to corroborate earlier surveys that document broad racial and socioeconomic gaps in home computer access and use. Using within-student variation in home computer access, and across-ZIP code variation...

  7. Gamma spectra pictures using a digital plotter. Program MONO; Representacion de Espectros directos mediante un trazado digital. Prograa MONO

    Energy Technology Data Exchange (ETDEWEB)

    Los Arcos, J. M.

    1978-07-01

    The program MONO has been written for a CALCOMP-936 digital plotter operating off- -line with a UMI VAC 1106 computer, to obtain graphic representations of single gamma spectra stored on magnetic tape. It allows to plot the whole spectrum or only a part, as well as to draw a given spectrum on the same or different picture than the previous one. Ten representation scales are available and at up nine comment lines can be written in a graphic. (Author) 4 refs.

  8. Scenario-Based Digital Forensics Challenges in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Erik Miranda Lopez

    2016-10-01

    Full Text Available The aim of digital forensics is to extract information to answer the 5Ws (Why, When, Where, What, and Who from the data extracted from the evidence. In order to achieve this, most digital forensic processes assume absolute control of digital evidence. However, in a cloud environment forensic investigation, this is not always possible. Additionally, the unique characteristics of cloud computing create new technical, legal and architectural challenges when conducting a forensic investigation. We propose a hypothetical scenario to uncover and explain the challenges forensic practitioners face during cloud investigations. Additionally, we also provide solutions to address the challenges. Our hypothetical case scenario has shown that, in the long run, better live forensic tools, development of new methods tailored for cloud investigations and new procedures and standards are indeed needed. Furthermore, we have come to the conclusion that forensic investigations biggest challenge is not technical but legal.

  9. Personal Computer Transport Analysis Program

    Science.gov (United States)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  10. Hydraulic logic gates: building a digital water computer

    Science.gov (United States)

    Taberlet, Nicolas; Marsal, Quentin; Ferrand, Jérémy; Plihon, Nicolas

    2018-03-01

    In this article, we propose an easy-to-build hydraulic machine which serves as a digital binary computer. We first explain how an elementary adder can be built from test tubes and pipes (a cup filled with water representing a 1, and empty cup a 0). Using a siphon and a slow drain, the proposed setup combines AND and XOR logical gates in a single device which can add two binary digits. We then show how these elementary units can be combined to construct a full 4-bit adder. The sequencing of the computation is discussed and a water clock can be incorporated so that the machine can run without any exterior intervention.

  11. Debugging a high performance computing program

    Science.gov (United States)

    Gooding, Thomas M.

    2013-08-20

    Methods, apparatus, and computer program products are disclosed for debugging a high performance computing program by gathering lists of addresses of calling instructions for a plurality of threads of execution of the program, assigning the threads to groups in dependence upon the addresses, and displaying the groups to identify defective threads.

  12. Computer Assistance for Writing Interactive Programs: TICS

    Science.gov (United States)

    Kaplow, Ray; And Others

    1973-01-01

    A description of an on-line and interactive programing system (TICS - Teacher-Interactive-Computer-System), which is aimed at facilitating the authoring of interactive, instructional computer programs by persons who are experts on the subject matter being addressed, but not necessarily programers. (Author)

  13. The NASA computer science research program plan

    Science.gov (United States)

    1983-01-01

    A taxonomy of computer science is included, one state of the art of each of the major computer science categories is summarized. A functional breakdown of NASA programs under Aeronautics R and D, space R and T, and institutional support is also included. These areas were assessed against the computer science categories. Concurrent processing, highly reliable computing, and information management are identified.

  14. Digital image processing using parallel computing based on CUDA technology

    Science.gov (United States)

    Skirnevskiy, I. P.; Pustovit, A. V.; Abdrashitova, M. O.

    2017-01-01

    This article describes expediency of using a graphics processing unit (GPU) in big data processing in the context of digital images processing. It provides a short description of a parallel computing technology and its usage in different areas, definition of the image noise and a brief overview of some noise removal algorithms. It also describes some basic requirements that should be met by certain noise removal algorithm in the projection to computer tomography. It provides comparison of the performance with and without using GPU as well as with different percentage of using CPU and GPU.

  15. Digital Earth Initiative: A Joint Interagency Program

    Science.gov (United States)

    Halem, Milton

    1999-01-01

    The Digital Earth is a virtual representation of our planet that enables a person to explore and interact with the vast amounts of natural and cultural information gathered about the Earth. The Digital Earth comprises data interfaces and standards enabling access to geo-referenced data from remote sensing, cartographic, demographic, medical, and other sources to respond to questions posed by the user. In a recent address at the California Science Center in Los Angeles, Vice President Al Gore articulated a Digital Earth Vision. That vision spoke to developing a multi-resolution, three-dimensional representation of the planet, into which we can roam and zoom into vast quantities of embedded geo-referenced data. The vision was not limited to moving through space but also allowing travel over a time-line, which can be set for days, years, centuries, or even geological epochs. As prototypes become available, it would also be possible to interact with the Digital Earth in multiple places around the country with access to high-speed networks and at a more limited level of access over the Internet. NASA was asked by the Vice President to lead an interagency initiative that would take steps to bring this vision to the public. This talk describes the start-up and plans of the Digital Earth Interagency Working Group in the formulation of its charter, an architecture reference model for Digital Earth, public/private partnerships, cooperative agreement notices, Digital Earth prototypes, and testbeds. Animations employing technologies for virtual roaming and zooming through multi-resolution satellite data set as prototype systems will be presented along with examples of potential user scenarios. Plans for engaging academia and industry in implementing the Digital Earth initiative will be discussed.

  16. High performance computing and communications program

    Science.gov (United States)

    Holcomb, Lee

    1992-01-01

    A review of the High Performance Computing and Communications (HPCC) program is provided in vugraph format. The goals and objectives of this federal program are as follows: extend U.S. leadership in high performance computing and computer communications; disseminate the technologies to speed innovation and to serve national goals; and spur gains in industrial competitiveness by making high performance computing integral to design and production.

  17. What do reversible programs compute?

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Glück, Robert

    2011-01-01

    are not strictly classically universal, but that they support another notion of universality; we call this RTM-universality. Thus, even though the RTMs are sub-universal in the classical sense, they are powerful enough as to include a self-interpreter. Lifting this to other computation models, we propose r...... be the starting point of a computational theory of reversible computing. We provide a novel semantics-based approach to such a theory, using reversible Turing machines (RTMs) as the underlying computation model. We show that the RTMs can compute exactly all injective, computable functions. We find that the RTMs...

  18. Integer programming theory, applications, and computations

    CERN Document Server

    Taha, Hamdy A

    1975-01-01

    Integer Programming: Theory, Applications, and Computations provides information pertinent to the theory, applications, and computations of integer programming. This book presents the computational advantages of the various techniques of integer programming.Organized into eight chapters, this book begins with an overview of the general categorization of integer applications and explains the three fundamental techniques of integer programming. This text then explores the concept of implicit enumeration, which is general in a sense that it is applicable to any well-defined binary program. Other

  19. Design of All Digital Flight Program Training Desktop Application System

    Directory of Open Access Journals (Sweden)

    Li Yu

    2017-01-01

    Full Text Available All digital flight program training desktop application system operating conditions are simple. Can make the aircraft aircrew learning theory and operation training closely. Improve the training efficiency and effectiveness. This paper studies the application field and design requirements of flight program training system. Based on the WINDOWS operating system desktop application, the design idea and system architecture of the all digital flight program training system are put forward. Flight characteristics, key airborne systems and aircraft cockpit are simulated. Finally, By comparing flight training simulator and the specific script program training system, The characteristics and advantages of the training system are analyzed in this paper.

  20. Mastering cloud computing foundations and applications programming

    CERN Document Server

    Buyya, Rajkumar; Selvi, SThamarai

    2013-01-01

    Mastering Cloud Computing is designed for undergraduate students learning to develop cloud computing applications. Tomorrow's applications won't live on a single computer but will be deployed from and reside on a virtual server, accessible anywhere, any time. Tomorrow's application developers need to understand the requirements of building apps for these virtual systems, including concurrent programming, high-performance computing, and data-intensive systems. The book introduces the principles of distributed and parallel computing underlying cloud architectures and specifical

  1. Computational proximity excursions in the topology of digital images

    CERN Document Server

    Peters, James F

    2016-01-01

    This book introduces computational proximity (CP) as an algorithmic approach to finding nonempty sets of points that are either close to each other or far apart. Typically in computational proximity, the book starts with some form of proximity space (topological space equipped with a proximity relation) that has an inherent geometry. In CP, two types of near sets are considered, namely, spatially near sets and descriptivelynear sets. It is shown that connectedness, boundedness, mesh nerves, convexity, shapes and shape theory are principal topics in the study of nearness and separation of physical aswell as abstract sets. CP has a hefty visual content. Applications of CP in computer vision, multimedia, brain activity, biology, social networks, and cosmology are included. The book has been derived from the lectures of the author in a graduate course on the topology of digital images taught over the past several years. Many of the students have provided important insights and valuable suggestions. The topics in ...

  2. On Verified Numerical Computations in Convex Programming

    OpenAIRE

    Jansson, Christian

    2009-01-01

    This survey contains recent developments for computing verified results of convex constrained optimization problems, with emphasis on applications. Especially, we consider the computation of verified error bounds for non-smooth convex conic optimization in the framework of functional analysis, for linear programming, and for semidefinite programming. A discussion of important problem transformations to special types of convex problems and convex relaxations is included...

  3. Viking image processing. [digital stereo imagery and computer mosaicking

    Science.gov (United States)

    Green, W. B.

    1977-01-01

    The paper discusses the camera systems capable of recording black and white and color imagery developed for the Viking Lander imaging experiment. Each Viking Lander image consisted of a matrix of numbers with 512 rows and an arbitrary number of columns up to a maximum of about 9,000. Various techniques were used in the processing of the Viking Lander images, including: (1) digital geometric transformation, (2) the processing of stereo imagery to produce three-dimensional terrain maps, and (3) computer mosaicking of distinct processed images. A series of Viking Lander images is included.

  4. Computer image processing - The Viking experience. [digital enhancement techniques

    Science.gov (United States)

    Green, W. B.

    1977-01-01

    Computer processing of digital imagery from the Viking mission to Mars is discussed, with attention given to subjective enhancement and quantitative processing. Contrast stretching and high-pass filtering techniques of subjective enhancement are described; algorithms developed to determine optimal stretch and filtering parameters are also mentioned. In addition, geometric transformations to rectify the distortion of shapes in the field of view and to alter the apparent viewpoint of the image are considered. Perhaps the most difficult problem in quantitative processing of Viking imagery was the production of accurate color representations of Orbiter and Lander camera images.

  5. Digital dice computational solutions to practical probability problems

    CERN Document Server

    Nahin, Paul J

    2013-01-01

    Some probability problems are so difficult that they stump the smartest mathematicians. But even the hardest of these problems can often be solved with a computer and a Monte Carlo simulation, in which a random-number generator simulates a physical process, such as a million rolls of a pair of dice. This is what Digital Dice is all about: how to get numerical answers to difficult probability problems without having to solve complicated mathematical equations. Popular-math writer Paul Nahin challenges readers to solve twenty-one difficult but fun problems, from determining the

  6. Coding Metamaterials, Digital Metamaterials and Programming Metamaterials

    OpenAIRE

    Cui, Tie Jun; Qi, Mei Qing; Wan, Xiang; Zhao, Jie; Cheng, Qiang

    2014-01-01

    As artificial structures, metamaterials are usually described by macroscopic effective medium parameters, which are named as "analog metamaterials". Here, we propose "digital metamaterials" in two steps. Firstly, we present "coding metamaterials" that are composed of only two kinds of unit cells with 0 and {\\pi} phase responses, which we name as "0" and "1" elements. By coding "0" and "1" elements with controlled sequences (i.e., 1-bit coding), we can manipulate electromagnetic (EM) waves and...

  7. Folding Digital Mapping into a Traditional Field Camp Program

    Science.gov (United States)

    Kelley, D. F.

    2011-12-01

    Louisiana State University runs a field camp with a permanent fixed-base which has continually operated since 1928 in the Front Range just to the south of Colorado Springs, CO. The field camp program which offers a 6-credit hour course in Field Geology follows a very traditional structure. The first week is spent collecting data for the construction of a detailed stratigraphic column of the local geology. The second week is spent learning the skills of geologic mapping, while the third applies these skills to a more geologically complicated mapping area. The final three weeks of the field camp program are spent studying and mapping igneous and metamorphic rocks as well as conducting a regional stratigraphic correlation exercise. Historically there has been a lack of technology involved in this program. All mapping has been done in the field without the use of any digital equipment and all products have been made in the office without the use of computers. In the summer of 2011 the use of GPS units, and GIS software were introduced to the program. The exercise that was chosen for this incorporation of technology was one in which metamorphic rocks are mapped within Golden Gate Canyon State Park in Colorado. This same mapping exercise was carried out during the 2010 field camp session with no GPS or GIS use. The students in both groups had the similar geologic backgrounds, similar grade point averages, and similar overall performances at field camp. However, the group that used digital mapping techniques mapped the field area more quickly and reportedly with greater ease. Additionally, the students who used GPS and GIS included more detailed rock descriptions with their final maps indicating that they spent less time in the field focusing on mapping contacts between units. The outcome was a better overall product. The use of GPS units also indirectly caused the students to produce better field maps. In addition to greater ease in mapping, the use of GIS software to

  8. [Profile, competencies and digital fluency of nurses in the Professional Improvement Program].

    Science.gov (United States)

    Tanabe, Lyvia Pini; Kobayashi, Rika Miyahara

    2013-08-01

    A descriptive exploratory study conducted in the city of São Paulo, which aimed to identify the profile, competencies and digital fluency of nurses in the Professional Improvement Program in handling technology at work. The population, composed by 60 nurses in the program, answered a questionnaire with data about profile, digital fluency and professional competencies. The participants were found to be: 95.0% female, 61.7% between 23 and 25 years old, 75.0% from public schools, 58.3% enrolled in cardiovascular nursing, 98.3% had contact with computing resources during graduation, 100.0% had a computer at home, 86.7% accessed the internet daily, 96.7% used Messenger and 58.3% had an intermediate level of knowledge and skill in computing. Professional competencies required for technology management referred to knowing how to be innovative, creative, and updated to identify and manage software and to use technological resources.

  9. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    Science.gov (United States)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  10. Computer vision and digital imaging technology in melanoma detection.

    Science.gov (United States)

    Voigt, Holger; Classen, Richarda

    2002-08-01

    With today's treatment options, melanoma cure rates can be improved only if the diagnosis is made early enough to allow for curative surgery. Since clinical signs of malignancy in a pigmented lesion are often ambiguous and even dermatology experts may misdiagnose melanoma, diagnostic tools and procedures have been developed to assist the clinician in the diagnostic workup. Epiluminescence microscopy or dermatoscopy is widely used to inspect the melanin reticulum at the epidermo-dermal junction zone for signs indicative of early tumor growth. With the help of computer technology, digital dermatoscopy systems have entered the diagnostic arena capable of accurately assessing skin surface features modeled along the ABCD criteria already used for clinical assessment of pigmented skin lesions. Today's technically refined computer-based systems provide sophisticated functionalities for automated feature extraction and lesion assessment for quantitative analysis, and may also be used for education and training purposes. Copyright 2002, Elsevier Science (USA). All rights reserved.

  11. Accuracy of a computed tomography scanning procedure to manufacture digital models.

    NARCIS (Netherlands)

    Darroudi, A.M.; Kuijpers-Jagtman, A.M.; Ongkosuwito, E.M.; Suttorp, C.M.; Bronkhorst, E.M.; Breuning, K.H.

    2017-01-01

    INTRODUCTION: Accurate articulation of the digital dental casts is crucial in orthodontic diagnosis and treatment planning. We aimed to determine the accuracy of manufacturing digital dental casts from computed tomography scanning of plaster casts regarding linear dimensions and interarch

  12. Structured Parallel Programming Patterns for Efficient Computation

    CERN Document Server

    McCool, Michael; Robison, Arch

    2012-01-01

    Programming is now parallel programming. Much as structured programming revolutionized traditional serial programming decades ago, a new kind of structured programming, based on patterns, is relevant to parallel programming today. Parallel computing experts and industry insiders Michael McCool, Arch Robison, and James Reinders describe how to design and implement maintainable and efficient parallel algorithms using a pattern-based approach. They present both theory and practice, and give detailed concrete examples using multiple programming models. Examples are primarily given using two of th

  13. Computer Programs in Marine Science

    Science.gov (United States)

    1976-04-01

    between two locations. Requires subroutines COS, SIN, ARCOS . Author - Ralph Johnson. Oceanographic Services Branch Copy on file at XODC National...STEREOGRAPHIC PROJECTICN 65 FORTRAN CDC 3800 PIE SCATTERINC COMPUTATIONS 41 FURTRAN COC 36UO PLCTS TRACK $AD DATA PROFILE .RACK 47 FORTRAN COC 3.00 PLCTS

  14. 10 CFR 73.54 - Protection of digital computer and communication systems and networks.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 2 2010-01-01 2010-01-01 false Protection of digital computer and communication systems... computer and communication systems and networks. By November 23, 2009 each licensee currently licensed to... provide high assurance that digital computer and communication systems and networks are adequately...

  15. FORTRAN computer program for seismic risk analysis

    Science.gov (United States)

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  16. Computer programming and architecture the VAX

    CERN Document Server

    Levy, Henry

    2014-01-01

    Takes a unique systems approach to programming and architecture of the VAXUsing the VAX as a detailed example, the first half of this book offers a complete course in assembly language programming. The second describes higher-level systems issues in computer architecture. Highlights include the VAX assembler and debugger, other modern architectures such as RISCs, multiprocessing and parallel computing, microprogramming, caches and translation buffers, and an appendix on the Berkeley UNIX assembler.

  17. The Dynamic Geometrisation of Computer Programming

    Science.gov (United States)

    Sinclair, Nathalie; Patterson, Margaret

    2018-01-01

    The goal of this paper is to explore dynamic geometry environments (DGE) as a type of computer programming language. Using projects created by secondary students in one particular DGE, we analyse the extent to which the various aspects of computational thinking--including both ways of doing things and particular concepts--were evident in their…

  18. 43 Computer Assisted Programmed Instruction and Cognitive ...

    African Journals Online (AJOL)

    cce

    Computer Assisted Programmed Instruction and Cognitive Preference Style as. Determinant of Achievement of Secondary School Physics Students. Sotayo, M. A. O.. Federal College of Education, Osiele, Abeokuta, Nigeria. Abstract. The study probes into the effect of Computer Assisted Instruction and Cognitive preference.

  19. NASA High-End Computing Program Website

    Science.gov (United States)

    Cohen, Jarrett S.

    2008-01-01

    If you are a NASA-sponsored scientist or engineer. computing time is available to you at the High-End Computing (HEC) Program's NASA Advanced Supercomputing (NAS) Facility and NASA Center for Computational Sciences (NCCS). The Science Mission Directorate will select from requests NCCS Portals submitted to the e-Books online system for awards beginning on May 1. Current projects set to explore on April 30 must have a request in e-Books to be considered for renewal

  20. Migrating Home Computer Audio Waveforms to Digital Objects: A Case Study on Digital Archaeology

    Directory of Open Access Journals (Sweden)

    Mark Guttenbrunner

    2011-03-01

    Full Text Available Rescuing data from inaccessible or damaged storage media for the purpose of preserving the digital data for the long term is one of the dimensions of digital archaeology. With the current pace of technological development, any system can become obsolete in a matter of years and hence the data stored in a specific storage media might not be accessible anymore due to the unavailability of the system to access the media. In order to preserve digital records residing in such storage media, it is necessary to extract the data stored in those media by some means.One early storage medium for home computers in the 1980s was audio tape. The first home computer systems allowed the use of standard cassette players to record and replay data. Audio cassettes are more durable than old home computers when properly stored. Devices playing this medium (i.e. tape recorders can be found in working condition or can be repaired, as they are usually made out of standard components. By re-engineering the format of the waveform and the file formats, the data on such media can then be extracted from a digitised audio stream and migrated to a non-obsolete format.In this paper we present a case study on extracting the data stored on an audio tape by an early home computer system, namely the Philips Videopac+ G7400. The original data formats were re-engineered and an application was written to support the migration of the data stored on tapes without using the original system. This eliminates the necessity of keeping an obsolete system alive for enabling access to the data on the storage media meant for this system. Two different methods to interpret the data and eliminate possible errors in the tape were implemented and evaluated on original tapes, which were recorded 20 years ago. Results show that with some error correction methods, parts of the tapes are still readable even without the original system. It also implies that it is easier to build solutions while original

  1. Computer-Aided Corrosion Program Management

    Science.gov (United States)

    MacDowell, Louis

    2010-01-01

    This viewgraph presentation reviews Computer-Aided Corrosion Program Management at John F. Kennedy Space Center. The contents include: 1) Corrosion at the Kennedy Space Center (KSC); 2) Requirements and Objectives; 3) Program Description, Background and History; 4) Approach and Implementation; 5) Challenges; 6) Lessons Learned; 7) Successes and Benefits; and 8) Summary and Conclusions.

  2. IDRC program to focus on digital innovations | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-03-17

    Mar 17, 2016 ... The Networked Economies program will support research that helps developing countries use digital innovations to create inclusive economic opportunities and advance democracy. More particularly ... IDRC and Cancer Research UK partner on innovative new tobacco control initiative. IDRC and Cancer ...

  3. Teaching Ethical Copyright Behavior: Assessing the Effects of a University-Sponsored Computing Ethics Program

    Science.gov (United States)

    Siemens, Jennifer Christie; Kopp, Steven W.

    2006-01-01

    Universities have become sensitized to the potential for students' illegal downloading of copyrighted materials. Education has been advocated as one way to curb downloading of copyrighted digital content. This study investigates the effectiveness of a university-sponsored computing ethics education program. The program positively influenced…

  4. Developing computer training programs for blood bankers.

    Science.gov (United States)

    Eisenbrey, L

    1992-01-01

    Two surveys were conducted in July 1991 to gather information about computer training currently performed within American Red Cross Blood Services Regions. One survey was completed by computer trainers from software developer-vendors and regional centers. The second survey was directed to the trainees, to determine their perception of the computer training. The surveys identified the major concepts, length of training, evaluations, and methods of instruction used. Strengths and weaknesses of training programs were highlighted by trainee respondents. Using the survey information and other sources, recommendations (including those concerning which computer skills and tasks should be covered) are made that can be used as guidelines for developing comprehensive computer training programs at any blood bank or blood center.

  5. Integration of Digital Dentistry into a Predoctoral Implant Program: Program Description, Rationale, and Utilization Trends.

    Science.gov (United States)

    Afshari, Fatemeh S; Sukotjo, Cortino; Alfaro, Maria F; McCombs, Jeri; Campbell, Stephen D; Knoernschild, Kent L; Yuan, Judy Chia-Chun

    2017-08-01

    A recently revised predoctoral implant curriculum at the University of Illinois at Chicago College of Dentistry integrated digital dentistry into both the preclinical dental implant course and clinical activities. Traditionally, competence in the didactic and clinical parts of predoctoral education in single tooth implant restorations has emphasized the analog impression technique and subsequent mounting of soft tissue working casts. However, computer-aided design/computer-aided manufacturing (CAD/CAM) implant restorations can play a significant role in predoctoral dental education utilizing digital technologies. The goal of the curriculum expansion is to transition from analog to partially digital and, finally, complete digital workflow. The aim of this article is to describe the specific components, implementation, and rationale for the new digitally integrated implant curriculum and present short-term clinical utilization trends.

  6. Page 1 Cathode-Ray Display of Digital Computer Outputs 127 at a ...

    Indian Academy of Sciences (India)

    Cathode-Ray Display of Digital Computer Outputs 127 at a negative bias of Vb/2 which keeps it cut off, ... characters, the necessary digital information being wired in. Having numetal core, each transformer has a ... in the counter and the response time of the. Digital to Analog converter, Carry delay should thus be minimised,

  7. Three Dimensional Digital Sieving of Asphalt Mixture Based on X-ray Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chichun Hu

    2017-07-01

    Full Text Available In order to perform three-dimensional digital sieving based on X-ray computed tomography images, the definition of digital sieve size (DSS was proposed, which was defined as the minimum length of the minimum bounding squares of all possible orthographic projections of an aggregate. The corresponding program was developed to reconstruct aggregate structure and to obtain DSS. Laboratory experiments consisting of epoxy-filled aggregate specimens were conducted to investigate the difference between mechanical sieve analysis and the digital sieving technique. It was suggested that concave surface of aggregate was the possible reason for the disparity between DSS and mechanical sieve size. A comparison between DSS and equivalent diameter was also performed. Moreover, the digital sieving technique was adopted to evaluate the gradation of stone mastic asphalt mixtures. The results showed that the closest proximity of the laboratory gradation curve was achieved by calibrated DSS, among gradation curves based on calibrated DSS, un-calibrated DSS and equivalent diameter.

  8. What Makes the Digital "Special"? The Research Program in Digital Collections at the National Library of Wales

    Science.gov (United States)

    Cusworth, Andrew; Hughes, Lorna M.; James, Rhian; Roberts, Owain; Roderick, Gareth Lloyd

    2015-01-01

    This article introduces some of the digital projects currently in development at the National Library of Wales as part of its Research Program in Digital Collections. These projects include the digital representation of the Library's Kyffin Willams art collection, musical collections, and probate collection, and of materials collected by the…

  9. The Computational Physics Program of the national MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs.

  10. An introduction to Python and computer programming

    CERN Document Server

    Zhang, Yue

    2015-01-01

    This book introduces Python programming language and fundamental concepts in algorithms and computing. Its target audience includes students and engineers with little or no background in programming, who need to master a practical programming language and learn the basic thinking in computer science/programming. The main contents come from lecture notes for engineering students from all disciplines, and has received high ratings. Its materials and ordering have been adjusted repeatedly according to classroom reception. Compared to alternative textbooks in the market, this book introduces the underlying Python implementation of number, string, list, tuple, dict, function, class, instance and module objects in a consistent and easy-to-understand way, making assignment, function definition, function call, mutability and binding environments understandable inside-out. By giving the abstraction of implementation mechanisms, this book builds a solid understanding of the Python programming language.

  11. A digital patient for computer-aided prosthesis design.

    Science.gov (United States)

    Colombo, Giorgio; Facoetti, Giancarlo; Rizzi, Caterina

    2013-04-06

    This article concerns the design of lower limb prosthesis, both below and above knee. It describes a new computer-based design framework and a digital model of the patient around which the prosthesis is designed and tested in a completely virtual environment. The virtual model of the patient is the backbone of the whole system, and it is based on a biomechanical general-purpose model customized with the patient's characteristics (e.g. anthropometric measures). The software platform adopts computer-aided and knowledge-guided approaches with the goal of replacing the current development process, mainly hand made, with a virtual one. It provides the prosthetics with a set of tools to design, configure and test the prosthesis and comprehends two main environments: the prosthesis modelling laboratory and the virtual testing laboratory. The first permits the three-dimensional model of the prosthesis to be configured and generated, while the second allows the prosthetics to virtually set up the artificial leg and simulate the patient's postures and movements, validating its functionality and configuration. General architecture and modelling/simulation tools for the platform are described as well as main aspects and results of the experimentation.

  12. Exploration of operator method digital optical computers for application to NASA

    Science.gov (United States)

    1990-01-01

    Digital optical computer design has been focused primarily towards parallel (single point-to-point interconnection) implementation. This architecture is compared to currently developing VHSIC systems. Using demonstrated multichannel acousto-optic devices, a figure of merit can be formulated. The focus is on a figure of merit termed Gate Interconnect Bandwidth Product (GIBP). Conventional parallel optical digital computer architecture demonstrates only marginal competitiveness at best when compared to projected semiconductor implements. Global, analog global, quasi-digital, and full digital interconnects are briefly examined as alternative to parallel digital computer architecture. Digital optical computing is becoming a very tough competitor to semiconductor technology since it can support a very high degree of three dimensional interconnect density and high degrees of Fan-In without capacitive loading effects at very low power consumption levels.

  13. 78 FR 47015 - Software Requirement Specifications for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2013-08-02

    ... COMMISSION Software Requirement Specifications for Digital Computer Software Used in Safety Systems of... 1 of RG 1.172, ``Software Requirement Specifications for Digital Computer Software used in Safety... (IEEE) Standard (Std.) 830-1998, ``IEEE Recommended Practice for Software Requirements Specifications...

  14. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in... issuing for public comment draft regulatory guide (DG), DG-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used in Safety Systems of Nuclear Power Plants.'' The DG...

  15. Digital Diagnosis and Treatment Program for Maxillofacial Fractures: A Retrospective Analysis of 626 Cases.

    Science.gov (United States)

    Zeng, Wei; Lian, Xiaotian; Chen, Gang; Ju, Rui; Tian, Weidong; Tang, Wei

    2017-12-09

    The purpose of this study was to evaluate the accuracy of the digital diagnosis and treatment program for maxillofacial fractures. The data of 626 patients with maxillofacial fractures were analyzed retrospectively from January 2010 to August 2016. These patients were divided into 2 groups. In the experimental group, preoperative planning was conducted and transferred to patients with guiding templates and navigation according to the digital diagnosis and treatment program for maxillofacial fractures. In the control group, postsurgical planning was performed instead of preoperative planning. To assess the accuracy of the digital diagnosis and treatment program for maxillofacial fractures, preoperative planning and postoperative computed tomographic models were superimposed and imported to dedicated software (Geomagic Studio 13.0, Geomagic, Inc, Research Triangle Park, NC) to calculate the difference between the 2 models in the 2 groups. Results of the experimental set showed that the mean error between the preoperative planning model and the postoperative model ranged from 0.65 to 0.97 mm (average, 0.89 mm). For the control group, the mean error was 0.78 to 1.45 mm (average, 1.01 mm). Thus, the mean error of the experimental group was statistically lower than that of the control group (P maxillofacial fractures was more accurate. Aided by the digital diagnosis and treatment program, the accuracy for maxillofacial fractures was notably improved. To facilitate the application and promotion of digital technology, further modification of the complete digital diagnosis and treatment pathway for maxillofacial fractures is highly desired. Copyright © 2017. Published by Elsevier Inc.

  16. An interactive program for digitization of seabed photographs

    Digital Repository Service at National Institute of Oceanography (India)

    Ramprasad, T.; Sharma, R.

    A program for dignitization of seabed photographs to compute coverage and abundance of polymetallic nodules is developed. Since the objects in the seabed photograph are partially covered by thin sediment layer, the automatic scanning devices may...

  17. Digital computer control of servomotor angular position | Mullisa ...

    African Journals Online (AJOL)

    The paper discussess the design and simulation methodology of digital control systems for the benefit of the interested practicing engineer. A lead-type digital controller for a 2nd order system and a leadlag type digital controller for a 3rd order system are designed. The simulations show that the design methods are ...

  18. Computer program compatible with a laser nephelometer

    Science.gov (United States)

    Paroskie, R. M.; Blau, H. H., Jr.; Blinn, J. C., III

    1975-01-01

    The laser nephelometer data system was updated to provide magnetic tape recording of data, and real time or near real time processing of data to provide particle size distribution and liquid water content. Digital circuits were provided to interface the laser nephelometer to a Data General Nova 1200 minicomputer. Communications are via a teletypewriter. A dual Linc Magnetic Tape System is used for program storage and data recording. Operational programs utilize the Data General Real-Time Operating System (RTOS) and the ERT AIRMAP Real-Time Operating System (ARTS). The programs provide for acquiring data from the laser nephelometer, acquiring data from auxiliary sources, keeping time, performing real time calculations, recording data and communicating with the teletypewriter.

  19. Role of logic programming in computer studies

    Directory of Open Access Journals (Sweden)

    Nicolae PELIN

    2016-09-01

    Full Text Available The paper contains the analysis of the opinions of a number of scholars and specialists on the importance and the role in logic programming methodology of studying computer science, philosophy about the logic programs and interpreter, concerning the burden of which is opposite to the programmer if there is logic interpreter. The presented material is meant, according to the author, to help the reader to understand more easily the analyzed multilateral problem.

  20. Program computes turbine steam rates and properties

    Energy Technology Data Exchange (ETDEWEB)

    Ganapathy, V. (ABCO Industries, Inc., Abilene, TX (US))

    1988-11-01

    BASIC computer program quickly evaluates steam properties and rates during expansion in a steam turbine. Engineers involved in cogeneration projects and power plant studies often need to calculate the steam properties during expansion in a steam turbine to evaluate the theoretical and actual steam rates and hence, the electrical power output. With the help of this program written in BASIC, one can quickly evaluate all the pertinent data. Correlations used for steam property evaluation are also presented.

  1. A CAD (Classroom Assessment Design) of a Computer Programming Course

    Science.gov (United States)

    Hawi, Nazir S.

    2012-01-01

    This paper presents a CAD (classroom assessment design) of an entry-level undergraduate computer programming course "Computer Programming I". CAD has been the product of a long experience in teaching computer programming courses including teaching "Computer Programming I" 22 times. Each semester, CAD is evaluated and modified…

  2. Nanoelectromechanical Switches for Low-Power Digital Computing

    Directory of Open Access Journals (Sweden)

    Alexis Peschot

    2015-08-01

    Full Text Available The need for more energy-efficient solid-state switches beyond complementary metal-oxide-semiconductor (CMOS transistors has become a major concern as the power consumption of electronic integrated circuits (ICs steadily increases with technology scaling. Nano-Electro-Mechanical (NEM relays control current flow by nanometer-scale motion to make or break physical contact between electrodes, and offer advantages over transistors for low-power digital logic applications: virtually zero leakage current for negligible static power consumption; the ability to operate with very small voltage signals for low dynamic power consumption; and robustness against harsh environments such as extreme temperatures. Therefore, NEM logic switches (relays have been investigated by several research groups during the past decade. Circuit simulations calibrated to experimental data indicate that scaled relay technology can overcome the energy-efficiency limit of CMOS technology. This paper reviews recent progress toward this goal, providing an overview of the different relay designs and experimental results achieved by various research groups, as well as of relay-based IC design principles. Remaining challenges for realizing the promise of nano-mechanical computing, and ongoing efforts to address these, are discussed.

  3. Introduction to programming multiple-processor computers

    Energy Technology Data Exchange (ETDEWEB)

    Hicks, H.R.; Lynch, V.E.

    1985-04-01

    FORTRAN applications programs can be executed on multiprocessor computers in either a unitasking (traditional) or multitasking form. The latter allows a single job to use more than one processor simultaneously, with a consequent reduction in wall-clock time and, perhaps, the cost of the calculation. An introduction to programming in this environment is presented. The concepts of synchronization and data sharing using EVENTS and LOCKS are illustrated with examples. The strategy of strong synchronization and the use of synchronization templates are proposed. We emphasize that incorrect multitasking programs can produce irreproducible results, which makes debugging more difficult.

  4. Computer Assisted Programmed Instruction and Cognitive ...

    African Journals Online (AJOL)

    The achievement of students of application learning mode was also significantly higher than those of recall and principle respectively. There was no significant interaction effect between Cognitive Preference Style and Computer Assisted Programmed Instruction. The implications of the result to the stakeholder were ...

  5. Computer program package for PIXE spectra evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Kajfosz, J. [Institute of Nuclear Physics, Cracow (Poland)

    1992-12-31

    The computer programs described here were developed for calculating the concentrations of elements in samples analysed by the PIXE (Proton Induced X-ray Emission) method from the X-ray spectra obtained in those analyses. (author). 10 refs, 2 figs.

  6. Computer programming students head to Tokyo

    OpenAIRE

    Crumbley, Liz

    2007-01-01

    "The Milk's Gone Bad," a team of three undergraduate students from the Virginia Tech College of Engineering, will compete in the World Finals of the Association of Computing Machinery International Collegiate Programming Contest (ACM-ICPC) March 12-16 in Tokyo, Japan.

  7. Computer Program Re-layers Engineering Drawings

    Science.gov (United States)

    Crosby, Dewey C., III

    1990-01-01

    RULCHK computer program aids in structuring layers of information pertaining to part or assembly designed with software described in article "Software for Drawing Design Details Concurrently" (MFS-28444). Checks and optionally updates structure of layers for part. Enables designer to construct model and annotate its documentation without burden of manually layering part to conform to standards at design time.

  8. Data systems and computer science programs: Overview

    Science.gov (United States)

    Smith, Paul H.; Hunter, Paul

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. The topics are presented in viewgraph form and include the following: onboard memory and storage technology; advanced flight computers; special purpose flight processors; onboard networking and testbeds; information archive, access, and retrieval; visualization; neural networks; software engineering; and flight control and operations.

  9. Quantitative evaluation of anatomical noise in chest digital tomosynthesis, digital radiography, and computed tomography

    Science.gov (United States)

    Lee, D.; Choi, S.; Lee, H.; Kim, D.; Choi, S.; Kim, H.-J.

    2017-04-01

    Lung cancer is currently the worldwide leading cause of death from cancer. Thus, detection of lung cancer at its early stages is critical for improving the survival rate of patients. Chest digital tomosynthesis (CDT) is a recently developed imaging modality, combining many advantages of digital radiography (DR) and computed tomography (CT). This method has the potential to be widely used in the clinical setting. In this study, we introduce a developed CDT R/F system and compare its image quality with those of DR and CT, especially with respect to anatomical noise and lung nodule conspicuity, for LUNGMAN phantoms. The developed CDT R/F system consists of a CsI scintillator flat panel detector, X-ray tube, and tomosynthesis data acquisition geometry. For CDT R/F imaging, 41 projections were acquired at different angles, over the ± 20° angular range, in a linear translation geometry. To evaluate the clinical effectiveness of the CDT R/F system, the acquired images were compared with CT (Philips brilliance CT 64, Philips healthcare, U.S.) and DR (ADR-M, LISTEM, Korea) phantom images in terms of the anatomical noise power spectrum (aNPS). DR images exhibited low conspicuity for a small-size lung nodule, while CDT R/F and CT exhibited relatively high sensitivity for all lung nodule sizes. The aNPS of the CDT R/F system was better than that of DR, by resolving anatomical overlapping problems. In conclusion, the developed CDT R/F system is likely to contribute to early diagnosis of lung cancer, while requiring a relatively low patient dose, compared with CT.

  10. Comparison of computer-aided detection of clustered microcalcifications in digital mammography and digital breast tomosynthesis

    Science.gov (United States)

    Samala, Ravi K.; Chan, Heang-Ping; Lu, Yao; Hadjiiski, Lubomir; Wei, Jun; Helvie, Mark

    2015-03-01

    Digital breast tomosynthesis (DBT) has the potential to replace digital mammography (DM) for breast cancer screening. An effective computer-aided detection (CAD) system for microcalcification clusters (MCs) on DBT will facilitate the transition. In this study, we collected a data set with corresponding DBT and DM for the same breasts. DBT was acquired with IRB approval and informed consent using a GE GEN2 DBT prototype system. The DM acquired with a GE Essential system for the patient's clinical care was collected retrospectively from patient files. DM-based CAD (CADDM) and DBT-based CAD (CADDBT) were previously developed by our group. The major differences between the CAD systems include: (a) CADDBT uses two parallel processes whereas CADDM uses a single process for enhancing MCs and removing the structured background, (b) CADDBT has additional processing steps to reduce the false positives (FPs), including ranking of candidates of cluster seeds and cluster members and the use of adaptive CNR and size thresholds at clustering and FP reduction, (c) CADDM uses convolution neural network (CNN) and linear discriminant analysis (LDA) to differentiate true microcalcifications from FPs based on their morphological and CNN features. The performance difference is assessed by FROC analysis using test set (100 views with MCs and 74 views without MCs) independent of their respective training sets. At sensitivities of 70% and 80%, CADDBT achieved FP rates of 0.78 and 1.57 per view compared to 0.66 and 2.10 per image for the CADDM. JAFROC showed no significant difference between MC detection on DM and DBT by the two CAD systems.

  11. Defining a Standard for Reporting Digital Evidence Items in Computer Forensic Tools

    Science.gov (United States)

    Bariki, Hamda; Hashmi, Mariam; Baggili, Ibrahim

    Due to the lack of standards in reporting digital evidence items, investigators are facing difficulties in efficiently presenting their findings. This paper proposes a standard for digital evidence to be used in reports that are generated using computer forensic software tools. The authors focused on developing a standard digital evidence items by surveying various digital forensic tools while keeping in mind the legal integrity of digital evidence items. Additionally, an online questionnaire was used to gain the opinion of knowledgeable and experienced stakeholders in the digital forensics domain. Based on the findings, the authors propose a standard for digital evidence items that includes data about the case, the evidence source, evidence item, and the chain of custody. Research results enabled the authors in creating a defined XML schema for digital evidence items.

  12. Employing subgoals in computer programming education

    Science.gov (United States)

    Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark

    2016-01-01

    The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.

  13. Computer Presentation Programs and Teaching Research Methodologies

    Directory of Open Access Journals (Sweden)

    Vahid Motamedi

    2015-05-01

    Full Text Available Supplementing traditional chalk and board instruction with computer delivery has been viewed positively by students who have reported increased understanding and more interaction with the instructor when computer presentations are used in the classroom. Some problems contributing to student errors while taking class notes might be transcription of numbers to the board, and handwriting of the instructor can be resolved in careful construction of computer presentations. The use of computer presentation programs promises to increase the effectiveness of learning by making content more readily available, by reducing the cost and effort of producing quality content, and by allowing content to be more easily shared. This paper describes how problems can be overcome by using presentation packages for instruction.

  14. New Mexico district work-effort analysis computer program

    Science.gov (United States)

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  15. Breast cancer screening results 5 years after introduction of digital mammography in a population-based screening program.

    NARCIS (Netherlands)

    Karssemeijer, N.; Bluekens, A.M.; Beijerinck, D.; Deurenberg, J.J.; Beekman, M.; Visser, R.; Engen, R. van; Bartels-Kortland, A.; Broeders, M.J.M.

    2009-01-01

    PURPOSE: To compare full-field digital mammography (FFDM) using computer-aided diagnosis (CAD) with screen-film mammography (SFM) in a population-based breast cancer screening program for initial and subsequent screening examinations. MATERIALS AND METHODS: The study was approved by the regional

  16. Foresters' Metric Conversions program (version 1.0). [Computer program

    Science.gov (United States)

    Jefferson A. Palmer

    1999-01-01

    The conversion of scientific measurements has become commonplace in the fields of - engineering, research, and forestry. Foresters? Metric Conversions is a Windows-based computer program that quickly converts user-defined measurements from English to metric and from metric to English. Foresters? Metric Conversions was derived from the publication "Metric...

  17. OK computer? Digital community archaeologies in practice (Internet Archaeology 40

    Directory of Open Access Journals (Sweden)

    Seren Griffiths

    2015-12-01

    Full Text Available The articles in this section of Internet Archaeology came out of a Theoretical Archaeology Group session at Manchester University in 2014. The session was motivated to explore issues associated with 'digital public archaeology' (DPA. The articles presented here deal with a number of themes which arise when doing digital public archaeology.

  18. Construction of a Digital Learning Environment Based on Cloud Computing

    Science.gov (United States)

    Ding, Jihong; Xiong, Caiping; Liu, Huazhong

    2015-01-01

    Constructing the digital learning environment for ubiquitous learning and asynchronous distributed learning has opened up immense amounts of concrete research. However, current digital learning environments do not fully fulfill the expectations on supporting interactive group learning, shared understanding and social construction of knowledge.…

  19. Breast Cancer: Computer-aided Detection with Digital Breast Tomosynthesis.

    Science.gov (United States)

    Morra, Lia; Sacchetto, Daniela; Durando, Manuela; Agliozzo, Silvano; Carbonaro, Luca Alessandro; Delsanto, Silvia; Pesce, Barbara; Persano, Diego; Mariscotti, Giovanna; Marra, Vincenzo; Fonio, Paolo; Bert, Alberto

    2015-10-01

    To evaluate a commercial tomosynthesis computer-aided detection (CAD) system in an independent, multicenter dataset. Diagnostic and screening tomosynthesis mammographic examinations (n = 175; cranial caudal and mediolateral oblique) were randomly selected from a previous institutional review board-approved trial. All subjects gave informed consent. Examinations were performed in three centers and included 123 patients, with 132 biopsy-proven screening-detected cancers, and 52 examinations with negative results at 1-year follow-up. One hundred eleven lesions were masses and/or microcalcifications (72 masses, 22 microcalcifications, 17 masses with microcalcifications) and 21 were architectural distortions. Lesions were annotated by radiologists who were aware of all available reports. CAD performance was assessed as per-lesion sensitivity and false-positive results per volume in patients with negative results. Use of the CAD system showed per-lesion sensitivity of 89% (99 of 111; 95% confidence interval: 81%, 94%), with 2.7 ± 1.8 false-positive rate per view, 62 of 72 lesions detected were masses, 20 of 22 were microcalcification clusters, and 17 of 17 were masses with microcalcifications. Overall, 37 of 39 microcalcification clusters (95% sensitivity, 95% confidence interval: 81%, 99%) and 79 of 89 masses (89% sensitivity, 95% confidence interval: 80%, 94%) were detected with the CAD system. On average, 0.5 false-positive rate per view were microcalcification clusters, 2.1 were masses, and 0.1 were masses and microcalcifications. A digital breast tomosynthesis CAD system can allow detection of a large percentage (89%, 99 of 111) of breast cancers manifesting as masses and microcalcification clusters, with an acceptable false-positive rate (2.7 per breast view). Further studies with larger datasets acquired with equipment from multiple vendors are needed to replicate the findings and to study the interaction of radiologists and CAD systems. (©) RSNA, 2015.

  20. GAP: A computer program for gene assembly

    Energy Technology Data Exchange (ETDEWEB)

    Eisnstein, J.R.; Uberbacher, E.C.; Guan, X.; Mural, R.J.; Mann, R.C.

    1991-09-01

    A computer program, GAP (Gene Assembly Program), has been written to assemble and score hypothetical genes, given a DNA sequence containing the gene, and the outputs of several other programs which analyze the sequence. These programs include the codign-recognition and splice-junction-recognition modules developed in this laboratory. GAP is a prototype of a planned system in which it will be integrated with an expert system and rule base. Initial tests of GAP have been carried out with four sequences, the exons of which have been determined by biochemcial methods. The highest-scoring hypothetical genes for each of the four sequences had percent correct splice junctions ranging from 50 to 100% (average 81%) and percent correct bases ranging from 92 to 100% (average 96%). 9 refs., 1 tab.

  1. Linear programming phase unwrapping for dual-wavelength digital holography.

    Science.gov (United States)

    Wang, Zhaomin; Jiao, Jiannan; Qu, Weijuan; Yang, Fang; Li, Hongru; Tian, Ailing; Asundi, Anand

    2017-01-20

    A linear programming phase unwrapping method in dual-wavelength digital holography is proposed and verified experimentally. The proposed method uses the square of height difference as a convergence standard and theoretically gives the boundary condition in a searching process. A simulation was performed by unwrapping step structures at different levels of Gaussian noise. As a result, our method is capable of recovering the discontinuities accurately. It is robust and straightforward. In the experiment, a microelectromechanical systems sample and a cylindrical lens were measured separately. The testing results were in good agreement with true values. Moreover, the proposed method is applicable not only in digital holography but also in other dual-wavelength interferometric techniques.

  2. The computational physics program of the National MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers.

  3. The Digital Fingerprinting Analysis Concerning Google Calendar under Ubiquitous Mobile Computing Era

    Directory of Open Access Journals (Sweden)

    Hai-Cheng Chu

    2015-04-01

    Full Text Available Internet Communication Technologies (ICTs are making progress day by day, driven by the relentless need to utilize them for everything from leisure to business. This inevitable trend has dramatically changed contemporary digital behavior in all aspects. Undoubtedly, digital fingerprints will be at some point unwarily left on crime scenes creating digital information security incidents. On the other hand, corporates in the private sector or governments are on the edge of being exploited in terms of confidential digital information leakages. Some digital fingerprinting is volatile by its nature. Alternatively, once the power of computing devices is no longer sustainable, these digital traces could disappear forever. Due to the pervasive usage of Google Calendar and Safari browser among network communities, digital fingerprinting could be disclosed if forensics is carried out in a sound manner, which could be admitted in a court of law as probative evidences concerning certain cybercrime incidents.

  4. COMPUTER PROGRAMMING AND ROBOTICS IN BASIC EDUCATION

    Directory of Open Access Journals (Sweden)

    José Manuel Cabrera Delgado

    2015-12-01

    Full Text Available This article aims to get an overview of the process of including the computer programming and the robotics in the educational curriculum of the basic education in several European countries, including Spain. For this purpose, the cases of Estonia and France are briefly analyzed, two countries in the European Union, which can be considered pioneers in implementing such teaching. Also, in relation to Spain, it is analyzed some of the current initiatives implemented by some Autonomous Communities in this sense.

  5. Scientific Computing in the CH Programming Language

    Directory of Open Access Journals (Sweden)

    Harry H. Cheng

    1993-01-01

    Full Text Available We have developed a general-purpose block-structured interpretive programming Ianguage. The syntax and semantics of this language called CH are similar to C. CH retains most features of C from the scientific computing point of view. In this paper, the extension of C to CH for numerical computation of real numbers will be described. Metanumbers of −0.0, 0.0, Inf, −Inf, and NaN are introduced in CH. Through these metanumbers, the power of the IEEE 754 arithmetic standard is easily available to the programmer. These metanumbers are extended to commonly used mathematical functions in the spirit of the IEEE 754 standard and ANSI C. The definitions for manipulation of these metanumbers in I/O; arithmetic, relational, and logic operations; and built-in polymorphic mathematical functions are defined. The capabilities of bitwise, assignment, address and indirection, increment and decrement, as well as type conversion operations in ANSI C are extended in CH. In this paper, mainly new linguistic features of CH in comparison to C will be described. Example programs programmed in CH with metanumbers and polymorphic mathematical functions will demonstrate capabilities of CH in scientific computing.

  6. Integration of digital dental casts in cone-beam computed tomography scans

    NARCIS (Netherlands)

    Rangel, F.A.; Maal, T.J.J.; Berge, S.J.; Kuijpers-Jagtman, A.M.

    2012-01-01

    Cone-beam computed tomography (CBCT) is widely used in maxillofacial surgery. The CBCT image of the dental arches, however, is of insufficient quality to use in digital planning of orthognathic surgery. Several authors have described methods to integrate digital dental casts into CBCT scans, but all

  7. News from the Library: A one-stop-shop for computing literature: ACM Digital Library

    CERN Multimedia

    CERN Library

    2011-01-01

    The Association for Computing Machinery, ACM, is the world’s largest educational and scientific computing society. Among others, the ACM provides the computing field's premier Digital Library and serves its members and the computing profession with leading-edge publications, conferences, and career resources.   ACM Digital Library is available to the CERN community. The most popular journal here at CERN is Communications of the ACM. However, the collection offers access to a series of other valuable important academic journals such as Journal of the ACM and even fulltext of a series of classical books. In addition, users have access to the ACM Guide to Computing Literature, the most comprehensive bibliographic database focusing on computing, integrated with ACM’s full-text articles and including features such as ACM Author Profile Pages - which provides bibliographic and bibliometric data for over 1,000,000 authors in the field. ACM Digital Library is an excellent com...

  8. A generalized approach to computer synthesis of digital holograms

    Science.gov (United States)

    Hopper, W. A.

    1973-01-01

    Hologram is constructed by taking number of digitized sample points and blending them together to form ''continuous'' picture. New system selects better set of sample points resulting in improved hologram from same amount of information.

  9. NASA High Performance Computing and Communications program

    Science.gov (United States)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1994-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.

  10. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Directory of Open Access Journals (Sweden)

    Gila Cohen Zilka

    2016-06-01

    Full Text Available Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school; and (b the digital divide in literacy skills. In the present study we examined the degree of effectiveness of receiving a desktop or hybrid computer for the home in reducing the digital divide among children of low socio-economic status aged 8-12 from various localities across Israel. The sample consisted of 1,248 respondents assessed in two measurements. As part of the mixed-method study, 128 children were also interviewed. Findings indicate that after the children received desktop or hybrid computers, changes occurred in their frequency of access, mobility, and computer literacy. Differences were found between the groups: hybrid computers reduce disparities and promote work with the computer and surfing the Internet more than do desktop computers. Narrowing the digital divide for this age group has many implications for the acquisition of skills and study habits, and consequently, for the realization of individual potential. The children spoke about self improvement as a result of exposure to the digital environment, about a sense of empowerment and of improvement in their advantage in the social fabric. Many children expressed a desire to continue their education and expand their knowledge of computer applications, the use of software, of games, and more. Therefore, if there is no computer in the home and it is necessary to decide between a desktop and a hybrid computer, a hybrid computer is preferable.

  11. Light Water Reactor Sustainability Program: Digital Architecture Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, Ken [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    There are many technologies available to the nuclear power industry to improve efficiency in plant work activities. These range from new control room technologies to those for mobile field workers. They can make a positive impact on a wide range of performance objectives – increase in productivity, human error reduction, validation of results, accurate transfer of data, and elimination of repetitive tasks. It is expected that the industry will more and more turn to these technologies to achieve these operational efficiencies to lower costs. At the same time, this will help utilities manage a looming staffing problem as the inevitable retirement wave of the more seasoned workers affects both staffing levels and knowledge retention. A barrier to this wide-scale implementation of new technologies for operational efficiency is the lack of a comprehensive digital architecture that can support the real-time information exchanges needed to achieve the desired operational efficiencies. This project will define an advanced digital architecture that will accommodate the entire range of system, process, and plant worker activity to enable the highest degree of integration, thereby creating maximum efficiency and productivity. This pilot project will consider a range of open standards that are suitable for the various data and communication requirements of a seamless digital environment. It will map these standards into an overall architecture to support the II&C developments of this research program.

  12. User's manual for computer program BASEPLOT

    Science.gov (United States)

    Sanders, Curtis L.

    2002-01-01

    The checking and reviewing of daily records of streamflow within the U.S. Geological Survey is traditionally accomplished by hand-plotting and mentally collating tables of data. The process is time consuming, difficult to standardize, and subject to errors in computation, data entry, and logic. In addition, the presentation of flow data on the internet requires more timely and accurate computation of daily flow records. BASEPLOT was developed for checking and review of primary streamflow records within the U.S. Geological Survey. Use of BASEPLOT enables users to (1) provide efficiencies during the record checking and review process, (2) improve quality control, (3) achieve uniformity of checking and review techniques of simple stage-discharge relations, and (4) provide a tool for teaching streamflow computation techniques. The BASEPLOT program produces tables of quality control checks and produces plots of rating curves and discharge measurements; variable shift (V-shift) diagrams; and V-shifts converted to stage-discharge plots, using data stored in the U.S. Geological Survey Automatic Data Processing System database. In addition, the program plots unit-value hydrographs that show unit-value stages, shifts, and datum corrections; input shifts, datum corrections, and effective dates; discharge measurements; effective dates for rating tables; and numeric quality control checks. Checklist/tutorial forms are provided for reviewers to ensure completeness of review and standardize the review process. The program was written for the U.S. Geological Survey SUN computer using the Statistical Analysis System (SAS) software produced by SAS Institute, Incorporated.

  13. A research program in empirical computer science

    Science.gov (United States)

    Knight, J. C.

    1991-01-01

    During the grant reporting period our primary activities have been to begin preparation for the establishment of a research program in experimental computer science. The focus of research in this program will be safety-critical systems. Many questions that arise in the effort to improve software dependability can only be addressed empirically. For example, there is no way to predict the performance of the various proposed approaches to building fault-tolerant software. Performance models, though valuable, are parameterized and cannot be used to make quantitative predictions without experimental determination of underlying distributions. In the past, experimentation has been able to shed some light on the practical benefits and limitations of software fault tolerance. It is common, also, for experimentation to reveal new questions or new aspects of problems that were previously unknown. A good example is the Consistent Comparison Problem that was revealed by experimentation and subsequently studied in depth. The result was a clear understanding of a previously unknown problem with software fault tolerance. The purpose of a research program in empirical computer science is to perform controlled experiments in the area of real-time, embedded control systems. The goal of the various experiments will be to determine better approaches to the construction of the software for computing systems that have to be relied upon. As such it will validate research concepts from other sources, provide new research results, and facilitate the transition of research results from concepts to practical procedures that can be applied with low risk to NASA flight projects. The target of experimentation will be the production software development activities undertaken by any organization prepared to contribute to the research program. Experimental goals, procedures, data analysis and result reporting will be performed for the most part by the University of Virginia.

  14. Gender Differences in the Use of Computers, Programming, and Peer Interactions in Computer Science Classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-01-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new…

  15. [Clinical analysis of 12 cases of orthognathic surgery with digital computer-assisted technique].

    Science.gov (United States)

    Tan, Xin-ying; Hu, Min; Liu, Chang-kui; Liu, Hua-wei; Liu, San-xia; Tao, Ye

    2014-06-01

    This study was to investigate the effect of the digital computer-assisted technique in orthognathic surgery. Twelve patients from January 2008 to December 2011 with jaw malformation were treated in our department. With the help of CT and three-dimensional reconstruction technique, 12 patients underwent surgical treatment and the results were evaluated after surgery. Digital computer-assisted technique could clearly show the status of the jaw deformity and assist virtual surgery. After surgery all patients were satisfied with the results. Digital orthognathic surgery can improve the predictability of the surgical procedure, and to facilitate patients' communication, shorten operative time, and reduce patients' pain.

  16. Computer Program Recognizes Patterns in Time-Series Data

    Science.gov (United States)

    Hand, Charles

    2003-01-01

    A computer program recognizes selected patterns in time-series data like digitized samples of seismic or electrophysiological signals. The program implements an artificial neural network (ANN) and a set of N clocks for the purpose of determining whether N or more instances of a certain waveform, W, occur within a given time interval, T. The ANN must be trained to recognize W in the incoming stream of data. The first time the ANN recognizes W, it sets clock 1 to count down from T to zero; the second time it recognizes W, it sets clock 2 to count down from T to zero, and so forth through the Nth instance. On the N + 1st instance, the cycle is repeated, starting with clock 1. If any clock has not reached zero when it is reset, then N instances of W have been detected within time T, and the program so indicates. The program can readily be encoded in a field-programmable gate array or an application-specific integrated circuit that could be used, for example, to detect electroencephalographic or electrocardiographic waveforms indicative of epileptic seizures or heart attacks, respectively.

  17. Increasing the computational efficient of digital cross correlation by a vectorization method

    Science.gov (United States)

    Chang, Ching-Yuan; Ma, Chien-Ching

    2017-08-01

    This study presents a vectorization method for use in MATLAB programming aimed at increasing the computational efficiency of digital cross correlation in sound and images, resulting in a speedup of 6.387 and 36.044 times compared with performance values obtained from looped expression. This work bridges the gap between matrix operations and loop iteration, preserving flexibility and efficiency in program testing. This paper uses numerical simulation to verify the speedup of the proposed vectorization method as well as experiments to measure the quantitative transient displacement response subjected to dynamic impact loading. The experiment involved the use of a high speed camera as well as a fiber optic system to measure the transient displacement in a cantilever beam under impact from a steel ball. Experimental measurement data obtained from the two methods are in excellent agreement in both the time and frequency domain, with discrepancies of only 0.68%. Numerical and experiment results demonstrate the efficacy of the proposed vectorization method with regard to computational speed in signal processing and high precision in the correlation algorithm. We also present the source code with which to build MATLAB-executable functions on Windows as well as Linux platforms, and provide a series of examples to demonstrate the application of the proposed vectorization method.

  18. Digital Workflow for Computer-Guided Implant Surgery in Edentulous Patients: A Case Report.

    Science.gov (United States)

    Oh, Ji-Hyeon; An, Xueyin; Jeong, Seung-Mi; Choi, Byung-Ho

    2017-12-01

    The purpose of this article was to describe a fully digital workflow used to perform computer-guided flapless implant placement in an edentulous patient without the use of conventional impressions, models, or a radiographic guide. Digital data for the workflow were acquired using an intraoral scanner and cone-beam computed tomography (CBCT). The image fusion of the intraoral scan data and CBCT data was performed by matching resin markers placed in the patient's mouth. The definitive digital data were used to design a prosthetically driven implant position, surgical template, and computer-aided design and computer-aided manufacturing fabricated fixed dental prosthesis. The authors believe this is the first published case describing such a technique in computer-guided flapless implant surgery for edentulous patients. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  19. Factors Influencing Women's Attitudes towards Computers in a Computer Literacy Training Program

    Science.gov (United States)

    Chang, Sung-Lu; Shieh, Ruey S.; Liu, Eric Zhi-Feng; Yu, Pao-Ta

    2012-01-01

    In the "Digital Divide" research, adult women have generally been found to be the weakest group when compared with others. There is thus a need to provide this particular group with computer literacy training, and to give them opportunities to learn about using computers. In such training, women not only need to learn computer skills,…

  20. Computer analysis, using a digitizer, of ataxic mouse gait due to drugs.

    Science.gov (United States)

    Steinberg, H; Sykes, E A; McBride, A; Terry, P; Robinson, K; Tillotson, H

    1989-04-01

    A simple objective method using the irregularity of the spacing of rats' footprints to determine drug-induced locomotor ataxia has been adapted for mice and for computer analysis, by means of a digitizer-based program. Results obtained by the usual manual method of measuring and analyzing the records are compared with results of the computerized method. The computer method improves speed, and perhaps accuracy, of measurement and analysis, especially with large numbers of records, although manual scoring gives satisfactory results and remains essential for unusual records. Inter-observer agreement of the computerized method was high, and there was good agreement between measurements and subjective ratings of ataxia. The use of footprints to measure ataxia, with or without computer aid, is recommended as a routine test in laboratory evaluation of psychoactive drugs. Other uses discussed include determining changes in different characteristics of gait such as step width and step length in animal and human subjects affected not only by drugs, but also by movement disorders such as Parkinsonism.

  1. Interactive Computing and Graphics in Undergraduate Digital Signal Processing. Microcomputing Working Paper Series F 84-9.

    Science.gov (United States)

    Onaral, Banu; And Others

    This report describes the development of a Drexel University electrical and computer engineering course on digital filter design that used interactive computing and graphics, and was one of three courses in a senior-level sequence on digital signal processing (DSP). Interactive and digital analysis/design routines and the interconnection of these…

  2. STEW A Nonlinear Data Modeling Computer Program

    CERN Document Server

    Chen, H

    2000-01-01

    A nonlinear data modeling computer program, STEW, employing the Levenberg-Marquardt algorithm, has been developed to model the experimental sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross sections. This report presents results of the modeling of the sup 2 sup 3 sup 9 Pu(n,f) and sup 2 sup 3 sup 5 U(n,f) cross-section data. The calculation of the fission transmission coefficient is based on the double-humped-fission-barrier model of Bjornholm and Lynn. Incident neutron energies of up to 5 MeV are considered.

  3. Computer programming: Science, art, or both?

    Science.gov (United States)

    Gum, Sandra Trent

    The purpose of this study was to determine if spatial intelligence contributes to a student's success in a computer science major or if mathematical-logical intelligence is sufficient data on which to base a prediction of success. The study was performed at a small university. The sample consisted of 15 computer science (CS) majors, enrolled in a computer science class, and 15 non-CS-majors, enrolled in a statistics class. Seven of the CS-majors were considered advanced and seven were considered less advanced. The independent measures were: the mathematics and the English scores from the ACT/SAT (CS-majors); a questionnaire to obtain personal information; the major area of study which compared CS-majors to all other majors; and the number of completed computer science classes (CS-majors) to determine advanced and less advanced CS-majors. The dependent measures were: a multiple intelligence inventory for adults to determine perception of intelligences; the GEFT to determine field independence independence; the Card Rotations Test to determine spatial orientation ability; the Maze Tracing Speed Test to determine spatial scanning ability; and the Surface Development test to determine visualization ability. The visualization measure correlated positively and significantly with the GEFT. The year in college correlated positively and significantly with the GEFT and visualization measure for CS-majors and correlated negatively for non-CS-majors. Although non-CS-majors scored higher on the spatial orientation measure, CS-majors scored significantly higher on the spatial scanning measure. The year in college correlated negatively with many of the measures and perceptions of intelligences among both groups; however, there were more significant negative correlations among non-CS-majors. Results indicated that experience in computer programming may increase field independence, visualization ability, and spatial scanning ability while decreasing spatial orientation ability. The

  4. Computer Aided Detection of Breast Masses in Digital Tomosynthesis

    Science.gov (United States)

    2008-06-01

    median filter. (c) Gabor filter: Another promising filter for image denoising and texture analysis is the Gabor filter (34). This type of multichannel ...retrieval in mammography: Using texture features for correlation with BI-RADS categories,” Proceedings of the 6th International Work- shop on Digital

  5. 75 FR 7370 - Closed Captioning of Video Programming; Closed Captioning Requirements for Digital Television...

    Science.gov (United States)

    2010-02-19

    ... Television Receivers AGENCY: Federal Communications Commission. ACTION: Final rule; announcement of effective... Closed Captioning of Video Programming; Closed Captioning Requirements for Digital Television Receivers...

  6. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    Science.gov (United States)

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  7. Elderly online: effects of a digital inclusion program in cognitive performance.

    Science.gov (United States)

    Ordonez, Tiago Nascimento; Yassuda, Mônica Sanches; Cachioni, Meire

    2011-01-01

    There is little empirical data about the impact of digital inclusion on cognition among older adults. This paper aimed at investigating the effects of a digital inclusion program in the cognitive performance of older individuals who participated in a computer learning workshop named "Idosos On-Line" (Elderly Online). Forty-two aged individuals participated in the research study: 22 completed the computer training workshop and 20 constituted the control group. All subjects answered a sociodemographic questionnaire and completed the Addenbrooke's cognitive examination, revised (ACE-R), which examines five cognitive domains: orientation and attention, memory, verbal fluency, language, and visuo-spatial skills. It was noted that the experimental group's cognitive performance significantly improved after the program, particularly in the language and memory domains, when compared to the control group. These findings suggest that the acquisition of new knowledge and the use of a new tool, that makes it possible to access the Internet, may bring gains to cognition. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Noninferiority and Equivalence Evaluation of Clinical Performance among Computed Radiography, Film, and Digitized Film for Telemammography Services

    Directory of Open Access Journals (Sweden)

    Antonio J. Salazar

    2016-01-01

    Full Text Available Objective. The aim of this study was to evaluate and compare the clinical performance of different alternatives to implement low-cost screening telemammography. We compared computed radiography, film printed images, and digitized films produced with a specialized film digitizer and a digital camera. Material and Methods. The ethics committee of our institution approved this study. We assessed the equivalence of the clinical performance of observers for cancer detection. The factorial design included 70 screening patients, four technological alternatives, and cases interpreted by seven radiologists, for a total of 1,960 observations. The variables evaluated were the positive predictive value (PPV, accuracy, sensitivity, specificity, and the area under the receiver operating characteristic curves (AUC. Result. The mean values for the observed variables were as follows: accuracy ranged from 0.77 to 0.82, the PPV ranged from 0.67 to 0.68, sensitivity ranged from 0.64 to 0.74, specificity ranged from 0.87 to 0.90, and the AUC ranged from 0.87 to 0.90. At a difference of 0.1 to claim equivalence, all alternatives were equivalent for all variables. Conclusion. Our findings suggest that telemammography screening programs may be provided to underserved populations at a low cost, using a film digitizer or a digital camera.

  9. 77 FR 50722 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants... regulatory guide (DG), DG-1208, ``Software Unit Testing for Digital Computer Software used in Safety Systems... revision endorses, with clarifications, the enhanced consensus practices for testing of computer software...

  10. Numerical analysis of stiffened shells of revolution. Volume 3: Users' manual for STARS-2B, 2V, shell theory automated for rotational structures, 2 (buckling, vibrations), digital computer programs

    Science.gov (United States)

    Svalbonas, V.

    1973-01-01

    The User's manual for the shell theory automated for rotational structures (STARS) 2B and 2V (buckling, vibrations) is presented. Several features of the program are: (1) arbitrary branching of the shell meridians, (2) arbitrary boundary conditions, (3) minimum input requirements to describe a complex, practical shell of revolution structure, and (4) accurate analysis capability using a minimum number of degrees of freedom.

  11. Digital image correlation in experimental mechanics and image registration in computer vision: Similarities, differences and complements

    Science.gov (United States)

    Wang, Zhaoyang; Kieu, Hien; Nguyen, Hieu; Le, Minh

    2015-02-01

    Digital image correlation and image registration or matching are among the most widely used techniques in the fields of experimental mechanics and computer vision, respectively. Despite their applications in separate fields, both techniques primarily involve detecting the same physical points in two or more images. In this paper, a brief technical comparison of the two techniques is reviewed, and their similarities and differences as well as complements are presented. It is shown that some concepts from the image registration or matching technique can be applied to the digital image correlation technique to substantially enhance its performance, which can help broaden the applications of digital image correlation in scientific research and engineering practice.

  12. Ceramic Prototypes – Design, Computation, and Digital Fabrication

    Directory of Open Access Journals (Sweden)

    M. Bechthold

    2016-12-01

    Full Text Available Research in ceramic material systems at Harvard University has introduced a range of novel applications which combine digital manufacturing technologies and robotics with imaginative design and engineering methods. Prototypes showcase the new performative qualities of ceramics and the integration of this material in today’s construction culture. Work ranges from daylight control systems to structural applications and a robotic tile placement system. Emphasis is on integrating novel technologies with tried and true manufacturing methods. The paper describes two distinct studies – one on 3D print-ing of ceramics, the other on structural use of large format thin tiles.

  13. Computer Aided Detection of Breast Masses in Digital Tomosynthesis

    National Research Council Canada - National Science Library

    Singh, Swatee; Lo, Joseph

    2008-01-01

    The purpose of this study was to investigate feasibility of computer-aided detection of masses and calcification clusters in breast tomosynthesis images and obtain reliable estimates of sensitivity...

  14. Digital library programs for libraries and archives developing, managing, and sustaining unique digital collections

    CERN Document Server

    Purcell, D

    2016-01-01

    Equally valuable for LIS students just learning about the digital landscape, information professionals taking their first steps to create digital content, and organizations who already have well-established digital credentials, Purcell's book outlines methods applicable and scalable to many different types and sizes of libraries and archives.

  15. Dose and risk evaluation in digital mammography using computer modeling

    Energy Technology Data Exchange (ETDEWEB)

    Correa, Samanda Cristine Arruda; Souza, Edmilson Monteiro de, E-mail: scorrea@nuclear.ufrj.b, E-mail: emonteiro@nuclear.ufrj.b [Centro Universitario Estadual da Zona Oeste (CCMAT/UEZO), Rio de Janeiro, RJ (Brazil); Silva, Humberto de Oliveira, E-mail: hbetorj@gmail.co [Universidade Federal do Rio de Janeiro IF/UFRJ, RJ (Brazil). Inst. de Fisica; Silva, Ademir Xavier da; Lopes, Ricardo Tadeu; Magalhaes, Sarah Braga, E-mail: ademir@nuclear.ufrj.b, E-mail: ricardo@lin.ufrj.b, E-mail: smagalhaes@nuclear.ufrj.b [Coordenacao dos Programas de Pos-Graduacao de Engenharia (COPPE/UFRJ), RJ (Brazil). Programa de Engenharia Nuclear

    2010-07-01

    Digital mammography has been introduced in several countries in the last years. The new technology requires new optimising methods considering for instance the increased possibility of changing the absorbed dose, mainly in modern mammographic systems that allow the operator to choose the beam quality by varying the tube voltage, and filter and target materials. In this work, the Monte Carlo code MCNPX is used in order to investigate how the average glandular dose vary with tube voltage (23-32 kV) and anode-filter combination (Mo-Mo,Mo-Rh and Rh-Rh) in digital mammographic examinations. Furthermore, the risk of breast cancer incidence attributable to mammography exams was estimated using the Biological Effects of Ionizing Radiations (BEIR) VII Committee Report. The results show that the risk of breast cancer incidence in women younger than 30 years of age tends to decrease significantly using Rh-Rh anode-filter combination and higher tube voltage. For women older than 50 years of age the variation of tube voltage, and anode-filter combination did not influence the risk values considerably. (author)

  16. Genetic program based data mining to reverse engineer digital logic

    Science.gov (United States)

    Smith, James F., III; Nguyen, Thanh Vu H.

    2006-04-01

    A data mining based procedure for automated reverse engineering and defect discovery has been developed. The data mining algorithm for reverse engineering uses a genetic program (GP) as a data mining function. A genetic program is an algorithm based on the theory of evolution that automatically evolves populations of computer programs or mathematical expressions, eventually selecting one that is optimal in the sense it maximizes a measure of effectiveness, referred to as a fitness function. The system to be reverse engineered is typically a sensor. Design documents for the sensor are not available and conditions prevent the sensor from being taken apart. The sensor is used to create a database of input signals and output measurements. Rules about the likely design properties of the sensor are collected from experts. The rules are used to create a fitness function for the genetic program. Genetic program based data mining is then conducted. This procedure incorporates not only the experts' rules into the fitness function, but also the information in the database. The information extracted through this process is the internal design specifications of the sensor. Uncertainty related to the input-output database and the expert based rule set can significantly alter the reverse engineering results. Significant experimental and theoretical results related to GP based data mining for reverse engineering will be provided. Methods of quantifying uncertainty and its effects will be presented. Finally methods for reducing the uncertainty will be examined.

  17. Near real-time digital holographic microscope based on GPU parallel computing

    Science.gov (United States)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  18. Human operator identification model and related computer programs

    Science.gov (United States)

    Kessler, K. M.; Mohr, J. N.

    1978-01-01

    Four computer programs which provide computational assistance in the analysis of man/machine systems are reported. The programs are: (1) Modified Transfer Function Program (TF); (2) Time Varying Response Program (TVSR); (3) Optimal Simulation Program (TVOPT); and (4) Linear Identification Program (SCIDNT). The TV program converts the time domain state variable system representative to frequency domain transfer function system representation. The TVSR program computes time histories of the input/output responses of the human operator model. The TVOPT program is an optimal simulation program and is similar to TVSR in that it produces time histories of system states associated with an operator in the loop system. The differences between the two programs are presented. The SCIDNT program is an open loop identification code which operates on the simulated data from TVOPT (or TVSR) or real operator data from motion simulators.

  19. Polygonal Approximation of Digital Curves Using Evolutionary Programming

    Directory of Open Access Journals (Sweden)

    Raul E. Sanchez-Yanez

    2012-03-01

    Full Text Available El artículo propone un método basado en Programación Evolutiva para solucionar la aproximación poligonal de curvas digitales. La solución proporcionada por el método consiste en una secuencia de segmentos de línea que será traducida a primitivas de movimiento Avanzar y Rotar en un robot Cartesiano 2D. El método propuesto encuentra automáticamente el número de segmentos así como sus puntos iniciales y finales. El método ha sido probado en un conjunto de curvas digitales de prueba que exhibe principalmente dos características: openess y straightness, en diferentes grados. Se demuestra que el método propuesto proporciona buenos resultados para la aproximación de curvas en el conjunto de pruebas. Presentamos resultados cualitativos y cuantitativos de estas pruebas.This paper proposes an Evolutionary Programming (EP approach to solve the polygonal approximation of digital curves. The solution provided by the method consists of a sequence of straight line segments to be applied as Advance and Rotate motion primitives of a 2D Cartesian robot. The proposed approach finds automatically the number of segments and the startingand ending points of each of them. We have tested our approach on a test set of digital curves that exhibits two main qualitative features: openess and straightness, in different degrees. We show that our method obtains good results for approximating the curves in the test set. We present both quantitative and qualitative results of these test.

  20. Experiences with Efficient Methodologies for Teaching Computer Programming to Geoscientists

    Science.gov (United States)

    Jacobs, Christian T.; Gorman, Gerard J.; Rees, Huw E.; Craig, Lorraine E.

    2016-01-01

    Computer programming was once thought of as a skill required only by professional software developers. But today, given the ubiquitous nature of computation and data science it is quickly becoming necessary for all scientists and engineers to have at least a basic knowledge of how to program. Teaching how to program, particularly to those students…

  1. Computer Programs for Characteristic Modes of Bodies of Revolution

    Science.gov (United States)

    Computer programs are given for calculating the characteristic currents and characteristic gain patterns of conducting bodies of revolution. Also...given are computer programs for using these characteristic currents in aperture radiation and plane-wave scattering problems. Plot programs for use with

  2. 01010000 01001100 01000001 01011001: Play Elements in Computer Programming

    Science.gov (United States)

    Breslin, Samantha

    2013-01-01

    This article explores the role of play in human interaction with computers in the context of computer programming. The author considers many facets of programming including the literary practice of coding, the abstract design of programs, and more mundane activities such as testing, debugging, and hacking. She discusses how these incorporate the…

  3. ROUTES: a computer program for preliminary route location.

    Science.gov (United States)

    S.E. Reutebuch

    1988-01-01

    An analytical description of the ROUTES computer program is presented. ROUTES is part of the integrated preliminary harvest- and transportation-planning software package, PLANS. The ROUTES computer program is useful where grade and sideslope limitations are important in determining routes for vehicular travel. With the program, planners can rapidly identify route...

  4. How Computers Change Things: Literacy and the Digitized Word.

    Science.gov (United States)

    Edwards, Bruce L., Jr.

    1991-01-01

    Asserts that, although computers pose no threat to reading and writing as modes of learning, knowing, and telling, they represent an attack on the Western tradition of textuality. Argues that instructors are needed whose literacy connects them with the orality of the past and bridges their present experience to the textuality of the future. (PRA)

  5. Digital Dating and Virtual Relating: Conceptualizing Computer Mediated Romantic Relationships.

    Science.gov (United States)

    Merkle, Erich R.; Richardson, Rhonda A.

    2000-01-01

    Examines the culture and history of the Internet that have contributed to the recent emergence of a subset of romantic interpersonal relationships known as computer mediated relationships. Considers the differences between the characteristics of face-to-face relationships and online relationships. Discusses implications of findings on clinical…

  6. NRC Class 1E Digital Computer System Guidelines

    Science.gov (United States)

    1993-05-01

    then be "proved" that the vessel cannot be at high temperature state and norma ! t emperature state at the same time. The question whether high, normal...3 of Dependability of critical computer systems. Elsever Applied Science, 1988. [18] J. W. Duran and S. C. Ntafos, "A report on random testing," in

  7. Digital event recorder capable of simple computations and with ...

    African Journals Online (AJOL)

    An event recorder which can summate and display stored data is described. This instrument can be used to record behavioural events or sequences in the laboratory or the field and produces a punched tape record which may be read by a computer, without need for an interface. Its ability to perform simple calculations for ...

  8. Simulation and Digitization of a Gas Electron Multiplier Detector Using Geant4 and an Object-Oriented Digitization Program

    Science.gov (United States)

    McMullen, Timothy; Liyanage, Nilanga; Xiong, Weizhi; Zhao, Zhiwen

    2017-01-01

    Our research has focused on simulating the response of a Gas Electron Multiplier (GEM) detector using computational methods. GEM detectors provide a cost effective solution for radiation detection in high rate environments. A detailed simulation of GEM detector response to radiation is essential for the successful adaption of these detectors to different applications. Using Geant4 Monte Carlo (GEMC), a wrapper around Geant4 which has been successfully used to simulate the Solenoidal Large Intensity Device (SoLID) at Jefferson Lab, we are developing a simulation of a GEM chamber similar to the detectors currently used in our lab. We are also refining an object-oriented digitization program, which translates energy deposition information from GEMC into electronic readout which resembles the readout from our physical detectors. We have run the simulation with beta particles produced by the simulated decay of a 90Sr source, as well as with a simulated bremsstrahlung spectrum. Comparing the simulation data with real GEM data taken under similar conditions is used to refine the simulation parameters. Comparisons between results from the simulations and results from detector tests will be presented.

  9. Programs=data=first-class citizens in a computational world

    DEFF Research Database (Denmark)

    Jones, Neil; Simonsen, Jakob Grue

    2012-01-01

    From a programming perspective, Alan Turing's epochal 1936 paper on computable functions introduced several new concepts, including what is today known as self-interpreters and programs as data, and invented a great many now-common programming techniques. We begin by reviewing Turing's contribution...... from a programming perspective; and then systematize and mention some of the many ways that later developments in models of computation (MOCs) have interacted with computability theory and programming language research. Next, we describe the ‘blob’ MOC: a recent stored-program computational model...... without pointers. In the blob model, programs are truly first-class citizens, capable of being automatically compiled, or interpreted, or executed directly. Further, the blob model appears closer to being physically realizable than earlier computation models. In part, this is due to strong finiteness...

  10. Development of Window-based program for analysis and visualization of two-dimensional stress field in digital photoelasticity

    Directory of Open Access Journals (Sweden)

    Pichet Pinit

    2009-07-01

    Full Text Available This paper describes the development of a Window-based framework for analyzing and visualizing two-dimensional stress field in digital photoelasticity. The program is implemented as stand-alone software. The program contains mainly two parts: computational part and visual part supplemented with several image-processing functions. The computation method used in the program for retrieval of photoelastic parameters (isoclinic and isochromatic parameters is the phase stepping method. The visualization links between the results and the user by a gray scale or color map of such parameters, which is very convenient to the user for physical interpretation. With the Windows-based framework, additional modules eithercomputation or visualization can be simply added to the program.

  11. Advanced Methods for the Computer-Aided Diagnosis of Lesions in Digital Mammograms

    Science.gov (United States)

    2000-07-01

    classification of mammographic mass lesions. Radiology 213: 200, 1999. " Nishikawa R, Giger ML, Yarusso L, Kupinski M, Baehr A, Venta L,: Computer-aided...detection of mass lesions in digital mammography using radial gradient index filtering. Radiology 213: 229, 1999. " Maloney M, Huo Z, Giger ML, Venta L...Nishikawa R, Huo Z, Jiang Y, Venta L, Doi K: Computer-aided diagnosis (CAD) in breast imaging. Radiology 213: 507, 1999. -Final Report DAMD 17-96-1-6058 19

  12. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    Science.gov (United States)

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  13. Stochastic linear programming models, theory, and computation

    CERN Document Server

    Kall, Peter

    2011-01-01

    This new edition of Stochastic Linear Programming: Models, Theory and Computation has been brought completely up to date, either dealing with or at least referring to new material on models and methods, including DEA with stochastic outputs modeled via constraints on special risk functions (generalizing chance constraints, ICC’s and CVaR constraints), material on Sharpe-ratio, and Asset Liability Management models involving CVaR in a multi-stage setup. To facilitate use as a text, exercises are included throughout the book, and web access is provided to a student version of the authors’ SLP-IOR software. Additionally, the authors have updated the Guide to Available Software, and they have included newer algorithms and modeling systems for SLP. The book is thus suitable as a text for advanced courses in stochastic optimization, and as a reference to the field. From Reviews of the First Edition: "The book presents a comprehensive study of stochastic linear optimization problems and their applications. … T...

  14. Macroevolution simulated with autonomously replicating computer programs.

    Science.gov (United States)

    Yedid, Gabriel; Bell, Graham

    The process of adaptation occurs on two timescales. In the short term, natural selection merely sorts the variation already present in a population, whereas in the longer term genotypes quite different from any that were initially present evolve through the cumulation of new mutations. The first process is described by the mathematical theory of population genetics. However, this theory begins by defining a fixed set of genotypes and cannot provide a satisfactory analysis of the second process because it does not permit any genuinely new type to arise. The evolutionary outcome of selection acting on novel variation arising over long periods is therefore difficult to predict. The classical problem of this kind is whether 'replaying the tape of life' would invariably lead to the familiar organisms of the modern biota. Here we study the long-term behaviour of populations of autonomously replicating computer programs and find that the same type, introduced into the same simple environment, evolves on any given occasion along a unique trajectory towards one of many well-adapted end points.

  15. Basis And Application Of The CARES/LIFE Computer Program

    Science.gov (United States)

    Nemeth, Noel N.; Janosik, Lesley A.; Gyekenyesi, John P.; Powers, Lynn M.

    1996-01-01

    Report discusses physical and mathematical basis of Ceramics Analysis and Reliability Evaluation of Structures LIFE prediction (CARES/LIFE) computer program, described in "Program for Evaluation of Reliability of Ceramic Parts" (LEW-16018).

  16. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  17. Digital fluoroscopic excretory urography, digital fluoroscopic urethrography, helical computed tomography, and cystoscopy in 24 dogs with suspected ureteral ectopia.

    Science.gov (United States)

    Samii, Valerie F; McLoughlin, Mary A; Mattoon, John S; Drost, Wm Tod; Chew, Dennis J; DiBartola, Stephen P; Hoshaw-Woodard, Stacy

    2004-01-01

    The purpose of this study was to determine the diagnostic utility of helical computed tomography (CT) for the diagnosis of ectopic ureters in the dog and to compare these findings with those of digital fluoroscopic excretory urography and digital fluoroscopic urethrography. Ureteral ectopia was confirmed or disproved based on findings from cystoscopy and exploratory surgery or postmortem examination. Of 24 dogs (20 female, 4 male) evaluated, 17 had ureteral ectopia. Digital fluoroscopic excretory urography and CT correctly identified ureteral ectopic status and site of ureteral ectopia (P < .05). Urethrography did not reliably detect ureteral ectopia. No false-positive diagnoses of ureteral ectopia were made in any of the imaging studies. Cystoscopic findings significantly agreed with findings during surgery in determining ureteral ectopic status and ectopic ureter site. One false-positive cystoscopic diagnosis of unilateral ureteral ectopia was made in a male dog. Kappa statistics showed better agreement between CT and both cystoscopy and surgical or postmortem examination findings with regard to presence and site of ureteral ectopia compared with other imaging techniques. CT was more useful than other established diagnostic imaging techniques for diagnosing canine ureteral ectopia.

  18. 78 FR 47014 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2013-08-02

    ... (IEEE) Standard 828-2005, ``IEEE Standard for Software Configuration Management Plans,'' issued in 2005... RG 1.169 endorses IEEE Std. 828-2005, ``IEEE Standard for Software Configuration Management Plans... COMMISSION Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

  19. Definition and trade-off study of reconfigurable airborne digital computer system organizations

    Science.gov (United States)

    Conn, R. B.

    1974-01-01

    A highly-reliable, fault-tolerant reconfigurable computer system for aircraft applications was developed. The development and application reliability and fault-tolerance assessment techniques are described. Particular emphasis is placed on the needs of an all-digital, fly-by-wire control system appropriate for a passenger-carrying airplane.

  20. Design Principles for "Thriving in Our Digital World": A High School Computer Science Course

    Science.gov (United States)

    Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory

    2016-01-01

    "Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

  1. 77 FR 50727 - Configuration Management Plans for Digital Computer Software Used in Safety Systems of Nuclear...

    Science.gov (United States)

    2012-08-22

    ... revision 2 of RG 1.168, ``Verification, Validation, Reviews, and Audits for Digital Computer Software used... NRC-2012- 0195. You may submit comments by any of the following methods: Federal Rulemaking Web Site... possesses and are publicly available, by any of the following methods: Federal Rulemaking Web Site: Go to...

  2. Examining the Relationship between Digital Game Preferences and Computational Thinking Skills

    Science.gov (United States)

    Yildiz, Hatice Durak; Yilmaz, Fatma Gizem Karaoglan; Yilmaz, Ramazan

    2017-01-01

    The purpose of this study is to identify whether computational thinking skills among secondary school students differ depending on the type of digital games they play. The participants of this study were 202 secondary school students at 5th, 6th, 7th and 8th grades during 2016-2017 academic year. Correlational survey method was used during this…

  3. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    Science.gov (United States)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  4. Digital-computer normal shock position and restart control of a Mach 2.5 axisymmetric mixed-compression inlet

    Science.gov (United States)

    Neiner, G. H.; Cole, G. L.; Arpasi, D. J.

    1972-01-01

    Digital computer control of a mixed-compression inlet is discussed. The inlet was terminated with a choked orifice at the compressor face station to dynamically simulate a turbojet engine. Inlet diffuser exit airflow disturbances were used. A digital version of a previously tested analog control system was used for both normal shock and restart control. Digital computer algorithms were derived using z-transform and finite difference methods. Using a sample rate of 1000 samples per second, the digital normal shock and restart controls essentially duplicated the inlet analog computer control results. At a sample rate of 100 samples per second, the control system performed adequately but was less stable.

  5. 78 FR 47011 - Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-02

    ... COMMISSION Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants..., ``Software Unit Testing for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This... (ANSI/IEEE) Standard (Std.) 1008-1987, ``IEEE Standard for Software Unit Testing'' with the...

  6. 77 FR 50720 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2012-08-22

    ... COMMISSION Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants... Nuclear Power Plants.'' The DG-1207 is proposed Revision 1 of RG 1.170, dated September 1997. This... for Digital Computer Software Used in Safety Systems of Nuclear Power Plants'' is temporarily...

  7. 78 FR 47805 - Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants

    Science.gov (United States)

    2013-08-06

    ... COMMISSION Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants..., ``Test Documentation for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This... quality in the software used in safety systems of nuclear power plants. ADDRESSES: Please refer to Docket...

  8. KNET - DISTRIBUTED COMPUTING AND/OR DATA TRANSFER PROGRAM

    Science.gov (United States)

    Hui, J.

    1994-01-01

    KNET facilitates distributed computing between a UNIX compatible local host and a remote host which may or may not be UNIX compatible. It is capable of automatic remote login. That is, it performs on the user's behalf the chore of handling host selection, user name, and password to the designated host. Once the login has been successfully completed, the user may interactively communicate with the remote host. Data output from the remote host may be directed to the local screen, to a local file, and/or to a local process. Conversely, data input from the keyboard, a local file, or a local process may be directed to the remote host. KNET takes advantage of the multitasking and terminal mode control features of the UNIX operating system. A parent process is used as the upper layer for interfacing with the local user. A child process is used for a lower layer for interfacing with the remote host computer, and optionally one or more child processes can be used for the remote data output. Output may be directed to the screen and/or to the local processes under the control of a data pipe switch. In order for KNET to operate, the local and remote hosts must observe a common communications protocol. KNET is written in ANSI standard C-language for computers running UNIX. It has been successfully implemented on several Sun series computers and a DECstation 3100 and used to run programs remotely on VAX VMS and UNIX based computers. It requires 100K of RAM under SunOS and 120K of RAM under DEC RISC ULTRIX. An electronic copy of the documentation is provided on the distribution medium. The standard distribution medium for KNET is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. KNET was developed in 1991 and is a copyrighted work with all copyright vested in NASA. UNIX is a registered trademark of AT&T Bell Laboratories. Sun and SunOS are trademarks of Sun Microsystems, Inc. DECstation, VAX, VMS, and

  9. Digital Media and Technology in Afterschool Programs, Libraries, and Museums

    Science.gov (United States)

    Herr-Stephenson, Becky; Rhoten, Diana; Perkel, Dan; Sims, Christo

    2011-01-01

    Digital media and technology have become culturally and economically powerful parts of contemporary middle-class American childhoods. Immersed in various forms of digital media as well as mobile and Web-based technologies, young people today appear to develop knowledge and skills through participation in media. This MacArthur Report examines the…

  10. Fourier optics through the looking glass of digital computers

    Science.gov (United States)

    Yaroslavsky, Leonid P.

    2011-10-01

    Optical transforms are represented in computers by their discrete versions. In particular, Fourier optics is represented through Discrete Fourier Transform (DFT) and Discrete Cosine Transform (DCT). Being discrete representation of the optical Fourier transform, these transforms feature a number of peculiarities that cast a new light on such fundamental properties of the Fourier Transform as sampling theorem and the uncertainty principle. In this paper, we formulate the Discrete Sampling Theorem and the discrete uncertainty principle, demonstrate that discrete signals can be both bandlimited in DFT or DCT domains and have strictly limited support in signal domain and present examples of such "bandlimited/ space-limited" signals that remain to be so for whatever large of their samples.

  11. Success Factors and Strategic Planning: Rebuilding an Academic Library Digitization Program

    OpenAIRE

    Cory Lampert; Jason Vaughan

    2009-01-01

    This paper discusses a dual approach of case study and research survey to investigate the complex factors in sustaining academic library digitization programs. The case study involves the background of the University of Nevada, Las Vegas (UNLV) Libraries’ digitization program and elaborates on the authors’ efforts to gain staff support for this program. A related survey was administered to all Association of Research Libraries (ARL) members, seeking to collect baseline data on their digital c...

  12. Adaptation and Feasibility Study of a Digital Health Program to Prevent Diabetes among Low-Income Patients: Results from a Partnership between a Digital Health Company and an Academic Research Team

    Directory of Open Access Journals (Sweden)

    Valy Fontil

    2016-01-01

    Full Text Available Background. The feasibility of digital health programs to prevent and manage diabetes in low-income patients has not been adequately explored. Methods. Researchers collaborated with a digital health company to adapt a diabetes prevention program for low-income prediabetes patients at a large safety net clinic. We conducted focus groups to assess patient perspectives, revised lessons for improved readability and cultural relevance to low-income and Hispanic patients, conducted a feasibility study of the adapted program in English and Spanish speaking cohorts, and implemented real-time adaptations to the program for commercial use and for a larger trial of in multiple safety net clinics. Results. The majority of focus group participants were receptive to the program. We modified the curriculum to a 5th-grade reading level and adapted content based on patient feedback. In the feasibility study, 54% of eligible contacted patients expressed interest in enrolling (n=23. Although some participants’ computer access and literacy made registration challenging, they were highly satisfied and engaged (80% logged in at least once/week. Conclusions. Underserved prediabetic patients displayed high engagement and satisfaction with a digital diabetes prevention program despite lower digital literacy skills. The collaboration between researchers and a digital health company enabled iterative improvements in technology implementation to address challenges in low-income populations.

  13. Digital books.

    Science.gov (United States)

    Wink, Diane M

    2011-01-01

    In this bimonthly series, the author examines how nurse educators can use the Internet and Web-based computer technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes digital books.

  14. The Outlook for Computer Professions: 1985 Rewrites the Program.

    Science.gov (United States)

    Drake, Larry

    1986-01-01

    The author states that graduates of junior college programs who learn COBOL will continue to find jobs, but employers will increasingly seek college graduates when filling positions for computer programers and systems analysts. Areas of growth for computer applications (services, military, data communications, and artificial intelligence) are…

  15. Basic BASIC; An Introduction to Computer Programming in BASIC Language.

    Science.gov (United States)

    Coan, James S.

    With the increasing availability of computer access through remote terminals and time sharing, more and more schools and colleges are able to introduce programing to substantial numbers of students. This book is an attempt to incorporate computer programming, using BASIC language, and the teaching of mathematics. The general approach of the book…

  16. A Computer Program for Short Circuit Analysis of Electric Power ...

    African Journals Online (AJOL)

    This paper described the mathematical basis and computational framework of a computer program developed for short circuit studies of electric power systems. The Short Circuit Analysis Program (SCAP) is to be used to assess the composite effects of unbalanced and balanced faults on the overall reliability of electric ...

  17. Software survey: VOSviewer, a computer program for bibliometric mapping

    NARCIS (Netherlands)

    N.J.P. van Eck (Nees Jan); L. Waltman (Ludo)

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps.

  18. Near-Surface Seismic Velocity Data: A Computer Program For ...

    African Journals Online (AJOL)

    A computer program (NESURVELANA) has been developed in Visual Basic Computer programming language to carry out a near surface velocity analysis. The method of analysis used includes: Algorithms design and Visual Basic codes generation for plotting arrival time (ms) against geophone depth (m) employing the ...

  19. Case Studies of Liberal Arts Computer Science Programs

    Science.gov (United States)

    Baldwin, D.; Brady, A.; Danyluk, A.; Adams, J.; Lawrence, A.

    2010-01-01

    Many undergraduate liberal arts institutions offer computer science majors. This article illustrates how quality computer science programs can be realized in a wide variety of liberal arts settings by describing and contrasting the actual programs at five liberal arts colleges: Williams College, Kalamazoo College, the State University of New York…

  20. Programming language for computations in the Interkosmos program

    Science.gov (United States)

    Schmidt, K.

    1975-01-01

    The programming system for Intercosmos data processing, based on the structural programming theory, which considers a program as an ordered set of standardized elementary parts, from which the user programs are automatically generated, is described. The programs are comprised of several modules, which are briefly summarized. The general structure of the programming system is presented in a block diagram. A programming control language developed to formulate the problem quickly and completely is presented along with basic symbols which are characteristic of the Intercosmos programming system.

  1. Computer-aided photometric analysis of dynamic digital bioluminescent images

    Science.gov (United States)

    Gorski, Zbigniew; Bembnista, T.; Floryszak-Wieczorek, J.; Domanski, Marek; Slawinski, Janusz

    2003-04-01

    The paper deals with photometric and morphologic analysis of bioluminescent images obtained by registration of light radiated directly from some plant objects. Registration of images obtained from ultra-weak light sources by the single photon counting (SPC) technique is the subject of this work. The radiation is registered by use of a 16-bit charge coupled device (CCD) camera "Night Owl" together with WinLight EG&G Berthold software. Additional application-specific software has been developed in order to deal with objects that are changing during the exposition time. Advantages of the elaborated set of easy configurable tools named FCT for a computer-aided photometric and morphologic analysis of numerous series of quantitatively imperfect chemiluminescent images are described. Instructions are given how to use these tools and exemplified with several algorithms for the transformation of images library. Using the proposed FCT set, automatic photometric and morphologic analysis of the information hidden within series of chemiluminescent images reflecting defensive processes in poinsettia (Euphorbia pulcherrima Willd) leaves affected by a pathogenic fungus Botrytis cinerea is revealed.

  2. Method and computer program product for maintenance and modernization backlogging

    Science.gov (United States)

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  3. An Experimental Investigation of Computer Program Development Approaches and Computer Programming Metrics.

    Science.gov (United States)

    1979-12-01

    Approved for plbi elaSO; 81 1ibutio u0lim t 03 /1 Technical Report TR-853 December 1979 An Experimental Investigation of Computer Program Development...this Report) Approved for public release; distribution unlimited. 17. DISTRIBUTION STATEMENT (of rhe ah.tr.ct entled i f [31-k 20, ! dil(enot fo, ).R...55t -3 O 2 a. tL loco o.uaZ 4 1 0 0 - .. a .CO acca 4 * ao -- c- - .an za-e~a- 2.5 aa34z98ba a- CHAPTER VII coipletely differentiated outcome is

  4. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  5. Seventy Years of Computing in the Nuclear Weapons Program

    Energy Technology Data Exchange (ETDEWEB)

    Archer, Billy Joe [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-05-30

    Los Alamos has continuously been on the forefront of scientific computing since it helped found the field. This talk will explore the rich history of computing in the Los Alamos weapons program. The current status of computing will be discussed, as will the expectations for the near future.

  6. 76 FR 52350 - Vehicular Digital Multimedia Evidence Recording System (VDMERS) Standard, Certification Program...

    Science.gov (United States)

    2011-08-22

    ... of Justice Programs Vehicular Digital Multimedia Evidence Recording System (VDMERS) Standard...) will make available to the general public three draft documents related to Vehicular Digital Multimedia Evidence Recording Systems (VDMERSs) used by law enforcement agencies: 1. Draft VDMERS Standard for Law...

  7. Programming Not Required: Skills and Knowledge for the Digital Library Environment

    Science.gov (United States)

    Howard, Katherine

    2010-01-01

    Education for Library and Information professionals in managing the digital environment has been a key topic for discussion within the LIS environment for some time. However, before designing and implementing a program for digital library education, it is prudent to ensure that the skills and knowledge required to work in this environment are…

  8. Analysis of the nebulosities near T Tauri using digital computer image processing

    Science.gov (United States)

    Lorre, J. J.

    1975-01-01

    Direct plates of T Tauri taken with the 120-inch (3 m) and 36-inch (91 cm) Lick reflectors were digitized and analyzed using digital-computer image-processing techniques. Luminous emission protrusions extending to as far as 13-sec from T Tauri in position angles 170 deg, 210 deg, and 330 deg are shown. These features are variable and may contain a layered structure. The complex reflection nebula west of T Tauri (NGC 1555) appears to be an illuminated portion of a much larger dark nebula whose variability is due to obscuring material near the star.

  9. A Methodology for Teaching Computer Programming: first year students’ perspective

    OpenAIRE

    Bassey Isong

    2014-01-01

    The teaching of computer programming is one of the greatest challenges that have remained for years in Computer Science Education. A particular case is computer programming course for the beginners. While the traditional objectivist lecture-based approaches do not actively engage students to achieve their learning outcome, we believe that integrating some cutting-edge processes and practices like agile method into the teaching approaches will be leverage. Agile software development has gained...

  10. Computer Programs for Plotting Spot-Beam Coverages from an Earth-Synchronous Satellite and Earth-Station Antenna Elevation Angle Contours. Memorandum Number 72/4.

    Science.gov (United States)

    Stagl, Thomas W.; Singh, Jai P.

    Computer programs prepared in connection with a project on Application of Communication Satellites to Educational Development (see EM 010 449) are described and listed in this memorandum. First, the data tape containing a digitized map of the world which was used for the programs is described. Then the first program, WORLDMAP, which plots the tape…

  11. Attitude, Gender and Achievement in Computer Programming

    Science.gov (United States)

    Baser, Mustafa

    2013-01-01

    The aim of this research was to explore the relationship among students' attitudes toward programming, gender and academic achievement in programming. The scale used for measuring students' attitudes toward programming was developed by the researcher and consisted of 35 five-point Likert type items in four subscales. The scale was administered to…

  12. Wrist Hypothermia Related to Continuous Work with a Computer Mouse: A Digital Infrared Imaging Pilot Study

    Directory of Open Access Journals (Sweden)

    Jelena Reste

    2015-08-01

    Full Text Available Computer work is characterized by sedentary static workload with low-intensity energy metabolism. The aim of our study was to evaluate the dynamics of skin surface temperature in the hand during prolonged computer mouse work under different ergonomic setups. Digital infrared imaging of the right forearm and wrist was performed during three hours of continuous computer work (measured at the start and every 15 minutes thereafter in a laboratory with controlled ambient conditions. Four people participated in the study. Three different ergonomic computer mouse setups were tested on three different days (horizontal computer mouse without mouse pad; horizontal computer mouse with mouse pad and padded wrist support; vertical computer mouse without mouse pad. The study revealed a significantly strong negative correlation between the temperature of the dorsal surface of the wrist and time spent working with a computer mouse. Hand skin temperature decreased markedly after one hour of continuous computer mouse work. Vertical computer mouse work preserved more stable and higher temperatures of the wrist (>30 °C, while continuous use of a horizontal mouse for more than two hours caused an extremely low temperature (<28 °C in distal parts of the hand. The preliminary observational findings indicate the significant effect of the duration and ergonomics of computer mouse work on the development of hand hypothermia.

  13. Digitization

    DEFF Research Database (Denmark)

    Finnemann, Niels Ole

    2014-01-01

    Processes of digitization have for years represented a major trend in the developments of modern society but have only recently been related to processes of mediatization. The purpose of this article is to look into the relation between the concepts of mediatization and digitization and to clarify...... what a concept of digital media might add to the understanding of processes of mediatization and what the concept of mediatization might add to the understanding of digital media. It is argued that digital media open an array of new trajectories in human communication, trajectories which were...... not anticipated in previous conceptualizations of media and mediatization. If digital media are to be included, the concept of mediatization has to be revised and new parameters are to be built into the concept of media. At the same time it is argued that the concept of mediatization still provides a variety...

  14. Digital Video Assignments: Focusing a New Lens on Teacher Preparation Programs

    Science.gov (United States)

    Fiorentino, Leah Holland

    2004-01-01

    Since NCATE requires teacher preparation programs to integrate technology, this article begins the dialogue for sharing start-up strategies. This article shows how one program (Adelphi University) has integrated digital video assignments into the teacher preparation program. It is hoped that this will spur interest and effort from other teacher…

  15. Digital Bridge or Digital Divide? A Case Study Review of the Implementation of the "Computers for Pupils Programme" in a Birmingham Secondary School

    Science.gov (United States)

    Morris, Jonathan Padraig

    2011-01-01

    Attempts to bridge the Digital Divide have seen vast investment in Information Communication Technology in schools. In the United Kingdom, the Computers for Pupils initiative has invested 60 million British Pounds of funds to help some of the most disadvantaged secondary school pupils by putting a computer in their home. This paper charts and…

  16. Tangible computer programming: Exploring the use of emerging technology in classrooms and science museums

    Science.gov (United States)

    Horn, Michael S.

    In considering ways to improve the use of digital technology in educational settings, it is helpful to look beyond desktop computers and conventional modes of interaction and consider the flood of emerging technologies that already play a prominent role in the everyday lives of children. In this dissertation, I will present a research project that builds on tangible user interface (TUI) technology to support computer programming and robotics activities in education settings. In particular, I will describe the design and implementation of a novel tangible computer programming language called Tern. I will also describe an evaluation of Tern's use in both formal and informal educational settings--as part of an interactive exhibit on robotics and computer programming called Robot Park on display at the Boston Museum of Science; and as part of a curriculum unit piloted in several kindergarten classrooms in the greater Boston area. In both cases, Tern allows children to create simple computer programs to control a robot. However, rather than using a keyboard or mouse to write programs on a computer screen, children instead use Tern to construct physical algorithmic structures using a collection of interlocking wooden blocks. The goal of this work is not to propose that tangible programming languages are general purpose tools that should replace existing graphical programming languages; rather, I will present evidence to support the argument that tangible programming begins to make sense when one considers the contexts and constraints of specific educational settings. Moreover, in these settings tangible languages can compensate for some of the shortcomings of graphical and text-based systems that have limited their use.

  17. Generic Assessment Rubrics for Computer Programming Courses

    Science.gov (United States)

    Mustapha, Aida; Samsudin, Noor Azah; Arbaiy, Nurieze; Mohammed, Rozlini; Hamid, Isredza Rahmi

    2016-01-01

    In programming, one problem can usually be solved using different logics and constructs but still producing the same output. Sometimes students get marked down inappropriately if their solutions do not follow the answer scheme. In addition, lab exercises and programming assignments are not necessary graded by the instructors but most of the time…

  18. Instructional Uses of the Computer: Program Force

    Science.gov (United States)

    Ostrander, P.

    1975-01-01

    Describes a program which simulates motion in two dimensions of a point mass subject to a force which is a function of position, velocity, or time. Sample applications are noted and a source of a complete list of applications and programs is given. (GH)

  19. Introduction of handheld computing to a family practice residency program.

    Science.gov (United States)

    Rao, Goutham

    2002-01-01

    Handheld computers are valuable practice tools. It is important for residency programs to introduce their trainees and faculty to this technology. This article describes a formal strategy to introduce handheld computing to a family practice residency program. Objectives were selected for the handheld computer training program that reflected skills physicians would find useful in practice. TRGpro handheld computers preloaded with a suite of medical reference programs, a medical calculator, and a database program were supplied to participants. Training consisted of four 1-hour modules each with a written evaluation quiz. Participants completed a self-assessment questionnaire after the program to determine their ability to meet each objective. Sixty of the 62 participants successfully completed the training program. The mean composite score on quizzes was 36 of 40 (90%), with no significant differences by level of residency training. The mean self-ratings of participants across all objectives was 3.31 of 4.00. Third-year residents had higher mean self-ratings than others (mean of group, 3.62). Participants were very comfortable with practical skills, such as using drug reference software, and less comfortable with theory, such as knowing the different types of handheld computers available. Structured training is a successful strategy for introducing handheld computing to a residency program.

  20. Diagnostic accuracy of Cone Beam Computed Tomography, conventional and digital radiographs in detecting interproximal caries.

    Science.gov (United States)

    Safi, Y; Shamloo Mahmoudi, N; Aghdasi, M M; Eslami Manouchehri, M; Rahimian, R; Valizadeh, S; Vasegh, Z; Azizi, Z

    2015-01-01

    Presently, various imaging methods are available for the disclosure of proximal caries. Some recent studies have attempted to determine the diagnostic accuracy of available modalities, but they have shown variable results. Aim: This study was carried out to recognize and examine the correctness of cone-beam computed tomography (CBCT), regular radiographs and the nondirect digital system in the disclosure of interproximal caries. Materials and Method: In this observational tryout study, forty-two extracted non-cavitated, unrestored person molar and premolar teeth were placed in the blocks with proximal surfaces in touch. Then they were appraised by CBCT, formal radiographs and the nondirect digital system for the disclosure of interproximal caries. Four oral and maxillofacial radiologists used a 4-point scale to assess the pictures for the existence or absence of proximal caries. Caries depth was specified by histological examination. The gathered data were assessed by SPSS software using Weighted Kappa and Friedman test. Results: Statistics demonstrated that the accuracy of the indirect digital system was somewhat better than conventional systems. The accuracy of the indirect digital system was better than cone beam system, and this difference was statistically significant. Conclusion: The digital system was better than CBCT in the disclosure of proximal caries. The formal radiography fell in between the two other systems without a statistically significant deviation in detecting caries. Thus, CBCT is not advised to detect proximal caries because of the higher radiation dose.

  1. Delay-based reservoir computing: noise effects in a combined analog and digital implementation.

    Science.gov (United States)

    Soriano, Miguel C; Ortín, Silvia; Keuninckx, Lars; Appeltant, Lennert; Danckaert, Jan; Pesquera, Luis; van der Sande, Guy

    2015-02-01

    Reservoir computing is a paradigm in machine learning whose processing capabilities rely on the dynamical behavior of recurrent neural networks. We present a mixed analog and digital implementation of this concept with a nonlinear analog electronic circuit as a main computational unit. In our approach, the reservoir network can be replaced by a single nonlinear element with delay via time-multiplexing. We analyze the influence of noise on the performance of the system for two benchmark tasks: 1) a classification problem and 2) a chaotic time-series prediction task. Special attention is given to the role of quantization noise, which is studied by varying the resolution in the conversion interface between the analog and digital worlds.

  2. Digital Monsters, Binary Aliens – Computer Viruses, Capitalism and the Flow of Information

    Directory of Open Access Journals (Sweden)

    Jussi Parikka

    2005-01-01

    Full Text Available This article deals with articulations of digital accidents, focusing especially on how the computer virus has been signified as a problem for national security, international commerce and the individual user. However, at the same time as viruses have since the 1980s been constructed as malicious software threatening the very basics of the network society, they have been captured as part of the consumer capitalist system, exemplified e.g. in the rise of anti-virus industry. Thus, the article argues that capitalism itself is viral, functioning through a constant reshifting of its limits. Capitalism proceeds per se via these accidents and disruptions that it at the same time constructs as its enemies. In this sense, computer viruses and worms can be understood as the general accidents of digital capitalist culture.

  3. Teacher Training Programs for Computer Education and Computer Assisted Education in Turkey

    Science.gov (United States)

    Usun, Salih

    2007-01-01

    The aim of this descriptive study is to review the applications and problems on the teacher training programs for computer education and computer assisted education (CAE) in Turkey. The study, firstly, introduces some applications and major problems on using instructional media and computers in developing countries and instructional technology…

  4. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  5. Automatic analysis of digitized TV-images by a Computer-driven Optical Microscope

    CERN Document Server

    Rosa, G; Grella, G; Romano, G

    1997-01-01

    New methods of image analysis and three-dimensional pattern recognition were developed in order to perform the automatic scan of nuclear emulsion pellicles. An optical microscope, with a motorized stage, was equipped with a CCD camera and an image digitizer, and interfaced to a personal computer. Selected software routines inspired the design of a dedicated hardware processor. Fast operation, high efficiency and accuracy were achieved. First applications to high-energy physics experiments are reported.

  6. Digital Records Forensics: A New Science and Academic Program for Forensic Readiness

    Directory of Open Access Journals (Sweden)

    Luciana Duranti

    2010-06-01

    Full Text Available This paper introduces the Digital Records Forensics project, a research endeavour located at the University of British Columbia in Canada and aimed at the development of a new science resulting from the integration of digital forensics with diplomatics, archival science, information science and the law of evidence, and of an interdisciplinary graduate degree program, called Digital Records Forensics Studies, directed to professionals working for law enforcement agencies, legal firms, courts, and all kind of institutions and business that require their services. The program anticipates the need for organizations to become “forensically ready,” defined by John Tan as “maximizing the ability of an environment to collect credible digital evidence while minimizing the cost of an incident response (Tan, 2001.” The paper argues the need for such a program, describes its nature and content, and proposes ways of delivering it.

  7. Advanced Methods for the Computer-Aided Diagnosis of Lesions in Digital Mammograms

    National Research Council Canada - National Science Library

    Giger, Maryellen Lissak

    1999-01-01

    The objective of the proposed research is to develop computer-aided diagnosis methods for use in mammography in order to increase the diagnostic accuracy of radiologists and to aid in mammographic screening programs...

  8. Design of a fault tolerant airborne digital computer. Volume 1: Architecture

    Science.gov (United States)

    Wensley, J. H.; Levitt, K. N.; Green, M. W.; Goldberg, J.; Neumann, P. G.

    1973-01-01

    This volume is concerned with the architecture of a fault tolerant digital computer for an advanced commercial aircraft. All of the computations of the aircraft, including those presently carried out by analogue techniques, are to be carried out in this digital computer. Among the important qualities of the computer are the following: (1) The capacity is to be matched to the aircraft environment. (2) The reliability is to be selectively matched to the criticality and deadline requirements of each of the computations. (3) The system is to be readily expandable. contractible, and (4) The design is to appropriate to post 1975 technology. Three candidate architectures are discussed and assessed in terms of the above qualities. Of the three candidates, a newly conceived architecture, Software Implemented Fault Tolerance (SIFT), provides the best match to the above qualities. In addition SIFT is particularly simple and believable. The other candidates, Bus Checker System (BUCS), also newly conceived in this project, and the Hopkins multiprocessor are potentially more efficient than SIFT in the use of redundancy, but otherwise are not as attractive.

  9. DNAGEL: a computer program for determining DNA fragment sizes using a small computer equipped with a graphics tablet.

    Science.gov (United States)

    Kieser, T

    1984-01-01

    The program DNAGEL is used to determine the size of DNA fragments run on agarose or polyacrylamide gels. The positions of the bands are read from gel photographs by means of a digitizer. Standard curves are calculated by the method of Southern (1979). The bands, as they are measured, are reproduced on the screen so that erroneous input can be recognized and corrected immediately. Similarly the estimated fragment sizes are printed in a table in the same relative positions as the bands on the gel. This makes it especially easy to relate fragment sizes with the bands on the gel picture. As an additional function the calculated positions of bands can be displayed on the screen. The program DNAGEL is written in APPLESOFT BASIC, suitable for APPLE II computers with 48K memory connected to a monitor, printer and a HOUSTON graphics tablet. PMID:6320102

  10. Computer Aided Design in Digital Human Modeling for Human Computer Interaction in Ergonomic Assessment: A Review

    OpenAIRE

    Suman Mukhopadhyay , Sanjib Kumar Das and Tania Chakraborty

    2012-01-01

    Research in Human-Computer Interaction (HCI) hasbeen enormously successful in the area of computeraidedergonomics or human-centric designs. Perfectfit for people has always been a target for productdesign. Designers traditionally used anthropometricdimensions for 3D product design which created a lotof fitting problems when dealing with thecomplexities of the human body shapes. Computeraided design (CAD), also known as Computer aideddesign and drafting (CADD) is the computertechnology used fo...

  11. Intelligent physical blocks for introducing computer programming in developing countries

    CSIR Research Space (South Africa)

    Smith, Adrew C

    2007-05-01

    Full Text Available This paper reports on the evaluation of a novel affordable system that incorporates intelligent physical blocks to introduce illiterate children in developing countries to the logical thinking process required in computer programming. Both...

  12. Scaling the Digital Divide: Home Computer Technology and Student Achievement. Working Paper 48

    Science.gov (United States)

    Vigdor, Jacob L.; Ladd, Helen F.

    2010-01-01

    Does differential access to computer technology at home compound the educational disparities between rich and poor? Would a program of government provision of computers to early secondary school students reduce these disparities? The authors use administrative data on North Carolina public school students to corroborate earlier surveys that…

  13. Scaling the Digital Divide: Home Computer Technology and Student Achievement. NBER Working Paper No. 16078

    Science.gov (United States)

    Vigdor, Jacob L.; Ladd, Helen F.

    2010-01-01

    Does differential access to computer technology at home compound the educational disparities between rich and poor? Would a program of government provision of computers to early secondary school students reduce these disparities? We use administrative data on North Carolina public school students to corroborate earlier surveys that document broad…

  14. Advanced wellbore thermal simulator GEOTEMP2. Appendix. Computer program listing

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, R.F.

    1982-02-01

    This appendix gives the program listing of GEOTEMP2 with comments and discussion to make the program organization more understandable. This appendix is divided into an introduction and four main blocks of code: main program, program initiation, wellbore flow, and wellbore heat transfer. The purpose and use of each subprogram is discussed and the program listing is given. Flowcharts will be included to clarify code organization when needed. GEOTEMP2 was written in FORTRAN IV. Efforts have been made to keep the programing as conventional as possible so that GEOTEMP2 will run without modification on most computers.

  15. 3D computer-aided detection for digital breast tomosynthesis: Comparison with 2D computer-aided detection for digital mammography in the detection of calcifications

    Energy Technology Data Exchange (ETDEWEB)

    Chu, A Jung; Cho, Nariya; Chang, Jung Min; Kim, Won Hwa; Lee, Su Hyun; Song, Sung Eun; Shin, Sung Ui; Moon, Woo Kyung [Dept. of Radiology, Seoul National University College of Medicine, Seoul National University Hospital, Seoul (Korea, Republic of)

    2017-08-15

    To retrospectively evaluate the performance of 3D computer-aided detection (CAD) for digital breast tomosynthesis (DBT) in the detection of calcifications in comparison with 2D CAD for digital mammography (DM). Between 2012 and 2013, both 3D CAD and 2D CAD systems were retrospectively applied to the calcification data set including 69 calcifications (31 malignant calcifications and 38 benign calcifications) and the normal data set including 20 bilateral normal mammograms. Each data set consisted of paired DBT and DM images. Sensitivities for the detection of malignant calcifications were calculated from the calcification data set. False-positive mark rates were calculated from the normal data set. They were compared between the two systems. Sensitivities of 3D CAD [100% (31/31) at levels 2, 1, and 0] were same as those of the 2D CAD system [100% (31/31) at levels 2 and 1] (p = 1.0, respectively). The mean value of false-positive marks per view with 3D CAD was higher than that with 2D CAD at level 2 (0.52 marks ± 0.91 vs. 0.07 marks ± 0.26, p = 0.009). 3D CAD for DBT showed equivalent sensitivity, albeit with a higher false-positive mark rate, than 2D CAD for DM in the detection of calcifications.

  16. Architecture and Programming Models for High Performance Intensive Computation

    Science.gov (United States)

    2016-06-29

    AFRL-AFOSR-VA-TR-2016-0230 Architecture and Programming Models for High Performance Intensive Computation XiaoMing Li UNIVERSITY OF DELAWARE Final...TITLE AND SUBTITLE Architecture and Programming Models for High Performance Intensive Computation 5a. CONTRACT NUMBER 5b. GRANT NUMBER FA9550-13-1-0213...developing an efficient system architecture and software tools for building and running Dynamic Data Driven Application Systems (DDDAS). The foremost

  17. Computing the Line Index of Balance Using Integer Programming Optimisation

    OpenAIRE

    Aref, Samin; Andrew J. Mason; Wilson, Mark C.

    2017-01-01

    An important measure of a signed graph is the line index of balance which has several applications in many fields. However, this graph-theoretic measure was underused for decades because of the inherent complexity in its computation which is closely related to solving NP-hard graph optimisation problems like MAXCUT. We develop new quadratic and linear programming models to compute the line index of balance exactly. Using the Gurobi integer programming optimisation solver, we evaluate the line...

  18. Software survey: VOSviewer, a computer program for bibliometric mapping

    OpenAIRE

    van Eck, Nees Jan; Waltman, Ludo

    2010-01-01

    textabstractWe present VOSviewer, a freely available computer program that we have developed for constructing and viewing bibliometric maps. Unlike most computer programs that are used for bibliometric mapping, VOSviewer pays special attention to the graphical representation of bibliometric maps. The functionality of VOSviewer is especially useful for displaying large bibliometric maps in an easy-to-interpret way. The paper consists of three parts. In the first part, an overview of VOSviewer'...

  19. Newnes circuit calculations pocket book with computer programs

    CERN Document Server

    Davies, Thomas J

    2013-01-01

    Newnes Circuit Calculations Pocket Book: With Computer Programs presents equations, examples, and problems in circuit calculations. The text includes 300 computer programs that help solve the problems presented. The book is comprised of 20 chapters that tackle different aspects of circuit calculation. The coverage of the text includes dc voltage, dc circuits, and network theorems. The book also covers oscillators, phasors, and transformers. The text will be useful to electrical engineers and other professionals whose work involves electronic circuitry.

  20. Contributions to computational stereology and parallel programming

    DEFF Research Database (Denmark)

    Rasmusson, Allan

    Stereology is the science of interpreting 3D structures from 2D sections planes and it is used in a multitude of disciplines including bioscience, material science and more. At its core is the use of random systematic sampling and geometrical probes which allow valid statistical inference...... between computer science and stereology, we try to overcome these problems by developing new virtual stereological probes and virtual tissue sections. A concrete result is the development of a new virtual 3D probe, the spatial rotator, which was found to have lower variance than the widely used planar...... simulator and a memory efficient, GPU implementation of for connected components labeling. This was furthermore extended to produce signed distance fields and Voronoi diagrams, all with real-time performance. It has during the course of the project been realized that many disciplines within computer science...

  1. 77 FR 38610 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-06-28

    ... education at the time of the parent or guardian's death. Beginning July 1, 2010, students who are otherwise... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program AGENCY: Department of Education. ACTION: Notice--Computer...

  2. Computing, Information, and Communications Technology (CICT) Program Overview

    Science.gov (United States)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  3. 78 FR 1275 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2013-01-08

    ...: Notice--computer matching between the Office of Personnel Management and the Social Security... matching program with the Social Security Administration (SSA). DATES: OPM will file a report of the..., as amended, regulates the use of computer matching by Federal agencies when records in a system of...

  4. 77 FR 74518 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-12-14

    ...: Notice--computer matching between the Office of Personnel Management and the Social Security... Personnel Management (OPM) is publishing notice of its new computer matching program with the Social... matching by Federal agencies when records in a system of records are matched with other Federal, State, or...

  5. A computer program for analysis of fuelwood harvesting costs

    Science.gov (United States)

    George B. Harpole; Giuseppe Rensi

    1985-01-01

    The fuelwood harvesting computer program (FHP) is written in FORTRAN 60 and designed to select a collection of harvest units and systems from among alternatives to satisfy specified energy requirements at a lowest cost per million Btu's as recovered in a boiler, or thousand pounds of H2O evaporative capacity kiln drying. Computed energy costs are used as a...

  6. Computer Programming with Infants and Juniors.

    Science.gov (United States)

    Hind, Jim

    1984-01-01

    The article argues that even extremely young children can be taught to program microcomputers from their very first contact. A teaching strategy is proposed, having more in common with the teaching of language than with the more traditional didactic-reinforcement cycle commonly employed in the text books. (Author/CL)

  7. Programming Languages for Distributed Computing Systems

    NARCIS (Netherlands)

    Bal, H.E.; Steiner, J.G.; Tanenbaum, A.S.

    1989-01-01

    When distributed systems first appeared, they were programmed in traditional sequential languages, usually with the addition of a few library procedures for sending and receiving messages. As distributed applications became more commonplace and more sophisticated, this ad hoc approach became less

  8. Computer viruses

    Science.gov (United States)

    Denning, Peter J.

    1988-01-01

    The worm, Trojan horse, bacterium, and virus are destructive programs that attack information stored in a computer's memory. Virus programs, which propagate by incorporating copies of themselves into other programs, are a growing menace in the late-1980s world of unprotected, networked workstations and personal computers. Limited immunity is offered by memory protection hardware, digitally authenticated object programs,and antibody programs that kill specific viruses. Additional immunity can be gained from the practice of digital hygiene, primarily the refusal to use software from untrusted sources. Full immunity requires attention in a social dimension, the accountability of programmers.

  9. Software usage in unsupervised digital doorway computing environments in disadvantaged South African communities: Focusing on youthful users

    CSIR Research Space (South Africa)

    Gush, K

    2011-01-01

    Full Text Available Digital Doorways provide computing infrastructure in low-income communities in South Africa. The unsupervised DD terminals offer various software applications, from entertainment through educational resources to research material, encouraging...

  10. EZLP: An Interactive Computer Program for Solving Linear Programming Problems. Final Report.

    Science.gov (United States)

    Jarvis, John J.; And Others

    Designed for student use in solving linear programming problems, the interactive computer program described (EZLP) permits the student to input the linear programming model in exactly the same manner in which it would be written on paper. This report includes a brief review of the development of EZLP; narrative descriptions of program features,…

  11. Design and implementation of the modified signed digit multiplication routine on a ternary optical computer.

    Science.gov (United States)

    Xu, Qun; Wang, Xianchao; Xu, Chao

    2017-06-01

    Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.

  12. Bridging the digital divide through the integration of computer and information technology in science education: An action research study

    Science.gov (United States)

    Brown, Gail Laverne

    The presence of a digital divide, computer and information technology integration effectiveness, and barriers to continued usage of computer and information technology were investigated. Thirty-four African American and Caucasian American students (17 males and 17 females) in grades 9--11 from 2 Georgia high school science classes were exposed to 30 hours of hands-on computer and information technology skills. The purpose of the exposure was to improve students' computer and information technology skills. Pre-study and post-study skills surveys, and structured interviews were used to compare race, gender, income, grade-level, and age differences with respect to computer usage. A paired t-test and McNemar test determined mean differences between student pre-study and post-study perceived skills levels. The results were consistent with findings of the National Telecommunications and Information Administration (2000) that indicated the presence of a digital divide and digital inclusion. Caucasian American participants were found to have more at-home computer and Internet access than African American participants, indicating that there is a digital divide by ethnicity. Caucasian American females were found to have more computer and Internet access which was an indication of digital inclusion. Sophomores had more at-home computer access and Internet access than other levels indicating digital inclusion. Students receiving regular meals had more computer and Internet access than students receiving free/reduced meals. Older students had more computer and Internet access than younger students. African American males had been using computer and information technology the longest which is an indication of inclusion. The paired t-test and McNemar test revealed significant perceived student increases in all skills levels. Interviews did not reveal any barriers to continued usage of the computer and information technology skills.

  13. [Guided and computer-assisted implant surgery and prosthetic: The continuous digital workflow].

    Science.gov (United States)

    Pascual, D; Vaysse, J

    2016-02-01

    New continuous digital workflow protocols of guided and computer-assisted implant surgery improve accuracy of implant positioning. The design of the future prosthesis is based on the available prosthetic space, gingival height and occlusal relationship with the opposing and adjacent teeth. The implant position and length depend on volume, density and bone quality, gingival height, tooth-implant and implant-implant distances, implant parallelism, axis and type of the future prosthesis. The crown modeled on the software will therefore serve as a guide to the future implant axis and not the reverse. The guide is made by 3D printing. The software determines surgical protocol with the drilling sequences. The unitary or plural prosthesis, modeled on the software and built before surgery, is loaded directly after implant placing, if needed. These protocols allow for a full continuity of the digital workflow. The software provides the surgeon and the dental technician a total freedom for the prosthetic-surgery guide design and the position of the implants. The prosthetic project, occlusal and aesthetic, taking the bony and surgical constraints into account, is optimized. The implant surgery is simplified and becomes less "stressful" for the patient and the surgeon. Guided and computer-assisted surgery with continuous digital workflow is becoming the technique of choice to improve the accuracy and quality of implant rehabilitation. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  14. From Clover to computer. Towards programmed anaesthesia?

    Science.gov (United States)

    Mapleson, W W

    1979-02-01

    The control of depth of anaesthesia has been viewed as a control-system problem the solution of which can involve both feedback and feedforward techniques. The nature of the problem in Clover's day and the solutions he found have been examined. A similar analysis has been made in respect of the modern anaesthetist. Finally, the way in which computers may aid the anaesthetist in his task has been illustrated by reference to various attempts reported from around the world and, in particular, by describing the development in Cardiff of a system which should produce, in the brain of the patient, any tension of an inhaled anaesthetic which the anaesthetist chooses to specify.

  15. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  16. MC6800 cross-assembler for the PDP-8/E digital computer. [M68CA

    Energy Technology Data Exchange (ETDEWEB)

    Sand, R.J.

    1978-08-01

    A cross-assembler was developed to assemble Motorola MC6800 microprocessor programs on a digital Equipment Corporation PDP-8/E minicomputer. This cross-assembler runs in 8K of core under the OS/8 operating system. The advantages of using the cross-assembler are the large user symbol table and the convenience and speed of program development. User's instructions for the cross-assembler are given. The design of the cross-assembler and examples of its use are described. 12 figures.

  17. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  18. A Successful Course of Study in Computer Programming

    Science.gov (United States)

    Seeger, David H.

    1977-01-01

    Three keys to the successful development of the program of the computer programming department of the Technical Institute of Oklahoma State University are discussed: Community involvement, faculty/administration commitment to the basic principles of technical career education, and availability of appropriate equipment for student use. (HD)

  19. A Research Program in Computer Technology

    Science.gov (United States)

    1979-01-01

    14 (7), 1971, 453-360. 5. Donzeau-Gouge, V., G. Kahn, and B. Lang , A Complete Machine-Checked Definition of a Simple Programming Language Using...Denotational Semantics, IRIA Laborla, Technical Report 330, October 1978. 6. Donzeau-Gouge, V., G. Kahn, and B. Lang , Formal Definition of Ada, Honeywell...May 1976. r S.-..-. . . . . . . . 12. ARPANET TENEX SERVICE T’fhttiral Staff Marion McKinley, Jr. William H. Moore Robert Hines Serge Poievitzky Edward

  20. Programs=data=first-class citizens in a computational world.

    Science.gov (United States)

    Jones, Neil D; Simonsen, Jakob Grue

    2012-07-28

    From a programming perspective, Alan Turing's epochal 1936 paper on computable functions introduced several new concepts, including what is today known as self-interpreters and programs as data, and invented a great many now-common programming techniques. We begin by reviewing Turing's contribution from a programming perspective; and then systematize and mention some of the many ways that later developments in models of computation (MOCs) have interacted with computability theory and programming language research. Next, we describe the 'blob' MOC: a recent stored-program computational model without pointers. In the blob model, programs are truly first-class citizens, capable of being automatically compiled, or interpreted, or executed directly. Further, the blob model appears closer to being physically realizable than earlier computation models. In part, this is due to strong finiteness owing to early binding in the program; and a strong adjacency property: the active instruction is always adjacent to the piece of data on which it operates. The model is Turing complete in a strong sense: a universal interpretation algorithm exists that is able to run any program in a natural way and without arcane data encodings. Next, some of the best known among the numerous existing MOCs are described, and we develop a list of traits an 'ideal' MOC should possess from our perspective. We make no attempt to consider all models put forth since Turing's 1936 paper, and the selection of models covered concerns only models with discrete, atomic computation steps. The next step is to classify the selected models by qualitative rather than quantitative features. Finally, we describe how the blob model differs from an 'ideal' MOC, and identify some natural next steps to achieve such a model.

  1. Integrating computer programs for engineering analysis and design

    Science.gov (United States)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  2. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  3. On Computational Power of Quantum Read-Once Branching Programs

    Directory of Open Access Journals (Sweden)

    Farid Ablayev

    2011-03-01

    Full Text Available In this paper we review our current results concerning the computational power of quantum read-once branching programs. First of all, based on the circuit presentation of quantum branching programs and our variant of quantum fingerprinting technique, we show that any Boolean function with linear polynomial presentation can be computed by a quantum read-once branching program using a relatively small (usually logarithmic in the size of input number of qubits. Then we show that the described class of Boolean functions is closed under the polynomial projections.

  4. COED Transactions, Vol. X, No. 5, May 1978. STAGEF, A Program to Compute the Internal Variables of an Operating Distillation Column.

    Science.gov (United States)

    Marcovitz, Alan B., Ed.

    A digital computer program, STAGEF, designed for use with the distillation experiments in a typical undergraduate Chemical Engineering laboratory in Unit Operations is explained. The program enables the student to determine the rate of liquid overflow and vapor boil-up which leaves each tray within the distillation column. The student may also…

  5. Standard practice for digital imaging and communication nondestructive evaluation (DICONDE) for computed radiography (CR) test methods

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of computed radiography (CR) imaging and data acquisition equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This practice is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information objec...

  6. The prevalence of encoded digital trace evidence in the nonfile space of computer media(,) (.).

    Science.gov (United States)

    Garfinkel, Simson L

    2014-09-01

    Forensically significant digital trace evidence that is frequently present in sectors of digital media not associated with allocated or deleted files. Modern digital forensic tools generally do not decompress such data unless a specific file with a recognized file type is first identified, potentially resulting in missed evidence. Email addresses are encoded differently for different file formats. As a result, trace evidence can be categorized as Plain in File (PF), Encoded in File (EF), Plain Not in File (PNF), or Encoded Not in File (ENF). The tool bulk_extractor finds all of these formats, but other forensic tools do not. A study of 961 storage devices purchased on the secondary market and shows that 474 contained encoded email addresses that were not in files (ENF). Different encoding formats are the result of different application programs that processed different kinds of digital trace evidence. Specific encoding formats explored include BASE64, GZIP, PDF, HIBER, and ZIP. Published 2014. This article is a U.S. Government work and is in the public domain in the USA. Journal of Forensic Sciences published by Wiley Periodicals, Inc. on behalf of American Academy of Forensic Sciences.

  7. Injecting Artificial Memory Errors Into a Running Computer Program

    Science.gov (United States)

    Bornstein, Benjamin J.; Granat, Robert A.; Wagstaff, Kiri L.

    2008-01-01

    Single-event upsets (SEUs) or bitflips are computer memory errors caused by radiation. BITFLIPS (Basic Instrumentation Tool for Fault Localized Injection of Probabilistic SEUs) is a computer program that deliberately injects SEUs into another computer program, while the latter is running, for the purpose of evaluating the fault tolerance of that program. BITFLIPS was written as a plug-in extension of the open-source Valgrind debugging and profiling software. BITFLIPS can inject SEUs into any program that can be run on the Linux operating system, without needing to modify the program s source code. Further, if access to the original program source code is available, BITFLIPS offers fine-grained control over exactly when and which areas of memory (as specified via program variables) will be subjected to SEUs. The rate of injection of SEUs is controlled by specifying either a fault probability or a fault rate based on memory size and radiation exposure time, in units of SEUs per byte per second. BITFLIPS can also log each SEU that it injects and, if program source code is available, report the magnitude of effect of the SEU on a floating-point value or other program variable.

  8. SEISPRHO: An interactive computer program for processing and interpretation of high-resolution seismic reflection profiles

    Science.gov (United States)

    Gasperini, Luca; Stanghellini, Giuseppe

    2009-07-01

    SEISPRHO is an interactive computer program for processing and interpreting high-resolution seismic reflection profiles developed using the Delphi/Kylix multiplatform programming environment. For this reason, it is available under Windows™ and Linux™ operating systems. The program allows the users to handle SEG-Y data files (and other non-standard formats) carrying out a processing sequence over the data to obtain, as a final result, bitmap images of seismic sections. Some basic algorithms are implemented, including filtering and deconvolution. However, the main feature of SEISPRHO is its interactive graphic interface, which provides the user with several tools for interpreting the data, such as reflector picking and map digitizing. Moreover, the program allows importing and geo-referencing maps and seismic profiles in the form of digital images. Trace-by-trace analysis of seismic signal and sea-bottom reflectivity is also implemented, as well as other special functions such as compilation of time-slice maps from close-spaced grids of seismic lines. SEISPRHO is distributed as public domain software for non-commercial purposes by the Marine Geology division of the Istituto di Scienze Marine ( ISMAR-CNR). This paper is an introduction to the program and a preliminary guide to the users.

  9. Computational generation of high-quality digital halftones (grey/colour patterns

    Directory of Open Access Journals (Sweden)

    Alfonsas Misevičius

    2013-09-01

    Full Text Available The purpose of this paper is to describe the computational algorithmic generation of the high-quality digital halftones (grey/colour patterns. At the beginning, the formal model for generation of the digital halftones, the so-called grey pattern problem (GPP is introduced. Then, the heuristic algorithm for the solution, in particular, of the grey pattern problem is discussed. Although the algorithm employed does not guarantee the optimality of the solutions found, still superior-quality, near-optimal (and in some cases probably optimal solutions can be achieved within reasonable computation time. Further, we provide the results of the extensive computational experiments with the newly proposed, extra-large size instance (data set of the GPP — which is the main contribution of this work. As a confirmation of the quality of the solutions produced, we also give the visual representations of several fine-looking halftone patterns and the reader can judge about the perfection of the images obtained.

  10. Success Factors and Strategic Planning: Rebuilding an Academic Library Digitization Program

    Directory of Open Access Journals (Sweden)

    Cory Lampert

    2009-09-01

    Full Text Available This paper discusses a dual approach of case study and research survey to investigate the complex factors in sustaining academic library digitization programs. The case study involves the background of the University of Nevada, Las Vegas (UNLV Libraries’ digitization program and elaborates on the authors’ efforts to gain staff support for this program. A related survey was administered to all Association of Research Libraries (ARL members, seeking to collect baseline data on their digital collections, understand their respective administrative frameworks, and to gather feedback on both negative obstacles and positive inputs affecting their success. Results from the survey, combined with the authors’ local experience, point to several potential success factors including staff skill sets, funding, and strategic planning.

  11. [Interest of computer-based cognitive behavioral stress management. Feasability of the Seren@ctif program].

    Science.gov (United States)

    Servant, D; Rougegrez, L; Barasino, O; Demarty, A-L; Duhamel, A; Vaiva, G

    2016-10-01

    Cognitive-behavioural stress management programs have been studied in many countries. Many reports have shown beyond a doubt their efficacy to reduce perceived stress, anxiety symptoms and to improve quality of life of patients. Considering the very large number of people who could benefit from such programs but are unable to reach them, self-help programs have been offered. First presented as books (bibliotherapy), these programs then became enriched by computing and digital supports. Regrettably, many of the programs of stress management based on the Cognitive behavioural therapy (CBT) both in face-to-face and on digital support have been little evaluated in France. To our knowledge, the Seren@ctif program is the first French language self-help program of stress management proposed on digital support. We led a feasibility study of this program on 10 patients responding to the diagnosis of adjustment disorder with anxiety according to the DSM IV criteria. The program includes 5 weekly sessions that the patient follows in our unit from a web site. He benefits from minimal contact with a medical member of staff before and after every session. Right from the first session an USB key is supplied to the patient containing videos, audio files, self-help book portfolio in the form of an e-guide, and log books with the exercises to be realized between each sessions of the 5 session program. The patient is encouraged to practice 20 minutes of exercises 5 or 6 days per week. The program's feasibility has been assessed in accordance with a standard satisfaction scale. Anxiety symptomatology has been quantified thanks to the Spielberger State-Trait Anxiety Inventory (STAI-Y-S). After the scheduled 5 weeks, good results were found in terms of acceptability and attractiveness. The average score to the satisfaction survey was at least equal to 4 out of 5 for each item. The mean score on the STAI-State decreased from 53,4 (SD: 8,29) to 44,2 (SD: 7,73) following the

  12. Marketing digital, de la publicidad online a la publicidad program??tica = Digital marketing, from online advertising to programmatic advertising

    OpenAIRE

    Panera Gallego, Sergio

    2017-01-01

    El marketing digital se encuentra en continuo avance y evolucionando a pasos agigantados desarrollando todo tipo de nuevas t??cnicas, tecnolog??as, formatos o simplemente mejorando lo ya existente hasta la fecha. En esta espiral evolutiva ha surgido una nueva manera de realizar publicidad, la conocida como publicidad program??tica, la cual ser?? foco de estudio a lo largo de este trabajo. Como consecuencia de la aparici??n de este nuevo y tecnol??gico sistema publicitario, tambi??n han sur...

  13. Material Programming: a Design Practice for Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna; Boer, Laurens; Tsaknaki, Vasiliki

    2016-01-01

    In this paper we propose the notion of material programming as a future design practice for computational composites. Material programming would be a way for the interaction designer to better explore the dynamic potential of computational materials at hand and through that familiarity be able...... to compose more sophisticated and complex temporal forms in their designs. The contribution of the paper is an analysis of qualities that we find a material programming practice would and should support: designs grounded in material properties and experiences, embodied programming practice, real-time on......-site explorations, and finally a reasonable level of complexity in couplings between input and output. We propose material programming knowing that the technology and materials are not entirely ready to support this practice yet, however, we are certain they will be and that the interaction design community...

  14. An open-source computational and data resource to analyze digital maps of immunopeptidomes

    Energy Technology Data Exchange (ETDEWEB)

    Caron, Etienne; Espona, Lucia; Kowalewski, Daniel J.; Schuster, Heiko; Ternette, Nicola; Alpizar, Adan; Schittenhelm, Ralf B.; Ramarathinam, Sri Harsha; Lindestam-Arlehamn, Cecilia S.; Koh, Ching Chiek; Gillet, Ludovic; Rabsteyn, Armin; Navarro, Pedro; Kim, Sangtae; Lam, Henry; Sturm, Theo; Marcilla, Miguel; Sette, Alessandro; Campbell, David; Deutsch, Eric W.; Moritz, Robert L.; Purcell, Anthony; Rammensee, Hans-Georg; Stevanovic, Stevan; Aebersold, Ruedi

    2015-07-08

    We present a novel proteomics-based workflow and an open source data and computational resource for reproducibly identifying and quantifying HLA-associated peptides at high-throughput. The provided resources support the generation of HLA allele-specific peptide assay libraries consisting of consensus fragment ion spectra and the analysis of quantitative digital maps of HLA peptidomes generated by SWATH mass spectrometry (MS). This is the first community-based study towards the development of a robust platform for the reproducible and quantitative measurement of HLA peptidomes, an essential step towards the design of efficient immunotherapies.

  15. COMPUTER-AIDED SYNTESIS OF DIGITAL CONTROLLERS BASED ON THE DISCRETE TRANSFER FUNCTION OF THE CONTROL OBJECTS

    Directory of Open Access Journals (Sweden)

    A. G. Stryzhniou

    2013-01-01

    Full Text Available The paper presents discretization methods of control objects transfer functions, which are used in MATLAB, including zero- and first-order extrapolators, bilinear Tustin approximation and Tustin approximation with frequency prewarping. The MATLAB program which automates the process of determining discrete transfer functions of various control objects from their continuous models and calculates the digital controllers is developed. Discrete transfer functions and digital controllers for control objects of the second and third order are obtained programmatically. The digital modeling is applied to verify the operability of the control objects and the automatic control systems with different digital controllers.

  16. On the Development of Digital Forensics Curriculum

    Directory of Open Access Journals (Sweden)

    Manghui Tu

    2012-09-01

    Full Text Available Computer Crime and computer related incidents continue their prevalence and frequency and result in loss of billions of dollars. To fight against those crimes and frauds, it is urgent to develop digital forensics education programs to train a suitable workforce to efficiently and effectively investigate crimes and frauds. However, there is no standard to guide the design of digital forensics curriculum for an academic program. In this research, we investigate the research works on digital forensics curriculum design and existing education programs.  Both digital forensics educators and practitioners were surveyed and the results are analyzed to determine what industry and law enforcement need. Based on the survey results and what the industry certificate programs cover, we identified topics that are desired to be covered in digital forensics courses. Finally, we propose six digital forensics courses and their topics that can be offered in both undergraduate and graduate digital forensics programs.

  17. Comparison of Micro-Computed Tomography and Digital Intraoral Radiography to Determine the Accuracy of Digital Radiographic Measurements of Mandibular Molar Teeth in Dogs.

    Science.gov (United States)

    Marron, Louise; Rawlinson, Jennifer; McGilvray, Kirk; Prytherch, Ben

    2017-12-01

    The purpose of this study was to compare root and root canal width measurements between digital intraoral radiography (IOR) and micro-computed tomography (μCT). The accuracy of IOR measurements of canine mandibular molars was scrutinized to assess feasibility of developing a model to estimate animal age based on dentinal thickness. Thirty-nine canine mandibular first molars were imaged using μCT and IOR. For each tooth, the root and root canal width of the mesial and distal roots were measured by a single observer at 3 marked sites on μCT and IOR. Two different software programs were used to measure the radiographs. The radiograph measurements were compared to each other and to the μCT measurements. The μCT images were considered the anatomic reference standard for structural representation. The data collected demonstrated IOR bias and variability throughout all measurement sites, with some sites being more affected than others. Neither IOR system produced unbiased measurements that closely reflected the μCT measurements consistently. The overall lack of agreement between measurements demonstrated the difficulties in developing a standardized protocol for measuring root and root canal width for the first molar teeth in dogs. Developing a protocol to accurately measure and compare μCT and IOR measurements is challenging. Designing a measurement system that would allow for universal application to age dogs would require continued research utilizing a standardized approach to overcome the limitations identified in this article.

  18. On the Development of Digital Forensics Curriculum

    OpenAIRE

    Manghui Tu; Dianxiang Xu; Cristian Balan; kyle Cronin

    2012-01-01

    Computer Crime and computer related incidents continue their prevalence and frequency and result in loss of billions of dollars. To fight against those crimes and frauds, it is urgent to develop digital forensics education programs to train a suitable workforce to efficiently and effectively investigate crimes and frauds. However, there is no standard to guide the design of digital forensics curriculum for an academic program. In this research, we investigate the research works on digital for...

  19. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  20. An introduction to NASA's advanced computing program: Integrated computing systems in advanced multichip modules

    Science.gov (United States)

    Fang, Wai-Chi; Alkalai, Leon

    1996-01-01

    Recent changes within NASA's space exploration program favor the design, implementation, and operation of low cost, lightweight, small and micro spacecraft with multiple launches per year. In order to meet the future needs of these missions with regard to the use of spacecraft microelectronics, NASA's advanced flight computing (AFC) program is currently considering industrial cooperation and advanced packaging architectures. In relation to this, the AFC program is reviewed, considering the design and implementation of NASA's AFC multichip module.

  1. Application of computer intensive data analysis methods to the analysis of digital images and spatial data

    DEFF Research Database (Denmark)

    Windfeld, Kristian

    1992-01-01

    Computer-intensive methods for data analysis in a traditional setting has developed rapidly in the last decade. The application of and adaption of some of these methods to the analysis of multivariate digital images and spatial data are explored, evaluated and compared to well established classical...... into the projection pursuit is presented. Examples from remote sensing are given. The ACE algorithm for computing non-linear transformations for maximizing correlation is extended and applied to obtain a non-linear transformation that maximizes autocorrelation or 'signal' in a multivariate image....... This is a generalization of the minimum /maximum autocorrelation factors (MAF's) which is a linear method. The non-linear method is compared to the linear method when analyzing a multivariate TM image from Greenland. The ACE method is shown to give a more detailed decomposition of the image than the MAF-transformation...

  2. Computer aided analysis of digitized dental stone replicas by dental CAD/CAM technology.

    Science.gov (United States)

    Persson, Anna S K; Andersson, Matts; Odén, Agneta; Sandborgh-Englund, Gunilla

    2008-08-01

    To determine the reproducibility of digitized dental stone replicas compared to the master model and the reliability of the computer aided analysis. Four master dies, prepared for complete crowns were fabricated in presintered yttria-stabilized tetragonal zirconia (Y-TZP). Eight vinyl polysiloxane impressions (PROVIL novo; Heraeus Kulzer) were taken of each die and stone replicas were poured in type IV stone (Vel-Mix Stone; Kerr). The master dies and the stone replicas were digitized in a touch-probe scanner (Procera Forte; Nobel Biocare AB), to create triangulated surface-models. The point-cloud from the first of the repeated digitizations of each master die was used as CAD-reference-models (CRM). Discrepancies between the points in the triangulated surface-models and the corresponding CRM were measured by a matching-software (CopyCAD 6.504 SP2; Delcam Plc). The distribution of the discrepancies was analyzed and presented in color-difference-maps. The precision of the measuring method, presented as the repeatability coefficient, ranged between 7 and 16 microm (entire surface), whereas the analysis of the stone replicas revealed a precision (repeatability coefficient) ranging from 19 to 26 microm. The accuracy of the replica to master (the mean discrepancy) ranged from 0.5 to 2.0 microm (95% confidence interval 1.5-2.9 microm). The greatest precision of the measurement was seen in the jacket surface of the die. The size of the stone replicas varied and the repeatability coefficient was on average 15 microm (2-25 microm) greater for the replica-to-master alignment than the repeated digitizations of the master.

  3. Computers Take Flight: A History of NASA's Pioneering Digital Fly-By-Wire Project

    Science.gov (United States)

    Tomayko, James E.

    2000-01-01

    An overview of the NASA F-8 Fly-by Wire project is presented. The project made two significant contributions to the new technology: (1) a solid design base of techniques that work and those that do not, and (2) credible evidence of good flying qualities and the ability of such a system to tolerate real faults and to continue operation without degradation. In 1972 the F-8C aircraft used in the program became he first digital fly-by-wire aircraft to operate without a mechanical backup system.

  4. A Genetic Programming Approach to Geometrical Digital Content Modeling in Web Oriented Applications

    Directory of Open Access Journals (Sweden)

    Dragos PALAGHITA

    2011-01-01

    Full Text Available The paper presents the advantages of using genetic techniques in web oriented problems. The specific area of genetic programming applications that paper approaches is content modeling. The analyzed digital content is formed through the accumulation of targeted geometrical structured entities that have specific characteristics and behavior. The accumulated digital content is analyzed and specific features are extracted in order to develop an analysis system through the use of genetic programming. An experiment is presented which evolves a model based on specific features of each geometrical structured entity in the digital content base. The results show promising expectations with a low error rate which provides fair approximations related to analyzed geometrical structured entities.

  5. CRECTJ: a computer program for compilation of evaluated nuclear data

    Energy Technology Data Exchange (ETDEWEB)

    Nakagawa, Tsuneo [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1999-09-01

    In order to compile evaluated nuclear data in the ENDF format, the computer program CRECTJ has been developed. CRECTJ has two versions; CRECTJ5 treats the data in the ENDF/B-IV and ENDF/B-V format, and CRECTJ6 the data in the ENDF-6 format. These programs have been frequently used to make Japanese Evaluated Nuclear Data Library (JENDL). This report describes input data and examples of CRECTJ. (author)

  6. Synopsis of a computer program designed to interface a personal computer with the fast data acquisition system of a time-of-flight mass spectrometer

    Science.gov (United States)

    Bechtel, R. D.; Mateos, M. A.; Lincoln, K. A.

    1988-01-01

    Briefly described are the essential features of a computer program designed to interface a personal computer with the fast, digital data acquisition system of a time-of-flight mass spectrometer. The instrumentation was developed to provide a time-resolved analysis of individual vapor pulses produced by the incidence of a pulsed laser beam on an ablative material. The high repetition rate spectrometer coupled to a fast transient recorder captures complete mass spectra every 20 to 35 microsecs, thereby providing the time resolution needed for the study of this sort of transient event. The program enables the computer to record the large amount of data generated by the system in short time intervals, and it provides the operator the immediate option of presenting the spectral data in several different formats. Furthermore, the system does this with a high degree of automation, including the tasks of mass labeling the spectra and logging pertinent instrumental parameters.

  7. A method to incorporate the effect of beam quality on image noise in a digitally reconstructed radiograph (DRR) based computer simulation for optimisation of digital radiography.

    Science.gov (United States)

    Moore, Craig S; Wood, Tim J; Saunderson, John R; Beavis, Andrew W

    2017-09-01

    The use of computer simulated digital x-radiographs for optimisation purposes has become widespread in recent years. To make these optimisation investigations effective, it is vital simulated radiographs contain accurate anatomical and system noise. Computer algorithms that simulate radiographs based solely on the incident detector x-ray intensity ('dose') have been reported extensively in the literature. However, while it has been established for digital mammography that x-ray beam quality is an important factor when modelling noise in simulated images there are no such studies for diagnostic imaging of the chest, abdomen and pelvis. This study investigates the influence of beam quality on image noise in a digital radiography (DR) imaging system, and incorporates these effects into a digitally reconstructed radiograph (DRR) computer simulator. Image noise was measured on a real DR imaging system as a function of dose (absorbed energy) over a range of clinically relevant beam qualities. Simulated 'absorbed energy' and 'beam quality' DRRs were then created for each patient and tube voltage under investigation. Simulated noise images, corrected for dose and beam quality, were subsequently produced from the absorbed energy and beam quality DRRs, using the measured noise, absorbed energy and beam quality relationships. The noise images were superimposed onto the noiseless absorbed energy DRRs to create the final images. Signal-to-noise measurements in simulated chest, abdomen and spine images were within 10% of the corresponding measurements in real images. This compares favourably to our previous algorithm where images corrected for dose only were all within 20%.

  8. Department of Energy Mathematical, Information, and Computational Sciences Division: High Performance Computing and Communications Program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-11-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, The DOE Program in HPCC), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW).

  9. Department of Energy: MICS (Mathematical Information, and Computational Sciences Division). High performance computing and communications program

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    This document is intended to serve two purposes. Its first purpose is that of a program status report of the considerable progress that the Department of Energy (DOE) has made since 1993, the time of the last such report (DOE/ER-0536, {open_quotes}The DOE Program in HPCC{close_quotes}), toward achieving the goals of the High Performance Computing and Communications (HPCC) Program. The second purpose is that of a summary report of the many research programs administered by the Mathematical, Information, and Computational Sciences (MICS) Division of the Office of Energy Research under the auspices of the HPCC Program and to provide, wherever relevant, easy access to pertinent information about MICS-Division activities via universal resource locators (URLs) on the World Wide Web (WWW). The information pointed to by the URL is updated frequently, and the interested reader is urged to access the WWW for the latest information.

  10. Putting Multiliteracies into Practice: Digital Storytelling for Multilingual Adolescents in a Summer Program

    Science.gov (United States)

    Angay-Crowder, Tuba; Choi, Jayoung; Yi, Youngjoo

    2013-01-01

    In this article we demonstrate how we created a context in which digital storytelling was designed and implemented to teach multilingual middle school students in the summer program sponsored by a local nonprofit organization, the Latin American Association, in a city in the southeastern United States. While implementing the notion of…

  11. Re-Articulating the Mission and Work of the Writing Program with Digital Video

    Science.gov (United States)

    Kopp, Drew; Stevens, Sharon McKenzie

    2010-01-01

    In this webtext, we discuss one powerful way that writing program administrators (WPAs) can start to reshape their basic rhetorical situation, potentially shifting the underlying premises that external audiences bring to discussions about writing instruction. We argue that digital video, when used strategically, is a particularly valuable medium…

  12. We Interrupt This Program: Media Theorist Douglas Rushkoff Has Second Thoughts about Our Digital Practices

    Science.gov (United States)

    Rushkoff, Douglas

    2011-01-01

    When asked what Facebook is for, kids will say that it's there to help them make friends. The kids the author celebrated in his early books as "digital natives," capable of seeing through all efforts of big media and marketing, have actually proven less able to discern the integrity of the sources they read and the intentions of the programs they…

  13. Computer Program Plagiarism Detection: The Limits of the Halstead Metric.

    Science.gov (United States)

    Berghel, H. L.; Sallach, David L.

    1985-01-01

    Discusses two alternative metrics to detect computer software plagiarism: the Halstead metric drawn from the software science discipline and an ad hoc method drawn from program grading experience and identified by factor analysis. Possible explanations as to why the ad hoc method is more useful in identical-task environments are considered.…

  14. Computer Programming with Early Elementary Students with Down Syndrome

    Science.gov (United States)

    Taylor, Matthew S.; Vasquez, Eleazar; Donehower, Claire

    2017-01-01

    Students of all ages and abilities must be given the opportunity to learn academic skills that can shape future opportunities and careers. Researchers in the mid-1970s and 1980s began teaching young students the processes of computer programming using basic coding skills and limited technology. As technology became more personalized and easily…

  15. 77 FR 56824 - Privacy Act of 1974; Computer Matching Program

    Science.gov (United States)

    2012-09-14

    ... information contained in the USCIS database is referred to as the Verification Information System (VIS), which... records entitled ``Verification Information System Records Notice (DHS-2007-0010).'' Where there is a... Information: Privacy Act of 1974; Computer Matching Program between the U.S. Department of Education and the...

  16. Computers for All Children: A Handbook for Program Design.

    Science.gov (United States)

    Sharp, Pamela; Crist-Whitzel, Janet

    One of three publications of the Research on Equitable Access to Technology (REAT) project, this practitioner's handbook is designed to assist educators in the design and implementation of computer instruction programs for underserved groups of students, including low-income, minority, low-achieving, limited-English speaking, female, and rural…

  17. Tpetra, and the Use of Generic Programming in Scientific Computing

    Directory of Open Access Journals (Sweden)

    C.G. Baker

    2012-01-01

    Full Text Available We present Tpetra, a Trilinos package for parallel linear algebra primitives implementing the Petra object model. We describe Tpetra's design, based on generic programming via C++ templated types and template metaprogramming. We discuss some benefits of this approach in the context of scientific computing, with illustrations consisting of code and notable empirical results.

  18. What's New in Software? Integrated Computer Programs and Daily Living.

    Science.gov (United States)

    Hedley, Carolyn N.

    1989-01-01

    Various kinds of electronic information media can now be integrated to plan educational programs, through use of computer videodiscs, hypercards, and hypertexts. Discussed are the components of integrative technology, including audio technology, video technology, and electronic text and graphics, and possibilities for interfacing the various…

  19. Intellectual Property Law and the Protection of Computer Programs.

    Science.gov (United States)

    Lomio, J. Paul

    1990-01-01

    Briefly reviews the laws pertaining to copyrights, patents, and trade secrets, and discusses how each of these may be applied to the protection of computer programs. The comparative merits and limitations of each category of law are discussed and recent court decisions are summarized. (CLB)

  20. Learning Computer Programming: Implementing a Fractal in a Turing Machine

    Science.gov (United States)

    Pereira, Hernane B. de B.; Zebende, Gilney F.; Moret, Marcelo A.

    2010-01-01

    It is common to start a course on computer programming logic by teaching the algorithm concept from the point of view of natural languages, but in a schematic way. In this sense we note that the students have difficulties in understanding and implementation of the problems proposed by the teacher. The main idea of this paper is to show that the…

  1. Individual Differences in Learning Computer Programming: A Social Cognitive Approach

    Science.gov (United States)

    Akar, Sacide Guzin Mazman; Altun, Arif

    2017-01-01

    The purpose of this study is to investigate and conceptualize the ranks of importance of social cognitive variables on university students' computer programming performances. Spatial ability, working memory, self-efficacy, gender, prior knowledge and the universities students attend were taken as variables to be analyzed. The study has been…

  2. A Domain-Specific Programming Language for Secure Multiparty Computation

    DEFF Research Database (Denmark)

    Nielsen, Janus Dam; Schwartzbach, Michael Ignatieff

    2007-01-01

    We present a domain-specific programming language for Secure Multiparty Computation (SMC). Information is a resource of vital importance and considerable economic value to individuals, public administration, and private companies. This means that the confidentiality of information is crucial, but...... application development. The language is implemented in a prototype compiler that generates Java code exploiting a distributed cryptographic runtime....

  3. A Research Program in Computer Technology. 1987 Annual Technical Report

    Science.gov (United States)

    1990-07-01

    mathematical approach to computational network design," in E. E. Swartzlander (ed.), Systolic Signal Processing Systems, chapter 1, Marcel Dekker, 1987...Intention-Based Diagnosis of Novice Programming Errors, Morgan Kaufmann, Los Altos, California, 1986. 32. Johnson, W. L., and E. Soloway, " PROUST

  4. An Analysis on Distance Education Computer Programming Students' Attitudes Regarding Programming and Their Self-Efficacy for Programming

    Science.gov (United States)

    Ozyurt, Ozcan

    2015-01-01

    This study aims to analyze the attitudes of students studying computer programming through the distance education regarding programming, and their self-efficacy for programming and the relation between these two factors. The study is conducted with 104 students being thought with distance education in a university in the north region of Turkey in…

  5. Computing Programs for Determining Traffic Flows from Roundabouts

    Science.gov (United States)

    Boroiu, A. A.; Tabacu, I.; Ene, A.; Neagu, E.; Boroiu, A.

    2017-10-01

    For modelling road traffic at the level of a road network it is necessary to specify the flows of all traffic currents at each intersection. These data can be obtained by direct measurements at the traffic light intersections, but in the case of a roundabout this is not possible directly and the literature as well as the traffic modelling software doesn’t offer ways to solve this issue. Two sets of formulas are proposed by which all traffic flows from the roundabouts with 3 or 4 arms are calculated based on the streams that can be measured. The objective of this paper is to develop computational programs to operate with these formulas. For each of the two sets of analytical relations, a computational program was developed in the Java operating language. The obtained results fully confirm the applicability of the calculation programs. The final stage for capitalizing these programs will be to make them web pages in HTML format, so that they can be accessed and used on the Internet. The achievements presented in this paper are an important step to provide a necessary tool for traffic modelling because these computational programs can be easily integrated into specialized software.

  6. Computer programs for eddy-current defect studies

    Energy Technology Data Exchange (ETDEWEB)

    Pate, J. R.; Dodd, C. V. [Oak Ridge National Lab., TN (USA)

    1990-06-01

    Several computer programs to aid in the design of eddy-current tests and probes have been written. The programs, written in Fortran, deal in various ways with the response to defects exhibited by four types of probes: the pancake probe, the reflection probe, the circumferential boreside probe, and the circumferential encircling probe. Programs are included which calculate the impedance or voltage change in a coil due to a defect, which calculate and plot the defect sensitivity factor of a coil, and which invert calculated or experimental readings to obtain the size of a defect. The theory upon which the programs are based is the Burrows point defect theory, and thus the calculations of the programs will be more accurate for small defects. 6 refs., 21 figs.

  7. Introductory Computer Programming Course Teaching Improvement Using Immersion Language, Extreme Programming, and Education Theories

    Science.gov (United States)

    Velez-Rubio, Miguel

    2013-01-01

    Teaching computer programming to freshmen students in Computer Sciences and other Information Technology areas has been identified as a complex activity. Different approaches have been studied looking for the best one that could help to improve this teaching process. A proposed approach was implemented which is based in the language immersion…

  8. Modeling of rolling element bearing mechanics. Computer program user's manual

    Science.gov (United States)

    Greenhill, Lyn M.; Merchant, David H.

    1994-10-01

    This report provides the user's manual for the Rolling Element Bearing Analysis System (REBANS) analysis code which determines the quasistatic response to external loads or displacement of three types of high-speed rolling element bearings: angular contact ball bearings, duplex angular contact ball bearings, and cylindrical roller bearings. The model includes the defects of bearing ring and support structure flexibility. It is comprised of two main programs: the Preprocessor for Bearing Analysis (PREBAN) which creates the input files for the main analysis program, and Flexibility Enhanced Rolling Element Bearing Analysis (FEREBA), the main analysis program. This report addresses input instructions for and features of the computer codes. A companion report addresses the theoretical basis for the computer codes. REBANS extends the capabilities of the SHABERTH (Shaft and Bearing Thermal Analysis) code to include race and housing flexibility, including such effects as dead band and preload springs.

  9. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  10. Solutions manual and computer programs for physical and computational aspects of convective heat transfer

    CERN Document Server

    Cebeci, Tuncer

    1989-01-01

    This book is designed to accompany Physical and Computational Aspects of Convective Heat Transfer by T Cebeci and P Bradshaw and contains solutions to the exercises and computer programs for the numerical methods contained in that book Physical and Computational Aspects of Convective Heat Transfer begins with a thorough discussion of the physical aspects of convective heat transfer and presents in some detail the partial differential equations governing the transport of thermal energy in various types of flows The book is intended for senior undergraduate and graduate students of aeronautical, chemical, civil and mechanical engineering It can also serve as a reference for the practitioner

  11. A computationally efficient depression-filling algorithm for digital elevation models, applied to proglacial lake drainage

    Science.gov (United States)

    Berends, Constantijn J.; van de Wal, Roderik S. W.

    2016-12-01

    Many processes govern the deglaciation of ice sheets. One of the processes that is usually ignored is the calving of ice in lakes that temporarily surround the ice sheet. In order to capture this process a "flood-fill algorithm" is needed. Here we present and evaluate several optimizations to a standard flood-fill algorithm in terms of computational efficiency. As an example, we determine the land-ocean mask for a 1 km resolution digital elevation model (DEM) of North America and Greenland, a geographical area of roughly 7000 by 5000 km (roughly 35 million elements), about half of which is covered by ocean. Determining the land-ocean mask with our improved flood-fill algorithm reduces computation time by 90 % relative to using a standard stack-based flood-fill algorithm. This implies that it is now feasible to include the calving of ice in lakes as a dynamical process inside an ice-sheet model. We demonstrate this by using bedrock elevation, ice thickness and geoid perturbation fields from the output of a coupled ice-sheet-sea-level equation model at 30 000 years before present and determine the extent of Lake Agassiz, using both the standard and improved versions of the flood-fill algorithm. We show that several optimizations to the flood-fill algorithm used for filling a depression up to a water level, which is not defined beforehand, decrease the computation time by up to 99 %. The resulting reduction in computation time allows determination of the extent and volume of depressions in a DEM over large geographical grids or repeatedly over long periods of time, where computation time might otherwise be a limiting factor. The algorithm can be used for all glaciological and hydrological models, which need to trace the evolution over time of lakes or drainage basins in general.

  12. Landmark identification errors on cone-beam computed tomography-derived cephalograms and conventional digital cephalograms.

    Science.gov (United States)

    Chang, Zwei-Chieng; Hu, Fu-Chang; Lai, Eddie; Yao, Chung-Chen; Chen, Mu-Hsiung; Chen, Yi-Jane

    2011-12-01

    In this study, we investigated the landmark identification errors on cone-beam computed tomography (CBCT)-derived cephalograms and conventional digital cephalograms. Twenty patients who had both a CBCT-derived cephalogram and a conventional digital cephalogram were recruited. Twenty commonly used lateral cephalometric landmarks and 2 fiducial points were identified on each cephalogram by 11 observers at 2 time points. The mean positions of the landmarks identified by all observers were used as the best estimate to calculate the landmark identification errors. In addition to univariate analysis, regression analysis of landmark identification errors was conducted for identifying the predicting variables of the observed landmark identification errors. To properly handle the multilayer correlations among the clustered observations, a marginal multiple linear regression model was fitted to our correlated data by using the well-known generalized estimating equations method. In addition to image modality, many variables potentially affecting landmark identification errors were considered, including location and characteristics of the landmark, seniority of the observer, and patient information (sex, age, metallic dental restorations, and facial asymmetry). Image modality was not the significant variable in the final generalized estimating equations model. The regression coefficient estimates of the significant landmarks for the overall identification error ranged from -0.99 (Or) to 1.42 mm (Ba). The difficulty of identifying landmarks on structural images with multiple overlapping--eg, Or, U1R, L1R, Po, Ba, UMo, and LMo--increased the identification error by 1.17 mm. In the CBCT modality, the identification errors significantly decreased at Ba (-0.76 mm). The overall landmark identification errors on CBCT-derived cephalograms were comparable to those on conventional digital cephalograms, and Ba was more reliable on CBCT-derived cephalograms. Copyright © 2011 American

  13. Computer-aided detection in direct digital full-field mammography: initial results

    Energy Technology Data Exchange (ETDEWEB)

    Baum, F.; Fischer, U.; Obenauer, S.; Grabbe, E. [Department of Radiology, Georg-August-Universitaet Goettingen, Robert-Koch-Strasse 40, 37075 Goettingen (Germany)

    2002-12-01

    For the first time, full-field digital mammography (FFDM) allows computer-aided detection (CAD) analysis of directly acquired digital image data. The purpose of this study was to evaluate a CAD system in patients with histologically correlated breast cancer depicted with FFDM. Sixty-three cases of histologically proven breast cancer detected with FFDM (Senographe 2000D, GE Medical Systems, Buc, France) were analyzed using a CAD system (Image Checker V2.3, R2 Technology, Los Altos, Calif.). Fourteen of these malignancies were characterized as microcalcifications, 37 as masses, and 12 as both. The mammographic findings were categorized as BI-RADS 3 (n=5), BI-RADS 4 (n=17) and BI-RADS 5 (n=40). The sensitivity for malignant lesions and the rate of false-positive marks per image were calculated. The sensitivity and its 95% confidence interval (CI) were estimated. The sensitivity of the CAD R2 system in breast cancer seen on FFDM was 89% for microcalcifications [CI{sub 95%}=(70%; 98%)] and 81% for masses [CI{sub 95%}=(67%; 91%)]. As expected, the detection rate was higher in lesions categorized as BI-RADS 5 (37 of 40) compared with lesions categorized as BI-RADS 4 (11 of 17). In the group categorized as BI-RADS 3 the detection rate was 4 of 5 lesions; however, this group was very small. The rate of false-positive marks was 0.35 microcalcification marks/image and 0.26 mass marks/image. The overall rate of false-positive marks was 0.61 per image. CAD based on FFDM provides an optimized work flow. Results are equivalent to the results reported for CAD analysis of secondarily digitized image data. Sensitivity for microcalcifications is acceptable and for masses is low. The number of false-positive marks per image should be reduced. (orig.)

  14. Applied Cryptography Using Chaos Function for Fast Digital Logic-Based Systems in Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Piyush Kumar Shukla

    2015-03-01

    Full Text Available Recently, chaotic dynamics-based data encryption techniques for wired and wireless networks have become a topic of active research in computer science and network security such as robotic systems, encryption, and communication. The main aim of deploying a chaos-based cryptosystem is to provide encryption with several advantages over traditional encryption algorithms such as high security, speed, and reasonable computational overheads and computational power requirements. These challenges have motivated researchers to explore novel chaos-based data encryption techniques with digital logics dealing with hiding information for fast secure communication networks. This work provides an overview of how traditional data encryption techniques are revised and improved to achieve good performance in a secure communication network environment. A comprehensive survey of existing chaos-based data encryption techniques and their application areas are presented. The comparative tables can be used as a guideline to select an encryption technique suitable for the application at hand. Based on the limitations of the existing techniques, an adaptive chaos based data encryption framework of secure communication for future research is proposed

  15. GPU-accelerated compressed-sensing (CS) image reconstruction in chest digital tomosynthesis (CDT) using CUDA programming

    Science.gov (United States)

    Choi, Sunghoon; Lee, Haenghwa; Lee, Donghoon; Choi, Seungyeon; Shin, Jungwook; Jang, Woojin; Seo, Chang-Woo; Kim, Hee-Joung

    2017-03-01

    A compressed-sensing (CS) technique has been rapidly applied in medical imaging field for retrieving volumetric data from highly under-sampled projections. Among many variant forms, CS technique based on a total-variation (TV) regularization strategy shows fairly reasonable results in cone-beam geometry. In this study, we implemented the TV-based CS image reconstruction strategy in our prototype chest digital tomosynthesis (CDT) R/F system. Due to the iterative nature of time consuming processes in solving a cost function, we took advantage of parallel computing using graphics processing units (GPU) by the compute unified device architecture (CUDA) programming to accelerate our algorithm. In order to compare the algorithmic performance of our proposed CS algorithm, conventional filtered back-projection (FBP) and simultaneous algebraic reconstruction technique (SART) reconstruction schemes were also studied. The results indicated that the CS produced better contrast-to-noise ratios (CNRs) in the physical phantom images (Teflon region-of-interest) by factors of 3.91 and 1.93 than FBP and SART images, respectively. The resulted human chest phantom images including lung nodules with different diameters also showed better visual appearance in the CS images. Our proposed GPU-accelerated CS reconstruction scheme could produce volumetric data up to 80 times than CPU programming. Total elapsed time for producing 50 coronal planes with 1024×1024 image matrix using 41 projection views were 216.74 seconds for proposed CS algorithms on our GPU programming, which could match the clinically feasible time ( 3 min). Consequently, our results demonstrated that the proposed CS method showed a potential of additional dose reduction in digital tomosynthesis with reasonable image quality in a fast time.

  16. My Program Is Ok--Am I? Computing Freshmen's Experiences of Doing Programming Assignments

    Science.gov (United States)

    Kinnunen, Paivi; Simon, Beth

    2012-01-01

    This article provides insight into how computing majors experience the process of doing programming assignments in their first programming course. This grounded theory study sheds light on the various processes and contexts through which students constantly assess their self-efficacy as a programmer. The data consists of a series of four…

  17. Motivating Programming: Using Storytelling to Make Computer Programming Attractive to Middle School Girls

    Science.gov (United States)

    2006-11-01

    In Generic Alice, there are two common default Chapter 5: Developing the Storytelling Gallery 114 positions for characters and, consequently , two...balance or generating the nth Fibonacci number. Often, students write programs in introductory computer science using professional programming...handle without fundamentally changing the common control structures found in general -purpose languages. Consequently , when a student moves from one of

  18. Computer Programming Games and Gender Oriented Cultural Forms

    Science.gov (United States)

    AlSulaiman, Sarah Abdulmalik

    I present the design and evaluation of two games designed to help elementary and middle school students learn computer programming concepts. The first game was designed to be "gender neutral", aligning with might be described as a consensus opinion on best practices for computational learning environments. The second game, based on the cultural form of dress up dolls was deliberately designed to appeal to females. I recruited 70 participants in an international two-phase study to investigate the relationship between games, gender, attitudes towards computer programming, and learning. My findings suggest that while the two games were equally effective in terms of learning outcomes, I saw differences in motivation between players of the two games. Specifically, participants who reported a preference for female- oriented games were more motivated to learn about computer programming when they played a game that they perceived as designed for females. In addition, I describe how the two games seemed to encourage different types of social activity between players in a classroom setting. Based on these results, I reflect on the strategy of exclusively designing games and activities as "gender neutral", and suggest that employing cultural forms, including gendered ones, may help create a more productive experience for learners.

  19. Computer-aided detection of breast carcinoma in standard mammographic projections with digital mammography

    Energy Technology Data Exchange (ETDEWEB)

    Destounis, Stamatia [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States); University of Rochester, School of Medicine and Dentistry, Rochester, NY (United States); Hanson, Sarah; Morgan, Renee; Murphy, Philip; Somerville, Patricia; Seifert, Posy; Andolina, Valerie; Arieno, Andrea; Skolny, Melissa; Logan-Young, Wende [Elizabeth Wende Breast Care, LLC, Rochester, NY (United States)

    2009-06-15

    A retrospective evaluation of the ability of computer-aided detection (CAD) ability to identify breast carcinoma in standard mammographic projections. Forty-five biopsy proven lesions in 44 patients imaged digitally with CAD applied at examination were reviewed. Forty-four screening BIRADS {sup registered} category 1 digital mammography examinations were randomly identified to serve as a comparative normal/control population. Data included patient age; BIRADS {sup registered} breast density; lesion type, size, and visibility; number, type, and location of CAD marks per image; CAD ability to mark lesions; needle core and surgical pathologic correlation. The CAD lesion/case sensitivity of 87% (n=39), image sensitivity of 69% (n=31) for mediolateral oblique view and 78% (n=35) for the craniocaudal view was found. The average false positive rate in 44 normal screening cases was 2.0 (range 1-8). The 2.0 figure is based on 88 reported false positive CAD marks in 44 normal screening exams: 98% (n=44) lesions proceeded to excision; initial pathology upgraded at surgical excision from in situ to invasive disease in 24% (n=9) lesions. CAD demonstrated potential to detect mammographically visible cancers in standard projections for all lesion types. (orig.)

  20. Accuracy of digital peripical radiography and cone-beam computed tomography in detecting external root resorption

    Energy Technology Data Exchange (ETDEWEB)

    Creanga, Adriana Gabriela [Division of Dental Diagnostic Science, Rutgers School of Dental Medicine, Newark (United States); Geha, Hassem; Sankar, Vidya; Mcmahan, Clyde Alex; Noujeim, Marcel [University of Texas Health Science Center San Antonio, San Antonio (United States); Teixeira, Fabrico B. [Dept. of Endodontics, University of Iowa, Iowa City (United States)

    2015-09-15

    The purpose of this study was to evaluate and compare the efficacy of cone-beam computed tomography (CBCT) and digital intraoral radiography in diagnosing simulated small external root resorption cavities. Cavities were drilled in 159 roots using a small spherical bur at different root levels and on all surfaces. The teeth were imaged both with intraoral digital radiography using image plates and with CBCT. Two sets of intraoral images were acquired per tooth: orthogonal (PA) which was the conventional periapical radiograph and mesioangulated (SET). Four readers were asked to rate their confidence level in detecting and locating the lesions. Receiver operating characteristic (ROC) analysis was performed to assess the accuracy of each modality in detecting the presence of lesions, the affected surface, and the affected level. Analysis of variation was used to compare the results and kappa analysis was used to evaluate interobserver agreement. A significant difference in the area under the ROC curves was found among the three modalities (P=0.0002), with CBCT (0.81) having a significantly higher value than PA (0.71) or SET (0.71). PA was slightly more accurate than SET, but the difference was not statistically significant. CBCT was also superior in locating the affected surface and level. CBCT has already proven its superiority in detecting multiple dental conditions, and this study shows it to likewise be superior in detecting and locating incipient external root resorption.

  1. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  2. The digital computer as a metaphor for the perfect laboratory experiment: Loophole-free Bell experiments

    Science.gov (United States)

    De Raedt, Hans; Michielsen, Kristel; Hess, Karl

    2016-12-01

    Using Einstein-Podolsky-Rosen-Bohm experiments as an example, we demonstrate that the combination of a digital computer and algorithms, as a metaphor for a perfect laboratory experiment, provides solutions to problems of the foundations of physics. Employing discrete-event simulation, we present a counterexample to John Bell's remarkable "proof" that any theory of physics, which is both Einstein-local and "realistic" (counterfactually definite), results in a strong upper bound to the correlations that are being measured in Einstein-Podolsky-Rosen-Bohm experiments. Our counterexample, which is free of the so-called detection-, coincidence-, memory-, and contextuality loophole, violates this upper bound and fully agrees with the predictions of quantum theory for Einstein-Podolsky-Rosen-Bohm experiments.

  3. Design Navigation Computer System Based on Double Digital Signal Process and FPGA

    Directory of Open Access Journals (Sweden)

    Jie Yan

    2013-03-01

    Full Text Available The article describes the design and implementation of integrated navigation embedded computer system based on double DSP and FPGA. In the system, TMS320C6727 (C6727 and TMS320C6713 (C6713 digital signal processor (DSP, which produced by TI are used as the core processing chip. C6727 is responsibility to de-noising the inertial measurement unit (IMU original signal, and send the IMU data to C6713. C6713 is responsibility to collect the IMU and GNSS data, run navigation algorithm and send the navigation information to other implements. The I/0 interface, timing control, data buffering and address bus decoding modes are implemented in FPGA. This design can improve the system real-time performance and reliability.

  4. Computer-Guided Deep Brain Stimulation Programming for Parkinson's Disease.

    Science.gov (United States)

    Heldman, Dustin A; Pulliam, Christopher L; Urrea Mendoza, Enrique; Gartner, Maureen; Giuffrida, Joseph P; Montgomery, Erwin B; Espay, Alberto J; Revilla, Fredy J

    2016-02-01

    Pilot study to evaluate computer-guided deep brain stimulation (DBS) programming designed to optimize stimulation settings using objective motion sensor-based motor assessments. Seven subjects (five males; 54-71 years) with Parkinson's disease (PD) and recently implanted DBS systems participated in this pilot study. Within two months of lead implantation, the subject returned to the clinic to undergo computer-guided programming and parameter selection. A motion sensor was placed on the index finger of the more affected hand. Software guided a monopolar survey during which monopolar stimulation on each contact was iteratively increased followed by an automated assessment of tremor and bradykinesia. After completing assessments at each setting, a software algorithm determined stimulation settings designed to minimize symptom severities, side effects, and battery usage. Optimal DBS settings were chosen based on average severity of motor symptoms measured by the motion sensor. Settings chosen by the software algorithm identified a therapeutic window and improved tremor and bradykinesia by an average of 35.7% compared with baseline in the "off" state (p computer-guided DBS programming identified stimulation parameters that significantly improved tremor and bradykinesia with minimal clinician involvement. Automated motion sensor-based mapping is worthy of further investigation and may one day serve to extend programming to populations without access to specialized DBS centers. © 2015 International Neuromodulation Society.

  5. Method for performance comparison of ddc algorithms and application of this method to selected cases by means of the computer program OPTAL

    Energy Technology Data Exchange (ETDEWEB)

    Patzelt, W.; Salaba, M.

    1975-07-01

    Characteristics and test setups are defined for the comparison of direct digital control algorithms for single variable control. Under the test setup process, process controller, signal, the control algorithms to be compared are characterized by the characteristics quality measure, sensitivity measure, costs. The defined characteristics under the defined test setups are investigated for four types of algorithms designed with respect to deadbeat response, minimal integrated squared error, quantity optimum (PID structure), quantity optimum with compensation of dead time by the process model (PID structure). A digital computer program ''OPTAL'' for design and simulation of direct digital control is used. (2 figures, 10 tables) (auth)

  6. Engineering and programming manual: Two-dimensional kinetic reference computer program (TDK)

    Science.gov (United States)

    Nickerson, G. R.; Dang, L. D.; Coats, D. E.

    1985-01-01

    The Two Dimensional Kinetics (TDK) computer program is a primary tool in applying the JANNAF liquid rocket thrust chamber performance prediction methodology. The development of a methodology that includes all aspects of rocket engine performance from analytical calculation to test measurements, that is physically accurate and consistent, and that serves as an industry and government reference is presented. Recent interest in rocket engines that operate at high expansion ratio, such as most Orbit Transfer Vehicle (OTV) engine designs, has required an extension of the analytical methods used by the TDK computer program. Thus, the version of TDK that is described in this manual is in many respects different from the 1973 version of the program. This new material reflects the new capabilities of the TDK computer program, the most important of which are described.

  7. Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  8. Techniques for Engaging Students in an Online Computer Programming Course

    Directory of Open Access Journals (Sweden)

    Eman M. El-Sheikh

    2009-02-01

    Full Text Available Many institutions of higher education are significantly expanding their online program and course offerings to deal with the rapidly increasing demand for flexible educational alternatives. One of the main challenges that faculty who teach online courses face is determining how to engage students in an online environment. Teaching computer programming effectively requires demonstration of programming techniques, examples, and environments, and interaction with the students, making online delivery even more challenging. This paper describes efforts to engage students in an online introductory programming course at our institution. The tools and methods used to promote student engagement in the course are described, in addition to the lessons learned from the design and delivery of the online course and opportunities for future work.

  9. Digital Computer Transient Models of Three-Phase Inverter Systems Under Normal and Fault Conditions

    Science.gov (United States)

    Gawish, Said Abdelhamid Atiya

    In many industrial applications, variable speed drives of electrical machines are needed. This speed control can be met either by dc or ac machines. The ac machines have several distinct advantages compared to dc machines due to the absence of commutators, therefore, a variable -voltage, variable-frequency power supply is normally required for speed control of ac machines. This power supply can be obtained by a dc link converter system that consists of a rectifier and inverter. In this dissertation the waveforms and transient response of a three-phase forced-commutated inverters are simulated on a digital computer from basic circuit theory. Both the voltage source inverter (VSI) and the current source inverter (CSI) are simulated using thyristors with real characteristics. The simulation is further modified to give three-phase currents with adjustable frequency to be used in adjustable speed induction motor drives or the starting of synchronous motors from rest. The digital simulation of Gate Turn-Off (GTO) thyristor inverters feeding an induction motor is presented and can allow for step frequency change for the study of adjustable speed induction motor drives. A naturally-commutated three-phase inverter using thyristors with real characteristics was also simulated to study VSIs and CSIs. The interactions between the load parameters and the inverter circuit parameters are investigated. The parameters studied include ratio of dc voltage to amplitude of ac voltage, ratio of smoothing inductance to load inductance and triggering angle (alpha). Since the naturally-commutated CSI system is widely used in power applications, it was investigated under different types of fault s occurring both on the line and in the inverter circuit. These faults include three-phase short circuits, thyristor failures, line-to-line faults, false triggering and open circuits. A digital computer was used to simulate these faults and the system response because it is difficult to obtain the

  10. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  11. Qualification of a computer program for drill string dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Stone, C.M.; Carne, T.G.; Caskey, B.C.

    1985-01-01

    A four point plan for the qualification of the GEODYN drill string dynamics computer program is described. The qualification plan investigates both modal response and transient response of a short drill string subjected to simulated cutting loads applied through a polycrystalline diamond compact (PDC) bit. The experimentally based qualification shows that the analytical techniques included in Phase 1 GEODYN correctly simulate the dynamic response of the bit-drill string system. 6 refs., 8 figs.

  12. Introduction to ''Interactive models of computation and program behaviour"

    OpenAIRE

    Curien, Pierre-Louis

    2009-01-01

    Since the mid-eighties of the last century, a fruitful interplay between computer scientists and mathematicians has led to much progress in the understanding of programming languages, and has given new impulse to areas of mathematics such as proof theory or category theory. The volume of which this text is an introduction contains three contributions: Categorical semantics of linear logic, by P.-A. Melliès, Realizability in classical logic, by J.-L. Krivien, Abstract machines for dialogue gam...

  13. PET computer programs for use with the 88-inch cyclotron

    Energy Technology Data Exchange (ETDEWEB)

    Gough, R.A.; Chlosta, L.

    1981-06-01

    This report describes in detail several offline programs written for the PET computer which provide an efficient data management system to assist with the operation of the 88-Inch Cyclotron. This function includes the capability to predict settings for all cyclotron and beam line parameters for all beams within the present operating domain of the facility. The establishment of a data base for operational records is also described from which various aspects of the operating history can be projected.

  14. Designing, programming, and optimizing a (small) quantum computer

    Science.gov (United States)

    Svore, Krysta

    In 1982, Richard Feynman proposed to use a computer founded on the laws of quantum physics to simulate physical systems. In the more than thirty years since, quantum computers have shown promise to solve problems in number theory, chemistry, and materials science that would otherwise take longer than the lifetime of the universe to solve on an exascale classical machine. The practical realization of a quantum computer requires understanding and manipulating subtle quantum states while experimentally controlling quantum interference. It also requires an end-to-end software architecture for programming, optimizing, and implementing a quantum algorithm on the quantum device hardware. In this talk, we will introduce recent advances in connecting abstract theory to present-day real-world applications through software. We will highlight recent advancement of quantum algorithms and the challenges in ultimately performing a scalable solution on a quantum device.

  15. Computationally intensive econometrics using a distributed matrix-programming language.

    Science.gov (United States)

    Doornik, Jurgen A; Hendry, David F; Shephard, Neil

    2002-06-15

    This paper reviews the need for powerful computing facilities in econometrics, focusing on concrete problems which arise in financial economics and in macroeconomics. We argue that the profession is being held back by the lack of easy-to-use generic software which is able to exploit the availability of cheap clusters of distributed computers. Our response is to extend, in a number of directions, the well-known matrix-programming interpreted language Ox developed by the first author. We note three possible levels of extensions: (i) Ox with parallelization explicit in the Ox code; (ii) Ox with a parallelized run-time library; and (iii) Ox with a parallelized interpreter. This paper studies and implements the first case, emphasizing the need for deterministic computing in science. We give examples in the context of financial economics and time-series modelling.

  16. The ACP (Advanced Computer Program) multiprocessor system at Fermilab

    Energy Technology Data Exchange (ETDEWEB)

    Nash, T.; Areti, H.; Atac, R.; Biel, J.; Case, G.; Cook, A.; Fischler, M.; Gaines, I.; Hance, R.; Husby, D.

    1986-09-01

    The Advanced Computer Program at Fermilab has developed a multiprocessor system which is easy to use and uniquely cost effective for many high energy physics problems. The system is based on single board computers which cost under $2000 each to build including 2 Mbytes of on board memory. These standard VME modules each run experiment reconstruction code in Fortran at speeds approaching that of a VAX 11/780. Two versions have been developed: one uses Motorola's 68020 32 bit microprocessor, the other runs with AT and T's 32100. both include the corresponding floating point coprocessor chip. The first system, when fully configured, uses 70 each of the two types of processors. A 53 processor system has been operated for several months with essentially no down time by computer operators in the Fermilab Computer Center, performing at nearly the capacity of 6 CDC Cyber 175 mainframe computers. The VME crates in which the processing ''nodes'' sit are connected via a high speed ''Branch Bus'' to one or more MicroVAX computers which act as hosts handling system resource management and all I/O in offline applications. An interface from Fastbus to the Branch Bus has been developed for online use which has been tested error free at 20 Mbytes/sec for 48 hours. ACP hardware modules are now available commercially. A major package of software, including a simulator that runs on any VAX, has been developed. It allows easy migration of existing programs to this multiprocessor environment. This paper describes the ACP Multiprocessor System and early experience with it at Fermilab and elsewhere.

  17. Engineering integrated digital circuits with allosteric ribozymes for scaling up molecular computation and diagnostics.

    Science.gov (United States)

    Penchovsky, Robert

    2012-10-19

    Here we describe molecular implementations of integrated digital circuits, including a three-input AND logic gate, a two-input multiplexer, and 1-to-2 decoder using allosteric ribozymes. Furthermore, we demonstrate a multiplexer-decoder circuit. The ribozymes are designed to seek-and-destroy specific RNAs with a certain length by a fully computerized procedure. The algorithm can accurately predict one base substitution that alters the ribozyme's logic function. The ability to sense the length of RNA molecules enables single ribozymes to be used as platforms for multiple interactions. These ribozymes can work as integrated circuits with the functionality of up to five logic gates. The ribozyme design is universal since the allosteric and substrate domains can be altered to sense different RNAs. In addition, the ribozymes can specifically cleave RNA molecules with triplet-repeat expansions observed in genetic disorders such as oculopharyngeal muscular dystrophy. Therefore, the designer ribozymes can be employed for scaling up computing and diagnostic networks in the fields of molecular computing and diagnostics and RNA synthetic biology.

  18. Digital mammography: computer-assisted diagnosis method for mass detection with multiorientation and multiresolution wavelet transforms.

    Science.gov (United States)

    Li, L; Qian, W; Clarke, L P

    1997-11-01

    The authors evaluated a modular computer-assisted diagnosis (CAD) method for mass detection that uses computation of features in three domains (gray level, morphology, and directional texture). Their objectives were to improve the sensitivity of detection and reduce the false-positive (FP) detection rate. The directional wavelet transform (DWT) method, which uses both multiorientation and multiresolution wavelet transforms to improve image preprocessing and segmentation of suspicious areas and to extract both morphologic and directional texture features, was evaluated with a previously reported image database containing 50 normal and 45 abnormal digitized screen-film mammograms. The mammograms contained all mass types and included 16 minimal cancers. This method was compared with the Markov random field (MRF) method to avoid issues related to case selection criteria. Free-response receiver operating characteristic curves were compared for both DWT and MRF methods. For the DWT method, the sensitivity was 98% and the FP detection rate was 1.8 FP findings per image. For the MRF method, the sensitivity was 90% and the FP detection rate was 2.0 FP findings per image. The CAD method applied to the full mammographic image is automatic and independent of mass type. The segmentation of masses as performed with this method may potentially allow visual interpretation according to American College of Radiology criteria.

  19. DANCER: a program for digital anatomical reconstruction of gene expression data

    Science.gov (United States)

    Kankainen, Matti; Wong, Garry

    2003-01-01

    A digital anatomy construction (DANCER) program was developed for gene expression data. DANCER can be used to reconstruct anatomical images from in situ hybridization images, microarray or other gene expression data. The program fills regions of a drawn figure with the corresponding values from a gene expression data set. The output of the program presents the expression levels of a particular gene in a particular region relative to other regions. The program was tested with values from experimental in situ hybridization autoradiographs and from a microarray experiment. Reconstruction of in situ hybridization data from adult rat brain made by DANCER corresponded well with the original autoradiograph. Reconstruction of microarray data from adult mouse brains provided images that reflect actual expression levels. This program should help to provide visualization and interpretation of data derived from gene expression experiments. DANCER may be freely downloaded. PMID:14576332

  20. User's manual for the generalized computer program system. Open-channel flow and sedimentation, TABS-2. Main text

    Science.gov (United States)

    Thomas, W. A.; McAnally, W. H., Jr.

    1985-07-01

    TABS-2 is a generalized numerical modeling system for open-channel flows, sedimentation, and constituent transport. It consists of more than 40 computer programs to perform modeling and related tasks. The major modeling components--RMA-2V, STUDH, and RMA-4--calculate two-dimensional, depth-averaged flows, sedimentation, and dispersive transport, respectively. The other programs in the system perform digitizing, mesh generation, data management, graphical display, output analysis, and model interfacing tasks. Utilities include file management and automatic generation of computer job control instructions. TABS-2 has been applied to a variety of waterways, including rivers, estuaries, bays, and marshes. It is designed for use by engineers and scientists who may not have a rigorous computer background. Use of the various components is described in Appendices A-O. The bound version of the report does not include the appendices. A looseleaf form with Appendices A-O is distributed to system users.

  1. 77 FR 50724 - Developing Software Life Cycle Processes for Digital Computer Software Used in Safety Systems of...

    Science.gov (United States)

    2012-08-22

    ... revision 2 of RG 1.168, ``Verification, Validation, Reviews, and Audits for Digital Computer Software used... NRC-2012- 0195. You may submit comments by any of the following methods: Federal Rulemaking Web Site... possesses and are publicly available, by any of the following methods: Federal Rulemaking Web Site: Go to...

  2. Natural Resources Research Program: Catalog of Computer Programs for Project Management.

    Science.gov (United States)

    project management . These include programs developed for use on a microcomputer, as well as those which run on a host computer but are accessed by a terminal in a field office. A one-page description of each program contains the title; preparing agency; abstract; a summary of the data inputs and outputs; equipment, disk, and memory requirements; operating system and programming language; and a contact for further information. The programs described in this publication are not limited to those available within the Corps, but also include those available from other

  3. E-Rate Program Seen as Too Lean for a Digital Era

    Science.gov (United States)

    Klein, Alyson

    2013-01-01

    As school districts strive to put more technology into schools to support 1-to-1 computing initiatives and prepare for the common-core online assessments, the federal E-rate program is in danger of becoming as outdated and insufficient as a sputtering dial-up connection in a Wi-Fi world. While the program can boast great success since its…

  4. Towards a Serious Game to Help Students Learn Computer Programming

    Directory of Open Access Journals (Sweden)

    Mathieu Muratet

    2009-01-01

    Full Text Available Video games are part of our culture like TV, movies, and books. We believe that this kind of software can be used to increase students' interest in computer science. Video games with other goals than entertainment, serious games, are present, today, in several fields such as education, government, health, defence, industry, civil security, and science. This paper presents a study around a serious game dedicated to strengthening programming skills. Real-Time Strategy, which is a popular game genre, seems to be the most suitable kind of game to support such a serious game. From programming teaching features to video game characteristics, we define a teaching organisation to experiment if a serious game can be adapted to learn programming.

  5. A new approach to develop computer-aided detection schemes of digital mammograms

    Science.gov (United States)

    Tan, Maxine; Qian, Wei; Pu, Jiantao; Liu, Hong; Zheng, Bin

    2015-01-01

    The purpose of this study is to develop a new global mammographic image feature analysis based computer-aided detection (CAD) scheme and evaluate its performance in detecting positive screening mammography examinations. A dataset that includes images acquired from 1896 full-field digital mammography (FFDM) screening examinations was used in this study. Among them, 812 cases were positive for cancer and 1084 were negative or benign. After segmenting the breast area, a computerized scheme was applied to compute 92 global mammographic tissue density based features on each of four mammograms of the craniocaudal (CC) and mediolateral oblique (MLO) views. After adding three existing popular risk factors (woman’s age, subjectively rated mammographic density, and family breast cancer history) into the initial feature pool, we applied a Sequential Forward Floating Selection (SFFS) feature selection algorithm to select relevant features from the bilateral CC and MLO view images separately. The selected CC and MLO view image features were used to train two artificial neural networks (ANNs). The results were then fused by a third ANN to build a two-stage classifier to predict the likelihood of the FFDM screening examination being positive. CAD performance was tested using a ten-fold cross-validation method. The computed area under the receiver operating characteristic curve was AUC=0.779±0.025 and the odds ratio monotonically increased from 1 to 31.55 as CAD-generated detection scores increased. The study demonstrated that this new global image feature based CAD scheme had a relatively higher discriminatory power to cue the FFDM examinations with high risk of being positive, which may provide a new CAD-cueing method to assist radiologists in reading and interpreting screening mammograms. PMID:25984710

  6. A new approach to develop computer-aided detection schemes of digital mammograms

    Science.gov (United States)

    Tan, Maxine; Qian, Wei; Pu, Jiantao; Liu, Hong; Zheng, Bin

    2015-06-01

    The purpose of this study is to develop a new global mammographic image feature analysis based computer-aided detection (CAD) scheme and evaluate its performance in detecting positive screening mammography examinations. A dataset that includes images acquired from 1896 full-field digital mammography (FFDM) screening examinations was used in this study. Among them, 812 cases were positive for cancer and 1084 were negative or benign. After segmenting the breast area, a computerized scheme was applied to compute 92 global mammographic tissue density based features on each of four mammograms of the craniocaudal (CC) and mediolateral oblique (MLO) views. After adding three existing popular risk factors (woman’s age, subjectively rated mammographic density, and family breast cancer history) into the initial feature pool, we applied a sequential forward floating selection feature selection algorithm to select relevant features from the bilateral CC and MLO view images separately. The selected CC and MLO view image features were used to train two artificial neural networks (ANNs). The results were then fused by a third ANN to build a two-stage classifier to predict the likelihood of the FFDM screening examination being positive. CAD performance was tested using a ten-fold cross-validation method. The computed area under the receiver operating characteristic curve was AUC = 0.779   ±   0.025 and the odds ratio monotonically increased from 1 to 31.55 as CAD-generated detection scores increased. The study demonstrated that this new global image feature based CAD scheme had a relatively higher discriminatory power to cue the FFDM examinations with high risk of being positive, which may provide a new CAD-cueing method to assist radiologists in reading and interpreting screening mammograms.

  7. Computer aided method for colour calibration and analysis of digital rock photographs

    Directory of Open Access Journals (Sweden)

    Matic Potočnik

    2015-12-01

    Full Text Available The methods used in geology to determine colour and colour coverage are expensive, time consuming, and/ or subjective. Estimates of colour coverage can only be approximate since they are based on rough comparisonbased measuring etalons and subjective estimation, which is dependent upon the skill and experience of the person performing the estimation. We present a method which accelerates, simplifis, and objectifis these tasks using a computer application. It automatically calibrates the colours of a digital photo, and enables the user to read colour values and coverage, even after returning from fild work. Colour identifiation is based on the Munsell colour system. For the purposes of colour calibration we use the X-Rite ColorChecker Passport colour chart placed onto the photographed scene. Our computer application detects the ColorChecker colour chart, and fids a colour space transformation to calibrate the colour in the photo. The user can then use the application to read colours within selected points or regions of the photo. The results of the computerised colour calibration were compared to the reference values of the ColorChecker chart. The values slightly deviate from the exact values, but the deviation is around the limit of human capability for visual comparison. We have devised an experiment, which compares the precision of the computerised colour analysis and manual colour analysis performed on a variety of rock samples with the help of geology students using Munsell Rock-color Chart. The analysis showed that the precision of manual comparative identifiation on multicoloured samples is somewhat problematic, since the choice of representative colours and observation points for a certain part of a sample are subjective. The computer based method has the edge in verifibility and repeatability of the analysis since the application the original photo to be saved with colour calibration, and tagging of colouranalysed points and regions.

  8. Comparison distortion in the mandible skull using panoramic digital radiograpy and Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Cek Dara Manja

    2017-08-01

    The conclusion of this study is average distortion that occurs in the mandible using CBCT is more less than digital panoramic radiographs. That is mean CBCT more accurate than digital panoramic radiographs.

  9. Integration of digital dental casts in cone beam computed tomography scans-a clinical validation study.

    Science.gov (United States)

    Rangel, Frits A; Maal, Thomas J J; de Koning, Martien J J; Bronkhorst, Ewald M; Bergé, Stefaan J; Kuijpers-Jagtman, Anne Marie

    2017-09-20

    Images derived from cone beam computed tomography (CBCT) scans lack detailed information on the dentition and interocclusal relationships needed for proper surgical planning and production of surgical splints. To get a proper representation of the dentition, integration of a digital dental model into the CBCT scan is necessary. The aim of this study was to validate a simplified protocol to integrate digital dental models into CBCT scans using only one scan. Conventional protocol A used one combined upper and lower impression and two CBCT scans. The new protocol B included placement of ten markers on the gingiva, one CBCT scan, and two separate impressions of the upper and lower dentition. Twenty consecutive patients, scheduled for mandibular advancement surgery, were included. To validate protocol B, 3-dimensional reconstructions were made, which were compared by calculating the mean intersurface distances obtained with both protocols. The mean distance for all patients for the upper jaw is 0.39 mm and for the lower jaw is 0.30 mm. For ten out of 20 patients, all distances were less than 1 mm. For the other ten patients, all distances were less than 2 mm. Mean distances of 0.39 and 0.30 mm are clinically acceptable and comparable to other studies; therefore, this new protocol is clinically accurate. This new protocol seems to be clinically accurate. It is less time consuming, gives less radiation exposure for the patient, and has a lower risk for positional errors of the impressions compared to other integration protocols.

  10. Literary drafts, genetic criticism and computational technology. The Beckett Digital Manuscript Project

    NARCIS (Netherlands)

    Sichani, Anna-Maria

    2017-01-01

    This article addresses the Beckett Digital Manuscript Project, an evolving project, currently comprising a series of digital genetic editions of Samuel Beckett’s bilingual literary drafts and a digital library. Following the genetic school of editing, the project’s goal is to explore and represent

  11. Reducing the Digital Divide among Children Who Received Desktop or Hybrid Computers for the Home

    Science.gov (United States)

    Zilka, Gila Cohen

    2016-01-01

    Researchers and policy makers have been exploring ways to reduce the digital divide. Parameters commonly used to examine the digital divide worldwide, as well as in this study, are: (a) the digital divide in the accessibility and mobility of the ICT infrastructure and of the content infrastructure (e.g., sites used in school); and (b) the digital…

  12. HAL/SM language specification. [programming languages and computer programming for space shuttles

    Science.gov (United States)

    Williams, G. P. W., Jr.; Ross, C.

    1975-01-01

    A programming language is presented for the flight software of the NASA Space Shuttle program. It is intended to satisfy virtually all of the flight software requirements of the space shuttle. To achieve this, it incorporates a wide range of features, including applications-oriented data types and organizations, real time control mechanisms, and constructs for systems programming tasks. It is a higher order language designed to allow programmers, analysts, and engineers to communicate with the computer in a form approximating natural mathematical expression. Parts of the English language are combined with standard notation to provide a tool that readily encourages programming without demanding computer hardware expertise. Block diagrams and flow charts are included. The semantics of the language is discussed.

  13. The Analysis of Heterogeneous Text Documents with the Help of the Computer Program NUD*IST

    Directory of Open Access Journals (Sweden)

    Christine Plaß

    2000-12-01

    Full Text Available On the basis of a current research project we discuss the use of the computer program NUD*IST for the analysis and archiving of qualitative documents. Our project examines the social evaluation of spectacular criminal offenses and we identify, digitize and analyze documents from the entire 20th century. Since public and scientific discourses are examined, the data of the project are extraordinarily heterogeneous: scientific publications, court records, newspaper reports, and administrative documents. We want to show how to transfer general questions into a systematic categorization with the assistance of NUD*IST. Apart from the functions, possibilities and limitations of the application of NUD*IST, concrete work procedures and difficulties encountered are described. URN: urn:nbn:de:0114-fqs0003211

  14. Social Support and “Playing Around”: An Examination of How Older Adults Acquire Digital Literacy With Tablet Computers

    Science.gov (United States)

    Tsai, Hsin-yi Sandy; Shillair, Ruth; Cotten, Shelia R.

    2017-01-01

    This study examines how older adults learn to use tablet computers. Learning to use new technologies can help older adults to be included in today’s digital society. However, learning to use new technologies is not always easy, especially for older adults. This study focuses on how older adults learn to use a specific technology, tablet computers, and the role that social support plays in this process. Data for this project are from 21 in-depth interviews with individuals who own tablet computers. We examine how older adults engage with tablet devices and increase their digital literacy. The findings suggest that, for older adults to start to use tablets, social support plays an important role. In addition, a key way that many participants report gaining expertise with the technology is through “playing around” with the tablets. Suggestions for how to help older adults learn to use new technologies are detailed. PMID:26491029

  15. Social Support and "Playing Around": An Examination of How Older Adults Acquire Digital Literacy With Tablet Computers.

    Science.gov (United States)

    Tsai, Hsin-Yi Sandy; Shillair, Ruth; Cotten, Shelia R

    2017-01-01

    This study examines how older adults learn to use tablet computers. Learning to use new technologies can help older adults to be included in today's digital society. However, learning to use new technologies is not always easy, especially for older adults. This study focuses on how older adults learn to use a specific technology, tablet computers, and the role that social support plays in this process. Data for this project are from 21 in-depth interviews with individuals who own tablet computers. We examine how older adults engage with tablet devices and increase their digital literacy. The findings suggest that, for older adults to start to use tablets, social support plays an important role. In addition, a key way that many participants report gaining expertise with the technology is through "playing around" with the tablets. Suggestions for how to help older adults learn to use new technologies are detailed. © The Author(s) 2015.

  16. Evaluation of the use of digital study models in postgraduate orthodontic programs in the United States and Canada.

    Science.gov (United States)

    Shastry, Shruti; Park, Jae Hyun

    2014-01-01

    To investigate the extent, experience, and trends associated with digital model use, as well as the advantages of using a particular study model type (digital or plaster) in postgraduate orthodontic programs in the United States and Canada. An electronic survey consisting of 14 questions was sent to 72 program directors or chairpersons of accredited orthodontic postgraduate programs in the United States and Canada. Fifty-one responded for a 71% response rate. Sixty-five percent of the schools use plaster study models compared with 35% that use digital models. The most common advantages of plaster models were a three-dimensional feel and the ability for them to be mounted on an articulator. The most common advantages of digital models were the ease of storage and retrieval, and the residents' exposure to new technology. About one third of the plaster model users reported that they wanted to switch to digital models in the future, with 12% planning to do so within 1 year. Based on our study, 35% of accredited orthodontic postgraduate programs in the United States and Canada are using digital study models in most cases treated in their programs, and the trend is for increased digital model use in the future.

  17. [An algol program for the computation of empiric regressions].

    Science.gov (United States)

    Peil, J; Schmerling, S

    1977-01-01

    An explanation is given about the meaning of empirical regression and on the domain of application of this biomathematical-statistical procedure. It may be helpful in data handling after the measurements and in a first stage of data processing especially if there is a large amount of datas. An empirical regression can provide the basis for a functional relationship analysis by giving hints for the choice of empirical mathematical functions. This will be useful and necessary in such cases where the measured values have a greater dispersion and one wants to get an analytical expression for the course of measured points. In the appendix a program listing of the ALGOL-program for empirical regression is presented. Detailed remarks are made in the text concerning the program structure, the data input and output resp. the program control parameters to enable the biological or medical user to adapt the program to their special problems without the help by a mathematician, and neither with deeper knowledge of mathematics nor with detailed insight to computer technical aspects of data processing.

  18. A Computer Program for Assessing Nuclear Safety Culture Impact

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-10-15

    Through several accidents of NPP including the Fukushima Daiichi in 2011 and Chernobyl accidents in 1986, a lack of safety culture was pointed out as one of the root cause of these accidents. Due to its latent influences on safety performance, safety culture has become an important issue in safety researches. Most of the researches describe how to evaluate the state of the safety culture of the organization. However, they did not include a possibility that the accident occurs due to the lack of safety culture. Because of that, a methodology for evaluating the impact of the safety culture on NPP's safety is required. In this study, the methodology for assessing safety culture impact is suggested and a computer program is developed for its application. SCII model which is the new methodology for assessing safety culture impact quantitatively by using PSA model. The computer program is developed for its application. This program visualizes the SCIs and the SCIIs. It might contribute to comparing the level of the safety culture among NPPs as well as improving the management safety of NPP.

  19. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 1. Technical Report

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume I defines the general analytical capabilities required for computer programs applicable to single rail vehi...

  20. Ethics of prevention: an interactive computer-tailored program.

    Science.gov (United States)

    Van Hooren, Rob H; Van den Borne, Bart W; Curfs, Leopold M G; Widdershoven, Guy A M

    2007-01-01

    This article describes the contents of an interactive computer-tailored program. The program is based on previous studies of the practice of care for persons with Prader-Willi syndrome. This genetic condition is associated with a constant overeating behaviour with the risk of obesity. The aim of the program is to start a process of awareness, reflection, and discussion by caregivers who are confronted with the moral dilemma of respect for autonomy versus restricting overeating behaviour. The program focuses on values (such as health and well-being) that are relevant to caregivers in daily practice. Furthermore, the focus is on various ways of interaction with the client. Caregivers were expected to focus mainly on health, and on both paternalistic and interpretive/deliberative forms of interaction. Sixteen professionals and 12 parents pilot-tested the program contents. With a pre-test, responses on one central case were collected for tailored feedback; with a post-test, the effects of the program were measured. Significant correlations were found between the values of autonomy and consultation and between autonomy and well-being. In contrast to our expectations respondents valued all categories (autonomy, consultation, health, well-being, and liveability for others) as equally important in the pre-test. No significant changes in scores were found between pre- and post-test. The open answers and remarks of participants support the program contents. Participants' responses support previous research findings, advocating a concept of autonomy in terms of positive freedom, through support by others. The promotion of the client's self-understanding and self-development is central in this concept.

  1. High Performance Computing - Power Application Programming Interface Specification.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H.,; Kelly, Suzanne M.; Pedretti, Kevin; Grant, Ryan; Olivier, Stephen Lecler; Levenhagen, Michael J.; DeBonis, David

    2014-08-01

    Measuring and controlling the power and energy consumption of high performance computing systems by various components in the software stack is an active research area [13, 3, 5, 10, 4, 21, 19, 16, 7, 17, 20, 18, 11, 1, 6, 14, 12]. Implementations in lower level software layers are beginning to emerge in some production systems, which is very welcome. To be most effective, a portable interface to measurement and control features would significantly facilitate participation by all levels of the software stack. We present a proposal for a standard power Application Programming Interface (API) that endeavors to cover the entire software space, from generic hardware interfaces to the input from the computer facility manager.

  2. Horizon 2020 program for industry 4.0: Towards a digital model of quality

    Directory of Open Access Journals (Sweden)

    Majstorović Vidosav D.

    2015-01-01

    Full Text Available Program Manufuture Vision 2020 was developed as a Strategic Research Agenda and Road Maps, as a basis for EU research within Horizon 2020. This document is a response to the challenges of global competitiveness and sustainable development, seeking to EU industry as many turns, innovative production based to create additional value for the customer and the use of ICT technologies. Somewhat later also made public a document - Manufuture a strategic research agenda '2020-2030' and Roadmaps, to support structural changes industry, which should be orjetisana high automation, effectiveness and flexibility. Technological developments driven by technological advancement potential in a process chain, from raw materials to the exclusion of the product from use, through continuous improvement of additional value for the customer. These documents Program Manufacture represent a proactive action and collaborative research for 40 industrial sectors. Key areas of this research are: the factory of the future - the Factory of the Future (FOF, and digital production - Digital Manufacturing as well as digital quality.

  3. Building Computer-Based Experiments in Psychology without Programming Skills.

    Science.gov (United States)

    Ruisoto, Pablo; Bellido, Alberto; Ruiz, Javier; Juanes, Juan A

    2016-06-01

    Research in Psychology usually requires to build and run experiments. However, although this task has required scripting, recent computer tools based on graphical interfaces offer new opportunities in this field for researchers with non-programming skills. The purpose of this study is to illustrate and provide a comparative overview of two of the main free open source "point and click" software packages for building and running experiments in Psychology: PsychoPy and OpenSesame. Recommendations for their potential use are further discussed.

  4. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    Science.gov (United States)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  5. Computational Hydrodynamics: How Portable and Scalable Are Heterogeneous Programming Paradigms?

    DEFF Research Database (Denmark)

    Pawlak, Wojciech; Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter

    New many-core era applications at the interface of mathematics and computer science adopt modern parallel programming paradigms and expose parallelism through proper algorithms. We present new performance results for a novel massively parallel free surface wave model suitable for advanced......-device system sizes from desktops to large HPC systems such as superclusters and in the cloud utilizing heterogeneous devices like multi-core CPUs, GPUs, and Xeon Phi coprocessors. The numerical efficiency is evaluated on heterogeneous devices like multi-core CPUs, GPUs and Xeon Phi coprocessors to test...

  6. Three-Dimensional Computer-Aided Detection of Microcalcification Clusters in Digital Breast Tomosynthesis

    Directory of Open Access Journals (Sweden)

    Ji-wook Jeong

    2016-01-01

    Full Text Available We propose computer-aided detection (CADe algorithm for microcalcification (MC clusters in reconstructed digital breast tomosynthesis (DBT images. The algorithm consists of prescreening, MC detection, clustering, and false-positive (FP reduction steps. The DBT images containing the MC-like objects were enhanced by a multiscale Hessian-based three-dimensional (3D objectness response function and a connected-component segmentation method was applied to extract the cluster seed objects as potential clustering centers of MCs. Secondly, a signal-to-noise ratio (SNR enhanced image was also generated to detect the individual MC candidates and prescreen the MC-like objects. Each cluster seed candidate was prescreened by counting neighboring individual MC candidates nearby the cluster seed object according to several microcalcification clustering criteria. As a second step, we introduced bounding boxes for the accepted seed candidate, clustered all the overlapping cubes, and examined. After the FP reduction step, the average number of FPs per case was estimated to be 2.47 per DBT volume with a sensitivity of 83.3%.

  7. Novel fully integrated computer system for custom footwear: from 3D digitization to manufacturing

    Science.gov (United States)

    Houle, Pascal-Simon; Beaulieu, Eric; Liu, Zhaoheng

    1998-03-01

    This paper presents a recently developed custom footwear system, which integrates 3D digitization technology, range image fusion techniques, a 3D graphical environment for corrective actions, parametric curved surface representation and computer numerical control (CNC) machining. In this system, a support designed with the help of biomechanics experts can stabilize the foot in a correct and neutral position. The foot surface is then captured by a 3D camera using active ranging techniques. A software using a library of documented foot pathologies suggests corrective actions on the orthosis. Three kinds of deformations can be achieved. The first method uses previously scanned pad surfaces by our 3D scanner, which can be easily mapped onto the foot surface to locally modify the surface shape. The second kind of deformation is construction of B-Spline surfaces by manipulating control points and modifying knot vectors in a 3D graphical environment to build desired deformation. The last one is a manual electronic 3D pen, which may be of different shapes and sizes, and has an adjustable 'pressure' information. All applied deformations should respect a G1 surface continuity, which ensure that the surface can accustom a foot. Once the surface modification process is completed, the resulting data is sent to manufacturing software for CNC machining.

  8. Selective tuberculosis incidence estimation by digital computer information technologies in the MS Excel system

    Directory of Open Access Journals (Sweden)

    G. I. Ilnitsky

    2014-01-01

    Full Text Available The incidence of tuberculosis was estimated in different age groups of people, applying the digital computer information technologies of tracking. For this, the author used the annual forms of the reporting materials stipulated by the Ministry of Health of Ukraine, the results of his observations, and the data of bank information accumulation in the MS Excel system. The initial positions were formed in terms of the epidemiological indicators of Ukraine and the Lvov Region during a 10-year period (2000-2009 that was, in relation with different initial characteristics, divided into Step 1 (2000-2004 in which the tuberculosis epidemic situation progressively deteriorated and Step 2 (2005-2009 in which relative morbidity was relatively stabilized. The results were processed using the MS Excel statistical and mathematical functions that were parametric and nonparametric in establishing a correlation when estimating the changes in epidemic parameters. The findings of studies among the general population could lead to the conclusion that the mean tuberculosis morbidity in Ukraine was much greater than that in the Lvov Region irrespective of the age of a population. At the same time, the morbidity rate in the foci of tuberculosis infection suggested that it rose among both the children, adolescents, and adults, which provided a rationale for that therapeutic and preventive measures should be better implemented.

  9. Digital mammography with synchrotron radiation: characterization of a novel computed radiography system

    Science.gov (United States)

    Trivellato, S.; Vandenbroucke, D.; Arfelli, F.; Bessem, M.; Fedon, C.; Longo, R.; Tromba, G.; Taibi, A.

    2015-08-01

    Breast X-ray imaging is a continuous research field to define dedicated equipment, with specialized X-ray sources and efficient detectors to improve image quality with an equal or even lower patient dose. The Needle Imaging Plate HM5.0, produced by Agfa, has been characterized using synchrotron radiation to assess the performance of this novel imaging chain in comparison to conventional mammographic equipment. The detection performance has been initially assessed in terms of Detective Quantum Efficiency (DQE) and its computation showed that DQE curves are very close to the typical results for digital radiography systems. Image threshold contrast has been then evaluated using the CDMAM phantom. The analysis has been completed with a scoring of visible details in the radiographs of the TORMAM phantom. The characterization thus confirms that monochromaticity leads to an equal image quality with a lower glandular dose and phase-contrast effects lead to an increase in anatomical structure detectability. Finally, a preliminary evaluation of clinical images showed a clear improvement in image quality thanks to phase-contrast contribution and to detector performance.

  10. Readiness for Delivering Digital Health at Scale: Lessons From a Longitudinal Qualitative Evaluation of a National Digital Health Innovation Program in the United Kingdom.

    Science.gov (United States)

    Lennon, Marilyn R; Bouamrane, Matt-Mouley; Devlin, Alison M; O'Connor, Siobhan; O'Donnell, Catherine; Chetty, Ula; Agbakoba, Ruth; Bikker, Annemieke; Grieve, Eleanor; Finch, Tracy; Watson, Nicholas; Wyke, Sally; Mair, Frances S

    2017-02-16

    Digital health has the potential to support care delivery for chronic illness. Despite positive evidence from localized implementations, new technologies have proven slow to become accepted, integrated, and routinized at scale. The aim of our study was to examine barriers and facilitators to implementation of digital health at scale through the evaluation of a £37m national digital health program: ‟Delivering Assisted Living Lifestyles at Scale" (dallas) from 2012-2015. The study was a longitudinal qualitative, multi-stakeholder, implementation study. The methods included interviews (n=125) with key implementers, focus groups with consumers and patients (n=7), project meetings (n=12), field work or observation in the communities (n=16), health professional survey responses (n=48), and cross program documentary evidence on implementation (n=215). We used a sociological theory called normalization process theory (NPT) and a longitudinal (3 years) qualitative framework analysis approach. This work did not study a single intervention or population. Instead, we evaluated the processes (of designing and delivering digital health), and our outcomes were the identified barriers and facilitators to delivering and mainstreaming services and products within the mixed sector digital health ecosystem. We identified three main levels of issues influencing readiness for digital health: macro (market, infrastructure, policy), meso (organizational), and micro (professional or public). Factors hindering implementation included: lack of information technology (IT) infrastructure, uncertainty around information governance, lack of incentives to prioritize interoperability, lack of precedence on accountability within the commercial sector, and a market perceived as difficult to navigate. Factors enabling implementation were: clinical endorsement, champions who promoted digital health, and public and professional willingness. Although there is receptiveness to digital health

  11. Activity computer program for calculating ion irradiation activation

    Science.gov (United States)

    Palmer, Ben; Connolly, Brian; Read, Mark

    2017-07-01

    A computer program, Activity, was developed to predict the activity and gamma lines of materials irradiated with an ion beam. It uses the TENDL (Koning and Rochman, 2012) [1] proton reaction cross section database, the Stopping and Range of Ions in Matter (SRIM) (Biersack et al., 2010) code, a Nuclear Data Services (NDS) radioactive decay database (Sonzogni, 2006) [2] and an ENDF gamma decay database (Herman and Chadwick, 2006) [3]. An extended version of Bateman's equation is used to calculate the activity at time t, and this equation is solved analytically, with the option to also solve by numeric inverse Laplace Transform as a failsafe. The program outputs the expected activity and gamma lines of the activated material.

  12. Programming a massively parallel, computation universal system: static behavior

    Energy Technology Data Exchange (ETDEWEB)

    Lapedes, A.; Farber, R.

    1986-01-01

    In previous work by the authors, the ''optimum finding'' properties of Hopfield neural nets were applied to the nets themselves to create a ''neural compiler.'' This was done in such a way that the problem of programming the attractors of one neural net (called the Slave net) was expressed as an optimization problem that was in turn solved by a second neural net (the Master net). In this series of papers that approach is extended to programming nets that contain interneurons (sometimes called ''hidden neurons''), and thus deals with nets capable of universal computation. 22 refs.

  13. Accelerated Strategic Computing Initiative (ASCI) Program Plan [FY2000

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-01-01

    In August 1995, the United States took a significant step to reduce the nuclear danger. The decision to pursue a zero- yield Comprehensive Test Ban Treaty will allow greater control over the proliferation of nuclear weapons and will halt the growth of new nuclear systems. This step is only possible because of the Stockpile Stewardship Program, which provides an alternative means of ensuring the safety, performance, and reliability of the United States' enduring stockpile. At the heart of the Stockpile Stewardship Program is ASCI, which will create the high-confidence simulation capabilities needed to integrate fundamental science, experiments, and archival data into the stewardship of the actual weapons in the stockpile. ASCI will also serve to drive the development of simulation as a national resource by working closely with the computer industry and with universities.

  14. A Computer Program for a Canonical Problem in Underwater Shock

    Directory of Open Access Journals (Sweden)

    Thomas L. Geers

    1994-01-01

    Full Text Available Finite-element/boundary-element codes are widely used to analyze the response of marine structures to underwater explosions. An important step in verifying the correctness and accuracy of such codes is the comparison of code-generated results for canonical problems with corresponding analytical or semianalytical results. At the present time, such comparisons rely on hardcopy results presented in technical journals and reports. This article describes a computer program available from SAVIAC that produces user-selected numerical results for a step-wave-excited spherical shell submerged in and (optionally filled with an acoustic fluid. The method of solution employed in the program is based on classical expansion of the field quantities in generalized Fourier series in the meridional coordinate. Convergence of the series is enhanced by judicious application of modified Cesàro summation and partial closed-form solution.

  15. The Effects of a Robot Game Environment on Computer Programming Education for Elementary School Students

    Science.gov (United States)

    Shim, Jaekwoun; Kwon, Daiyoung; Lee, Wongyu

    2017-01-01

    In the past, computer programming was perceived as a task only carried out by computer scientists; in the 21st century, however, computer programming is viewed as a critical and necessary skill that everyone should learn. In order to improve teaching of problem-solving abilities in a computing environment, extensive research is being done on…

  16. Easy-to-use application programs for decay heat and delayed neutron calculations on personal computers

    Energy Technology Data Exchange (ETDEWEB)

    Oyamatsu, Kazuhiro [Nagoya Univ. (Japan)

    1998-03-01

    Application programs for personal computers are developed to calculate the decay heat power and delayed neutron activity from fission products. The main programs can be used in any computers from personal computers to main frames because their sources are written in Fortran. These programs have user friendly interfaces to be used easily not only for research activities but also for educational purposes. (author)

  17. A minimax technique for time-domain design of preset digital equalizers using linear programming

    Science.gov (United States)

    Vaughn, G. L.; Houts, R. C.

    1975-01-01

    A linear programming technique is presented for the design of a preset finite-impulse response (FIR) digital filter to equalize the intersymbol interference (ISI) present in a baseband channel with known impulse response. A minimax technique is used which minimizes the maximum absolute error between the actual received waveform and a specified raised-cosine waveform. Transversal and frequency-sampling FIR digital filters are compared as to the accuracy of the approximation, the resultant ISI and the transmitted energy required. The transversal designs typically have slightly better waveform accuracy for a given distortion; however, the frequency-sampling equalizer uses fewer multipliers and requires less transmitted energy. A restricted transversal design is shown to use the least number of multipliers at the cost of a significant increase in energy and loss of waveform accuracy at the receiver.

  18. Validation of a digital videodensitometric program analysis for measurement of left ventricular ejection fraction

    Energy Technology Data Exchange (ETDEWEB)

    Gaux, J.C.; Angel, C.Y.; Pernes, J.M.; Raynaud, A.; Brenot, Ph.; Vuthien, H.; Letienne, G.

    1987-07-01

    Left ventricular (LV) function was studied in 30 patients using digital subtraction angiography by the intravenous approach. Each ventriculogram was processed with a specific video-densitometric analysis to determine LV ejection fraction. The program was verified in an experimental set-up consisting of nine latex balloons filled with contrast medium. Its validation has been established by comparing videodensitometric results with classical results supplied by geometric methods. A good correlation was obtained (r 0.9449) and, furthermore, with experimental models, videodensitometric analysis seemed to be more accurate than geometric analysis. Digital videodensitometry appears to be a valuable and accurate method for quantifying LV function, and a promising technique for determination of the real volumes.

  19. Research on the digital education resources of sharing pattern in independent colleges based on cloud computing environment

    Science.gov (United States)

    Xiong, Ting; He, Zhiwen

    2017-06-01

    Cloud computing was first proposed by Google Company in the United States, which was based on the Internet center, providing a standard and open network sharing service approach. With the rapid development of the higher education in China, the educational resources provided by colleges and universities had greatly gap in the actual needs of teaching resources. therefore, Cloud computing of using the Internet technology to provide shared methods liked the timely rain, which had become an important means of the Digital Education on sharing applications in the current higher education. Based on Cloud computing environment, the paper analyzed the existing problems about the sharing of digital educational resources in Jiangxi Province Independent Colleges. According to the sharing characteristics of mass storage, efficient operation and low input about Cloud computing, the author explored and studied the design of the sharing model about the digital educational resources of higher education in Independent College. Finally, the design of the shared model was put into the practical applications.

  20. Program Predicts Time Courses of Human/Computer Interactions

    Science.gov (United States)

    Vera, Alonso; Howes, Andrew

    2005-01-01

    CPM X is a computer program that predicts sequences of, and amounts of time taken by, routine actions performed by a skilled person performing a task. Unlike programs that simulate the interaction of the person with the task environment, CPM X predicts the time course of events as consequences of encoded constraints on human behavior. The constraints determine which cognitive and environmental processes can occur simultaneously and which have sequential dependencies. The input to CPM X comprises (1) a description of a task and strategy in a hierarchical description language and (2) a description of architectural constraints in the form of rules governing interactions of fundamental cognitive, perceptual, and motor operations. The output of CPM X is a Program Evaluation Review Technique (PERT) chart that presents a schedule of predicted cognitive, motor, and perceptual operators interacting with a task environment. The CPM X program allows direct, a priori prediction of skilled user performance on complex human-machine systems, providing a way to assess critical interfaces before they are deployed in mission contexts.

  1. Analog to digital converter for two-dimensional radiant energy array computers

    Science.gov (United States)

    Shaefer, D. H.; Strong, J. P., III (Inventor)

    1977-01-01

    The analog to digital converter stage derives a bit array of digital radiant energy signals representative of the amplitudes of an input radiant energy analog signal array and derives an output radiant energy analog signal array to serve as an input to succeeding stages. The converter stage includes a digital radiant energy array device which contains radiant energy array positions so that the analog array is less than a predetermined threshold level. A scaling device amplifies the radiant signal levels of the input array and the digital array so that the radiant energy signal level carried by the digital array corresponds to the threshold level. An adder device adds the signals of the scaled input and digital arrays at corresponding array positions to form the output analog array.

  2. Digital System e-Prognostics for Critical Aircraft Computer Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Impact Technologies, in cooperation with Raytheon, proposes to develop and demonstrate an innovative prognostics approach for aircraft digital electronics. The...

  3. Automated Breast Density Computation in Digital Mammography and Digital Breast Tomosynthesis: Influence on Mean Glandular Dose and BIRADS Density Categorization.

    Science.gov (United States)

    Castillo-García, Maria; Chevalier, Margarita; Garayoa, Julia; Rodriguez-Ruiz, Alejandro; García-Pinto, Diego; Valverde, Julio

    2017-07-01

    The study aimed to compare the breast density estimates from two algorithms on full-field digital mammography (FFDM) and digital breast tomosynthesis (DBT) and to analyze the clinical implications. We selected 561 FFDM and DBT examinations from patients without breast pathologies. Two versions of a commercial software (Quantra 2D and Quantra 3D) calculated the volumetric breast density automatically in FFDM and DBT, respectively. Other parameters such as area breast density and total breast volume were evaluated. We compared the results from both algorithms using the Mann-Whitney U non-parametric test and the Spearman's rank coefficient for data correlation analysis. Mean glandular dose (MGD) was calculated following the methodology proposed by Dance et al. Measurements with both algorithms are well correlated (r ≥ 0.77). However, there are statistically significant differences between the medians (P density median values from FFDM are, respectively, 8% and 77% higher than DBT estimations. Both algorithms classify 35% and 55% of breasts into BIRADS (Breast Imaging-Reporting and Data System) b and c categories, respectively. There are no significant differences between the MGD calculated using the breast density from each algorithm. DBT delivers higher MGD than FFDM, with a lower difference (5%) for breasts in the BIRADS d category. MGD is, on average, 6% higher than values obtained with the breast glandularity proposed by Dance et al. Breast density measurements from both algorithms lead to equivalent BIRADS classification and MGD values, hence showing no difference in clinical outcomes. The median MGD values of FFDM and DBT examinations are similar for dense breasts (BIRADS d category). Published by Elsevier Inc.

  4. Computer programs for forward and inverse modeling of acoustic and electromagnetic data

    Science.gov (United States)

    Ellefsen, Karl J.

    2011-01-01

    A suite of computer programs was developed by U.S. Geological Survey personnel for forward and inverse modeling of acoustic and electromagnetic data. This report describes the computer resources that are needed to execute the programs, the installation of the programs, the program designs, some tests of their accuracy, and some suggested improvements.

  5. Technical Note: Guidelines for the digital computation of 2D and 3D enamel thickness in hominoid teeth.

    Science.gov (United States)

    Benazzi, Stefano; Panetta, Daniele; Fornai, Cinzia; Toussaint, Michel; Gruppioni, Giorgio; Hublin, Jean-Jacques

    2014-02-01

    The study of enamel thickness has received considerable attention in regard to the taxonomic, phylogenetic and dietary assessment of human and non-human primates. Recent developments based on two-dimensional (2D) and three-dimensional (3D) digital techniques have facilitated accurate analyses, preserving the original object from invasive procedures. Various digital protocols have been proposed. These include several procedures based on manual handling of the virtual models and technical shortcomings, which prevent other scholars from confidently reproducing the entire digital protocol. There is a compelling need for standard, reproducible, and well-tailored protocols for the digital analysis of 2D and 3D dental enamel thickness. In this contribution we provide essential guidelines for the digital computation of 2D and 3D enamel thickness in hominoid molars, premolars, canines and incisors. We modify previous techniques suggested for 2D analysis and we develop a new approach for 3D analysis that can also be applied to premolars and anterior teeth. For each tooth class, the cervical line should be considered as the fundamental morphological feature both to isolate the crown from the root (for 3D analysis) and to define the direction of the cross-sections (for 2D analysis). Copyright © 2013 Wiley Periodicals, Inc.

  6. [Computer-aided method and rapid prototyping for the personalized fabrication of a silicone bandage digital prosthesis].

    Science.gov (United States)

    Ventura Ferreira, Nuno; Leal, Nuno; Correia Sá, Inês; Reis, Ana; Marques, Marisa

    2014-01-01

    The fabrication of digital prostheses has acquired growing importance not only for the possibility for the patient to overcome psychosocial trauma but also to promote grip functionality. An application method of three dimensional-computer-aided design technologies for the production of passive prostheses is presented by means of a fifth finger amputee clinical case following bilateral hand replantation.Three-dimensional-computerized tomography was used for the collection of anthropometric images of the hands. Computer-aided design techniques were used to develop the digital file-based prosthesis from the reconstruction images by inversion and superimposing the contra-lateral finger images. The rapid prototyping manufacturing method was used for the production of a silicone bandage prosthesis prototype. This approach replaces the traditional manual method by a virtual method that is basis for the optimization of a high speed, accurate and innovative process.

  7. The comparative study on diagnostic validity of cerebral aneurysm by computed tomography angiography versus digital subtraction angiography after subarachnoid hemorrhage

    Directory of Open Access Journals (Sweden)

    Masih Saboori

    2011-01-01

    Full Text Available Background: In order to declare the preoperative diagnostic value of brain aneurysms, two radiological modalities, computed tomographic angiography and digital subtraction angiography were compared. Methods: In this descriptive analytic study, diagnostic value of computed tomographic angiography (CTA was com-pared with digital subtraction angiography (DSA. Sensitivity, specificity, positive and negative predictive values were calculated and compared between the two modalities. All data were analyzed with SPSS software, version 16. Results: Mean age of patients was 49.5 ± 9.13 years. 57.9 % of subjects were female. CTA showed 89% sensitivity and 100% specificity whereas DSA demonstrated 74% sensitivity and 100% specificity. Positive predictive value of both methods was 100%, but negative predictive value of CTA and DSA was 85% and 69%, respectively. Conclusions: Based on our data, CTA is a valuable diagnostic modality for detection of brain aneurysm and su-barachnoid hemorrhage.

  8. Threshold evaluation data revision and computer program enhancement. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1979-02-27

    The Threshold Evaluation System was developed to assist the Division of Buildings and Community Systems of the Department of Energy in performing preliminary evaluation of projects being considered for funding. In addition, the evaluation has been applied to on-going projects, because information obtained through RD and D may alter the expected benefits and costs of a project, making it necessary to reevaluate project funding. The system evaluates each project according to its expected energy savings and costs. A number of public and private sector criteria are calculated, upon which comparisons between projects may be based. A summary of the methodology is given in Appendix B. The purpose of this task is to upgrade both the quality of the data used for input to the system and the usefulness and efficiency of the computer program used to perform the analysis. The modifications required to produce a better, more consistent set of data are described in Section 2. Program changes that have had a significant impact on the methodology are discussed in Section 3, while those that affected only the computer code are presented as a system flow diagram and program listing in Appendix C. These improvements in the project evaluation methodology and data will provide BCS with a more efficient and comprehensive management tool. The direction of future work will be toward integrating this system with a large scale (at ORNL) so that information used by both systems may be stored in a common data base. A discussion of this, and other unresolved problems is given in Section 4.

  9. Frequency Domain Computer Programs for Prediction and Analysis of Rail Vehicle Dynamics : Volume 2. Appendixes

    Science.gov (United States)

    1975-12-01

    Frequency domain computer programs developed or acquired by TSC for the analysis of rail vehicle dynamics are described in two volumes. Volume 2 contains program listings including subroutines for the four TSC frequency domain programs described in V...

  10. Second Annual AEC Scientific Computer Information Exhange Meeting. Proceedings of the technical program theme: computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Peskin,A.M.; Shimamoto, Y.

    1974-01-01

    The topic of computer graphics serves well to illustrate that AEC affiliated scientific computing installations are well represented in the forefront of computing science activities. The participant response to the technical program was overwhelming--both in number of contributions and quality of the work described. Session I, entitled Advanced Systems, contains presentations describing systems that contain features not generally found in graphics facilities. These features can be roughly classified as extensions of standard two-dimensional monochromatic imaging to higher dimensions including color and time as well as multidimensional metrics. Session II presents seven diverse applications ranging from high energy physics to medicine. Session III describes a number of important developments in establishing facilities, techniques and enhancements in the computer graphics area. Although an attempt was made to schedule as many of these worthwhile presentations as possible, it appeared impossible to do so given the scheduling constraints of the meeting. A number of prospective presenters 'came to the rescue' by graciously withdrawing from the sessions. Some of their abstracts have been included in the Proceedings.

  11. Digital imaging primer

    CERN Document Server

    Parkin, Alan

    2016-01-01

    Digital Imaging targets everyyone with an interest in digital imaging, be they professional or private, who uses even quite modest equipment such as a PC, digital camera and scanner, a graphics editor such as Paint, and an inkjet printer. Uniquely, it is intended to fill the gap between highly technical texts for academics (with access to expensive equipment) and superficial introductions for amateurs. The four-part treatment spans theory, technology, programs and practice. Theory covers integer arithmetic, additive and subtractive color, greyscales, computational geometry, and a new presentation of discrete Fourier analysis; Technology considers bitmap file structures, scanners, digital cameras, graphic editors, and inkjet printers; Programs develops several processing tools for use in conjunction with a standard Paint graphics editor and supplementary processing tools; Practice discusses 1-bit, greyscale, 4-bit, 8-bit, and 24-bit images for the practice section. Relevant QBASIC code is supplied an accompa...

  12. Multi-Language Programming Environments for High Performance Java Computing

    Directory of Open Access Journals (Sweden)

    Vladimir Getov

    1999-01-01

    Full Text Available Recent developments in processor capabilities, software tools, programming languages and programming paradigms have brought about new approaches to high performance computing. A steadfast component of this dynamic evolution has been the scientific community’s reliance on established scientific packages. As a consequence, programmers of high‐performance applications are reluctant to embrace evolving languages such as Java. This paper describes the Java‐to‐C Interface (JCI tool which provides application programmers wishing to use Java with immediate accessibility to existing scientific packages. The JCI tool also facilitates rapid development and reuse of existing code. These benefits are provided at minimal cost to the programmer. While beneficial to the programmer, the additional advantages of mixed‐language programming in terms of application performance and portability are addressed in detail within the context of this paper. In addition, we discuss how the JCI tool is complementing other ongoing projects such as IBM’s High‐Performance Compiler for Java (HPCJ and IceT’s metacomputing environment.

  13. Computer Says No: An Analysis of Three Digital Food Education Resources

    Science.gov (United States)

    Gard, Michael; Enright, Eimear

    2016-01-01

    What kind of thing will food education become in digitised classrooms? Drawn from a broader research project concerned with the "e turn" in school health and physical education, this paper analyses three examples of digital food education (DEF). This is done by considering the role of digital technology in changing--or not…

  14. Designing Computer-Based Learning Contents: Influence of Digital Zoom on Attention

    Science.gov (United States)

    Glaser, Manuela; Lengyel, Dominik; Toulouse, Catherine; Schwan, Stephan

    2017-01-01

    In the present study, we investigated the role of digital zoom as a tool for directing attention while looking at visual learning material. In particular, we analyzed whether minimal digital zoom functions similarly to a rhetorical device by cueing mental zooming of attention accordingly. Participants were presented either static film clips, film…

  15. A Computer-Assisted Learning Model Based on the Digital Game Exponential Reward System

    Science.gov (United States)

    Moon, Man-Ki; Jahng, Surng-Gahb; Kim, Tae-Yong

    2011-01-01

    The aim of this research was to construct a motivational model which would stimulate voluntary and proactive learning using digital game methods offering players more freedom and control. The theoretical framework of this research lays the foundation for a pedagogical learning model based on digital games. We analyzed the game reward system, which…

  16. UmUTracker: A versatile MATLAB program for automated particle tracking of 2D light microscopy or 3D digital holography data

    Science.gov (United States)

    Zhang, Hanqing; Stangner, Tim; Wiklund, Krister; Rodriguez, Alvaro; Andersson, Magnus

    2017-10-01

    We present a versatile and fast MATLAB program (UmUTracker) that automatically detects and tracks particles by analyzing video sequences acquired by either light microscopy or digital in-line holographic microscopy. Our program detects the 2D lateral positions of particles with an algorithm based on the isosceles triangle transform, and reconstructs their 3D axial positions by a fast implementation of the Rayleigh-Sommerfeld model using a radial intensity profile. To validate the accuracy and performance of our program, we first track the 2D position of polystyrene particles using bright field and digital holographic microscopy. Second, we determine the 3D particle position by analyzing synthetic and experimentally acquired holograms. Finally, to highlight the full program features, we profile the microfluidic flow in a 100 μm high flow chamber. This result agrees with computational fluid dynamic simulations. On a regular desktop computer UmUTracker can detect, analyze, and track multiple particles at 5 frames per second for a template size of 201 ×201 in a 1024 × 1024 image. To enhance usability and to make it easy to implement new functions we used object-oriented programming. UmUTracker is suitable for studies related to: particle dynamics, cell localization, colloids and microfluidic flow measurement. Program Files doi : http://dx.doi.org/10.17632/fkprs4s6xp.1 Licensing provisions : Creative Commons by 4.0 (CC by 4.0) Programming language : MATLAB Nature of problem: 3D multi-particle tracking is a common technique in physics, chemistry and biology. However, in terms of accuracy, reliable particle tracking is a challenging task since results depend on sample illumination, particle overlap, motion blur and noise from recording sensors. Additionally, the computational performance is also an issue if, for example, a computationally expensive process is executed, such as axial particle position reconstruction from digital holographic microscopy data. Versatile

  17. From Cards To Digital Games

    DEFF Research Database (Denmark)

    Valente, Andrea; Marchetti, Emanuela

    2017-01-01

    This study is based on an iterative, participatory design investigation that we are conducting in order to create digital games that could be flexibly re-designed by players, without requiring programming knowledge. In particular we focus on digital game development, both design and implementation......, for primary school pupils and their teachers. We propose a scenario where digital game development is mediated by tinkering with paper prototypes similar to board games. We address the problems of making sense and expressing rules of a digital game without programming. Analysis of our latest participatory...... workshop offers evidence that a board game can work as a tangible model of the computation happening in a digital game. Children understand the practice of designing games mainly as manipulation of features and behaviors of the visual elements of a game. We attempt at looking beyond visual programming...

  18. Digital Art and Design

    OpenAIRE

    Khaldoun A. A. BESOUL; Al Salaimeh, Safwan; Khaled BATIHA

    2007-01-01

    The desire to create unique things and give free rain to one's imagination served as a powerful impetus to the development of digital art and design software. The commoner was the use of computers the wider variety of professional software was developed. Nowadays the creators and computer designers are receiving more and more new and advanced programs that allow their ideas becoming virtual reality. This research paper looks at the history of the development of graphic editors from the simple...

  19. Introduction to Focus Issue: Intrinsic and Designed Computation: Information Processing in Dynamical Systems-Beyond the Digital Hegemony

    Science.gov (United States)

    Crutchfield, James P.; Ditto, William L.; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws—that predicted the inexorable improvement in digital circuitry—to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  20. Introduction to focus issue: intrinsic and designed computation: information processing in dynamical systems--beyond the digital hegemony.

    Science.gov (United States)

    Crutchfield, James P; Ditto, William L; Sinha, Sudeshna

    2010-09-01

    How dynamical systems store and process information is a fundamental question that touches a remarkably wide set of contemporary issues: from the breakdown of Moore's scaling laws--that predicted the inexorable improvement in digital circuitry--to basic philosophical problems of pattern in the natural world. It is a question that also returns one to the earliest days of the foundations of dynamical systems theory, probability theory, mathematical logic, communication theory, and theoretical computer science. We introduce the broad and rather eclectic set of articles in this Focus Issue that highlights a range of current challenges in computing and dynamical systems.

  1. Pigmented Skin Lesion Biopsies After Computer-Aided Multispectral Digital Skin Lesion Analysis.

    Science.gov (United States)

    Winkelmann, Richard R; Tucker, Natalie; White, Richard; Rigel, Darrell S

    2015-11-01

    The incidence of melanoma has been rising over the past century. With 37% of patients presenting to their primary care physician with at least 1 skin problem, primary care physicians and other nondermatologist practitioners have substantial opportunity to make an impact at the forefront of the disease process. New diagnostic aids have been developed to augment physician analysis of suspicious pigmented skin lesions (PSLs). To determine the effects of computer-aided multispectral digital skin lesion analysis (MSDSLA) on dermatologists' and nondermatologist clinicians' decisions to biopsy suspicious PSLs after clinical and dermatoscopic evaluation. Participants were shown 6 images of PSLs. For each PSL, participants were asked 3 times if they would biopsy the lesion: first after reviewing a clinical image of the PSL, again after reviewing a high-resolution dermatoscopic image, and again after reviewing MSDSLA probability findings. An answer was right if a melanoma or high-risk lesion was selected for biopsy or a low-risk lesion was not selected for biopsy. An answer was wrong if a melanoma or high-risk lesion was not selected for biopsy or a low-risk lesion was selected for biopsy. Clinicians' decisions to biopsy were evaluated using χ² analysis for proportions. Data were analyzed from a total of 212 participants, 177 of whom were dermatologists. Overall, sensitivity of clinical image review was 63%; dermatoscopic image review, 5%; and MSDSLA, 83%. Specificity of clinical image review was 59%; dermatoscopic image review, 40%; and MSDSLA, 76%. Biopsy decision accuracy was 61% after review of clinical images, 52% after review of dermatoscopic images, and 80% after review of MSDSLA findings. The number of lesions participants indicated that they would biopsy increased significantly, from 52% after reviewing clinical images to 63% after reviewing dermatoscopic images (Plesions.

  2. Computer-aided detection system applied to full-field digital mammograms

    Energy Technology Data Exchange (ETDEWEB)

    Vega Bolivar, Alfonso; Sanchez Gomez, Sonia; Merino, Paula; Alonso-Bartolome, Pilar; Ortega Garcia, Estrella (Dept. of Radiology, Univ. Marques of Valdecilla Hospital, Santander (Spain)), e-mail: avegab@telefonica.net; Munoz Cacho, Pedro (Dept. of Statistics, Univ. Marques of Valdecilla Hospital, Santander (Spain)); Hoffmeister, Jeffrey W. (iCAD, Inc., Nashua, NH (United States))

    2010-12-15

    Background: Although mammography remains the mainstay for breast cancer screening, it is an imperfect examination with a sensitivity of 75-92% for breast cancer. Computer-aided detection (CAD) has been developed to improve mammographic detection of breast cancer. Purpose: To retrospectively estimate CAD sensitivity and false-positive rate with full-field digital mammograms (FFDMs). Material and Methods: CAD was used to evaluate 151 cases of ductal carcinoma in situ (DCIS) (n=48) and invasive breast cancer (n=103) detected with FFDM. Retrospectively, CAD sensitivity was estimated based on breast density, mammographic presentation, histopathology type, and lesion size. CAD false-positive rate was estimated with screening FFDMs from 200 women. Results: CAD detected 93% (141/151) of cancer cases: 97% (28/29) in fatty breasts, 94% (81/86) in breasts containing scattered fibroglandular densities, 90% (28/31) in heterogeneously dense breasts, and 80% (4/5) in extremely dense breasts. CAD detected 98% (54/55) of cancers manifesting as calcifications, 89% (74/83) as masses, and 100% (13/13) as mixed masses and calcifications. CAD detected 92% (73/79) of invasive ductal carcinomas, 89% (8/9) of invasive lobular carcinomas, 93% (14/15) of other invasive carcinomas, and 96% (46/48) of DCIS. CAD sensitivity for cancers 1-10 mm was 87% (47/54); 11-20 mm, 99% (70/71); 21-30 mm, 86% (12/14); and larger than 30 mm, 100% (12/12). The CAD false-positive rate was 2.5 marks per case. Conclusion: CAD with FFDM showed a high sensitivity in identifying cancers manifesting as calcifications or masses. CAD sensitivity was maintained in small lesions (1-20 mm) and invasive lobular carcinomas, which have lower mammographic sensitivity

  3. The Digital Astronaut Project Computational Bone Remodeling Model (Beta Version) Bone Summit Summary Report

    Science.gov (United States)

    Pennline, James; Mulugeta, Lealem

    2013-01-01

    Under the conditions of microgravity, astronauts lose bone mass at a rate of 1% to 2% a month, particularly in the lower extremities such as the proximal femur [1-3]. The most commonly used countermeasure against bone loss in microgravity has been prescribed exercise [4]. However, data has shown that existing exercise countermeasures are not as effective as desired for preventing bone loss in long duration, 4 to 6 months, spaceflight [1,3,5,6]. This spaceflight related bone loss may cause early onset of osteoporosis to place the astronauts at greater risk of fracture later in their lives. Consequently, NASA seeks to have improved understanding of the mechanisms of bone demineralization in microgravity in order to appropriately quantify this risk, and to establish appropriate countermeasures [7]. In this light, NASA's Digital Astronaut Project (DAP) is working with the NASA Bone Discipline Lead to implement well-validated computational models to help predict and assess bone loss during spaceflight, and enhance exercise countermeasure development. More specifically, computational modeling is proposed as a way to augment bone research and exercise countermeasure development to target weight-bearing skeletal sites that are most susceptible to bone loss in microgravity, and thus at higher risk for fracture. Given that hip fractures can be debilitating, the initial model development focused on the femoral neck. Future efforts will focus on including other key load bearing bone sites such as the greater trochanter, lower lumbar, proximal femur and calcaneus. The DAP has currently established an initial model (Beta Version) of bone loss due to skeletal unloading in femoral neck region. The model calculates changes in mineralized volume fraction of bone in this segment and relates it to changes in bone mineral density (vBMD) measured by Quantitative Computed Tomography (QCT). The model is governed by equations describing changes in bone volume fraction (BVF), and rates of

  4. VICAR-DIGITAL image processing system

    Science.gov (United States)

    Billingsley, F.; Bressler, S.; Friden, H.; Morecroft, J.; Nathan, R.; Rindfleisch, T.; Selzer, R.

    1969-01-01

    Computer program corrects various photometic, geometric and frequency response distortions in pictures. The program converts pictures to a number of elements, with each elements optical density quantized to a numerical value. The translated picture is recorded on magnetic tape in digital form for subsequent processing and enhancement by computer.

  5. Their Portfolios, Our Role: Examining a Community College Teacher Education Digital Portfolio Program from the Students' Perspective

    Science.gov (United States)

    Plaisir, Jean Y.; Hachey, Alyse C.; Theilheimer, Rachel

    2011-01-01

    In the of Fall 2006, our large, urban community college implemented digital portfolio development for all of the preservice early childhood educators registered in the infant-toddler and preschool-early elementary programs. Three years after implementation of the program, we conducted survey research to assess our students' perceptions of their…

  6. Aether: Leveraging Linear Programming For Optimal Cloud Computing In Genomics.

    Science.gov (United States)

    Luber, Jacob M; Tierney, Braden T; Cofer, Evan M; Patel, Chirag J; Kostic, Aleksandar D

    2017-12-08

    Across biology we are seeing rapid developments in scale of data production without a corresponding increase in data analysis capabilities. Here, we present Aether (http://aether.kosticlab.org), an intuitive, easy-to-use, cost-effective, and scalable framework that uses linear programming (LP) to optimally bid on and deploy combinations of underutilized cloud computing resources. Our approach simultaneously minimizes the cost of data analysis and provides an easy transition from users' existing HPC pipelines. Data utilized are available at https://pubs.broadinstitute.org/diabimmune and with EBI SRA accession ERP005989. Source code is available at (https://github.com/kosticlab/aether). Examples, documentation, and a tutorial are available at (http://aether.kosticlab.org). chirag_patel@hms.harvard.edu and aleksandar.kostic@joslin.harvard.edu. Supplementary data are available at Bioinformatics online.

  7. Interactive, Computer-Based Training Program for Radiological Workers

    Energy Technology Data Exchange (ETDEWEB)

    Trinoskey, P.A.; Camacho, P.I.; Wells, L.

    2000-01-18

    Lawrence Livermore National Laboratory (LLNL) is redesigning its Computer-Based Training (CBT) program for radiological workers. The redesign represents a major effort to produce a single, highly interactive and flexible CBT program that will meet the training needs of a wide range of radiological workers--from researchers and x-ray operators to individuals working in tritium, uranium, plutonium, and accelerator facilities. The new CBT program addresses the broad diversity of backgrounds found at a national laboratory. When a training audience is homogeneous in terms of education level and type of work performed, it is difficult to duplicate the effectiveness of a flexible, technically competent instructor who can tailor a course to the express needs and concerns of a course's participants. Unfortunately, such homogeneity is rare. At LLNL, they have a diverse workforce engaged in a wide range of radiological activities, from the fairly common to the quite exotic. As a result, the Laboratory must offer a wide variety of radiological worker courses. These include a general contamination-control course in addition to radioactive-material-handling courses for both low-level laboratory (i.e., bench-top) activities as well as high-level work in tritium, uranium, and plutonium facilities. They also offer training courses for employees who work with radiation-generating devices--x-ray, accelerator, and E-beam operators, for instance. However, even with the number and variety of courses the Laboratory offers, they are constrained by the diversity of backgrounds (i.e., knowledge and experience) of those to be trained. Moreover, time constraints often preclude in-depth coverage of site- and/or task-specific details. In response to this situation, several years ago LLNL began moving toward computer-based training for radiological workers. Today, that CBT effort includes a general radiological safety course developed by the Department of Energy's Hanford facility and

  8. The Julia programming language: the future of scientific computing

    Science.gov (United States)

    Gibson, John

    2017-11-01

    Julia is an innovative new open-source programming language for high-level, high-performance numerical computing. Julia combines the general-purpose breadth and extensibility of Python, the ease-of-use and numeric focus of Matlab, the speed of C and Fortran, and the metaprogramming power of Lisp. Julia uses type inference and just-in-time compilation to compile high-level user code to machine code on the fly. A rich set of numeric types and extensive numerical libraries are built-in. As a result, Julia is competitive with Matlab for interactive graphical exploration and with C and Fortran for high-performance computing. This talk interactively demonstrates Julia's numerical features and benchmarks Julia against C, C++, Fortran, Matlab, and Python on a spectral time-stepping algorithm for a 1d nonlinear partial differential equation. The Julia code is nearly as compact as Matlab and nearly as fast as Fortran. This material is based upon work supported by the National Science Foundation under Grant No. 1554149.

  9. Digital Forensics

    OpenAIRE

    Garfinkel, Simson L.

    2013-01-01

    A reprint from American Scientist the magazine of Sigma Xi, The Scientific Research Society Since the 1980s, computers have had increasing roles in all aspects of human life—including an involvement in criminal acts. This development has led to the rise of digital forensics, the uncovering and examination of evidence located on all things electronic with digital storage, including computers, cell phones, and networks. Digital forensics researchers and practitione...

  10. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    Science.gov (United States)

    Linder, Nina; Turkki, Riku; Walliander, Margarita; Mårtensson, Andreas; Diwan, Vinod; Rahtu, Esa; Pietikäinen, Matti; Lundin, Mikael; Lundin, Johan

    2014-01-01

    Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27) and uninfected controls (n = 20) were digitally scanned with an oil immersion objective (0.1 µm/pixel) to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors) used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls). From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for visual examination and has a potential to increase the

  11. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    Directory of Open Access Journals (Sweden)

    Nina Linder

    Full Text Available INTRODUCTION: Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. METHODS: Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27 and uninfected controls (n = 20 were digitally scanned with an oil immersion objective (0.1 µm/pixel to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. RESULTS: The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls. From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. CONCLUSION: We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for

  12. Is the "Net Generation" Ready for Digital Citizenship? Perspectives from the IEA International Computer and Information Literacy Study 2013. Policy Brief No. 6

    Science.gov (United States)

    Watkins, Ryan; Engel, Laura C.; Hastedt, Dirk

    2015-01-01

    The rise of digital information and communication technologies (ICT) has made the acquisition of computer and information literacy (CIL) a leading factor in creating an engaged, informed, and employable citizenry. However, are young people, often described as "digital natives" or the "net generation," developing the necessary…

  13. Validity, reliability, and reproducibility of linear measurements on digital models obtained from intraoral and cone-beam computed tomography scans of alginate impressions

    NARCIS (Netherlands)

    Wiranto, Matthew G.; Engelbrecht, W. Petrie; Nolthenius, Heleen E. Tutein; van der Meer, W. Joerd; Ren, Yijin

    INTRODUCTION: Digital 3-dimensional models are widely used for orthodontic diagnosis. The aim of this study was to assess the validity, reliability, and reproducibility of digital models obtained from the Lava Chairside Oral scanner (3M ESPE, Seefeld, Germany) and cone-beam computed tomography scans

  14. Does Performance in Digital Reading Relate to Computer Game Playing? A Study of Factor Structure and Gender Patterns in 15-Year-Olds' Reading Literacy Performance

    Science.gov (United States)

    Rasmusson, Maria; Åberg-Bengtsson, Lisbeth

    2015-01-01

    Data from a Swedish PISA-sample were used (1) to identify a digital reading factor, (2) to investigate gender differences in this factor (if found), and (3) to explore how computer game playing might relate to digital reading performance and gender. The analyses were conducted with structural equation modeling techniques. In addition to an overall…

  15. USNA DIGITAL FORENSICS LAB

    Data.gov (United States)

    Federal Laboratory Consortium — To enable Digital Forensics and Computer Security research and educational opportunities across majors and departments. Lab MissionEstablish and maintain a Digital...

  16. A program for reading DNA sequence gels using a small computer equipped with a graphics tablet.

    Science.gov (United States)

    Lautenberger, J A

    1982-01-01

    A program has been written in BASIC that allows DNA sequence gels to be read by a Tektronix model 4052 computer equipped with a graphics tablet. Sequences from each gel are stored on tape for later transfer to a larger computer where they are melded into a complete overall sequence. The program should be adaptable to other small computers. PMID:7063401

  17. A Study of the Programming Languages Used in Information Systems and in Computer Science Curricula

    Science.gov (United States)

    Russell, Jack; Russell, Barbara; Pollacia, Lissa F.; Tastle, William J.

    2010-01-01

    This paper researches the computer languages taught in the first, second and third programming courses in Computer Information Systems (CIS), Management Information Systems (MIS or IS) curricula as well as in Computer Science (CS) and Information Technology (IT) curricula. Instructors teaching the first course in programming within a four year…

  18. Application of digital diagnostic impression, virtual planning, and computer-guided implant surgery for a CAD/CAM-fabricated, implant-supported fixed dental prosthesis: a clinical report.

    Science.gov (United States)

    Stapleton, Brandon M; Lin, Wei-Shao; Ntounis, Athanasios; Harris, Bryan T; Morton, Dean

    2014-09-01

    This clinical report demonstrated the use of an implant-supported fixed dental prosthesis fabricated with a contemporary digital approach. The digital diagnostic data acquisition was completed with a digital diagnostic impression with an intraoral scanner and cone-beam computed tomography with a prefabricated universal radiographic template to design a virtual prosthetically driven implant surgical plan. A surgical template fabricated with computer-aided design and computer-aided manufacturing (CAD/CAM) was used to perform computer-guided implant surgery. The definitive digital data were then used to design the definitive CAD/CAM-fabricated fixed dental prosthesis. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  19. TRECII: a computer program for transportation risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Franklin, A.L.

    1980-05-01

    A risk-based fault tree analysis method has been developed at the Pacific Northwest Laboratory (PNL) for analysis of nuclear fuel cycle operations. This methodology was developed for the Department of Energy (DOE) as a risk analysis tool for evaluating high level waste management systems. A computer package consisting of three programs was written at that time to assist in the performance of risk assessment: ACORN (draws fault trees), MFAULT (analyzes fault trees), and RAFT (calculates risk). This methodology evaluates release consequences and estimates the frequency of occurrence of these consequences. This document describes an additional risk calculating code which can be used in conjunction with two of the three codes for transportation risk assessment. TRECII modifies the definition of risk used in RAFT (prob. x release) to accommodate release consequences in terms of fatalities. Throughout this report risk shall be defined as probability times consequences (fatalities are one possible health effect consequence). This methodology has been applied to a variety of energy material transportation systems. Typically the material shipped has been radioactive, although some adaptation to fossil fuels has occurred. The approach is normally applied to truck or train transport systems with some adaptation to pipelines and aircraft. TRECII is designed to be used primarily in conjunction with MFAULT; however, with a moderate amount of effort by the user, it can be implemented independent of the risk analysis package developed at PNL. Code description and user instructions necessary for the implementation of the TRECII program are provided.

  20. Artifacts found during quality assurance testing of computed radiography and digital radiography detectors

    National Research Council Canada - National Science Library

    Honey, Ian D; Mackenzie, Alistair

    2009-01-01

    ...) and integrated digital radiographic X-ray imaging detectors are presented. The images presented are all either flat field or test object images and show artifacts previously either undescribed in the existing literature or meriting further comment...

  1. Digital manufacturing, industry 4.0, clould computing and thing internet: Brazilian contextualization and reality

    OpenAIRE

    Wagner Cardoso; Walther Azzolini Junior; Jéssica Fernanda Bertosse; Edson Bassi; Emanuel Soares Ponciano

    2017-01-01

    The digital era represents significant changes in the design of IT projects with an emphasis on digital infrastructure, especially in terms of investment and professional qualification, which requires, in Brazil, the creation of specific lines of financing by government development agencies. The creation of demonstration platforms could be an effective initiative to stimulate the dissemination of the concept and the establishment of partnerships between customers and suppliers of new technolo...

  2. Computer-aided design and manufacture of hyrax devices: Can we really go digital?

    DEFF Research Database (Denmark)

    Graf, Simon; Cornelis, Marie; Gameiro, Gustavo Hauber

    2017-01-01

    Highlights - New possibilities of CAD/CAM technologies in orthodontics are illustrated. - A hyrax appliance can be produced with a full digital work flow. - Finite element analysis showed that the design and material delivered the needed strength.......Highlights - New possibilities of CAD/CAM technologies in orthodontics are illustrated. - A hyrax appliance can be produced with a full digital work flow. - Finite element analysis showed that the design and material delivered the needed strength....

  3. Digital Technology: the Effect of Connected World to Computer Ethic and Family

    OpenAIRE

    Benfano Soewito; Sani Muhamad Isa

    2015-01-01

    The development of digital technology such as smartphones, tablets and other gadgets grows very rapidly in the last decade so does the development of mobile applications for those mobile systems or smartphones. Unfortunately, those applications often do not specify the age range for their users. This is actually a problem in the world of digital technology and software development. It is not yet known whether the applications is good be used for children or not. Nowadays, parents are faced wi...

  4. A Hybrid Soft-computing Method for Image Analysis of Digital Plantar Scanners

    OpenAIRE

    Razjouyan, Javad; Khayat, Omid; Siahi, Mehdi; Mansouri, Ali Alizadeh

    2013-01-01

    Digital foot scanners have been developed in recent years to yield anthropometrists digital image of insole with pressure distribution and anthropometric information. In this paper, a hybrid algorithm containing gray level spatial correlation (GLSC) histogram and Shanbag entropy is presented for analysis of scanned foot images. An evolutionary algorithm is also employed to find the optimum parameters of GLSC and transform function of the membership values. Resulting binary images as the thres...

  5. Identification of the Procedural Accidents During Root Canal Preparation Using Digital Intraoral Radiography and Cone Beam Computed Tomography

    Directory of Open Access Journals (Sweden)

    Csinszka K.-Ivácson A.-

    2016-09-01

    Full Text Available Crown or root perforation, ledge formation, fractured instruments and perforation of the roots are the most important accidents which appear during endodontic therapy. Our objective was to evaluate the value of digital intraoral periapical radiographs compared to cone beam computed tomography images (CBCT used to diagnose some procedural accidents. Material and methods: Eleven extracted molars were used in this study. A total of 18 perforations and 13 ledges were created artifically and 10 instruments were fractured in the root canals. Digital intraoral periapical radiographs from two angles and CBCT scans were made with the teeth fixed in position. The images were evaluated and the number of detected accidents were stated in percentages. Statistical analysis was performed using the chi square-test. Results: On digital periapical radiographs the evaluators identified 12 (66.66% perforations, 10 (100 % separated instruments and 10 (76.9% created ledges. The CBCT scans made possible the recognition of 17 (94.66 % perforations, 9 (90 % separated instruments and 13 (100% ledges. The totally recognized accidental procedures showed significant differences between the two groups. (p<0.05 Conclusion: Digital periapical radiographs are the most common imaging modalities used during endodontic treatments. Though, the CBCT allows a better identification of the procedural accidents.

  6. Execution Models for Mapping Programs onto Distributed Memory Parallel Computers

    Science.gov (United States)

    1992-03-01

    DISTRIBUTED MEMORY PARALLEL COMPUTERS Alan Sussman Contract No. NAS1-18605 March 1992 Institute for Computer Applications in Science and Engineering NASA...MEMORY PARALLEL COMPUTERS Alan Sussman 1 Institute for Computer Applications in Science and Engineering NASA Langley Research Center Hampton, VA 23665...Computation onto Distributed Mem- ory Parallel Computers . PhD thesis, Carnegie Mellon University, September 1991. Also available as Technical Report

  7. Multithreaded transactions in scientific computing: New versions of a computer program for kinematical calculations of RHEED intensity oscillations

    Science.gov (United States)

    Brzuszek, Marcin; Daniluk, Andrzej

    2006-11-01

    Writing a concurrent program can be more difficult than writing a sequential program. Programmer needs to think about synchronisation, race conditions and shared variables. Transactions help reduce the inconvenience of using threads. A transaction is an abstraction, which allows programmers to group a sequence of actions on the program into a logical, higher-level computation unit. This paper presents multithreaded versions of the GROWTH program, which allow to calculate the layer coverages during the growth of thin epitaxial films and the corresponding RHEED intensities according to the kinematical approximation. The presented programs also contain graphical user interfaces, which enable displaying program data at run-time. New version program summaryTitles of programs:GROWTHGr, GROWTH06 Catalogue identifier:ADVL_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADVL_v2_0 Program obtainable from:CPC Program Library, Queen's University of Belfast, N. Ireland Catalogue identifier of previous version:ADVL Does the new version supersede the original program:No Computer for which the new version is designed and others on which it has been tested: Pentium-based PC Operating systems or monitors under which the new version has been tested: Windows 9x, XP, NT Programming language used:Object Pascal Memory required to execute with typical data:More than 1 MB Number of bits in a word:64 bits Number of processors used:1 No. of lines in distributed program, including test data, etc.:20 931 Number of bytes in distributed program, including test data, etc.: 1 311 268 Distribution format:tar.gz Nature of physical problem: The programs compute the RHEED intensities during the growth of thin epitaxial structures prepared using the molecular beam epitaxy (MBE). The computations are based on the use of kinematical diffraction theory [P.I. Cohen, G.S. Petrich, P.R. Pukite, G.J. Whaley, A.S. Arrott, Surf. Sci. 216 (1989) 222. [1

  8. Computer-implemented land use classification with pattern recognition software and ERTS digital data. [Mississippi coastal plains

    Science.gov (United States)

    Joyce, A. T.

    1974-01-01

    Significant progress has been made in the classification of surface conditions (land uses) with computer-implemented techniques based on the use of ERTS digital data and pattern recognition software. The supervised technique presently used at the NASA Earth Resources Laboratory is based on maximum likelihood ratioing with a digital table look-up approach to classification. After classification, colors are assigned to the various surface conditions (land uses) classified, and the color-coded classification is film recorded on either positive or negative 9 1/2 in. film at the scale desired. Prints of the film strips are then mosaicked and photographed to produce a land use map in the format desired. Computer extraction of statistical information is performed to show the extent of each surface condition (land use) within any given land unit that can be identified in the image. Evaluations of the product indicate that classification accuracy is well within the limits for use by land resource managers and administrators. Classifications performed with digital data acquired during different seasons indicate that the combination of two or more classifications offer even better accuracy.

  9. Design of the Digital Sky Survey DA and online system---A case history in the use of computer aided tools for data acquisition system design

    Energy Technology Data Exchange (ETDEWEB)

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over {pi} steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard'' methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document. 7 refs.

  10. Middle School Teachers' Perceptions of Computer-Assisted Reading Intervention Programs

    Science.gov (United States)

    Bippert, Kelli; Harmon, Janis

    2017-01-01

    Middle schools often turn to computer-assisted reading intervention programs to improve student reading. The questions guiding this study are (a) in what ways are computer-assisted reading intervention programs utilized, and (b) what are teachers' perceptions about these intervention programs? Nineteen secondary reading teachers were interviewed…

  11. Design and Curriculum Considerations for a Computer Graphics Program in the Arts.

    Science.gov (United States)

    Leeman, Ruedy W.

    This history and state-of-the-art review of computer graphics describes computer graphics programs and proposed programs at Sheridan College (Canada), the Rhode Island School of Design, the University of Oregon, Northern Illinois University, and Ohio State University. These programs are discussed in terms of their philosophy, curriculum, student…

  12. Case Study: Creation of a Degree Program in Computer Security. White Paper.

    Science.gov (United States)

    Belon, Barbara; Wright, Marie

    This paper reports on research into the field of computer security, and undergraduate degrees offered in that field. Research described in the paper reveals only one computer security program at the associate's degree level in the entire country. That program, at Texas State Technical College in Waco, is a 71-credit-hour program leading to an…

  13. The DOE Program in HPCC: High-Performance Computing and Communications.

    Science.gov (United States)

    Department of Energy, Washington, DC. Office of Energy Research.

    This document reports to Congress on the progress that the Department of Energy has made in 1992 toward achieving the goals of the High Performance Computing and Communications (HPCC) program. Its second purpose is to provide a picture of the many programs administered by the Office of Scientific Computing under the auspices of the HPCC program.…

  14. Children Learning Computer Programming: Experiments with Languages, Curricula and Programmable Devices. Technical Report No. 250.

    Science.gov (United States)

    Weyer, S. A.; Cannara, A. B.

    An experiment was conducted to study how children, aged 10-15 years, learn concepts relevant to computer programing and how they learn modern programing languages. The implicit educational goal was to teach thinking strategies through the medium of programing concepts and their applications. The computer languages Simper and Logo were chosen…

  15. Computer-aided detection of clustered microcalcifications in digital breast tomosynthesis: a 3D approach.

    Science.gov (United States)

    Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir M; Helvie, Mark A; Wei, Jun; Zhou, Chuan; Lu, Yao

    2012-01-01

    To design a computer-aided detection (CADe) system for clustered microcalcifications in reconstructed digital breast tomosynthesis (DBT) volumes and to perform a preliminary evaluation of the CADe system. IRB approval and informed consent were obtained in this study. A data set of two-view DBT of 72 breasts containing microcalcification clusters was collected from 72 subjects who were scheduled to undergo breast biopsy. Based on tissue sampling results, 17 cases had breast cancer and 55 were benign. A separate data set of two-view DBT of 38 breasts free of clustered microcalcifications from 38 subjects was collected to independently estimate the number of false-positives (FPs) generated by the CADe system. A radiologist experienced in breast imaging marked the biopsied cluster of microcalcifications with a 3D bounding box using all available clinical and imaging information. A CADe system was designed to detect microcalcification clusters in the reconstructed volume. The system consisted of prescreening, clustering, and false-positive reduction stages. In the prescreening stage, the conspicuity of microcalcification-like objects was increased by an enhancement-modulated 3D calcification response function. An iterative thresholding and 3D object growing method was used to detect cluster seed objects, which were used as potential centers of microcalcification clusters. In the cluster detection stage, microcalcification candidates were identified using a second iterative thresholding procedure, which was applied to the signal-to-noise ratio (SNR) enhanced image voxels with a positive calcification response. Starting with each cluster seed object as the initial cluster center, a dynamic clustering algorithm formed a cluster candidate by including microcalcification candidates within a 3D neighborhood of the cluster seed object that satisfied the clustering criteria. The number, size, and SNR of the microcalcifications in a cluster candidate and the cluster shape were

  16. Detection of various anatomic patterns of root canals in mandibular incisors using digital periapical radiography, 3 cone-beam computed tomographic scanners, and micro-computed tomographic imaging.

    Science.gov (United States)

    Paes da Silva Ramos Fernandes, Luciana Maria; Rice, Dwight; Ordinola-Zapata, Ronald; Alvares Capelozza, Ana Lucia; Bramante, Clovis Monteiro; Jaramillo, David; Christensen, Heidi

    2014-01-01

    The purpose of this study was to compare the accuracy of digital periapical (PA) radiography and 3 cone-beam computed tomographic (CBCT) scanners in the identification of various internal anatomic patterns in mandibular incisors. Forty mandibular incisors were scanned using micro-computed tomographic imaging as the gold standard to establish the internal anatomic pattern. The number of root canals and internal patterns were classified into type I (single canal, n = 12), type Ia (single oval canal, n = 12), and type III (2 canals, n = 16). The teeth were placed in a human mandible, and digital PA radiography and 3 CBCT scans (Kodak 9000 3D [Carestream Health, Rochester, NY], Veraviewepocs 3De [J Morita MFG Corp, Kyoto, Japan], NewTom 5G [QR Srl, Verona, Italy]) were performed. Two blinded examiners classified each tooth's anatomic pattern, which were then compared with the micro-computed tomographic determinations. Considering type I and type Ia, which both presented with 1 root canal, there was a high degree of accuracy for all methods used (P > .05). The same result was found for type III. When identifying the shape of single canals (type I), CBCT imaging was more accurate compared with PA radiography. Concerning oval canals (type Ia), there was a significant difference between PA radiography and NewTom CBCT (PA radiography = 44%, NewTom = 88%). However, there were no significant differences between the 3 CBCT units. Double-exposure digital PA radiography for mandibular incisors is sufficient for the identification of the number of root canals. All CBCT devices showed improved accuracy in the identification of single root canal anatomy when a narrow canal was present. However, the identification of oval canals was improved only with the NewTom CBCT device. Copyright © 2014 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  17. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

  18. Uses of Computed Tomography in the NASA Materials Science Program

    Science.gov (United States)

    Engel, H. Peter; Gillies, Donald C.; Curreri, Peter (Technical Monitor)

    2002-01-01

    Computed Tomography (CT) has proved to be of inestimable use in providing a rapid evaluation of a variety of samples from Mechanics of Granular Materials (MGM) to electronic materials (Ge-Si alloys) to space grown materials such as meteorites. The system at Kennedy Space Center (KSC), because of its convenient geographical location, is ideal for examining samples immediately after returning to Earth. It also has the advantage of the choice of fluxes, and in particular the use of a radioactive cobalt source, which is basically monochromatic. This permits a reasonable measurement of density to be made from which chemical composition can be determined. Due to the current dearth of long duration space grown materials, the CT instrument has been used to characterize materials in preparation for flight, to determine thermal expansion values, and to examine long duration space grown materials, i.e. meteorites. The work will first describe the establishment of the protocol for obtaining the optimum density readings for any material. This will include both the effects of the hardware or instrumental parameters that can be controlled, and the techniques used to process the CT data. Examples will be given of the compositional variation along single crystals of germanium-silicon alloys. Density variation with temperature has been measured in preparation for future materials science experiments; this involved the fabrication and installation of a single zone furnace incorporating a heat pipe to ensure of high temperature uniformity. At the time of writing the thermal expansion of lead has been measured from room temperature to 900 C. Three methods are available. Digital radiography enable length changes to be determined. Prior to melting the sample is small than the container and the diameter change can be measured. Most critical, however, is the density change in solid, through the melting region, and in the liquid state. These data are needed for engineering purposes to aid

  19. Debugging the Program. Computer Equity Strategies for the Classroom Teacher.

    Science.gov (United States)

    Wolfe, Leslie R.; And Others

    Designed to provide classroom teachers with activities to enhance computer equity for female students, this kit is divided into four sections which present excerpts from four other publications: (1) "The Neuter Computer: Computers for Boys and Girls" (Jo Schuchat Sanders and Antonia Stone for the Computer Equity Training Project, Women's…

  20. A Financial Technology Entrepreneurship Program for Computer Science Students

    Science.gov (United States)

    Lawler, James P.; Joseph, Anthony

    2011-01-01

    Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…

  1. A new kind of science - or a not-so-new kind of computer program?

    CERN Multimedia

    Naiditch, D

    2003-01-01

    Critical discussion of Stephen Wolfram's theory that "scientists give up their unwieldy equations and instead employ the types of computational rules used in cellular automata (CA) and related computer programs" (2 pages).

  2. Computation of vibration mode elastic-rigid and effective weight coefficients from finite-element computer program output

    Science.gov (United States)

    Levy, R.

    1991-01-01

    Post-processing algorithms are given to compute the vibratory elastic-rigid coupling matrices and the modal contributions to the rigid-body mass matrices and to the effective modal inertias and masses. Recomputation of the elastic-rigid coupling matrices for a change in origin is also described. A computational example is included. The algorithms can all be executed by using standard finite-element program eigenvalue analysis output with no changes to existing code or source programs.

  3. Possible means of legal protection for computer programs and perspectives of future development

    OpenAIRE

    Toufar, Pavel

    2008-01-01

    This theses focuses on the possible legal protection of a computer program as well as on the legal nature of computer program as an intangible asset. Both copyright protection (as a standard and worldwide accepted means of protection) and also the other possibilities, i.e. patent protection and protection based on provisions regulating an unfair competition are discussed. Each means of protection is assessed based on its usability in relation with the computer program taking the overall impac...

  4. Playable Serious Games for Studying and Programming Computational STEM and Informatics Applications of Distributed and Parallel Computer Architectures

    Science.gov (United States)

    Amenyo, John-Thones

    2012-01-01

    Carefully engineered playable games can serve as vehicles for students and practitioners to learn and explore the programming of advanced computer architectures to execute applications, such as high performance computing (HPC) and complex, inter-networked, distributed systems. The article presents families of playable games that are grounded in…

  5. Proceedings of seventh symposium on sharing of computer programs and technology in nuclear medicine, computer assisted data processing

    Energy Technology Data Exchange (ETDEWEB)

    Howard, B.Y.; McClain, W.J.; Landay, M. (comps.)

    1977-01-01

    The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included.

  6. Understanding digital storytelling: individual ‘voice’ and community-building in youth media programs

    Directory of Open Access Journals (Sweden)

    Dr Craig Campbell

    2010-06-01

    Full Text Available Digital storytelling (DST has been widely used as a means of empowerment for marginalised voices across community-based projects worldwide. This paper discusses uses but also limitations of the practice in the context of a Melbourne-based youth media program for ‘youth at risk’ called YouthWorx. Based on our ongoing, long-term ethnographic research, we explore the cultural production of digital stories as a co-creative process that exposes a range of controversies to do with the politics of ‘voice’, genre’s communicative potential and ethical considerations. Concrete examples from YouthWorx’s pedagogical work serve to illustrate the values of self-expression (‘voice’, critical reflection and collaboration that form part of broader social transformations generated by these creative practices. The critique of DST practice offered here connects with existing studies concerned with the socially contextualised processes of media education, and the theoretical shift beyond ‘the right to speak’ towards ‘the right to be understood’ (Husband, 2009. The paper recommends more analytical attention be paid to a dynamic social process of learning (of media, interpersonal competencies and community-building, extending beyond the immediate DST situation, rather than narrowing the focus on end-result atomised media products.

  7. Understanding digital storytelling: individual ‘voice’ and community-building in youth media programs

    Directory of Open Access Journals (Sweden)

    Aneta Podkalicka

    2010-11-01

    Full Text Available Digital storytelling (DST has been widely used as a means of empowerment for marginalised voices across community-based projects worldwide. This paper discusses uses but also limitations of the practice in the context of a Melbourne-based youth media program for ‘youth at risk’ called YouthWorx. Based on our ongoing, long-term ethnographic research, we explore the cultural production of digital stories as a co-creative process that exposes a range of controversies to do with the politics of ‘voice’, genre’s communicative potential and ethical considerations. Concrete examples from YouthWorx’s pedagogical work serve to illustrate the values of self-expression (‘voice’, critical reflection and collaboration that form part of broader social transformations generated by these creative practices. The critique of DST practice offered here connects with existing studies concerned with the socially contextualised processes of media education, and the theoretical shift beyond ‘the right to speak’ towards ‘the right to be understood’ (Husband, 2009. The paper recommends more analytical attention be paid to a dynamic social process of learning (of media, interpersonal competencies and community-building, extending beyond the immediate DST situation, rather than narrowing the focus on end-result atomised media products.

  8. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    Science.gov (United States)

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  9. Thinking processes used by high-performing students in a computer programming task

    Directory of Open Access Journals (Sweden)

    Marietjie Havenga

    2011-07-01

    Full Text Available Computer programmers must be able to understand programming source code and write programs that execute complex tasks to solve real-world problems. This article is a trans- disciplinary study at the intersection of computer programming, education and psychology. It outlines the role of mental processes in the process of programming and indicates how successful thinking processes can support computer science students in writing correct and well-defined programs. A mixed methods approach was used to better understand the thinking activities and programming processes of participating students. Data collection involved both computer programs and students’ reflective thinking processes recorded in their journals. This enabled analysis of psychological dimensions of participants’ thinking processes and their problem-solving activities as they considered a programming problem. Findings indicate that the cognitive, reflective and psychological processes used by high-performing programmers contributed to their success in solving a complex programming problem. Based on the thinking processes of high performers, we propose a model of integrated thinking processes, which can support computer programming students. Keywords: Computer programming, education, mixed methods research, thinking processes.  Disciplines: Computer programming, education, psychology

  10. AutoStitcher: An Automated Program for Efficient and Robust Reconstruction of Digitized Whole Histological Sections from Tissue Fragments

    Science.gov (United States)

    Penzias, Gregory; Janowczyk, Andrew; Singanamalli, Asha; Rusu, Mirabela; Shih, Natalie; Feldman, Michael; Stricker, Phillip D.; Delprado, Warick; Tiwari, Sarita; Böhm, Maret; Haynes, Anne-Maree; Ponsky, Lee; Viswanath, Satish; Madabhushi, Anant

    2016-07-01

    In applications involving large tissue specimens that have been sectioned into smaller tissue fragments, manual reconstruction of a “pseudo whole-mount” histological section (PWMHS) can facilitate (a) pathological disease annotation, and (b) image registration and correlation with radiological images. We have previously presented a program called HistoStitcher, which allows for more efficient manual reconstruction than general purpose image editing tools (such as Photoshop). However HistoStitcher is still manual and hence can be laborious and subjective, especially when doing large cohort studies. In this work we present AutoStitcher, a novel automated algorithm for reconstructing PWMHSs from digitized tissue fragments. AutoStitcher reconstructs (“stitches”) a PWMHS from a set of 4 fragments by optimizing a novel cost function that is domain-inspired to ensure (i) alignment of similar tissue regions, and (ii) contiguity of the prostate boundary. The algorithm achieves computational efficiency by performing reconstruction in a multi-resolution hierarchy. Automated PWMHS reconstruction results (via AutoStitcher) were quantitatively and qualitatively compared to manual reconstructions obtained via HistoStitcher for 113 prostate pathology sections. Distances between corresponding fiducials placed on each of the automated and manual reconstruction results were between 2.7%-3.2%, reflecting their excellent visual similarity.

  11. Embedded protostars in the dust, ice, and gas in time (DIGIT) Herschel key program

    DEFF Research Database (Denmark)

    Green, Joel D.; Evans II, Neal J.; Jørgensen, Jes Kristian

    2013-01-01

    We present 50-210 um spectral scans of 30 Class 0/I protostellar sources, obtained with Herschel-PACS, and 0.5-1000 um SEDs, as part of the Dust, Ice, and Gas in Time (DIGIT) Key Program. Some sources exhibit up to 75 H2O lines ranging in excitation energy from 100-2000 K, 12 transitions of OH......, but not with the time-averaged outflow rate derived from low-J CO maps. [C II] emission is in general not local to the source. The sample Lbol increased by 1.25 (1.06) and Tbol decreased to 0.96 (0.96) of mean (median) values with the inclusion of the Herschel data. Most CO rotational diagrams are characterized by two...

  12. A socioeconomic related 'digital divide' exists in how, not if, young people use computers

    National Research Council Canada - National Science Library

    Courtenay Harris; Leon Straker; Clare Pollock

    2017-01-01

    .... All participants had computer access at school and 98.9% at home. Neighbourhood SES was related to computer use, IT activities, playing musical instruments, and participating in vigorous physical activity...

  13. Accuracy of Digital Radiography and Cone Beam Computed Tomography on Periapical Radiolucency Detection in Endodontically Treated Teeth

    Directory of Open Access Journals (Sweden)

    Tadas Venskutonis

    2014-07-01

    Full Text Available Objectives: The aim of the present study was to compare the accuracy of intraoral digital periapical radiography and cone beam computed tomography in the detection of periapical radiolucencies in endodontically treated teeth. Material and Methods: Radiographic images (cone beam computed tomography [CBCT] scans and digital periapical radiography [PR] images from 60 patients, achieved from September 2008 to July 2013, were retrieved from databases of the Department of Oral Diseases, Lithuanian University of Health Sciences. Twenty patients met inclusion criteria and were selected for further evaluation. Results: In 20 patients (42.4 [SD 12.1] years, 65% men and 35% women a total of 35 endodontically treated teeth (1.75 [SD 0.91]; 27 in maxilla and 8 in mandible were evaluated. Overall, it was observed a statistical significant difference between the number of periapical lesions observed in the CBCT (n = 42 and radiographic (n = 24 examinations (P < 0.05. In molar teeth, CBCT identify a significantly higher amount of periapical lesions than with the radiographic method (P < 0.05. There were significant differences between CBCT and PR in the mean number of lesions identified per tooth (1.2 vs 0.66, P = 0.03, number of teeth with lesions (0.71 vs 0.46, P = 0.03 and number of lesions identified per canal (0.57 vs 0.33, P = 0.005. Considering CBCT as “gold standard” in lesion detection with the sensitivity, specificity and accuracy considering as score 1, then the same parameters of PR were 0.57, 1 and 0.76 respectively. Conclusions: Within the limitations of the present study, it can be concluded that cone beam computed tomography scans were more accurate compared to digital periapical radiographs for detecting periapical radiolucencies in endodontically treated teeth. The difference was more pronounced in molar teeth.

  14. The use of linear programming techniques to design optimal digital filters for pulse shaping and channel equalization

    Science.gov (United States)

    Houts, R. C.; Burlage, D. W.

    1972-01-01

    A time domain technique is developed to design finite-duration impulse response digital filters using linear programming. Two related applications of this technique in data transmission systems are considered. The first is the design of pulse shaping digital filters to generate or detect signaling waveforms transmitted over bandlimited channels that are assumed to have ideal low pass or bandpass characteristics. The second is the design of digital filters to be used as preset equalizers in cascade with channels that have known impulse response characteristics. Example designs are presented which illustrate that excellent waveforms can be generated with frequency-sampling filters and the ease with which digital transversal filters can be designed for preset equalization.

  15. Automation and schema acquisition in learning elementary computer programming : implications for the design of practice

    NARCIS (Netherlands)

    van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.; Paas, Fred G.W.C.

    1990-01-01

    Two complementary processes may be distinguished in learning a complex cognitive skill such as computer programming. First, automation offers task-specific procedures that may directly control programming behavior, second, schema acquisition offers cognitive structures that provide analogies in new

  16. Evaluation of diagnostic accuracy of conventional and digital periapical radiography, panoramic radiography, and cone-beam computed tomography in the assessment of alveolar bone loss.

    Science.gov (United States)

    Takeshita, Wilton Mitsunari; Vessoni Iwaki, Lilian Cristina; Da Silva, Mariliani Chicarelli; Tonin, Renata Hernandes

    2014-07-01

    To evaluate the diagnostic accuracy of different radiographic methods in the assessment of proximal alveolar bone loss (ABL). ABL, the distance between cement-enamel junction and alveolar bone crest, was measured in 70 mandibular human teeth - directly on the mandibles (control), using conventional periapical radiography with film holders (Rinn XCP and Han-Shin), digital periapical radiography with complementary metal-oxide semiconductor sensor, conventional panoramic, and cone-beam computed tomography (CBCT). Three programs were used to measure ABL on the images: Image tool 3.0 (University of Texas Health Sciences Center, San Antonio, Texas, USA), Kodak Imaging 6.1 (Kodak Dental Imaging 6.1, Carestream Health(®), Rochester, NY, USA), and i-CAT vision 1.6.20. Statistical analysis used ANOVA and Tukey's test at 5% significance level. The tomographic images showed the highest means, whereas the lowest were found for periapical with Han-Shin. Controls differed from periapical with Han-Shin (P periapical with Rinn XCP (P = 0.0066), periapical with Han-Shin (P periapical (P = 0.0027). Conventional periapicals with film holders differed from each other (P = 0.0007). Digital periapical differed from conventional periapical with Han-Shin (P = 0.0004). Conventional periapical with Han-Shin film holder was the only method that differed from the controls. CBCT had the closest means to the controls.

  17. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Iliopoulos, AS; Sun, X [Duke University, Durham, NC (United States); Pitsianis, N [Aristotle University of Thessaloniki (Greece); Duke University, Durham, NC (United States); Yin, FF; Ren, L [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digital projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub

  18. Low-cost digital image processing on a university mainframe computer. [considerations in selecting and/or designing instructional systems

    Science.gov (United States)

    Williams, T. H. L.

    1981-01-01

    The advantages and limitations of using university mainframe computers in digital image processing instruction are listed. Aspects to be considered when designing software for this purpose include not only two general audience, but also the capabilities of the system regarding the size of the image/subimage, preprocessing and enhancement functions, geometric correction and registration techniques; classification strategy, classification algorithm, multitemporal analysis, and ancilliary data and geographic information systems. The user/software/hardware interaction as well as acquisition and operating costs must also be considered.

  19. Digital computer study of nuclear reactor thermal transients during startup of 60-kWe Brayton power conversion system

    Science.gov (United States)

    Jefferies, K. S.; Tew, R. C.

    1974-01-01

    A digital computer study was made of reactor thermal transients during startup of the Brayton power conversion loop of a 60-kWe reactor Brayton power system. A startup procedure requiring the least Brayton system complication was tried first; this procedure caused violations of design limits on key reactor variables. Several modifications of this procedure were then found which caused no design limit violations. These modifications involved: (1) using a slower rate of increase in gas flow; (2) increasing the initial reactor power level to make the reactor respond faster; and (3) appropriate reactor control drum manipulation during the startup transient.

  20. The Application of Visual Basic Computer Programming Language to Simulate Numerical Iterations

    Directory of Open Access Journals (Sweden)

    Abdulkadir Baba HASSAN

    2006-06-01

    Full Text Available This paper examines the application of Visual Basic Computer Programming Language to Simulate Numerical Iterations, the merit of Visual Basic as a Programming Language and the difficulties faced when solving numerical iterations analytically, this research paper encourage the uses of Computer Programming methods for the execution of numerical iterations and finally fashion out and develop a reliable solution using Visual Basic package to write a program for some selected iteration problems.