WorldWideScience

Sample records for hx-20 notebook computer

  1. Field microcomputerized multichannel γ ray spectrometer based on notebook computer

    International Nuclear Information System (INIS)

    Jia Wenyi; Wei Biao; Zhou Rongsheng; Li Guodong; Tang Hong

    1996-01-01

    Currently, field γ ray spectrometry can not rapidly measure γ ray full spectrum, so a field microcomputerized multichannel γ ray spectrometer based on notebook computer is described, and the γ ray full spectrum can be rapidly measured in the field

  2. seismo-live: Training in Computational Seismology using Jupyter Notebooks

    Science.gov (United States)

    Igel, H.; Krischer, L.; van Driel, M.; Tape, C.

    2016-12-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation technologies in research projects. At the same time well-engineered community codes make it easy to return simulation-based results yet with the danger that the inherent traps of numerical solutions are not well understood. It is our belief that training with highly simplified numerical solutions (here to the equations describing elastic wave propagation) with carefully chosen elementary ingredients of simulation technologies (e.g., finite-differencing, function interpolation, spectral derivatives, numerical integration) could substantially improve this situation. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without and necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations with interactive, executable python codes. We demonstrate the potential with training notebooks for the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin method. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing and noise analysis. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas.

  3. An ergonomic evaluation comparing desktop, notebook, and subnotebook computers.

    Science.gov (United States)

    Szeto, Grace P; Lee, Raymond

    2002-04-01

    To evaluate and compare the postures and movements of the cervical and upper thoracic spine, the typing performance, and workstation ergonomic factors when using a desktop, notebook, and subnotebook computers. Repeated-measures design. A motion analysis laboratory with an electromagnetic tracking device. A convenience sample of 21 university students between ages 20 and 24 years with no history of neck or shoulder discomfort. Each subject performed a standardized typing task by using each of the 3 computers. Measurements during the typing task were taken at set intervals. Cervical and thoracic spines adopted a more flexed posture in using the smaller-sized computers. There were significantly greater neck movements in using desktop computers when compared with the notebook and subnotebook computers. The viewing distances adopted by the subjects decreased as the computer size decreased. Typing performance and subjective rating of difficulty in using the keyboards were also significantly different among the 3 types of computers. Computer users need to consider the posture of the spine and potential risk of developing musculoskeletal discomfort in choosing computers. Copyright 2002 by the American Congress of Rehabilitation Medicine and the American Academy of Physical Medicine and Rehabilitation

  4. Longitudinal Study of Factors Impacting the Implementation of Notebook Computer Based CAD Instruction

    Science.gov (United States)

    Goosen, Richard F.

    2009-01-01

    This study provides information for higher education leaders that have or are considering conducting Computer Aided Design (CAD) instruction using student owned notebook computers. Survey data were collected during the first 8 years of a pilot program requiring engineering technology students at a four year public university to acquire a notebook…

  5. 35-We polymer electrolyte membrane fuel cell system for notebook computer using a compact fuel processor

    Science.gov (United States)

    Son, In-Hyuk; Shin, Woo-Cheol; Lee, Yong-Kul; Lee, Sung-Chul; Ahn, Jin-Gu; Han, Sang-Il; kweon, Ho-Jin; Kim, Ju-Yong; Kim, Moon-Chan; Park, Jun-Yong

    A polymer electrolyte membrane fuel cell (PEMFC) system is developed to power a notebook computer. The system consists of a compact methanol-reforming system with a CO preferential oxidation unit, a 16-cell PEMFC stack, and a control unit for the management of the system with a d.c.-d.c. converter. The compact fuel-processor system (260 cm 3) generates about 1.2 L min -1 of reformate, which corresponds to 35 We, with a low CO concentration (notebook computers.

  6. 35-We polymer electrolyte membrane fuel cell system for notebook computer using a compact fuel processor

    Energy Technology Data Exchange (ETDEWEB)

    Son, In-Hyuk; Shin, Woo-Cheol; Lee, Sung-Chul; Ahn, Jin-Gu; Han, Sang-Il; kweon, Ho-Jin; Kim, Ju-Yong; Park, Jun-Yong [Energy 1 Group, Energy Laboratory at Corporate R and D Center in Samsung SDI Co., Ltd., 575, Shin-dong, Yeongtong-gu, Suwon-si, Gyeonggi-do 443-731 (Korea); Lee, Yong-Kul [Department of Chemical Engineering, Dankook University, Youngin 448-701 (Korea); Kim, Moon-Chan [Department of Environmental Engineering, Chongju University, Chongju 360-764 (Korea)

    2008-10-15

    A polymer electrolyte membrane fuel cell (PEMFC) system is developed to power a notebook computer. The system consists of a compact methanol-reforming system with a CO preferential oxidation unit, a 16-cell PEMFC stack, and a control unit for the management of the system with a d.c.-d.c. converter. The compact fuel-processor system (260 cm{sup 3}) generates about 1.2 L min{sup -1} of reformate, which corresponds to 35 We, with a low CO concentration (<30 ppm, typically 0 ppm), and is thus proven to be capable of being targetted at notebook computers. (author)

  7. 75 FR 8400 - In the Matter of Certain Notebook Computer Products and Components Thereof; Notice of Investigation

    Science.gov (United States)

    2010-02-24

    ... INTERNATIONAL TRADE COMMISSION [Inv. No. 337-TA-705] In the Matter of Certain Notebook Computer... United States after importation of certain notebook computer products and components thereof by reason of... an industry in the United States exists as required by subsection (a)(2) of section 337. The...

  8. Exploding the Black Box: Personal Computing, the Notebook Battery Crisis, and Postindustrial Systems Thinking.

    Science.gov (United States)

    Eisler, Matthew N

    Historians of science and technology have generally ignored the role of power sources in the development of consumer electronics. In this they have followed the predilections of historical actors. Research, development, and manufacturing of batteries has historically occurred at a social and intellectual distance from the research, development, and manufacturing of the devices they power. Nevertheless, power source technoscience should properly be understood as an allied yet estranged field of electronics. The separation between the fields has had important consequences for the design and manufacturing of mobile consumer electronics. This paper explores these dynamics in the co-construction of notebook batteries and computers. In so doing, it challenges assumptions of historians and industrial engineers and planners about the nature of computer systems in particular and the development of technological systems. The co-construction of notebook computers and batteries, and the occasional catastrophic failure of their compatibility, challenges systems thinking more generally.

  9. Usability Evaluation of Notebook Computers and Cellular Telephones Among Users with Visual and Upper Extremity Disabilities

    OpenAIRE

    Mooney, Aaron Michael

    2002-01-01

    Information appliances such as notebook computers and cellular telephones are becoming integral to the lives of many. These devices facilitate a variety of communication tasks, and are used for employment, education, and entertainment. Those with disabilities, however, have limited access to these devices, due in part to product designs that do not consider their special needs. A usability evaluation can help identify the needs and difficulties those with disabilities have when using a pro...

  10. Idea Notebook: Wilderness Food Planning in the Computer Age.

    Science.gov (United States)

    Drury, Jack K.

    1986-01-01

    Explains the use of a computer as a planning and teaching tool in wilderness trip food planning. Details use of master food list and spreadsheet software such as VisiCalc to provide shopping lists for food purchasing, cost analysis, and diet analysis. (NEC)

  11. A Laboratory Notebook System

    OpenAIRE

    Schreiber, Andreas

    2012-01-01

    Many scientists are using a laboratory notebook when conducting experiments. The scientist documents each step, either taken in the experiment or afterwards when processing data. Due to computerized research systems, acquired data increases in volume and becomes more elaborate. This increases the need to migrate from originally paper-based to electronic notebooks with data storage, computational features and reliable electronic documentation. This talks describes a laboratory notebook bas...

  12. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

    Science.gov (United States)

    Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

    2000-10-01

    In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

  13. Revision of the European Ecolabel Criteria for Personal, Notebook and Tablet Computers TECHNICAL REPORT Summary of the final criteria proposals

    OpenAIRE

    DODD NICHOLAS; VIDAL ABARCA GARRIDO CANDELA; WOLF Oliver; GRAULICH Kathrin; BUNKE Dirk; GROSS Rita; LIU Ran; MANHART Andreas; PRAKASH Siddharth

    2015-01-01

    This technical report provide the background information for the revision of the EU Ecolabel criteria for Personal and Notebook Computers. The study has been carried out by the Joint Research Centre with technical support from the Oeko-Institut. The work has been developed for the European Commission's Directorate General for the Environment. The main purpose of this report is to provide a summary of the technical background and rationale for each criterion proposal. This document is compl...

  14. IPython notebook essentials

    CERN Document Server

    Martins, L Felipe

    2014-01-01

    If you are a professional, student, or educator who wants to learn to use IPython Notebook as a tool for technical and scientific computing, visualization, and data analysis, this is the book for you. This book will prove valuable for anyone that needs to do computations in an agile environment.

  15. Delivering health information about self-medication to older adults: use of touchscreen-equipped notebook computers.

    Science.gov (United States)

    Neafsey, P J; Strickler, Z; Shellman, J; Padula, A T

    2001-11-01

    Preventing Drug Interactions in Active Older Adults is an educational intervention to prevent prescription and over-the-counter (OTC) drug and alcohol interactions in active, community-living older adults. The objectives of the program are to increase older adults' knowledge of potential interactions of prescription medications with OTC drugs and alcohol and to increase their confidence (self-efficacy) about how to avoid such interactions. An interactive multimedia computer software program (Personal Education Program or PEP) was designed for the learning styles and psychomotor skills of older adults. Focus groups of older adults evaluated PEP components in a formative manner during development. The program content dealing with antacids, calcium supplements, and acid reducers was pilot tested with 60 older adults recruited from local senior centers. Participants used the PEP on notebook computers equipped with infrared-sensitive touchscreens. Users of PEP had greater knowledge and self-efficacy scores than controls. Participants indicated a high degree of satisfaction with the PEP and reported their intent to make specific changes in self-medication behaviors.

  16. The Invention Notebook Challenge

    Science.gov (United States)

    Roman, Harry T.

    2018-01-01

    Like scientists who keep lab notebooks detailing their experiments, inventors keep invention notebooks that chronologically detail the inception, development, and refinement of their inventions. These notebooks are legal documents that can help prove one inventor's precedent over another. Scenarios like these are very real, as the author has had…

  17. ELECTRONIC RESEARCH NOTEBOOKS

    Science.gov (United States)

    The paper details the public availability of Electronic notebooks (EN) and an example of a system in use within a research laboratory in the Office of Research and Development. Research notebooks contain intellectual property which must be guarded until it can be disseminated wit...

  18. Keeping a Laboratory Notebook.

    Science.gov (United States)

    Eisenberg, Anne

    1982-01-01

    Since the keeping of good records is essential in the chemistry laboratory, general guidelines for maintaining a laboratory notebook are provided. Includes rationale for having entries documented or witnessed. (Author/JN)

  19. IPython/Jupyter Notebooks

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Jupyter notebooks are pretty amazing. They can run code and keep in one place visualizations, equations and pretty-formatted text as well. With notebooks it's extremely easy to produce and share results in a comprehensible format and so they make the perfect tool for data analysis. I'll give a sneak peek on their wide range of uses and what we are doing at Indico to help their adoption at CERN.

  20. The Electronic Notebook Ontology

    OpenAIRE

    Chalk, Stuart

    2016-01-01

    Science is rapidly being brought into the electronic realm and electronic laboratory notebooks (ELN) are a big part of this activity. The representation of the scientific process in the context of an ELN is an important component to making the data recorded in ELNs semantically integrated. This presentation will outline initial developments of an Electronic Notebook Ontology (ENO) that will help tie together the ExptML ontology, HCLS Community Profile data descriptions, and the VIVO-ISF ontol...

  1. Hibernate A Developer's Notebook

    CERN Document Server

    Elliott, James

    2004-01-01

    Do you enjoy writing software, except for the database code? Hibernate:A Developer's Notebook is for you. Database experts may enjoy fiddling with SQL, but you don't have to--the rest of the application is the fun part. And even database experts dread the tedious plumbing and typographical spaghetti needed to put their SQL into a Java program. Hibernate: A Developers Notebook shows you how to use Hibernate to automate persistence: you write natural Java objects and some simple configuration files, and Hibernate automates all the interaction between your objects and the database. You don't

  2. Rethinking Laboratory Notebooks

    DEFF Research Database (Denmark)

    Klokmose, Clemens Nylandsted; Zander, Pär-Ola

    2010-01-01

    We take digitalization of laboratory work practice as a challenging design domain to explore. There are obvious drawbacks with the use of paper instead of ICT in the collaborative writing that takes place in laboratory notebooks; yet paper persist in being the most common solution. The ultimate aim...... with our study is to produce design relevant knowledge that can envisage an ICT solution that keeps as many advantages of paper as possible, but with the strength of electronic laboratory notebooks as well. Rather than assuming that users are technophobic and unable to appropriate state of the art software...

  3. Writing the Laboratory Notebook.

    Science.gov (United States)

    Kanare, Howard M.

    The purpose of this book is to teach the principles of proper scientific notekeeping. The principles presented in this book are goals for which working scientists must strive. Chapter 1, "The Reasons for Notekeeping," is an overview of the process of keeping a laboratory notebook. Chapter 2, "The Hardware of Notekeeping," is intended especially…

  4. Lowering the barriers to computational modeling of Earth's surface: coupling Jupyter Notebooks with Landlab, HydroShare, and CyberGIS for research and education.

    Science.gov (United States)

    Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.

    2017-12-01

    The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently

  5. A virtual laboratory notebook for simulation models.

    Science.gov (United States)

    Winfield, A J

    1998-01-01

    In this paper we describe how we have adopted the laboratory notebook as a metaphor for interacting with computer simulation models. This 'virtual' notebook stores the simulation output and meta-data (which is used to record the scientist's interactions with the simulation). The meta-data stored consists of annotations (equivalent to marginal notes in a laboratory notebook), a history tree and a log of user interactions. The history tree structure records when in 'simulation' time, and from what starting point in the tree changes are made to the parameters by the user. Typically these changes define a new run of the simulation model (which is represented as a new branch of the history tree). The tree shows the structure of the changes made to the simulation and the log is required to keep the order in which the changes occurred. Together they form a record which you would normally find in a laboratory notebook. The history tree is plotted in simulation parameter space. This shows the scientist's interactions with the simulation visually and allows direct manipulation of the parameter information presented, which in turn is used to control directly the state of the simulation. The interactions with the system are graphical and usually involve directly selecting or dragging data markers and other graphical control devices around in parameter space. If the graphical manipulators do not provide precise enough control then textual manipulation is still available which allows numerical values to be entered by hand. The Virtual Laboratory Notebook, by providing interesting interactions with the visual view of the history tree, provides a mechanism for giving the user complex and novel ways of interacting with biological computer simulation models.

  6. Optimising JS visualisation for notebooks

    CERN Document Server

    She, Harry

    2017-01-01

    In a large number of notebooks that are created and used by users of CERN, many important components are those related to graphical visualization. These include, but are not limited to graphs and histograms of data and events from physical detectors. However, the information used to display the ROOT graphical primitives is too comprehensive and hence cause the physical space requirements of the notebooks to grow drastically as the number of graphs and plots increases. Analysis has been performed to trim out redundant information from the notebooks, as well as providing insight into browsing ROOT objects within a notebook.

  7. GES DISC Data Recipes in Jupyter Notebooks

    Science.gov (United States)

    Li, A.; Banavige, B.; Garimella, K.; Rice, J.; Shen, S.; Liu, Z.

    2017-12-01

    The Earth Science Data and Information System (ESDIS) Project manages twelve Distributed Active Archive Centers (DAACs) which are geographically dispersed across the United States. The DAACs are responsible for ingesting, processing, archiving, and distributing Earth science data produced from various sources (satellites, aircraft, field measurements, etc.). In response to projections of an exponential increase in data production, there has been a recent effort to prototype various DAAC activities in the cloud computing environment. This, in turn, led to the creation of an initiative, called the Cloud Analysis Toolkit to Enable Earth Science (CATEES), to develop a Python software package in order to transition Earth science data processing to the cloud. This project, in particular, supports CATEES and has two primary goals. One, transition data recipes created by the Goddard Earth Science Data and Information Service Center (GES DISC) DAAC into an interactive and educational environment using Jupyter Notebooks. Two, acclimate Earth scientists to cloud computing. To accomplish these goals, we create Jupyter Notebooks to compartmentalize the different steps of data analysis and help users obtain and parse data from the command line. We also develop a Docker container, comprised of Jupyter Notebooks, Python library dependencies, and command line tools, and configure it into an easy to deploy package. The end result is an end-to-end product that simulates the use case of end users working in the cloud computing environment.

  8. Process notebook for aquatic ecosystem simulation

    International Nuclear Information System (INIS)

    Swartzman, G.; Smith, E.; McKenzie, D.; Haar, B.; Fickeisen, D.

    1980-01-01

    This notebook contains a detailed comparison of 14 models of fish growth, energetics, population dynamics, and feeding. It is a basic document for the evaluation of thes models' usefulness for impact assessment. Model equations are categorized into 18 subprocesses comprising the major processes of consumption, predation, metabolic processes, growth, fecundity, and mortality. The model equations are compared in a standard notation and the equation rationales are considered and put into a historical framework with historical precedence charts. Model parameters are computed in standard units and data sources and techniques used for parameter estimation are identified. A translator compares standard notation with the notation used in the models. The major contribution of this work is that, for the first time, fish models are arrayed with their assumptions laid bare and their parameter values compared, allowing elucidation of model differeances and evaluaton of model behavior and data needs by using the process notebook as a base for further simulation comparison

  9. Analysis of Samsung notebook strategy

    OpenAIRE

    Xu, Rui

    2009-01-01

    Under the fast growing background, Notebook industry draws lots attention from IT companies. It is expected that by 2010, the sales of notebook will overrun the sales of desktop. After the SWOT and SPACE analysis, we recommend Samsung to improve from two different perspectives, first, set up a clear and detailed marketing goal and plan, and ensure the enforcing of such strategies. Secondly, manage the distribution channels effectively and efficiently. There are two trends Samsu...

  10. Python for signal processing featuring IPython notebooks

    CERN Document Server

    Unpingco, José

    2013-01-01

    This book covers the fundamental concepts in signal processing illustrated with Python code and made available via IPython Notebooks, which are live, interactive, browser-based documents that allow one to change parameters, redraw plots, and tinker with the ideas presented in the text. Everything in the text is computable in this format and thereby invites readers to ""experiment and learn"" as they read. The book focuses on the core, fundamental principles of signal processing. The code corresponding to this book uses the core functionality of the scientific Python toolchain that should remai

  11. Doing physics with scientific notebook a problem solving approach

    CERN Document Server

    Gallant, Joseph

    2012-01-01

    The goal of this book is to teach undergraduate students how to use Scientific Notebook (SNB) to solve physics problems. SNB software combines word processing and mathematics in standard notation with the power of symbolic computation. As its name implies, SNB can be used as a notebook in which students set up a math or science problem, write and solve equations, and analyze and discuss their results. Written by a physics teacher with over 20 years experience, this text includes topics that have educational value, fit within the typical physics curriculum, and show the benefits of using SNB.

  12. Jupyter Notebooks for Earth Sciences: An Interactive Training Platform for Seismology

    Science.gov (United States)

    Igel, H.; Chow, B.; Donner, S.; Krischer, L.; van Driel, M.; Tape, C.

    2017-12-01

    We have initiated a community platform (http://www.seismo-live.org) where Python-based Jupyter notebooks (https://jupyter.org) can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow the combination of markup language, graphics, and equations with interactive, executable Python code examples. Jupyter notebooks are a powerful and easy-to-grasp tool for students to develop entire projects, scientists to collaborate and efficiently interchange evolving workflows, and trainers to develop efficient practical material. Utilizing the tmpnb project (https://github.com/jupyter/tmpnb), we link the power of Jupyter notebooks with an underlying server, such that notebooks can be run from anywhere, even on smart phones. We demonstrate the potential with notebooks for 1) learning the programming language Python, 2) basic signal processing, 3) an introduction to the ObsPy library (https://obspy.org) for seismology, 4) seismic noise analysis, 5) an entire suite of notebooks for computational seismology (the finite-difference method, pseudospectral methods, finite/spectral element methods, the finite-volume and the discontinuous Galerkin methods, Instaseis), 6) rotational seismology, 7) making results in papers fully reproducible, 8) a rate-and-state friction toolkit, 9) glacial seismology. The platform is run as a community project using Github. Submission of complementary Jupyter notebooks is encouraged. Extension in the near future include linear(-ized) and nonlinear inverse problems.

  13. seismo-live: Training in Seismology using Jupyter Notebooks

    Science.gov (United States)

    Igel, Heiner; Krischer, Lion; van Driel, Martin; Tape, Carl

    2017-04-01

    Practical training in computational methodologies is still underrepresented in Earth science curriculae despite the increasing use of sometimes highly sophisticated simulation and data processing technologies in research projects. At the same time well-engineered community codes make it easy to return results yet with the danger that the inherent traps of black-box solutions are not well understood. For this purpose we have initiated a community platform (www.seismo-live.org) where Python-based Jupyter notebooks can be accessed and run without necessary downloads or local software installations. The increasingly popular Jupyter notebooks allow combining markup language, graphics, equations, with interactive, executable python codes. The platform already includes general Python training, introduction to the ObsPy library for seismology as well as seismic data processing, noise analysis, and a variety of forward solvers for seismic wave propagation. In addition, an example is shown how Jupyter notebooks can be used to increase reproducibility of published results. Submission of Jupyter notebooks for general seismology are encouraged. The platform can be used for complementary teaching in Earth Science courses on compute-intensive research areas. We present recent developments and new features.

  14. Seismo-Live: Training in Seismology with Jupyter Notebooks

    Science.gov (United States)

    Krischer, Lion; Tape, Carl; Igel, Heiner

    2016-04-01

    Seismological training tends to occur within the isolation of a particular institution with a limited set of tools (codes, libraries) that are often not transferrable outside. Here, we propose to overcome these limitations with a community-driven library of Jupyter notebooks dedicated to training on any aspect of seismology for purposes of education and outreach, on-site or archived tutorials for codes, classroom instruction, and research. A Jupyter notebook (jupyter.org) is an open-source interactive computational environment that allows combining code execution, rich text, mathematics, and plotting. It can be considered a platform that supports reproducible research, as all inputs and outputs may be stored. Text, external graphics, equations can be handled using Markdown (incl. LaTeX) format. Jupyter notebooks are driven by standard web browsers, can be easily exchanged in text format, or converted to other documents (e.g. PDF, slide shows). They provide an ideal format for practical training in seismology. A pilot-platform was setup with a dedicated server such that the Jupyter notebooks can be run in any browser (PC, notepad, smartphone). We show the functionalities of the Seismo-Live platform with examples from computational seismology, seismic data access and processing using the ObsPy library, seismic inverse problems, and others. The current examples are all using the Python programming language but any free language can be used. Potentially, such community platforms could be integrated with the EPOS-IT infrastructure and extended to other fields of Earth sciences.

  15. Effect of two Notebook stands on work posture and productivity

    NARCIS (Netherlands)

    Könemann, R.; Kuijt-Evers, L.F.M.; Lingen, P. van; Sauvage, S.; Hallbeck, S.

    2009-01-01

    The aim of this study was to investigate the effect of using a notebook stand on the physical load when working with a notebook in a home environment. Sixteen subjects evaluated working with a notebook by performing three different tasks using two notebook stands and without using a notebook stand.

  16. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for public data releases and outreach

    CERN Document Server

    Banda, Tea; CERN. Geneva. EP Department

    2016-01-01

    The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.

  17. LDUA software custodian's notebook

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This plan describes the activities to be performed and controls to be applied to the process of specifying, obtaining, and qualifying the control and data acquisition software for the Light Duty Utility Arm (LDUA) System. It serves the purpose of a software quality assurance plan, a verification and validation plan, and a configuration management plan. This plan applies to all software that is an integral part of the LDUA control and data acquisition system, that is, software that is installed in the computers that are part of the LDUA system as it is deployed in the field. This plan applies to the entire development process, including: requirements; design; implementation; and operations and maintenance. This plan does not apply to any software that is not integral with the LDUA system. This plan has-been prepared in accordance with WHC-CM-6-1 Engineering Practices, EP-2.1; WHC-CM-3-10 Software Practices; and WHC-CM-4-2, QR 19.0, Software Quality Assurance Requirements

  18. Notebooks as didactic tool in design education

    NARCIS (Netherlands)

    Van den Toorn, M.W.M.; Have, R.

    2012-01-01

    Notebooks are an important didactic tool both for students and teaching staff. The idea of notebooks is that the daily work and thinking is reflected in notes, drawings, sketches, diagrams. Keeping track of the content of daily work can give an idea of the evolution and development of ideas.

  19. HUMAN RELATIONS LABORATORY TRAINING STUDENT NOTEBOOK.

    Science.gov (United States)

    Springport High School, MI.

    THE MAJOR OBJECTIVE OF THIS NOTEBOOK IS TO HELP THOSE STUDENTS INTERESTED IN TAKING PART IN THE SPRINGPORT HIGH SCHOOL HUMAN RELATIONS TRAINING LABORATORIES TO BETTER UNDERSTAND THEMSELVES, SOCIETY, AND HUMAN EMOTIONS SO THAT THEY MAY DEVELOP SOCIALLY AND EMOTIONALLY. THE SUBJECT MATTER OF THE NOTEBOOK IS DIVIDED INTO FOUR MAJOR AREAS--(1)…

  20. Laboratory Notebooks in the Science Classroom

    Science.gov (United States)

    Roberson, Christine; Lankford, Deanna

    2010-01-01

    Lab notebooks provide students with authentic science experiences as they become active, practicing scientists. Teachers gain insight into students' understanding of science content and processes, while students create a lasting personal resource. This article provides high school science teachers with guidelines for implementing lab notebooks in…

  1. Zařazení notebooků do současných informačních systémů

    OpenAIRE

    Kubeš, Adam

    2009-01-01

    This bachelor work focuses especially on categorization of notebooks in ICT of present IS. It extends and fills in my previous semestral work called "Notebooks", that I was working on in course Technical means and infrastructure of IS. The work is divided into two main sections -- theoretical and practical. Theoretical section devotes to the description of present types of computers and categorization of notebooks in ICT, their use, contributions, limitations, energy management and synchroniz...

  2. Assessing the significance of Heidegger's Black Notebooks

    Directory of Open Access Journals (Sweden)

    J. Malpas

    2018-03-01

    Full Text Available The publication of Heidegger's Black Notebooks (Schwarze Hefte has provoked a storm of controversy. Much of this has centred on the pro-Nazi and anti-Semitic comments the volumes contain. But these aspects of the Notebooks are perhaps the least surprising and important. This essay offers a summary overview of the issues to which the Notebooks give rise, at the same time as it also aims to provide a preliminary assessment of their overall significance, especially in relation to what they show about the nature and development of Heidegger's thinking from the early 1930s to the late 1940s.

  3. Integration of TMVA Output into Jupyter notebooks

    CERN Document Server

    Saliji, Albulena

    2016-01-01

    The purpose of this report is to describe the work that I have been doing during these past eight weeks as a Summer Student at CERN. The task which was assigned to me had to do with the integration of TMVA Output into Jupyter notebooks. In order to integrate the TMVA Output into the Jupyter notebook, first, improvement of the TMVA Output in the terminal was required. Once the output was improved, it needed to be transformed into HTML output and at the end it would be possible to integrate that output into the Jupyter notebook.

  4. Jupyter Notebooks as tools for interactive learning of Concepts in Structural Geology and efficient grading of exercises.

    Science.gov (United States)

    Niederau, Jan; Wellmann, Florian; Maersch, Jannik; Urai, Janos

    2017-04-01

    Programming is increasingly recognised an important skill for geoscientists - however, the hurdle to jump into programming for students with little or no experience can be high. We present here teaching concepts on the basis of Jupyter notebooks that combine, in an intuitive way, formatted instruction text with code cells in a single environment. This integration allows for an exposure to programming on several levels: from a complete interactive presentation of content, where students require no or very limited programming experience, to highly complex geoscientific computations. We consider these notebooks therefore as an ideal medium to present computational content to students in the field of geosciences. We show here how we use these notebooks to develop digital documents in Python for undergrad-students, who can then learn about basic concepts in structural geology via self-assessment. Such notebooks comprise concepts such as: stress tensor, strain ellipse, or the mohr circle. Students can interactively change parameters, e.g. by using sliders and immediately see the results. They can further experiment and extend the notebook by writing their own code within the notebook. Jupyter Notebooks for teaching purposes can be provided ready-to-use via online services. That is, students do not need to install additional software on their devices in order to work with the notebooks. We also use Jupyter Notebooks for automatic grading of programming assignments in multiple lectures. An implemented workflow facilitates the generation, distribution of assignments, as well as the final grading. Compared to previous grading methods with a high percentage of repetitive manual grading, the implemented workflow proves to be much more time efficient.

  5. Who profits from innovation in global value chains? A study of the iPod and notebook PCs

    OpenAIRE

    Jason Dedrick; Kenneth L. Kraemer; Greg Linden

    2010-01-01

    This article analyzes the distribution of financial value from innovation in the global supply chains of iPods and notebook computers. We find that Apple has captured a great deal of value from the innovation embodied in the iPod, while notebook makers capture a more modest share of the value from PC innovation. In order to understand these differences, we employ concepts from theories of innovation and industrial organization, finding significant roles for industry evolution, complementary a...

  6. Indexing Laboratory Notebooks in a Chemical R&D Environment

    Science.gov (United States)

    Mendenhall, Donna M.

    1978-01-01

    A method of preparing computerized subject and author indexes for research and development laboratory notebooks is described. Wiswesser Line Notation is used as the subject entry capable of listing specifically and unambiguously the compounds described in the notebooks. (Author)

  7. SCHOOL NOTEBOOK, A TOOL FOR SUPERVISION

    Directory of Open Access Journals (Sweden)

    Karmele Totoricagüena Barandica

    2016-12-01

    Full Text Available When we first decided to do a presentation of a paper about the School Notebook, a digital tool created to register the inspection interventions in education centers, we were encouraged by the rising interest and awareness over this tool that has been implemented by the Inspectorates of Education of the Basque Country. Our main goal is to present the school notebook in the current context, considering its origins and understanding that due to the continuous teamwork to be carried out by the future inspectors, the actual notebook will continue evolving to a more complex and advanced scenario. In conclusion, the present paper´s aim is to reflect on and analyze certain elements to open new doors to the inspectorate. Therefore, we are presenting a model we have created to work within our Autonomous Community considering its virtues and potentialities. We are also pointing out some key elements we should improve towards excellence and efficiency within our professional careers.

  8. The Laboratory Notebook as a Research and Development Record

    Science.gov (United States)

    Bailey, Martha J.

    1972-01-01

    The literature concerning laboratory notebooks is reviewed. A procedure is described for administering laboratory notebooks. Outlined is an indexing system which provides a method for retrieving information by laboratory notebook number, by name, and by general subjects. The indexing scheme is estimated to be adequate for collections up to 5,000…

  9. Digitizing and Securing Archived Laboratory Notebooks

    Science.gov (United States)

    Caporizzo, Marilyn

    2008-01-01

    The Information Group at Millipore has been successfully using a digital rights management tool to secure the email distribution of archived laboratory notebooks. Millipore is a life science leader providing cutting-edge technologies, tools, and services for bioscience research and biopharmaceutical manufacturing. Consisting of four full-time…

  10. Electronic lab notebooks: can they replace paper?

    Science.gov (United States)

    Kanza, Samantha; Willoughby, Cerys; Gibbins, Nicholas; Whitby, Richard; Frey, Jeremy Graham; Erjavec, Jana; Zupančič, Klemen; Hren, Matjaž; Kovač, Katarina

    2017-05-24

    Despite the increasingly digital nature of society there are some areas of research that remain firmly rooted in the past; in this case the laboratory notebook, the last remaining paper component of an experiment. Countless electronic laboratory notebooks (ELNs) have been created in an attempt to digitise record keeping processes in the lab, but none of them have become a 'key player' in the ELN market, due to the many adoption barriers that have been identified in previous research and further explored in the user studies presented here. The main issues identified are the cost of the current available ELNs, their ease of use (or lack of it) and their accessibility issues across different devices and operating systems. Evidence suggests that whilst scientists willingly make use of generic notebooking software, spreadsheets and other general office and scientific tools to aid their work, current ELNs are lacking in the required functionality to meet the needs of the researchers. In this paper we present our extensive research and user study results to propose an ELN built upon a pre-existing cloud notebook platform that makes use of accessible popular scientific software and semantic web technologies to help overcome the identified barriers to adoption.

  11. The status of electronic laboratory notebooks for chemistry and biology.

    Science.gov (United States)

    Taylor, Keith T

    2006-05-01

    Documenting an experiment in a way that ensures that the record can act as evidence to support a patent claim or to demonstrate compliance with the US Food and Drug Administration's (FDA's) predicate rules, puts demands on an electronic laboratory notebook (ELN) that are not trivial. The 1996 General Agreement on Tariffs and Trade (GATT) allowed notebook records that were generated outside of the US to be used to claim precedence in US patent claims. This agreement spurred interest in the development of ELNs in Europe. The pharmaceutical research process became dependent on computer systems during the latter part of the 1990s, and this also led to a wider interest in ELNs. More recently, the FDA began to encourage submissions in an all-electronic form, leading to great interest in the use of ELNs in development and manufacturing. As a result of these influences, the pharmaceutical industry is now actively pursuing ELN evaluations and implementations. This article describes some of the early efforts and the recent drivers for ELN adoption. The state of the ELN market in 2005 is also described.

  12. Validation of practical notebook Morphophysiology IV

    Directory of Open Access Journals (Sweden)

    Rafael Capote Martínez

    2012-03-01

    Full Text Available Since the implementation of the Morphophysiology, 2007-2008 academic year, has shown a low achievement and motivation of students in individual and independent study. Most of them do not possess the skills or general intellectual development freelancing enough to assimilate the great independence required by this new learning model. Therefore, it was decided to introduce a new medium of instruction notebook handy morphophysiology IV which is an orientation guide for individual and independent study of students at the same time improves the management process of the educational process, whose usefulness is intended to validate. They surveyed 345 (94.8% who choose different specialties including Physical Culture. It was used in the survey, the questionnaire technique questions and answers, the closed first, and combining the direct and indirect, with some filter. This study was conducted through the comparison test of proportions between independent samples, using the Microstat statistical system, with a significance level of α = 0.05 (P <0.05. The 97.33% and 95.6% of students classified as necessary and useful respectively, the Practical Notebook, based on teaching assignments. The 67.84% of students suggests that the effectiveness of Practical Notebook is achieved when it is developed in a coordinated manner from staff student effort, reflections set out in groups or teams of study and the guiding role and facilitator professor, where it appears that forces develop participatory techniques. It is concluded that the Practical Notebook, acts as a methodological guide guiding, and developed through teaching assignments by students during individual and independent study, represents an effective tool for the learning process, contributing significantly to the improvement morphophysiology the curricular discipline.

  13. Teaching Radiology Physics Interactively with Scientific Notebook Software.

    Science.gov (United States)

    Richardson, Michael L; Amini, Behrang

    2018-06-01

    The goal of this study is to demonstrate how the teaching of radiology physics can be enhanced with the use of interactive scientific notebook software. We used the scientific notebook software known as Project Jupyter, which is free, open-source, and available for the Macintosh, Windows, and Linux operating systems. We have created a scientific notebook that demonstrates multiple interactive teaching modules we have written for our residents using the Jupyter notebook system. Scientific notebook software allows educators to create teaching modules in a form that combines text, graphics, images, data, interactive calculations, and image analysis within a single document. These notebooks can be used to build interactive teaching modules, which can help explain complex topics in imaging physics to residents. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  14. Interacting with Petabytes of Earth Science Data using Jupyter Notebooks, IPython Widgets and Google Earth Engine

    Science.gov (United States)

    Erickson, T. A.; Granger, B.; Grout, J.; Corlay, S.

    2017-12-01

    The volume of Earth science data gathered from satellites, aircraft, drones, and field instruments continues to increase. For many scientific questions in the Earth sciences, managing this large volume of data is a barrier to progress, as it is difficult to explore and analyze large volumes of data using the traditional paradigm of downloading datasets to a local computer for analysis. Furthermore, methods for communicating Earth science algorithms that operate on large datasets in an easily understandable and reproducible way are needed. Here we describe a system for developing, interacting, and sharing well-documented Earth Science algorithms that combines existing software components: Jupyter Notebook: An open-source, web-based environment that supports documents that combine code and computational results with text narrative, mathematics, images, and other media. These notebooks provide an environment for interactive exploration of data and development of well documented algorithms. Jupyter Widgets / ipyleaflet: An architecture for creating interactive user interface controls (such as sliders, text boxes, etc.) in Jupyter Notebooks that communicate with Python code. This architecture includes a default set of UI controls (sliders, dropboxes, etc.) as well as APIs for building custom UI controls. The ipyleaflet project is one example that offers a custom interactive map control that allows a user to display and manipulate geographic data within the Jupyter Notebook. Google Earth Engine: A cloud-based geospatial analysis platform that provides access to petabytes of Earth science data via a Python API. The combination of Jupyter Notebooks, Jupyter Widgets, ipyleaflet, and Google Earth Engine makes it possible to explore and analyze massive Earth science datasets via a web browser, in an environment suitable for interactive exploration, teaching, and sharing. Using these environments can make Earth science analyses easier to understand and reproducible, which may

  15. Cloud-based Jupyter Notebooks for Water Data Analysis

    Science.gov (United States)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  16. Methods and Strategies: Digital Notebooks for Digital Natives

    Science.gov (United States)

    Miller, Bridget; Martin, Christie

    2016-01-01

    The idea of notebooking is not new in the science classroom. Since the mid-1970s, writing has been found to facilitate students' critical thinking and learning across a variety of content areas. For science educators, notebooks have become an essential tool for supporting students' scientific inquiry in and across concepts. Scientific notebooks…

  17. Proper laboratory notebook practices: protecting your intellectual property.

    Science.gov (United States)

    Nickla, Jason T; Boehm, Matthew B

    2011-03-01

    A laboratory notebook contains a wealth of knowledge that can be critical for establishing evidence in support of intellectual property rights and for refuting claims of research misconduct. The proper type, organization, use, maintenance, and storage of laboratory notebooks should be a priority for everyone at research institutions. Failure to properly document research activities can lead to serious problems, including the loss of valuable patent rights. Consequences of improper laboratory notebook practices can be harsh; numerous examples are described in court cases and journal articles, indicating a need for research institutions to develop strict policies on the proper use and storage of research documentation.

  18. IFRI's notebooks. Energy, development and security

    International Nuclear Information System (INIS)

    Finon, D.; Jacquet, P.

    1999-01-01

    Today, the concept of energy security has been greatly modified by the worldwide trade, the markets deregulation, the technical progress and the nuclear contestation. This notebook is the synthesis of a colloquium jointly organized in December 16, 1997 by the IFRI and the Institute of Economy and Energy Policy (IEPE) with the sustain of the French delegation of strategic affairs of the defense ministry. It analyzes the evolution of energy markets at the year 2030 prospects and stresses on the role of Middle-East and on the stakes of the economical development and energy policy of China. Finally, it treats of the goals and modalities of the French and European energy policies. (J.S.)

  19. The Impact of Notebooking on Teacher Candidates’ Construction of Knowledge

    Directory of Open Access Journals (Sweden)

    Jennifer A. MOHR

    2014-07-01

    Full Text Available Teacher education preparation programs must adapt to changing science education reform movements that identify notebooking as an effective means to increase children’s’ science process skills and content knowledge. This study addresses the question, “What are the structures and thinking processes that teacher candidates utilize when writing in notebooks?” Specifically, how do they express their thoughts during an observational-based prompt writing experience in an undergraduate, integrated science and mathematics methods course? Sixteen teacher candidates at a Midwestern university in the United States completed an eight-week assignment during the spring 2012 semester using notebooks. Results indicate the participants could be placed into three distinct categories of processing and formatting the notebooks which are described in detail with supporting examples.

  20. Electronic Laboratory Notebook on Web2py Framework

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available Proper experimental record-keeping is an important cornerstone in research and development for the purpose of auditing. The gold standard of record-keeping is based on the judicious use of physical, permanent notebooks. However, advances in technology had resulted in large amounts of electronic records making it virtually impossible to maintain a full set of records in physical notebooks. Electronic laboratory notebook systems aim to meet the stringency for keeping records electronically. This manuscript describes CyNote which is an electronic laboratory notebook system that is compliant with 21 CFP Part 11 controls on electronic records, requirements set by USA Food and Drug Administration for electronic records. CyNote is implemented on web2py framework and is adhering to the architectural paradigm of model-view-controller (MVC, allowing for extension modules to be built for CyNote. CyNote is available at http://cynote.sf.net.

  1. Smart Electronic Laboratory Notebooks for the NIST Research Environment.

    Science.gov (United States)

    Gates, Richard S; McLean, Mark J; Osborn, William A

    2015-01-01

    Laboratory notebooks have been a staple of scientific research for centuries for organizing and documenting ideas and experiments. Modern laboratories are increasingly reliant on electronic data collection and analysis, so it seems inevitable that the digital revolution should come to the ordinary laboratory notebook. The most important aspect of this transition is to make the shift as comfortable and intuitive as possible, so that the creative process that is the hallmark of scientific investigation and engineering achievement is maintained, and ideally enhanced. The smart electronic laboratory notebooks described in this paper represent a paradigm shift from the old pen and paper style notebooks and provide a host of powerful operational and documentation capabilities in an intuitive format that is available anywhere at any time.

  2. Logs, blogs and pods: smart electronic laboratory notebooks

    OpenAIRE

    Frey, Jeremy G.

    2009-01-01

    The Southampton experiences in developing a semantic electronic laboratory notebook for synthetic organic chemistry and a web 2.0 style laboratory Blog Book are introduced and discussed in the context of the Smart Laboratory.

  3. Electronic Engineering Notebook: A software environment for research execution, documentation and dissemination

    Science.gov (United States)

    Moerder, Dan

    1994-01-01

    The electronic engineering notebook (EEN) consists of a free form research notebook, implemented in a commercial package for distributed hypermedia, which includes utilities for graphics capture, formatting and display of LaTex constructs, and interfaces to the host operating system. The latter capability consists of an information computer-aided software engineering (CASE) tool and a means to associate executable scripts with source objects. The EEN runs on Sun and HP workstations. The EEN, in day-to-day use can be used in much the same manner as the sort of research notes most researchers keep during development of projects. Graphics can be pasted in, equations can be entered via LaTex, etc. In addition, the fact that the EEN is hypermedia permits easy management of 'context', e.g., derivations and data can contain easily formed links to other supporting derivations and data. The CASE tool also permits development and maintenance of source code directly in the notebook, with access to its derivations and data.

  4. The effect of vocabulary notebooks on vocabulary acquisition

    OpenAIRE

    Bozkurt, Neval

    2007-01-01

    Ankara : The Department of Teaching English as a Foreign Language, Bilkent University, 2007. Thesis (Master's) -- Bilkent University, 2007. Includes bibliographical references leaves 82-87 This study investigated the effectiveness of vocabulary notebooks on vocabulary acquisition, and the attitudes of teachers and learners towards keeping vocabulary notebooks. The study was conducted with the participation of 60 pre-intermediate level students, divided into one treatment ...

  5. Electronic laboratory notebook: the academic point of view.

    Science.gov (United States)

    Rudolphi, Felix; Goossen, Lukas J

    2012-02-27

    Based on a requirement analysis and alternative design considerations, a platform-independent electronic laboratory notebook (ELN) has been developed that specifically targets academic users. Its intuitive design and numerous productivity features motivate chemical researchers and students to record their data electronically. The data are stored in a highly structured form that offers substantial benefits over laboratory notebooks written on paper with regard to data retrieval, data mining, and exchange of results.

  6. Interactive model evaluation tool based on IPython notebook

    Science.gov (United States)

    Balemans, Sophie; Van Hoey, Stijn; Nopens, Ingmar; Seuntjes, Piet

    2015-04-01

    remaining parameter sets. As such, by interactively changing the settings and interpreting the graph, the user gains insight in the model structural behaviour. Moreover, a more deliberate choice of objective function and periods of high information content can be identified. The environment is written in an IPython notebook and uses the available interactive functions provided by the IPython community. As such, the power of the IPython notebook as a development environment for scientific computing is illustrated (Shen, 2014).

  7. A universal open-source Electronic Laboratory Notebook.

    Science.gov (United States)

    Voegele, Catherine; Bouchereau, Baptiste; Robinot, Nivonirina; McKay, James; Damiecki, Philippe; Alteyrac, Lucile

    2013-07-01

    Laboratory notebooks remain crucial to the activities of research communities. With the increase in generation of electronic data within both wet and dry analytical laboratories and new technologies providing more efficient means of communication, Electronic Laboratory Notebooks (ELN) offer equivalent record keeping to paper-based laboratory notebooks (PLN). They additionally allow more efficient mechanisms for data sharing and retrieval, which explains the growing number of commercial ELNs available varying in size and scope but all are increasingly accepted and used by the scientific community. The International Agency for Research on Cancer (IARC) having already an LIMS and a Biobank Management System for respectively laboratory workflows and sample management, we have developed a free multidisciplinary ELN specifically dedicated to work notes that will be flexible enough to accommodate different types of data. Information for installation of our freeware ELN with source codes customizations are detailed in supplementary data. Supplementary data are available at Bioinformatics online.

  8. Understanding NASA surface missions with the PDS Analyst's Notebook

    Science.gov (United States)

    Stein, T.

    2011-10-01

    Planetary data archives of surface missions contain data from numerous hosted instruments. Because of the nondeterministic nature of surface missions, it is not possible to assess the data without understanding the context in which they were collected. The PDS Analyst's Notebook (http://an.rsl.wustl.edu) provides access to Mars Exploration Rover (MER) [1] and Mars Phoenix Lander [2] data archives by integrating sequence information, engineering and science data, observation planning and targeting, and documentation into web-accessible pages to facilitate "mission replay." In addition, Lunar Apollo surface mission data archives and LCROSS mission data are available in the Analyst's Notebook concept, and a Notebook is planned for Mars Science Laboratory (MSL) mission.

  9. Users' Manual for Research: Translating Head Start Findings Into Action (Expanded Notebook Version).

    Science.gov (United States)

    Grotberg, Edith H.; Fowler, Austine

    This users' manual, intended for use with a Project Head Start teacher training notebook, describes the purpose, development and field testing of the training materials and suggests procedures for using the notebook as a resource in teacher training sessions. The training notebook to which the users' manual refers is based on 11 questions in the…

  10. SimpleITK Image-Analysis Notebooks: a Collaborative Environment for Education and Reproducible Research.

    Science.gov (United States)

    Yaniv, Ziv; Lowekamp, Bradley C; Johnson, Hans J; Beare, Richard

    2018-06-01

    Modern scientific endeavors increasingly require team collaborations to construct and interpret complex computational workflows. This work describes an image-analysis environment that supports the use of computational tools that facilitate reproducible research and support scientists with varying levels of software development skills. The Jupyter notebook web application is the basis of an environment that enables flexible, well-documented, and reproducible workflows via literate programming. Image-analysis software development is made accessible to scientists with varying levels of programming experience via the use of the SimpleITK toolkit, a simplified interface to the Insight Segmentation and Registration Toolkit. Additional features of the development environment include user friendly data sharing using online data repositories and a testing framework that facilitates code maintenance. SimpleITK provides a large number of examples illustrating educational and research-oriented image analysis workflows for free download from GitHub under an Apache 2.0 license: github.com/InsightSoftwareConsortium/SimpleITK-Notebooks .

  11. The Computerized Laboratory Notebook concept for genetic toxicology experimentation and testing.

    Science.gov (United States)

    Strauss, G H; Stanford, W L; Berkowitz, S J

    1989-03-01

    We describe a microcomputer system utilizing the Computerized Laboratory Notebook (CLN) concept developed in our laboratory for the purpose of automating the Battery of Leukocyte Tests (BLT). The BLT was designed to evaluate blood specimens for toxic, immunotoxic, and genotoxic effects after in vivo exposure to putative mutagens. A system was developed with the advantages of low cost, limited spatial requirements, ease of use for personnel inexperienced with computers, and applicability to specific testing yet flexibility for experimentation. This system eliminates cumbersome record keeping and repetitive analysis inherent in genetic toxicology bioassays. Statistical analysis of the vast quantity of data produced by the BLT would not be feasible without a central database. Our central database is maintained by an integrated package which we have adapted to develop the CLN. The clonal assay of lymphocyte mutagenesis (CALM) section of the CLN is demonstrated. PC-Slaves expand the microcomputer to multiple workstations so that our computerized notebook can be used next to a hood while other work is done in an office and instrument room simultaneously. Communication with peripheral instruments is an indispensable part of many laboratory operations, and we present a representative program, written to acquire and analyze CALM data, for communicating with both a liquid scintillation counter and an ELISA plate reader. In conclusion we discuss how our computer system could easily be adapted to the needs of other laboratories.

  12. Using Vocabulary Notebooks for Vocabulary Acquisition and Teaching

    Science.gov (United States)

    Dubiner, Deborah

    2017-01-01

    Vocabulary knowledge is recognized as an essential element for second language acquisition and reading comprehension. One known way to encourage and support vocabulary development amongst second language learners is keeping a vocabulary notebook. The primary purpose of the present study was to document two aspects of student teachers' own…

  13. Digital Science Notebooks: Perspectives from an Elementary Classroom Teacher

    Science.gov (United States)

    Paek, Seungoh; Fulton, Lori A.

    2017-01-01

    This study investigates how tablet-based note-taking applications can be integrated into elementary science classes as digital science notebooks. A teacher with 20 students in Grades 4-5 from a public charter school in Hawaii participated in the study. The participating science teacher introduced a tablet-based note taking application (TNA) to her…

  14. Laboratory E-Notebooks: A Learning Object-Based Repository

    Science.gov (United States)

    Abari, Ilior; Pierre, Samuel; Saliah-Hassane, Hamadou

    2006-01-01

    During distributed virtual laboratory experiment sessions, a major problem is to be able to collect, store, manage and share heterogeneous data (intermediate results, analysis, annotations, etc) manipulated simultaneously by geographically distributed teammates composing a virtual team. The electronic notebook is a possible response to this…

  15. Identifying Non-Volatile Data Storage Areas: Unique Notebook Identification Information as Digital Evidence

    Directory of Open Access Journals (Sweden)

    Nikica Budimir

    2007-03-01

    Full Text Available The research reported in this paper introduces new techniques to aid in the identification of recovered notebook computers so they may be returned to the rightful owner. We identify non-volatile data storage areas as a means of facilitating the safe storing of computer identification information. A forensic proof of concept tool has been designed to test the feasibility of several storage locations identified within this work to hold the data needed to uniquely identify a computer. The tool was used to perform the creation and extraction of created information in order to allow the analysis of the non-volatile storage locations as valid storage areas capable of holding and preserving the data created within them.  While the format of the information used to identify the machine itself is important, this research only discusses the insertion, storage and ability to retain such information.

  16. A review of electronic laboratory notebooks available in the market today.

    Science.gov (United States)

    Rubacha, Michael; Rattan, Anil K; Hosselet, Stephen C

    2011-02-01

    Electronic laboratory notebooks are becoming an increasingly popular tool for research and routine laboratories as part of a way to optimize workflow and minimize cost while realizing time-saving benefits. The number and variety of available solutions are quickly increasing; making selection of the right notebook a cumbersome process. To allay some of the strain associated with an exhaustive search through notebook technologies, this paper details some key features from a pool of 35 electronic notebooks available today. This review effectively classifies these notebooks into five categories based on market audience as follows: notebooks suited for a Quality environment can be found within the Quality Assurance/Quality Control pool. Notebooks suited for specialized tasks in Biology or Chemistry can be found within the Biology or Chemistry pools, respectively. Notebooks that are suitable for general science functionalities can be found under either the Research and Development or the Multidiscipline pools. Lastly, notebooks that are designed and developed for the spectrum of stringent Quality laboratories to free-form research laboratories can be found within the Multidiscipline pool. The guidelines put forth in this paper eliminate the need to perform an exhaustive search for a suitable notebook. Copyright © 2011 Society for Laboratory Automation and Screening. Published by Elsevier Inc. All rights reserved.

  17. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  18. Application of miniature heat pipe for notebook PC cooling

    Energy Technology Data Exchange (ETDEWEB)

    Moon, S.H.; Hwang, G.; Choy, T.G. [Electronics and Telecommunications research Institute, Taejeon (Korea)

    2001-06-01

    Miniature heat pipe(MHP) with woven-wired wick was used to cool the CPU of a notebook PC. The pipe with circular cross-section was pressed and bent for packaging the MHP into a notebook PC with very limited compact packaging space. A cross-sectional area of the pipe is reduced about 30% as the MHP with 4 mm diameter is pressed to 2 mm thickness. In the present study a performance test has been performed in order to review varying of operating performance according to pressed thickness variation and heat dissipation capacity of MHP cooling module that is packaged on a notebook PC. New wick type was considered for overcoming low heat transfer limit when MHP is pressed to thin-plate. The limiting thickness of pressing is shown to be within the range of 2 mm {approx} 2.5 mm through the performance test with varying the pressing thickness. When the wall thickness of 0.4 mm is reduced to 0.25 mm for minimizing conductive thermal resistance through the wall of heat pipe, heat transfer limit and thermal resistance of MHP were improved about 10%. In the meantime, it is shown that the thermal resistance and heat transfer limit for the MHP with central wick type are higher than those of MHP with existing wick types. The results of performance test for MHP cooling modules with woven-wired wick to cool notebook PC shows the stability as cooling system since T{sub j}(Temperature of Processor Junction) satisfy a demand condition of 0 {approx} 100 deg.C under 11.5 W of CPU heat. (author). 6 refs., 7 figs.

  19. The use of notebooks in mathematics instruction. What is manageable? What should be avoided? A field report after 10 years of CAS-application

    OpenAIRE

    Hofbauer, Peter

    2012-01-01

    Computer Algebra Systems (CAS) have been changing the mathematics instruction requirements for many years. Since the tendency of using CAS in mathematics instruction has been rising for decades and reports have often been positive, the implementation of notebook classes seems to be the consequent next step of mathematics instruction supported by computers. Experiences that have been made with the use of CAS in PC-rooms can be transformed directly into the classroom. Hence the use of CAS is no...

  20. Cloud hosting of the IPython Notebook to Provide Collaborative Research Environments for Big Data Analysis

    Science.gov (United States)

    Kershaw, Philip; Lawrence, Bryan; Gomez-Dans, Jose; Holt, John

    2015-04-01

    We explore how the popular IPython Notebook computing system can be hosted on a cloud platform to provide a flexible virtual research hosting environment for Earth Observation data processing and analysis and how this approach can be expanded more broadly into a generic SaaS (Software as a Service) offering for the environmental sciences. OPTIRAD (OPTImisation environment for joint retrieval of multi-sensor RADiances) is a project funded by the European Space Agency to develop a collaborative research environment for Data Assimilation of Earth Observation products for land surface applications. Data Assimilation provides a powerful means to combine multiple sources of data and derive new products for this application domain. To be most effective, it requires close collaboration between specialists in this field, land surface modellers and end users of data generated. A goal of OPTIRAD then is to develop a collaborative research environment to engender shared working. Another significant challenge is that of data volume and complexity. Study of land surface requires high spatial and temporal resolutions, a relatively large number of variables and the application of algorithms which are computationally expensive. These problems can be addressed with the application of parallel processing techniques on specialist compute clusters. However, scientific users are often deterred by the time investment required to port their codes to these environments. Even when successfully achieved, it may be difficult to readily change or update. This runs counter to the scientific process of continuous experimentation, analysis and validation. The IPython Notebook provides users with a web-based interface to multiple interactive shells for the Python programming language. Code, documentation and graphical content can be saved and shared making it directly applicable to OPTIRAD's requirements for a shared working environment. Given the web interface it can be readily made into a hosted

  1. Cooling performance of a notebook PC mounted with heat spreader

    Energy Technology Data Exchange (ETDEWEB)

    Noh, H.K. [Electronics and Telecommunications Research Institute, Taejeon (Korea); Lim, K.B. [Hanbat National University, Taejeon (Korea); Park, M.H. [Korea Power Engineering Company (Korea)

    2001-06-01

    Parametric study to investigate the cooling performance of a notebook PC mounted with heat spreader has been numerically performed. Two case of air-blowing and air-exhaust at inlet were tested. The cooling effect on parameters such as, inlet velocities in the cases of air-blowing and air-exhaust, materials of heat spreader, and CPU powers were simulated for two cases. Cooling performance in the case of air-blowing was better than the case of air-exhaust. (author). 9 refs., 7 figs., 5 tabs.

  2. More than Data: Using Interactive Science Notebooks to Engage Students in Science and Engineering

    Science.gov (United States)

    Mason, Kevin; Bohl, Heather

    2017-01-01

    A traditional science notebook is an official record of a scientist's research. Even in today's digital world, it is still common practice for scientists to record their experimental procedures, data, analysis, results, notes, and other thoughts on the right pages of a bound notebook in permanent ink with nothing written on the left side or back…

  3. An Alternative Approach to Assessing Laboratory and Field Notebooks: The Data Retrieval Test

    Science.gov (United States)

    Bedford, Hilary; Bedford, Alan; Thomas, Judith; Ashton, Paul

    2010-01-01

    Marking field and laboratory notebooks can be a time consuming and tedious task. This article describes a system whereby the contents of student's notebooks are assessed by testing the students on what they have included and their understanding of what has been done. It also tests the quality of the student's notes--detailed, organised notes…

  4. Laboratory Activity on Sample Handling and Maintaining a Laboratory Notebook through Simple pH Measurements

    Science.gov (United States)

    Erdmann, Mitzy A.; March, Joe L.

    2016-01-01

    Sample handling and laboratory notebook maintenance are necessary skills but can seem abstract if not presented to students in context. An introductory exercise focusing on proper sample handling, data collection and laboratory notebook keeping for the general chemistry laboratory was developed to emphasize the importance of keeping an accurate…

  5. SoS Notebook: An Interactive Multi-Language Data Analysis Environment.

    Science.gov (United States)

    Peng, Bo; Wang, Gao; Ma, Jun; Leong, Man Chong; Wakefield, Chris; Melott, James; Chiu, Yulun; Du, Di; Weinstein, John N

    2018-05-22

    Complex bioinformatic data analysis workflows involving multiple scripts in different languages can be difficult to consolidate, share, and reproduce. An environment that streamlines the entire processes of data collection, analysis, visualization and reporting of such multi-language analyses is currently lacking. We developed Script of Scripts (SoS) Notebook, a web-based notebook environment that allows the use of multiple scripting language in a single notebook, with data flowing freely within and across languages. SoS Notebook enables researchers to perform sophisticated bioinformatic analysis using the most suitable tools for different parts of the workflow, without the limitations of a particular language or complications of cross-language communications. SoS Notebook is hosted at http://vatlab.github.io/SoS/ and is distributed under a BSD license. bpeng@mdanderson.org.

  6. Reproducible Bioconductor workflows using browser-based interactive notebooks and containers.

    Science.gov (United States)

    Almugbel, Reem; Hung, Ling-Hong; Hu, Jiaming; Almutairy, Abeer; Ortogero, Nicole; Tamta, Yashaswi; Yeung, Ka Yee

    2018-01-01

    Bioinformatics publications typically include complex software workflows that are difficult to describe in a manuscript. We describe and demonstrate the use of interactive software notebooks to document and distribute bioinformatics research. We provide a user-friendly tool, BiocImageBuilder, that allows users to easily distribute their bioinformatics protocols through interactive notebooks uploaded to either a GitHub repository or a private server. We present four different interactive Jupyter notebooks using R and Bioconductor workflows to infer differential gene expression, analyze cross-platform datasets, process RNA-seq data and KinomeScan data. These interactive notebooks are available on GitHub. The analytical results can be viewed in a browser. Most importantly, the software contents can be executed and modified. This is accomplished using Binder, which runs the notebook inside software containers, thus avoiding the need to install any software and ensuring reproducibility. All the notebooks were produced using custom files generated by BiocImageBuilder. BiocImageBuilder facilitates the publication of workflows with a point-and-click user interface. We demonstrate that interactive notebooks can be used to disseminate a wide range of bioinformatics analyses. The use of software containers to mirror the original software environment ensures reproducibility of results. Parameters and code can be dynamically modified, allowing for robust verification of published results and encouraging rapid adoption of new methods. Given the increasing complexity of bioinformatics workflows, we anticipate that these interactive software notebooks will become as necessary for documenting software methods as traditional laboratory notebooks have been for documenting bench protocols, and as ubiquitous. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. TECHNICAL BASIS DOCUMENT FOR AT-POWER SIGNIFICANCE DETERMINATION PROCESS (SDP) NOTEBOOKS

    International Nuclear Information System (INIS)

    AZARM, M.A.; SMANTA, P.K.; MARTINEZ-GURIDI, G.; HIGGINS, J.

    2004-01-01

    To support the assessment of inspection findings as part of the risk-informed inspection in the United States Nuclear Regulatory Commission's (USNRC's) Reactor Oversight Process (ROP), risk inspection notebooks, also called significance determination process (SDP) notebooks, have been developed for each of the operating plants in the United States. These notebooks serve as a tool for assessing risk significance of inspection findings along with providing an engineering understanding of the significance. Plant-specific notebooks are developed to capture plant-specific features, characteristics, and analyses that influence the risk profile of the plant. At the same time, the notebooks follow a consistent set of assumptions and guidelines to assure consistent treatment of inspection findings across the plants. To achieve these objectives, notebooks are designed to provide specific information that are unique both in the manner in which the information is provided and in the way the screening risk assessment is carried out using the information provided. The unique features of the SDP notebooks, the approaches used to present the information for assessment of inspection findings, the assumptions used in consistent modeling across different plants with due credit to plant-specific features and analyses form the technical basis of the SDP notebooks. In this document, the unique features and the technical basis for the notebooks are presented. The types of information that are included and the reasoning/basis for including that information are discussed. The rules and basis for developing the worksheets that are used by the inspectors in the assessment of inspection findings are presented. The approach to modeling plants' responses to different initiating events and specific assumptions/considerations used for each of the reactor types are also discussed

  8. Female Identity in Doris Lessing’s The Golden Notebook

    Directory of Open Access Journals (Sweden)

    Heba Mohamed Abd El Aziz

    2018-02-01

    Full Text Available In the realm of art in general and literature in particular, the presence of Doris Lessing could not be denied as one of the most influential English novelists in the 1960s. Doris Lessing is a writer who is concerned with the representation of women identity in the West. In her renowned novel, The Golden Notebook Lessing aims at showcasing women identity in Europe and any aspect related to them, i.e., their psychology, political lives, relation to men and children, their place in a male-dominated society and their frequent attempts to escape from the social and political oppression. The aim of this paper is to present a truthful account of female identity from a feminist point of view.

  9. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial)

    Science.gov (United States)

    Howe, Adina; Chain, Patrick S. G.

    2015-01-01

    Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs), providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats While numerous tools have been developed based on these methodological concepts, they present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow. PMID:26217314

  10. Challenges and opportunities in understanding microbial communities with metagenome assembly (accompanied by IPython Notebook tutorial

    Directory of Open Access Journals (Sweden)

    Adina eHowe

    2015-07-01

    Full Text Available Metagenomic investigations hold great promise for informing the genetics, physiology, and ecology of environmental microorganisms. Current challenges for metagenomic analysis are related to our ability to connect the dots between sequencing reads, their population of origin, and their encoding functions. Assembly-based methods reduce dataset size by extending overlapping reads into larger contiguous sequences (contigs, providing contextual information for genetic sequences that does not rely on existing references. These methods, however, tend to be computationally intensive and are again challenged by sequencing errors as well as by genomic repeats While numerous tools have been developed based on these methodological concepts, they present confounding choices and training requirements to metagenomic investigators. To help with accessibility to assembly tools, this review also includes an IPython Notebook metagenomic assembly tutorial. This tutorial has instructions for execution any operating system using Amazon Elastic Cloud Compute and guides users through downloading, assembly, and mapping reads to contigs of a mock microbiome metagenome. Despite its challenges, metagenomic analysis has already revealed novel insights into many environments on Earth. As software, training, and data continue to emerge, metagenomic data access and its discoveries will to grow.

  11. Developing and Validating a Science Notebook Rubric for Fifth-Grade Non-Mainstream Students

    Science.gov (United States)

    Huerta, Margarita; Lara-Alecio, Rafael; Tong, Fuhui; Irby, Beverly J.

    2014-07-01

    We present the development and validation of a science notebook rubric intended to measure the academic language and conceptual understanding of non-mainstream students, specifically fifth-grade male and female economically disadvantaged Hispanic English language learner (ELL) and African-American or Hispanic native English-speaking students. The science notebook rubric is based on two main constructs: academic language and conceptual understanding. The constructs are grounded in second-language acquisition theory and theories of writing and conceptual understanding. We established content validity and calculated reliability measures using G theory and percent agreement (for comparison) with a sample of approximately 144 unique science notebook entries and 432 data points. Results reveal sufficient reliability estimates, indicating that the instrument is promising for use in future research studies including science notebooks in classrooms with populations of economically disadvantaged Hispanic ELL and African-American or Hispanic native English-speaking students.

  12. Mathematica: A System of Computer Programs

    OpenAIRE

    Maiti, Santanu K.

    2006-01-01

    Starting from the basic level of mathematica here we illustrate how to use a mathematica notebook and write a program in the notebook. Next, we investigate elaborately the way of linking of external programs with mathematica, so-called the mathlink operation. Using this technique we can run very tedious jobs quite efficiently, and the operations become extremely fast. Sometimes it is quite desirable to run jobs in background of a computer which can take considerable amount of time to finish, ...

  13. Data Mining to Capture User-Experience: A Case Study in Notebook Product Appearance Design

    OpenAIRE

    Rhoann Kerh; Chen-Fu Chien; Kuo-Yi Lin

    2014-01-01

    In the era of rapidly increasing notebook market, consumer electronics manufacturers are facing a highly dynamic and competitive environment. In particular, the product appearance is the first part for user to distinguish the product from the product of other brands. Notebook product should differ in its appearance to engage users and contribute to the user experience (UX). The UX evaluates various product concepts to find the design for user needs; in addition, help the designer to further u...

  14. Implementation of an electronic laboratory notebook to accelerate data review in bioanalysis.

    Science.gov (United States)

    Shoup, Ronald E; Beato, Brian D; Pisek, April; White, Jessica; Branstrator, Laurel; Bousum, Abby; Roach, Jasmine; Grever, Tim

    2013-07-01

    Electronic laboratory notebooks increase opportunities for collaboration and information exchange when compared with paper records. Depending on the degree of implementation, a laboratory- or enterprise-wide system can unify the collection, review and dissemination of data to improve laboratory efficiency and productivity. The advantages of an electronic laboratory notebook for speeding data review in bioanalysis are discussed, through the use of validated templates and organizational constructs to block errors in real-time and reduce manual audit tasks.

  15. Lab notebooks as scientific communication: Investigating development from undergraduate courses to graduate research

    Directory of Open Access Journals (Sweden)

    Jacob T. Stanley

    2016-09-01

    Full Text Available In experimental physics, lab notebooks play an essential role in the research process. For all of the ubiquity of lab notebooks, little formal attention has been paid to addressing what is considered “best practice” for scientific documentation and how researchers come to learn these practices in experimental physics. Using interviews with practicing researchers, namely, physics graduate students, we explore the different experiences researchers had in learning how to effectively use a notebook for scientific documentation. We find that very few of those interviewed thought that their undergraduate lab classes successfully taught them the benefit of maintaining a lab notebook. Most described training in lab notebook use as either ineffective or outright missing from their undergraduate lab course experience. Furthermore, a large majority of those interviewed explained that they did not receive any formal training in maintaining a lab notebook during their graduate school experience and received little to no feedback from their advisors on these records. Many of the interviewees describe learning the purpose of, and how to maintain, these kinds of lab records only after having a period of trial and error, having already started doing research in their graduate program. Despite the central role of scientific documentation in the research enterprise, these physics graduate students did not gain skills in documentation through formal instruction, but rather through informal hands-on practice.

  16. Using Evernote as an electronic lab notebook in a translational science laboratory.

    Science.gov (United States)

    Walsh, Emily; Cho, Ilseung

    2013-06-01

    Electronic laboratory notebooks (ELNs) offer significant advantages over traditional paper laboratory notebooks (PLNs), yet most research labs today continue to use paper documentation. While biopharmaceutical companies represent the largest portion of ELN users, government and academic labs trail far behind in their usage. Our lab, a translational science laboratory at New York University School of Medicine (NYUSoM), wanted to determine if an ELN could effectively replace PLNs in an academic research setting. Over 6 months, we used the program Evernote to record all routine experimental information. We also surveyed students working in research laboratories at NYUSoM on the relative advantages and limitations of ELNs and PLNs and discovered that electronic and paper notebook users alike reported the inability to freehand into a notebook as a limitation when using electronic methods. Using Evernote, we found that the numerous advantages of ELNs greatly outweighed the inability to freehand directly into a notebook. We also used imported snapshots and drawing program add-ons to obviate the need for freehanding. Thus, we found that using Evernote as an ELN not only effectively replaces PLNs in an academic research setting but also provides users with a wealth of other advantages over traditional paper notebooks.

  17. Direct Simple Shear Test Data Analysis using Jupyter Notebooks on DesignSafe-CI

    Science.gov (United States)

    Eslami, M.; Esteva, M.; Brandenberg, S. J.

    2017-12-01

    Due to the large number of files and their complex structure, managing data generated during natural hazards experiments requires scalable and specialized tools. DesignSafe-CI (https://www.designsafe-ci.org/) is a web-based research platform that provides computational tools to analyze, curate, and publish critical data for natural hazards research making it understandable and reusable. We present a use case from a series of Direct Simple Shear (DSS) experiments in which we used DS-CI to post-process, visualize, publish, and enable further analysis of the data. Current practice in geotechnical design against earthquakes relies on the soil's plasticity index (PI) to assess liquefaction susceptibility, and cyclic softening triggering procedures, although, quite divergent recommendations on recommended levels of plasticity can be found in the literature for these purposes. A series of cyclic and monotonic direct simple shear experiments was conducted on three low-plasticity fine-grained mixtures at the same plasticity index to examine the effectiveness of the PI in characterization of these types of materials. Results revealed that plasticity index is an insufficient indicator of the cyclic behavior of low-plasticity fine-grained soils, and corrections for pore fluid chemistry and clay minerology may be necessary for future liquefaction susceptibility and cyclic softening assessment procedures. Each monotonic, or cyclic experiment contains two stages; consolidation and shear, which include time series of load, displacement, and corresponding stresses and strains, as well as equivalent excess pore-water pressure. Using the DS-CI curation pipeline we categorized the data to display and describe the experiment's structure and files corresponding to each stage of the experiments. Two separate notebooks in Python 3 were created using the Jupyter application available in DS-CI. A data plotter aids visualizing the experimental data in relation to the sensor from which it was

  18. Electronic laboratory notebooks in a public–private partnership

    Directory of Open Access Journals (Sweden)

    Lea A.I. Vaas

    2016-09-01

    Full Text Available This report shares the experience during selection, implementation and maintenance phases of an electronic laboratory notebook (ELN in a public–private partnership project and comments on user’s feedback. In particular, we address which time constraints for roll-out of an ELN exist in granted projects and which benefits and/or restrictions come with out-of-the-box solutions. We discuss several options for the implementation of support functions and potential advantages of open access solutions. Connected to that, we identified willingness and a vivid culture of data sharing as the major item leading to success or failure of collaborative research activities. The feedback from users turned out to be the only angle for driving technical improvements, but also exhibited high efficiency. Based on these experiences, we describe best practices for future projects on implementation and support of an ELN supporting a diverse, multidisciplinary user group based in academia, NGOs, and/or for-profit corporations located in multiple time zones.

  19. Development of High Performance Cooling Modules in Notebook PC's

    Science.gov (United States)

    Tanahashi, Kosei

    The CPU power consumption in Notebook PCs is increasing every year. Video chips and HDDs are also continually using larger power for higher performance. In addition, since miniaturization is desired, the mounting of components is becoming more and more dense. Accordingly, the cooling mechanisms are increasingly important. The cooling modules have to dissipate larger amounts of heat in the same environmental conditions. Therefore, high capacity cooling capabilities is needed, while low costs and high reliability must be retained. Available cooling methods include air or water cooling systems and the heat conduction method. The air cooling system is to transmit heat by a cooling fan often using a heat pipe. The water cooling one employs the water to carry heat to the back of the display, which offers a comparatively large cooling area. The heat conduction method is to transfer the heat by thermal conduction to the case. This article describes the development of new and comparatively efficient cooling devices offering low cost and high reliability for air cooling system. As one of the development techniques, the heat resistance and performance are measured for various parts and layouts. Each cooling system is evaluated in the same measurement environment. With regards to the fans, an optimal shape of the fan blades to maximize air flow is found by using CFD simulation, and prototypes were built and tested.

  20. First steps towards semantic descriptions of electronic laboratory notebook records.

    Science.gov (United States)

    Coles, Simon J; Frey, Jeremy G; Bird, Colin L; Whitby, Richard J; Day, Aileen E

    2013-12-20

    In order to exploit the vast body of currently inaccessible chemical information held in Electronic Laboratory Notebooks (ELNs) it is necessary not only to make it available but also to develop protocols for discovery, access and ultimately automatic processing. An aim of the Dial-a-Molecule Grand Challenge Network is to be able to draw on the body of accumulated chemical knowledge in order to predict or optimize the outcome of reactions. Accordingly the Network drew up a working group comprising informaticians, software developers and stakeholders from industry and academia to develop protocols and mechanisms to access and process ELN records. The work presented here constitutes the first stage of this process by proposing a tiered metadata system of knowledge, information and processing where each in turn addresses a) discovery, indexing and citation b) context and access to additional information and c) content access and manipulation. A compact set of metadata terms, called the elnItemManifest, has been derived and caters for the knowledge layer of this model. The elnItemManifest has been encoded as an XML schema and some use cases are presented to demonstrate the potential of this approach.

  1. Integration of ROOT Notebooks as an ATLAS analysis web-based tool in outreach and public data release

    CERN Document Server

    Sanchez, Arturo; The ATLAS collaboration

    2016-01-01

    The integration of the ROOT data analysis framework with the Jupyter Notebook technology presents an incredible potential in the enhance and expansion of educational and training programs: starting from university students in their early years, passing to new ATLAS PhD students and post doctoral researchers, to those senior analysers and professors that want to restart their contact with the analysis of data or to include a more friendly but yet very powerful open source tool in the classroom. Such tools have been already tested in several environments and a fully web-based integration together with Open Access Data repositories brings the possibility to go a step forward in the search of ATLAS for integration between several CERN projects in the field of the education and training, developing new computing solutions on the way.

  2. Integration of ROOT notebook as an ATLAS analysis web-based tool in outreach and public data release projects

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00237353; The ATLAS collaboration

    2017-01-01

    Integration of the ROOT data analysis framework with the Jupyter Notebook technology presents the potential of enhancement and expansion of educational and training programs. It can be beneficial for university students in their early years, new PhD students and post-doctoral researchers, as well as for senior researchers and teachers who want to refresh their data analysis skills or to introduce a more friendly and yet very powerful open source tool in the classroom. Such tools have been already tested in several environments. A fully web-based integration of the tools and the Open Access Data repositories brings the possibility to go a step forward in the ATLAS quest of making use of several CERN projects in the field of the education and training, developing new computing solutions on the way.

  3. PDS MSL Analyst's Notebook: Supporting Active Rover Missions and Adding Value to Planetary Data Archives

    Science.gov (United States)

    Stein, Thomas

    Planetary data archives of surface missions contain data from numerous hosted instruments. Because of the nondeterministic nature of surface missions, it is not possible to assess the data without understanding the context in which they were collected. The PDS Analyst’s Notebook (http://an.rsl.wustl.edu) provides access to Mars Science Laboratory (MSL) data archives by integrating sequence information, engineering and science data, observation planning and targeting, and documentation into web-accessible pages to facilitate “mission replay.” In addition, Mars Exploration Rover (MER), Mars Phoenix Lander, Lunar Apollo surface mission, and LCROSS mission data are available in the Analyst’s Notebook concept, and a Notebook is planned for the Insight mission. The MSL Analyst’s Notebook contains data, documentation, and support files for the Curiosity rovers. The inputs are incorporated on a daily basis into a science team version of the Notebook. The public version of the Analyst’s Notebook is comprised of peer-reviewed, released data and is updated coincident with PDS data releases as defined in mission archive plans. The data are provided by the instrument teams and are supported by documentation describing data format, content, and calibration. Both operations and science data products are included. The operations versions are generated to support mission planning and operations on a daily basis. They are geared toward researchers working on machine vision and engineering operations. Science versions of observations from some instruments are provided for those interested in radiometric and photometric analyses. Both data set documentation and sol (i.e., Mars day) documents are included in the Notebook. The sol documents are the mission manager and documentarian reports that provide a view into science operations—insight into why and how particular observations were made. Data set documents contain detailed information regarding the mission, spacecraft

  4. A pocket guide to electronic laboratory notebooks in the academic life sciences.

    Science.gov (United States)

    Dirnagl, Ulrich; Przesdzing, Ingo

    2016-01-01

    Every professional doing active research in the life sciences is required to keep a laboratory notebook. However, while science has changed dramatically over the last centuries, laboratory notebooks have remained essentially unchanged since pre-modern science. We argue that the implementation of electronic laboratory notebooks (eLN) in academic research is overdue, and we provide researchers and their institutions with the background and practical knowledge to select and initiate the implementation of an eLN in their laboratories. In addition, we present data from surveying biomedical researchers and technicians regarding which hypothetical features and functionalities they hope to see implemented in an eLN, and which ones they regard as less important. We also present data on acceptance and satisfaction of those who have recently switched from paper laboratory notebook to an eLN.  We thus provide answers to the following questions: What does an electronic laboratory notebook afford a biomedical researcher, what does it require, and how should one go about implementing it?

  5. On Darwin's 'metaphysical notebooks'. II: "Metaphysics" and final cause.

    Science.gov (United States)

    Calabi, L

    2001-01-01

    The first part of this paper was published in Rivista di Biologia/Biology Forum 94 (2001). In the second part below an examination is made of the meaning of the term Metaphysics in some passages of the Darwinian Notebooks for the years 1836-1844. Metaphysics no longer defines a field of philosophical enquiries mainly concerning the being and the essence after the manner of Aristotle; it now refers to a kind of philosophy of mind after the manner of J. Locke's criticism of the Hypokeimenon. However Aristotle's Metaphysics also encompasses a treatment of the idea of causes, and of final cause particularly, in the explanation of events, and in the explanation of natural phenomena especially. The criticism of the idea of final cause in the interpretation of the world of life is one of Darwin's foundational acts in his early years. When conceiving his Système du monde, in the last years of the XVIII Century, Laplace could think that God is a hypothesis not really needed by science, as we are told. For the knowledge of organic nature to attain the status of science, it remained to be shown that since--certain of the exemplariness of Newton's Principles as much as cautious before the mystery of life--did not need the hypothesis of final ends in order to understand and explain the productions of the living nature: not only in the form of that final cause (the First Cause, the Vera Causa) in which Natural Theology still rested, but also in the form of nature's inner finality which still moulded Whewell's Kantian philosophy. Such demonstration is a very important subject in Darwin's early enquiries, where he criticises finalism as a projection of self-conceiving Man, likely inherited from a knowing of causality in nuce to be found also in animals.

  6. Empowering Middle School Teachers with Portable Computers.

    Science.gov (United States)

    Weast, Jerry D.; And Others

    1993-01-01

    A Sioux Falls (South Dakota) project that supplied middle school teachers with Macintosh computers and training to use them showed gratifying results. Easy access to portable notebook computers made teachers more active computer users, increased teacher interaction and collaboration, enhanced teacher productivity regarding management tasks and…

  7. Teaching numerical methods with IPython notebooks and inquiry-based learning

    KAUST Repository

    Ketcheson, David I.

    2014-01-01

    A course in numerical methods should teach both the mathematical theory of numerical analysis and the craft of implementing numerical algorithms. The IPython notebook provides a single medium in which mathematics, explanations, executable code, and visualizations can be combined, and with which the student can interact in order to learn both the theory and the craft of numerical methods. The use of notebooks also lends itself naturally to inquiry-based learning methods. I discuss the motivation and practice of teaching a course based on the use of IPython notebooks and inquiry-based learning, including some specific practical aspects. The discussion is based on my experience teaching a Masters-level course in numerical analysis at King Abdullah University of Science and Technology (KAUST), but is intended to be useful for those who teach at other levels or in industry.

  8. DEVOTION IN NICHOLAS SPARKS’ THE NOTEBOOK (1996: AN INDIVIDUAL PSYCHOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Yuli Andria Fajarini

    2017-08-01

    Full Text Available The study described the devotion of Noah Calhoun, the main character in Nicholas Sparks’ The Notebook. It focused on its structural elements and the devotion of Noah to deal with inferiority feeling and compensation, striving for superiority, fictional finalism, style of life, social interest, and creative self that were explored through an individual psychological approach. This research was qualitative research with the primary data source of the novel entitled The Notebook written by Nicholas Sparks in 1996. While the secondary data were other related sources. The data were collected through library research. The results showed that based on individual psychology analysis the major character, Noah Calhoun is psychologically affected. Noah fights hard to get his true love and shows her his devotion. He dedicates all of his life for Allie.     Keywords: Devotion, The Notebook, Individual Psychological Approach.

  9. Developing an Audiovisual Notebook as a Self-Learning Tool in Histology: Perceptions of Teachers and Students

    Science.gov (United States)

    Campos-Sánchez, Antonio; López-Núñez, Juan-Antonio; Scionti, Giuseppe; Garzón, Ingrid; González-Andrades, Miguel; Alaminos, Miguel; Sola, Tomás

    2014-01-01

    Videos can be used as didactic tools for self-learning under several circumstances, including those cases in which students are responsible for the development of this resource as an audiovisual notebook. We compared students' and teachers' perceptions regarding the main features that an audiovisual notebook should include. Four…

  10. Une lecture, mille et une reflexions: Cahier de reflexion sur le processus de lecture. (One Reading, 1001 Reflections: Notebook of Reflection on the Reading Process.)

    Science.gov (United States)

    Alberta Dept. of Education, Edmonton.

    This activity notebook is intended to help French-speaking students in Alberta, Canada, develop reflective reading practices. Following an introduction and information (with graphics) on the notebook's organization, the notebook is divided into three sections of reading strategies: the first section contains three activities, the second section…

  11. SMART NOTEBOOK AS AN ICT WAY FОR DEVELOPMENT OF RESEARCH COMPETENCE

    Directory of Open Access Journals (Sweden)

    Svitlana V. Vasylenko

    2014-05-01

    Full Text Available The article discusses the benefits of the educational process for general and higher education through the development of research competence of students, information and communication technology training. These technologies are used in many areas of activity, including updated content of education, implementing of distance learning, introducing new forms of collaboration. Attention is accented on the features using SMART Notebook software for organizing the learning process in the form of interactive sessions, clarified a basic arguments to use SMART Notebook for creation an author's educational resources, to orient teachers to construct their personal innovative methodical system.

  12. MARTIN HEIDEGGER’S BLACK NOTEBOOKS AND POLITICAL ECONOMY OF CONTEMPORARY PHILOSOPHICAL CRITIQUE

    Directory of Open Access Journals (Sweden)

    A. O. Karpenko

    2016-06-01

    Full Text Available The purpose of the study is to determine the key strategies of philosophical criticism of Heidegger’s Black Notebooks, whose achievement is realized in the following tasks: 1 to identify the body of texts that represent the discourse of philosophical criticism of Heidegger notes; 2 to reveal the typological features of the different strategies of interpreting Black Notebooks; 3 to reconstruct a thematic horizon of Heidegger studies, opened up by discussion on published notes. The methodology combines elements of discourse analysis with traditional methods of historical and philosophical criticism. Scientific novelty is expressed in the following: 1 philosophical discourse of the Notebooks’ reception includes texts of narrowly specialized character (Gatherings collection of articles, as well as reflections of key philosophers (A. Badiou, J.-L. Nancy 2 basic strategies in philosophical critique of Black Notebooks convey the overall structure of the discourse of interpretation of Heidegger’s legacy, distributed between apologetics and ideological criticism; 3 Black Notebooks have exacerbated the problem of architectonics of Gesammtausgabe and formed the textual basis for the study of "silence" period in philosophical life of Heidegger. Conclusions. The discourse of philosophical critique of Heidegger’s notes proves evidence for ideological charge of philosophizing and justifies socially oriented approaches of historical and philosophical studies examining philosophizing as a special cultural practice, not as a form of sublime creativity.

  13. notebooks, looking for a very specific di- ary entry that he had made ...

    African Journals Online (AJOL)

    notebooks, looking for a very specific di- ary entry that he had made some years before. As he searches, the audience (or in this case, the reader) is shown glimpses from Fugard's actual diary entries since there is a significant overlap between dramatist and character. As Fugard (and. Fourie) confirms in the foreword to the.

  14. Measuring and Comparing Academic Language Development and Conceptual Understanding via Science Notebooks

    Science.gov (United States)

    Huerta, Margarita; Tong, Fuhui; Irby, Beverly J.; Lara-Alecio, Rafael

    2016-01-01

    The authors of this quantitative study measured and compared the academic language development and conceptual understanding of fifth-grade economically disadvantaged English language learners (ELL), former ELLs, and native English-speaking (ES) students as reflected in their science notebook scores. Using an instrument they developed, the authors…

  15. Using M@th Desktop Notebooks and Palettes in the Classroom

    Science.gov (United States)

    Simonovits, Reinhard

    2011-01-01

    This article explains the didactical design of M@th Desktop (MD), a teaching and learning software application for high schools and universities. The use of two types of MD resources is illustrated: notebooks and palettes, focusing on the topic of exponential functions. The handling of MD in a blended learning approach and the impact on the…

  16. Writing Material in Chemical Physics Research: The Laboratory Notebook as Locus of Technical and Textual Integration

    Science.gov (United States)

    Wickman, Chad

    2010-01-01

    This article, drawing on ethnographic study in a chemical physics research facility, explores how notebooks are used and produced in the conduct of laboratory science. Data include written field notes of laboratory activity; visual documentation of "in situ" writing processes; analysis of inscriptions, texts, and material artifacts produced in the…

  17. Impact of the implementation of a well-designed electronic laboratory notebook on bioanalytical laboratory function.

    Science.gov (United States)

    Zeng, Jianing; Hillman, Mark; Arnold, Mark

    2011-07-01

    This paper shares experiences of the Bristol-Myers Squibb Company during the design, validation and implementation of an electronic laboratory notebook (ELN) into the GLP/regulated bioanalytical analysis area, as well as addresses the impact on bioanalytical laboratory functions with the implementation of the electronic notebook. Some of the key points covered are: knowledge management - the project-based electronic notebook takes full advantage of the available technology that focuses on data organization and sharing so that scientific data generated by individual scientists became department knowledge; bioanalytical workflows in the ELN - the custom-built workflows that include data entry templates, validated calculation processes, integration with laboratory information management systems/laboratory instruments, and reporting capability improve the data quality and overall workflow efficiency; regulatory compliance - carefully designed notebook reviewing processes, cross referencing of distributed information, audit trail and software validation reduce compliance risks. By taking into consideration both data generation and project documentation needs, a well-designed ELN can deliver significant improvements in laboratory efficiency, work productivity, and regulatory compliance.

  18. A Performance Evaluation of a Notebook PC under a High Dose-Rate Gamma Ray Irradiation Test

    Directory of Open Access Journals (Sweden)

    Jai Wan Cho

    2014-01-01

    Full Text Available We describe the performance of a notebook PC under a high dose-rate gamma ray irradiation test. A notebook PC, which is small and light weight, is generally used as the control unit of a robot system and loaded onto the robot body. Using TEPCO’s CAMS (containment atmospheric monitoring system data, the gamma ray dose rate before and after a hydrogen explosion in reactor units 1–3 of the Fukushima nuclear power plant was more than 150 Gy/h. To use a notebook PC as the control unit of a robot system entering a reactor building to mitigate the severe accident situation of a nuclear power plant, the performance of the notebook PC under such intense gamma-irradiation fields should be evaluated. Under a similar dose-rate (150 Gy/h gamma ray environment, the performances of different notebook PCs were evaluated. In addition, a simple method for a performance evaluation of a notebook PC under a high dose-rate gamma ray irradiation test is proposed. Three notebook PCs were tested to verify the method proposed in this paper.

  19. Will the leading firm continue to dominate the market in the Taiwan notebook industry?

    Science.gov (United States)

    Chu, Hsiao-Ping; Yeh, Ming-Liang; Sher, Peter J.; Chiu, Yi-Chia

    2007-09-01

    This study investigates whether the market share leader in the notebook industry in Taiwan is likely to maintain its dominant position. Market share data are used to investigate the intensity of competitiveness in the industry, and data on the gap in market shares are employed to elucidate the dominance of the leading firm in Taiwan's notebook industry during the 1998-2004 period. The newly developed Panel SURADF tests advanced by Breuer et al. [Misleading inferences from panel unit root tests with an illustration from purchasing power parity, Rev. Int. Econ. 9 (3) (2001) 482-493] are employed to determine whether the market share gap is stationary or not. Unlike other panel-based unit root tests which are joint tests of a unit root for all members of a panel and are incapable of determining the mix of I(0) and I(1) series in a panel setting, the Panel SURADF tests have the advantage of being able to investigate a separate unit root null hypothesis for each individual panel member and are, therefore, able to identify how many and which series in a panel are stationary processes. The empirical results from several panel-based unit root tests substantiate that the market shares of the firms studied here are non-stationary, indicating that Taiwan's notebook industry is highly competitive; however, Breuer et al.'s [12] Panel SURADF tests unequivocally show that only Compal is stationary with respect to market share gap. In terms of sales volume, Compal is the second largest firm in the notebook industry in Taiwan, and the results indicate that it alone has the opportunity to become the market share leader in the notebook industry.

  20. A suite of Mathematica notebooks for the analysis of protein main chain 15N NMR relaxation data

    International Nuclear Information System (INIS)

    Spyracopoulos, Leo

    2006-01-01

    A suite of Mathematica notebooks has been designed to ease the analysis of protein main chain 15 N NMR relaxation data collected at a single magnetic field strength. Individual notebooks were developed to perform the following tasks: nonlinear fitting of 15 N-T 1 and -T 2 relaxation decays to a two parameter exponential decay, calculation of the principal components of the inertia tensor from protein structural coordinates, nonlinear optimization of the principal components and orientation of the axially symmetric rotational diffusion tensor, model-free analysis of 15 N-T 1 , -T 2 , and { 1 H}- 15 N NOE data, and reduced spectral density analysis of the relaxation data. The principle features of the notebooks include use of a minimal number of input files, integrated notebook data management, ease of use, cross-platform compatibility, automatic visualization of results and generation of high-quality graphics, and output of analyses in text format

  1. Requirement analysis for an electronic laboratory notebook for sustainable data management in biomedical research.

    Science.gov (United States)

    Menzel, Julia; Weil, Philipp; Bittihn, Philip; Hornung, Daniel; Mathieu, Nadine; Demiroglu, Sara Y

    2013-01-01

    Sustainable data management in biomedical research requires documentation of metadata for all experiments and results. Scientists usually document research data and metadata in laboratory paper notebooks. An electronic laboratory notebook (ELN) can keep metadata linked to research data resulting in a better understanding of the research results, meaning a scientific benefit [1]. Besides other challenges [2], the biggest hurdles for introducing an ELN seem to be usability, file formats, and data entry mechanisms [3] and that many ELNs are assigned to specific research fields such as biology, chemistry, or physics [4]. We aimed to identify requirements for the introduction of ELN software in a biomedical collaborative research center [5] consisting of different scientific fields and to find software fulfilling most of these requirements.

  2. Einstein's `Z\\"urich Notebook' and his Journey to General Relativity

    OpenAIRE

    Straumann, Norbert

    2011-01-01

    On the basis of his `Z\\"urich Notebook' I shall describe a particularly fruitful phase in Einstein's struggle on the way to general relativity. These research notes are an extremely illuminating source for understanding Einstein's main physical arguments and conceptual difficulties that delayed his discovery of general relativity by about three years. Together with the `Ent\\-wurf' theory in collaboration with Marcel Grossmann, these notes also show that the final theory was missed late in 191...

  3. IntuiScript a new digital notebook for learning writing in elementary schools: 1st observations

    OpenAIRE

    Girard , Nathalie; Simonnet , Damien; Anquetil , Eric

    2017-01-01

    International audience; IntuiScript is an innovative project that aims for designing a digital notebook dedicated to handwriting learning at primary schools. One of the main goals is to provide children real-time feedback to make them more autonomous. These feedbacks are produced by automatically analysing their drawing, and this online analysis makes possible an adaptation of the pedagogical scenario to each child according to his own difficulties. The IntruiScript project complies with a us...

  4. {OpenLabNotes} -- An Electronic Laboratory Notebook Extension for {OpenLabFramework}

    OpenAIRE

    List, M.; Franz, M.; Tan, O.; Mollenhauer, J.; Baumbach, J.

    2015-01-01

    Electronic laboratory notebooks (ELNs) are more accessible and reliable than their paper based alternatives and thus find widespread adoption. While a large number of Commercial products is available, small- to mid-sized laboratories can often not afford the costs or are concerned about the longevity of the providers. Turning towards free alternatives, however, raises questions about data protection, which are not sufficiently addressed by available solutions.To serve as legal documents, ELNs...

  5. INTERACTION BEHAVIOUR LEADING TO COMFORTIN SERVICE ENCOUNTER OF NOTEBOOK PERIPHERAL SERVICE CENTER BUSINESS

    OpenAIRE

    Dr. Wachyudi.N.*

    2018-01-01

    This study aims to determine the effect of interaction behavior that elicits a sense of comfort for customers in the service encounter of notebook peripheral business, and investigating the mediating role of comfort on overall service quality, customer satisfaction, word of mouth and the repurchase intention. Based on 250 valid responses collected from a survey questionnaire used structural equation modeling (SEM) to examine the research model. The findings showed that all hypotheses on the r...

  6. Development of a prediction model on the acceptance of electronic laboratory notebooks in academic environments.

    Science.gov (United States)

    Kloeckner, Frederik; Farkas, Robert; Franken, Tobias; Schmitz-Rode, Thomas

    2014-04-01

    Documentation of research data plays a key role in the biomedical engineering innovation processes. It makes an important contribution to the protection of intellectual property, the traceability of results and fulfilling the regulatory requirement. Because of the increasing digitalization in laboratories, an electronic alternative to the commonly-used paper-bound notebooks could contribute to the production of sophisticated documentation. However, compared to in an industrial environment, the use of electronic laboratory notebooks is not widespread in academic laboratories. Little is known about the acceptance of an electronic documentation system and the underlying reasons for this. Thus, this paper aims to establish a prediction model on the potential preference and acceptance of scientists either for paper-based or electronic documentation. The underlying data for the analysis originate from an online survey of 101 scientists in industrial, academic and clinical environments. Various parameters were analyzed to identify crucial factors for the system preference using binary logistic regression. The analysis showed significant dependency between the documentation system preference and the supposed workload associated with the documentation system (plaboratory notebook before implementation.

  7. Evaluating Engagement Models for a Citizen Science Project: Lessons Learned From Four Years of Nature's Notebook

    Science.gov (United States)

    Crimmins, T. M.; Rosemartin, A.

    2012-12-01

    The success of citizen science programs hinges on their abilities to recruit and maintain active participants. The USA National Phenology Network's plant and animal phenology observation program, Nature's Notebook, has been active since 2009. This program engages thousands of citizen scientists in tracking plant and animal life cycle activity over the course of the year. We embarked on an evaluation of the various observer recruitment and retention tactics that we have employed over the ~4-year life of this program to better inform future outreach efforts specific to Nature's Notebook and for the broader citizen science community. Participants in Nature's Notebook may become engaged via one of three pathways: individuals may join Nature's Notebook directly, they may be invited to join through a USA-NPN partner organization, or they may engage through a group with local, site-based leadership. The level and type of recruitment tactics, training, and retention efforts that are employed varies markedly among these three models. In this evaluation, we compared the efficacy of these three engagement models using several metrics: number of individuals recruited, number of individuals that go on to submit at least one data point, retention rates over time, duration of activity, and quantity of data points submitted. We also qualitatively considered the differences in costs the three models require to support. In terms of recruitment, direct engagement yielded 20-100 times more registrants than other two models. In contrast, rates of participation were highest for site-based leadership (>35%, versus 20-30% for direct engagement; rates for partner organizations were highly variable due to small sample sizes). Individuals participating through partners with site-based leadership showed a much higher rate of retention (41% of participants remained active for two+ years) than those participating directly in Nature's Notebook (27% of participants remained active for two+ years

  8. IPython: components for interactive and parallel computing across disciplines. (Invited)

    Science.gov (United States)

    Perez, F.; Bussonnier, M.; Frederic, J. D.; Froehle, B. M.; Granger, B. E.; Ivanov, P.; Kluyver, T.; Patterson, E.; Ragan-Kelley, B.; Sailer, Z.

    2013-12-01

    Scientific computing is an inherently exploratory activity that requires constantly cycling between code, data and results, each time adjusting the computations as new insights and questions arise. To support such a workflow, good interactive environments are critical. The IPython project (http://ipython.org) provides a rich architecture for interactive computing with: 1. Terminal-based and graphical interactive consoles. 2. A web-based Notebook system with support for code, text, mathematical expressions, inline plots and other rich media. 3. Easy to use, high performance tools for parallel computing. Despite its roots in Python, the IPython architecture is designed in a language-agnostic way to facilitate interactive computing in any language. This allows users to mix Python with Julia, R, Octave, Ruby, Perl, Bash and more, as well as to develop native clients in other languages that reuse the IPython clients. In this talk, I will show how IPython supports all stages in the lifecycle of a scientific idea: 1. Individual exploration. 2. Collaborative development. 3. Production runs with parallel resources. 4. Publication. 5. Education. In particular, the IPython Notebook provides an environment for "literate computing" with a tight integration of narrative and computation (including parallel computing). These Notebooks are stored in a JSON-based document format that provides an "executable paper": notebooks can be version controlled, exported to HTML or PDF for publication, and used for teaching.

  9. A pocket guide to electronic laboratory notebooks in the academic life sciences [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Ulrich Dirnagl

    2016-01-01

    Full Text Available Every professional doing active research in the life sciences is required to keep a laboratory notebook. However, while science has changed dramatically over the last centuries, laboratory notebooks have remained essentially unchanged since pre-modern science. We argue that the implementation of electronic laboratory notebooks (eLN in academic research is overdue, and we provide researchers and their institutions with the background and practical knowledge to select and initiate the implementation of an eLN in their laboratories. In addition, we present data from surveying biomedical researchers and technicians regarding which hypothetical features and functionalities they hope to see implemented in an eLN, and which ones they regard as less important. We also present data on acceptance and satisfaction of those who have recently switched from paper laboratory notebook to an eLN.  We thus provide answers to the following questions: What does an electronic laboratory notebook afford a biomedical researcher, what does it require, and how should one go about implementing it?

  10. «La bellezza è un sentimento istintivo». L’estetico nei Notebooks darwiniani

    Directory of Open Access Journals (Sweden)

    Lorenzo Bartalesi

    2012-12-01

    Full Text Available From Charles Darwin, the theoretical framework of evolutionary aesthetics is sexual selection. Recent debate focuses the attention particularly on the criterion of female choice. The aim of this article is to sketch a Darwinian way to aesthetics complementary to the one that Darwin himself present in The descent of man (1871. A series of notes in the Darwin's notebooks traditionally known as “Metaphysical Enquiries” will constitute the point of departure for a hypothetical reconstruction of evolutionary history of aesthetic

  11. Communication of the monitoring and evaluation process through the use of storyboards and story notebooks.

    Science.gov (United States)

    Lewis, L C; Honea, S H; Kanter, D F; Haney, P E

    1993-10-01

    In preparation for the 1993 Joint Commission on Accreditation of Health Care Organizations (JCAHO) survey, Audie L. Murphy Memorial Veterans Hospital Nursing Service was faced with determining the best approach to presenting their Total Quality Improvement/Total Quality Management (TQI/TQM) process. Nursing Service management and staff, Quality Improvement Clinicians, and medical staff used the Storyboard concept and the accompanying Story Notebooks to organize and to communicate their TQI/TQM process and findings. This concept was extremely beneficial, enabling staff to successfully present the multidisciplinary TQI/TQM data to the JCAHO surveyors.

  12. Personalised Medical Reference to General Practice Notebook (GPnotebook - an evolutionary tale

    Directory of Open Access Journals (Sweden)

    James McMorran

    2002-09-01

    What has happened to this resource now? This brief paper outlines how the developers of the reference resource have improved on the design and content of the medical database. Now the reference resource is an Internet-based resource called General Practice Notebook (www.gpnotebook.co.uk and is currently attracting 5000 to 9000 page views per day and containing over 30 000 index terms in a complex web structure of over 60 000 links. This paper describes the evolutionary process that has occurred over the last decade.

  13. Understand your Algorithm: Drill Down to Sample Visualizations in Jupyter Notebooks

    Science.gov (United States)

    Mapes, B. E.; Ho, Y.; Cheedela, S. K.; McWhirter, J.

    2017-12-01

    Statistics are the currency of climate dynamics, but the space of all possible algorithms is fathomless - especially for 4-dimensional weather-resolving data that many "impact" variables depend on. Algorithms are designed on data samples, but how do you know if they measure what you expect when turned loose on Big Data? We will introduce the year-1 prototype of a 3-year scientist-led, NSF-supported, Unidata-quality software stack called DRILSDOWN (https://brianmapes.github.io/EarthCube-DRILSDOWN/) for automatically extracting, integrating, and visualizing multivariate 4D data samples. Based on a customizable "IDV bundle" of data sources, fields and displays supplied by the user, the system will teleport its space-time coordinates to fetch Cases of Interest (edge cases, typical cases, etc.) from large aggregated repositories. These standard displays can serve as backdrops to overlay with your value-added fields (such as derived quantities stored on a user's local disk). Fields can be readily pulled out of the visualization object for further processing in Python. The hope is that algorithms successfully tested in this visualization space will then be lifted out and added to automatic processing toolchains, lending confidence in the next round of processing, to seek the next Cases of Interest, in light of a user's statistical measures of "Interest". To log the scientific work done in this vein, the visualizations are wrapped in iPython-based Jupyter notebooks for rich, human-readable documentation (indeed, quasi-publication with formatted text, LaTex math, etc.). Such notebooks are readable and executable, with digital replicability and provenance built in. The entire digital object of a case study can be stored in a repository, where libraries of these Case Study Notebooks can be examined in a browser. Model data (the session topic) are of course especially convenient for this system, but observations of all sorts can also be brought in, overlain, and differenced or

  14. Exact distributions of two-sample rank statistics and block rank statistics using computer algebra

    NARCIS (Netherlands)

    Wiel, van de M.A.

    1998-01-01

    We derive generating functions for various rank statistics and we use computer algebra to compute the exact null distribution of these statistics. We present various techniques for reducing time and memory space used by the computations. We use the results to write Mathematica notebooks for

  15. Analysis and Implementation of an Electronic Laboratory Notebook in a Biomedical Research Institute.

    Science.gov (United States)

    Guerrero, Santiago; Dujardin, Gwendal; Cabrera-Andrade, Alejandro; Paz-Y-Miño, César; Indacochea, Alberto; Inglés-Ferrándiz, Marta; Nadimpalli, Hima Priyanka; Collu, Nicola; Dublanche, Yann; De Mingo, Ismael; Camargo, David

    2016-01-01

    Electronic laboratory notebooks (ELNs) will probably replace paper laboratory notebooks (PLNs) in academic research due to their advantages in data recording, sharing and security. Despite several reports describing technical characteristics of ELNs and their advantages over PLNs, no study has directly tested ELN performance among researchers. In addition, the usage of tablet-based devices or wearable technology as ELN complements has never been explored in the field. To implement an ELN in our biomedical research institute, here we first present a technical comparison of six ELNs using 42 parameters. Based on this, we chose two ELNs, which were tested by 28 scientists for a 3-month period and by 80 students via hands-on practical exercises. Second, we provide two survey-based studies aimed to compare these two ELNs (PerkinElmer Elements and Microsoft OneNote) and to analyze the use of tablet-based devices. We finally explore the advantages of using wearable technology as ELNs tools. Among the ELNs tested, we found that OneNote presents almost all parameters evaluated (39/42) and both surveyed groups preferred OneNote as an ELN solution. In addition, 80% of the surveyed scientists reported that tablet-based devices improved the use of ELNs in different respects. We also describe the advantages of using OneNote application for Apple Watch as an ELN wearable complement. This work defines essential features of ELNs that could be used to improve ELN implementation and software development.

  16. Radioactivity on the experimental notebook of Mme. Curie. Collection of Meisei University Library

    International Nuclear Information System (INIS)

    Mori, Chizuo; Inoue, Kazumasa; Chiwa, Kiyoshi; Miyahara, Junji

    2005-01-01

    The contamination with radioactive material of the notebook written by Marie Curie, collected in the library of Meisei University, Tokyo, was studied by use of Imaging Plate to obtain the distribution images of the contamination on the front and back covers, Si detector to obtain the energy spectrum of α-particles from the cover, and HPGe detector to obtain γ-ray spectrum. The distribution images showed that the contamination appeared mainly at the area held with hands, and even the end of the notebook was contaminated. Many dots of the contamination might imply that there were powdery contaminations around her circumstances. Energy spectra of α-particles and γ-rays showed that most of the nuclei were 226 Ra and the daughters. The radioactivity level at the intensely contaminated part was just below the permissible level, 4 Bq/cm 2 , of surface contamination for α-nuclides of Japan. Number of pages written in every month over 15 years was examined for the purpose to imagine the circumstances at that time, and some remarks were given by referring her biographies which include a matter on a Japanese researcher. (author)

  17. Analysis and Implementation of an Electronic Laboratory Notebook in a Biomedical Research Institute.

    Directory of Open Access Journals (Sweden)

    Santiago Guerrero

    Full Text Available Electronic laboratory notebooks (ELNs will probably replace paper laboratory notebooks (PLNs in academic research due to their advantages in data recording, sharing and security. Despite several reports describing technical characteristics of ELNs and their advantages over PLNs, no study has directly tested ELN performance among researchers. In addition, the usage of tablet-based devices or wearable technology as ELN complements has never been explored in the field. To implement an ELN in our biomedical research institute, here we first present a technical comparison of six ELNs using 42 parameters. Based on this, we chose two ELNs, which were tested by 28 scientists for a 3-month period and by 80 students via hands-on practical exercises. Second, we provide two survey-based studies aimed to compare these two ELNs (PerkinElmer Elements and Microsoft OneNote and to analyze the use of tablet-based devices. We finally explore the advantages of using wearable technology as ELNs tools. Among the ELNs tested, we found that OneNote presents almost all parameters evaluated (39/42 and both surveyed groups preferred OneNote as an ELN solution. In addition, 80% of the surveyed scientists reported that tablet-based devices improved the use of ELNs in different respects. We also describe the advantages of using OneNote application for Apple Watch as an ELN wearable complement. This work defines essential features of ELNs that could be used to improve ELN implementation and software development.

  18. NeuroScholar’s Electronic Laboratory Notebook and Its Application to Neuroendocrinology

    Science.gov (United States)

    Khan, Arshad M.; Hahn, Joel D.; Cheng, Wei-Cheng; Watts, Alan G.; Burns, Gully A. P. C.

    2015-01-01

    Scientists continually relate information from the published literature to their current research. The challenge of this essential and time-consuming activity increases as the body of scientific literature continues to grow. In an attempt to lessen the challenge, we have developed an Electronic Laboratory Notebook (ELN) application. Our ELN functions as a component of another application we have developed, an open-source knowledge management system for the neuroscientific literature called NeuroScholar (http://www.neuroscholar.org/). Scanned notebook pages, images, and data files are entered into the ELN, where they can be annotated, organized, and linked to similarly annotated excerpts from the published literature within Neuroscholar. Associations between these knowledge constructs are created within a dynamic node-and-edge user interface. To produce an interactive, adaptable knowledge base. We demonstrate the ELN’s utility by using it to organize data and literature related to our studies of the neuroendocrine hypothalamic paraventricular nucleus (PVH). We also discuss how the ELN could be applied to model other neuroendocrine systems; as an example we look at the role of PVH stressor-responsive neurons in the context of their involvement in the suppression of reproductive function. We present this application to the community as open-source software and invite contributions to its development. PMID:16845166

  19. Implementation and use of cloud-based electronic lab notebook in a bioprocess engineering teaching laboratory.

    Science.gov (United States)

    Riley, Erin M; Hattaway, Holly Z; Felse, P Arthur

    2017-01-01

    Electronic lab notebooks (ELNs) are better equipped than paper lab notebooks (PLNs) to handle present-day life science and engineering experiments that generate large data sets and require high levels of data integrity. But limited training and a lack of workforce with ELN knowledge have restricted the use of ELN in academic and industry research laboratories which still rely on cumbersome PLNs for recordkeeping. We used LabArchives, a cloud-based ELN in our bioprocess engineering lab course to train students in electronic record keeping, good documentation practices (GDPs), and data integrity. Implementation of ELN in the bioprocess engineering lab course, an analysis of user experiences, and our development actions to improve ELN training are presented here. ELN improved pedagogy and learning outcomes of the lab course through stream lined workflow, quick data recording and archiving, and enhanced data sharing and collaboration. It also enabled superior data integrity, simplified information exchange, and allowed real-time and remote monitoring of experiments. Several attributes related to positive user experiences of ELN improved between the two subsequent years in which ELN was offered. Student responses also indicate that ELN is better than PLN for compliance. We demonstrated that ELN can be successfully implemented in a lab course with significant benefits to pedagogy, GDP training, and data integrity. The methods and processes presented here for ELN implementation can be adapted to many types of laboratory experiments.

  20. NeuroScholar's electronic laboratory notebook and its application to neuroendocrinology.

    Science.gov (United States)

    Khan, Arshad M; Hahn, Joel D; Cheng, Wei-Cheng; Watts, Alan G; Burns, Gully A P C

    2006-01-01

    Scientists continually relate information from the published literature to their current research. The challenge of this essential and time-consuming activity increases as the body of scientific literature continues to grow. In an attempt to lessen the challenge, we have developed an Electronic Laboratory Notebook (ELN) application. Our ELN functions as a component of another application we have developed, an open-source knowledge management system for the neuroscientific literature called NeuroScholar (http://www. neuroscholar. org/). Scanned notebook pages, images, and data files are entered into the ELN, where they can be annotated, organized, and linked to similarly annotated excerpts from the published literature within Neuroscholar. Associations between these knowledge constructs are created within a dynamic node-and-edge user interface. To produce an interactive, adaptable knowledge base. We demonstrate the ELN's utility by using it to organize data and literature related to our studies of the neuroendocrine hypothalamic paraventricular nucleus (PVH). We also discuss how the ELN could be applied to model other neuroendocrine systems; as an example we look at the role of PVH stressor-responsive neurons in the context of their involvement in the suppression of reproductive function. We present this application to the community as open-source software and invite contributions to its development.

  1. Psychiatrist's Notebook.

    Science.gov (United States)

    Gifted Child Today (GCT), 1988

    1988-01-01

    A child psychiatrist offers a brief introduction to learning disabilities: their causes, common signals of learning disabilities, student assessment to clarify the existence of a learning disability, and treatment with special educational services or medication. (JDD)

  2. Using an ePortfolio System as an Electronic Laboratory Notebook in Undergraduate Biochemistry and Molecular Biology Practical Classes

    Science.gov (United States)

    Johnston, Jill; Kant, Sashi; Gysbers, Vanessa; Hancock, Dale; Denyer, Gareth

    2014-01-01

    Despite many apparent advantages, including security, back-up, remote access, workflow, and data management, the use of electronic laboratory notebooks (ELNs) in the modern research laboratory is still developing. This presents a challenge to instructors who want to give undergraduate students an introduction to the kinds of data curation and…

  3. Writing on the Bus: Using Athletic Team Notebooks and Journals to Advance Learning and Performance in Sports

    Science.gov (United States)

    Kent, Richard

    2012-01-01

    "Writing on the Bus" showcases the what, how, and why of using athletic team notebooks and journals. The book guides coaches and athletes, from elementary school through college, in analyzing games while thinking deeply about motivation, goal setting, and communication in order to optimize performance. Filled with lesson plans, writing activities,…

  4. El Universo a Sus Pies: Actividades y Recursos para Astronomia (Universe at Your Fingertips: An Astronomy Activity and Resource Notebook).

    Science.gov (United States)

    Fraknoi, Andrew, Ed.; Schatz, Dennis, Ed.

    The goal of this resource notebook is to provide activities selected by astronomers and classroom teachers, comprehensive resource lists and bibliographies, background material on astronomical topics, and teaching ideas from experienced astronomy educators. Activities are grouped into several major areas of study in astronomy including lunar…

  5. Bring Your Own Device: A Digital Notebook for Undergraduate Biochemistry Laboratory Using a Free, Cross-Platform Application

    Science.gov (United States)

    Van Dyke, Aaron R.; Smith-Carpenter, Jillian

    2017-01-01

    The majority of undergraduates own a smartphone, yet fewer than half view it as a valuable learning technology. Consequently, a digital laboratory notebook (DLN) was developed for an upper-division undergraduate biochemistry laboratory course using the free mobile application Evernote. The cloud-based DLN capitalized on the unique features of…

  6. Enrico Fermi's Discovery of Neutron-Induced Artificial Radioactivity:The Recovery of His First Laboratory Notebook

    Science.gov (United States)

    Acocella, Giovanni; Guerra, Francesco; Robotti, Nadia

    . We give a short description of the discovery of the first experimental notebook of Enrico Fermi (1901-1954) on his researches during March and April of 1934 on neutron-induced artificial radioactivity, and we point out its relevance for a proper historical and conceptual understanding of those researches.

  7. Developing an audiovisual notebook as a self-learning tool in histology: perceptions of teachers and students.

    Science.gov (United States)

    Campos-Sánchez, Antonio; López-Núñez, Juan-Antonio; Scionti, Giuseppe; Garzón, Ingrid; González-Andrades, Miguel; Alaminos, Miguel; Sola, Tomás

    2014-01-01

    Videos can be used as didactic tools for self-learning under several circumstances, including those cases in which students are responsible for the development of this resource as an audiovisual notebook. We compared students' and teachers' perceptions regarding the main features that an audiovisual notebook should include. Four questionnaires with items about information, images, text and music, and filmmaking were used to investigate students' (n = 115) and teachers' perceptions (n = 28) regarding the development of a video focused on a histological technique. The results show that both students and teachers significantly prioritize informative components, images and filmmaking more than text and music. The scores were significantly higher for teachers than for students for all four components analyzed. The highest scores were given to items related to practical and medically oriented elements, and the lowest values were given to theoretical and complementary elements. For most items, there were no differences between genders. A strong positive correlation was found between the scores given to each item by teachers and students. These results show that both students' and teachers' perceptions tend to coincide for most items, and suggest that audiovisual notebooks developed by students would emphasize the same items as those perceived by teachers to be the most relevant. Further, these findings suggest that the use of video as an audiovisual learning notebook would not only preserve the curricular objectives but would also offer the advantages of self-learning processes. © 2013 American Association of Anatomists.

  8. Going paperless: implementing an electronic laboratory notebook in a bioanalytical laboratory.

    Science.gov (United States)

    Beato, Brian; Pisek, April; White, Jessica; Grever, Timothy; Engel, Brian; Pugh, Michael; Schneider, Michael; Carel, Barbara; Branstrator, Laurel; Shoup, Ronald

    2011-07-01

    AIT Bioscience, a bioanalytical CRO, implemented a highly configurable, Oracle-based electronic laboratory notebook (ELN) from IDBS called E-WorkBook Suite (EWBS). This ELN provides a high degree of connectivity with other databases, including Watson LIMS. Significant planning and training, along with considerable design effort and template validation for dozens of laboratory workflows were required prior to EWBS being viable for either R&D or regulated work. Once implemented, EWBS greatly reduced the need for traditional quality review upon experiment completion. Numerous real-time error checks occur automatically when conducting EWBS experiments, preventing the majority of laboratory errors by pointing them out while there is still time to correct any issues. Auditing and reviewing EWBS data are very efficient, because all data are forever securely (and even remotely) accessible, provided a reviewer has appropriate credentials. Use of EWBS significantly increases both data quality and laboratory efficiency.

  9. From Notebook to Novel and from Diary to Dante: Reading Robert Dessaix’s Night Letters

    Directory of Open Access Journals (Sweden)

    Roberta Trapè

    2009-06-01

    Full Text Available This paper has developed out of a larger work in progress, which focuses on representations of Italy in contemporary Australian fiction and non-fiction prose. This larger project aims to add to an established body of work on travel writing by considering Australian texts that describe Australian travel in Italy, Italian people and Italian places. In this paper, I will specifically focus on the representations of Italy in Robert Dessaix’s novel Night Letters (1996. My paper will explore the relationship between the writer’s actual journey in Italy and that of the creative work’s main character. The novel offers the protagonist’s account in the form of letters, which describe his travel from Switzerland across Northern Italy to Venice. I will begin by briefly outlining the Italian itinerary followed by Dessaix that would eventually inspire the novel. I will then explore the relationship between Dessaix’s notebooks recording his two journeys in Italy and the literary accomplishment of Night Letters. My aim is to show ways in which an itinerary becomes a story, a complex narrative. Reference will be made to factual accounts and descriptions in the author’s own diaries with an analysis of their generative role as key sources for the fictional work. This will be done through a close reading of particular passages, in the diaries and in the novel, concerning the same event. A comparative analysis of the notebooks and Night Letters can show that Dessaix’s diary entries relating to Italian places are woven into the fictional fabric of the ‘night letters’ according to a unifying principle.

  10. Office of Geologic Repositories program baseline procedures notebook (OGR/B-1)

    International Nuclear Information System (INIS)

    1986-06-01

    Baseline management is typically applied to aid in the internal control of a program by providing consistent programmatic direction, control, and surveillance to an evolving system development. This fundamental concept of internal program control involves the establishment of a baseline to serve as a point of departure for consistent technical program coordination and to control subsequent changes from that baseline. The existence of a program-authorized baseline ensures that all participants are working to the same ground rules. Baseline management also ensures that, once the baseline is defined, changes are assessed and approved by a process which ensures adequate consideration of overall program impact. Baseline management also includes the consideration of examptions from the baseline. The process of baseline management continues through all the phases of an evolving system development program. As the Program proceeds, there will be a progressive increase in the data contained in the baseline documentation. Baseline management has been selected as a management technique to aid in the internal control of the Office of Geologic Repositories (OGR) program. Specifically, an OGR Program Baseline, including technical and programmatic requirements, is used for program control of the four Mined Geologic Disposal System field projects, i.e., Basalt Waste Isolation Project, Nevada Nuclear Waste Storage Investigation, Salt Repository Project and Crystalline Repository Project. This OGR Program Baseline Procedures Notebook provides a description of the baseline mwanagement concept, establishes the OGR Program baseline itself, and provides procedures to be followed for controlling changes to that baseline. The notebook has a controlled distribution and will be updated as required

  11. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  12. National Bureau of Standards health physics radioactive material shipment survey, packaging, and labelling program under ICAO/IATA and DOT regulations

    International Nuclear Information System (INIS)

    Sharp, D.R.; Slaback, L.A.

    1984-01-01

    NBS routinely ships many radionuclides in small to moderate activities, with many shipments containing mixtures of radionuclides in a variety of combinations. The ICAO/IATA shipping regulations (and the new DoT regulations on their model) specify individual shipping parameters for every radionuclide. As a result, quality control in the shipment of these radioactive packages has become difficult to maintain. The authors have developed a computer program that will guide a Health Physics technician through package surveys and give exact packaging and labelling instructions. The program is a 27 kilobyte user-friendly BASIC program that runs on an Epson-HX20 notebook computer with microcassette drive and 16 kilobyte memory expansion unit. This small computer is more manageable than the regulation books for which it will be substituted and will be used in routine radioactive shipments

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  14. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for Public Data Releases and Outreach

    CERN Document Server

    Abah, Anthony

    2016-01-01

    The project worked on the development of a physics analysis and its software under ROOT framework and Jupyter notebooks for the the ATLAS Outreach and the Naples teams. This analysis is created in the context of the release of data and Monte Carlo samples by the ATLAS collaboration. The project focuses on the enhancement of the recent opendata.atlas.cern web platform to be used as educational resources for university students and new researches. The generated analysis structure and tutorials will be used to extend the participation of students from other locations around the World. We conclude the project with the creation of a complete notebook representing the so-called W analysis in C + + language for the mentioned platform.

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  16. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  18. Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track

    Science.gov (United States)

    2015-11-20

    Mining Tasks from the Web Anchor Text Graph: MSR Notebook Paper for the TREC 2015 Tasks Track Paul N. Bennett Microsoft Research Redmond, USA pauben...anchor text graph has proven useful in the general realm of query reformulation [2], we sought to quantify the value of extracting key phrases from...anchor text in the broader setting of the task understanding track. Given a query, our approach considers a simple method for identifying a relevant

  19. An Interactive and Comprehensive Working Environment for High-Energy Physics Software with Python and Jupyter Notebooks

    Science.gov (United States)

    Braun, N.; Hauth, T.; Pulvermacher, C.; Ritter, M.

    2017-10-01

    Today’s analyses for high-energy physics (HEP) experiments involve processing a large amount of data with highly specialized algorithms. The contemporary workflow from recorded data to final results is based on the execution of small scripts - often written in Python or ROOT macros which call complex compiled algorithms in the background - to perform fitting procedures and generate plots. During recent years interactive programming environments, such as Jupyter, became popular. Jupyter allows to develop Python-based applications, so-called notebooks, which bundle code, documentation and results, e.g. plots. Advantages over classical script-based approaches is the feature to recompute only parts of the analysis code, which allows for fast and iterative development, and a web-based user frontend, which can be hosted centrally and only requires a browser on the user side. In our novel approach, Python and Jupyter are tightly integrated into the Belle II Analysis Software Framework (basf2), currently being developed for the Belle II experiment in Japan. This allows to develop code in Jupyter notebooks for every aspect of the event simulation, reconstruction and analysis chain. These interactive notebooks can be hosted as a centralized web service via jupyterhub with docker and used by all scientists of the Belle II Collaboration. Because of its generality and encapsulation, the setup can easily be scaled to large installations.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  1. Fast Deployment on the Cloud of Integrated Postgres, API and a Jupyter Notebook for Geospatial Collaboration

    Science.gov (United States)

    Fatland, R.; Tan, A.; Arendt, A. A.

    2016-12-01

    We describe a Python-based implementation of a PostgreSQL database accessed through an Application Programming Interface (API) hosted on the Amazon Web Services public cloud. The data is geospatial and concerns hydrological model results in the glaciated catchment basins of southcentral and southeast Alaska. This implementation, however, is intended to be generalized to other forms of geophysical data, particularly data that is intended to be shared across a collaborative team or publicly. An example (moderate-size) dataset is provided together with the code base and a complete installation tutorial on GitHub. An enthusiastic scientist with some familiarity with software installation can replicate the example system in two hours. This installation includes database, API, a test Client and a supporting Jupyter Notebook, specifically oriented towards Python 3 and markup text to comprise an executable paper. The installation 'on the cloud' often engenders discussion and consideration of cloud cost and safety. By treating the process as somewhat "cookbook" we hope to first demonstrate the feasibility of the proposition. A discussion of cost and data security is provided in this presentation and in the accompanying tutorial/documentation. This geospatial data system case study is part of a larger effort at the University of Washington to enable research teams to take advantage of the public cloud to meet challenges in data management and analysis.

  2. Focused campaign increases activity among participants in Nature's Notebook, a citizen science project

    Science.gov (United States)

    Crimmins, Theresa M.; Weltzin, Jake F.; Rosemartin, Alyssa H.; Surina, Echo M.; Marsh, Lee; Denny, Ellen G.

    2014-01-01

    Citizen science projects, which engage non-professional scientists in one or more stages of scientific research, have been gaining popularity; yet maintaining participants’ activity level over time remains a challenge. The objective of this study was to evaluate the potential for a short-term, focused campaign to increase participant activity in a national-scale citizen science program. The campaign that we implemented was designed to answer a compelling scientific question. We invited participants in the phenology-observing program, Nature’s Notebook, to track trees throughout the spring of 2012, to ascertain whether the season arrived as early as the anomalous spring of 2010. Consisting of a series of six electronic newsletters and costing our office slightly more than 1 week of staff resources, our effort was successful; compared with previous years, the number of observations collected in the region where the campaign was run increased by 184%, the number of participants submitting observations increased by 116%, and the number of trees registered increased by 110%. In comparison, these respective metrics grew by 25, 55, and 44%, over previous years, in the southeastern quadrant of the United States, where no such campaign was carried out. The campaign approach we describe here is a model that could be adapted by a wide variety of programs to increase engagement and thereby positively influence participant retention.

  3. OpenLabNotes – An Electronic Laboratory Notebook Extension for OpenLabFramework

    Directory of Open Access Journals (Sweden)

    List Markus

    2015-09-01

    Full Text Available Electronic laboratory notebooks (ELNs are more accessible and reliable than their paper based alternatives and thus find widespread adoption. While a large number of commercial products is available, small- to mid-sized laboratories can often not afford the costs or are concerned about the longevity of the providers. Turning towards free alternatives, however, raises questions about data protection, which are not sufficiently addressed by available solutions. To serve as legal documents, ELNs must prevent scientific fraud through technical means such as digital signatures. It would also be advantageous if an ELN was integrated with a laboratory information management system to allow for a comprehensive documentation of experimental work including the location of samples that were used in a particular experiment. Here, we present OpenLabNotes, which adds state-of-the-art ELN capabilities to OpenLabFramework, a powerful and flexible laboratory information management system. In contrast to comparable solutions, it allows to protect the intellectual property of its users by offering data protection with digital signatures. OpenLabNotes effectively closes the gap between research documentation and sample management, thus making Open- LabFramework more attractive for laboratories that seek to increase productivity through electronic data management.

  4. Differences in typing forces, muscle activity, comfort, and typing performance among virtual, notebook, and desktop keyboards.

    Science.gov (United States)

    Kim, Jeong Ho; Aulck, Lovenoor; Bartha, Michael C; Harper, Christy A; Johnson, Peter W

    2014-11-01

    The present study investigated whether there were physical exposure and typing productivity differences between a virtual keyboard with no tactile feedback and two conventional keyboards where key travel and tactile feedback are provided by mechanical switches under the keys. The key size and layout were same across all the keyboards. Typing forces; finger and shoulder muscle activity; self-reported comfort; and typing productivity were measured from 19 subjects while typing on a virtual (0 mm key travel), notebook (1.8 mm key travel), and desktop keyboard (4 mm key travel). When typing on the virtual keyboard, subjects typed with less force (p's typing forces and finger muscle activity came at the expense of a 60% reduction in typing productivity (p typing sessions or when typing productivity is at a premium, conventional keyboards with tactile feedback may be more suitable interface. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Chemotion ELN: an Open Source electronic lab notebook for chemists in academia.

    Science.gov (United States)

    Tremouilhac, Pierre; Nguyen, An; Huang, Yu-Chieh; Kotov, Serhii; Lütjohann, Dominic Sebastian; Hübsch, Florian; Jung, Nicole; Bräse, Stefan

    2017-09-25

    The development of an electronic lab notebook (ELN) for researchers working in the field of chemical sciences is presented. The web based application is available as an Open Source software that offers modern solutions for chemical researchers. The Chemotion ELN is equipped with the basic functionalities necessary for the acquisition and processing of chemical data, in particular the work with molecular structures and calculations based on molecular properties. The ELN supports planning, description, storage, and management for the routine work of organic chemists. It also provides tools for communicating and sharing the recorded research data among colleagues. Meeting the requirements of a state of the art research infrastructure, the ELN allows the search for molecules and reactions not only within the user's data but also in conventional external sources as provided by SciFinder and PubChem. The presented development makes allowance for the growing dependency of scientific activity on the availability of digital information by providing Open Source instruments to record and reuse research data. The current version of the ELN has been using for over half of a year in our chemistry research group, serves as a common infrastructure for chemistry research and enables chemistry researchers to build their own databases of digital information as a prerequisite for the detailed, systematic investigation and evaluation of chemical reactions and mechanisms.

  6. Nature's Notebook Provides Phenology Observations for NASA Juniper Phenology and Pollen Transport Project

    Science.gov (United States)

    Luval, J. C.; Crimmins, T. M.; Sprigg, W. A.; Levetin, E.; Huete, A.; Nickovic, S.; Prasad, A.; Vukovic, A.; VandeWater, P. K.; Budge, A. M.; hide

    2014-01-01

    Phenology Network has been established to provide national wide observations of vegetation phenology. However, as the Network is still in the early phases of establishment and growth, the density of observers is not yet adequate to sufficiently document the phenology variability over large regions. Hence a combination of satellite data and ground observations can provide optimal information regarding juniperus spp. pollen phenology. MODIS data was to observe Juniperus supp. pollen phenology. The MODIS surface reflectance product provided information on the Juniper supp. cone formation and cone density. Ground based observational records of pollen release timing and quantities were used as verification. Approximately 10, 818 records of juniper phenology for male cone formation Juniperus ashei., J. monosperma, J. scopulorum, and J. pinchotti were reported by Nature's Notebook observers in 2013 These observations provided valuable information for the analysis of satellite images for developing the pollen concentration masks for input into the PREAM (Pollen REgional Atmospheric Model) pollen transport model. The combination of satellite data and ground observations allowed us to improve our confidence in predicting pollen release and spread, thereby improving asthma and allergy alerts.

  7. Metadata capture in an electronic notebook: How to make it as simple as possible?

    Directory of Open Access Journals (Sweden)

    Menzel, Julia

    2015-09-01

    Full Text Available In the last few years electronic laboratory notebooks (ELNs have become popular. ELNs offer the great possibility to capture metadata automatically. Due to the high documentation effort metadata documentation is neglected in science. To close the gap between good data documentation and high documentation effort for the scientists a first user-friendly solution to capture metadata in an easy way was developed.At first, different protocols for the Western Blot were collected within the Collaborative Research Center 1002 and analyzed. Together with existing metadata standards identified in a literature search a first version of the metadata scheme was developed. Secondly, the metadata scheme was customized for future users including the implementation of default values for automated metadata documentation.Twelve protocols for the Western Blot were used to construct one standard protocol with ten different experimental steps. Three already existing metadata standards were used as models to construct the first version of the metadata scheme consisting of 133 data fields in ten experimental steps. Through a revision with future users the final metadata scheme was shortened to 90 items in three experimental steps. Using individualized default values 51.1% of the metadata can be captured with present values in the ELN.This lowers the data documentation effort. At the same time, researcher could benefit by providing standardized metadata for data sharing and re-use.

  8. OpenLabNotes--An Electronic Laboratory Notebook Extension for OpenLabFramework.

    Science.gov (United States)

    List, Markus; Franz, Michael; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2015-10-06

    Electronic laboratory notebooks (ELNs) are more accessible and reliable than their paper based alternatives and thus find widespread adoption. While a large number of commercial products is available, small- to mid-sized laboratories can often not afford the costs or are concerned about the longevity of the providers. Turning towards free alternatives, however, raises questions about data protection, which are not sufficiently addressed by available solutions. To serve as legal documents, ELNs must prevent scientific fraud through technical means such as digital signatures. It would also be advantageous if an ELN was integrated with a laboratory information management system to allow for a comprehensive documentation of experimental work including the location of samples that were used in a particular experiment. Here, we present OpenLabNotes, which adds state-of-the-art ELN capabilities to OpenLabFramework, a powerful and flexible laboratory information management system. In contrast to comparable solutions, it allows to protect the intellectual property of its users by offering data protection with digital signatures. OpenLabNotes effectively closes the gap between research documentation and sample management, thus making Open-LabFramework more attractive for laboratories that seek to increase productivity through electronic data management.

  9. Service Integration to Enhance Research Data Management: RSpace Electronic Laboratory Notebook Case Study

    Directory of Open Access Journals (Sweden)

    Stuart Macdonald

    2015-02-01

    Full Text Available Research Data Management (RDM provides a framework that supports researchers and their data throughout the course of their research and is increasingly regarded as one of the essential areas of responsible conduct of research. New tools and infrastructures make possible the generation of large volumes of digital research data in a myriad of formats. This facilitates new ways to analyse, share and reuse these outputs, with libraries, IT services and other service units within academic institutions working together with the research community to develop RDM infrastructures to curate and preserve this type of research output and make them re-usable for future generations. Working on the principle that a rationalised and continuous flow of data between systems and across institutional boundaries is one of the core goals of information management, this paper will highlight service integration via Electronic Laboratory Notebooks (ELN, which streamline research data workflows, result in efficiency gains for researchers, research administrators and other stakeholders, and ultimately enhance the RDM process.

  10. OpenLabNotes - An Electronic Laboratory Notebook Extension for OpenLabFramework.

    Science.gov (United States)

    List, Markus; Franz, Michael; Tan, Qihua; Mollenhauer, Jan; Baumbach, Jan

    2015-09-01

    Electronic laboratory notebooks (ELNs) are more accessible and reliable than their paper based alternatives and thus find widespread adoption. While a large number of commercial products is available, small- to mid-sized laboratories can often not afford the costs or are concerned about the longevity of the providers. Turning towards free alternatives, however, raises questions about data protection, which are not sufficiently addressed by available solutions. To serve as legal documents, ELNs must prevent scientific fraud through technical means such as digital signatures. It would also be advantageous if an ELN was integrated with a laboratory information management system to allow for a comprehensive documentation of experimental work including the location of samples that were used in a particular experiment. Here, we present OpenLabNotes, which adds state-of-the-art ELN capabilities to OpenLabFramework, a powerful and flexible laboratory information management system. In contrast to comparable solutions, it allows to protect the intellectual property of its users by offering data protection with digital signatures. OpenLabNotes effectively closes the gap between research documentation and sample management, thus making Open- LabFramework more attractive for laboratories that seek to increase productivity through electronic data management.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  17. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  20. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  1. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  3. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  4. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  7. On Darwin's 'metaphysical notebooks'. I: teleology and the project of a theory.

    Science.gov (United States)

    Calabi, L

    2001-01-01

    Huxley's essay On the Reception of the 'Origin of Species' brings us close to the issue of cause and of why- and how-questions in the understanding of the living world. The present contribution, which is divided into two parts, reviews the problem of Teleology as conceived by Huxley and re-examines Darwin as the author who revealed the existence of a 'foundations problem' in the explanation of an entire realm of nature, i.e., the problem of explaining such realm in terms of its own, specific legality, or iuxta sua propria principia. In the first part the enquiry is mainly focused on the secularization of natural history after Paley; in the second part it is mainly focused on the desubjectivization of the inquiry into natural history after Erasmus Darwin and Lamarck. The second part will be published in the next issue of Rivista di Biologia/Biology Forum. In the first part below an analysis is made of Notebooks M and N. The author disputes the correctness of conceiving them only as the works where Darwin envisages the 'metaphysical' themes later to become the subject of The Expression of the Emotions. He suggests to conceive of them also as the works where Darwin defines the terms of the general project of his own, peculiar evolutionary theory. The author then outlines the intellectual progress of Darwin from the inosculation to the transmutation hypotheses. Darwin's reading of Malthus appears to be analytically decisive, because it offers him the vintage point to attack the metaphysical and theological citadels on the morphological side. Darwin is thus able to re-consider Erasmus' comprehensive zoonomic project, by displacing it, however, from the old idea of the scala naturae to the new one of the "coral of life", and by emphasising the distinction between "the fittest" and "the best" vs. the tradition of Natural Theology.

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  9. L’impossibile fuga. Soggetto migrante e dinamiche identitarie in The Peruvian notebooks di Braulio Muñoz

    Directory of Open Access Journals (Sweden)

    Rodja Bernardoni

    2011-06-01

    Full Text Available The article aims to examine the novel The peruvian notebooks, by the peruvian writer Braulio Muñoz. We will proceed by analyzing it through the ideas of migrant subject and migrant discourse developed by the critic Antonio Cornejo Polar. Starting from the instability and the fragmentation of these symbolic and discursive realities, we will try to point out how the text intends to investigate and to represent the alienating dynamics that, in the migration experience, mark the process of social and identitarian redefinition.

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  11. From documents to datasets: A MediaWiki-based method of annotating and extracting species observations in century-old field notebooks.

    Science.gov (United States)

    Thomer, Andrea; Vaidya, Gaurav; Guralnick, Robert; Bloom, David; Russell, Laura

    2012-01-01

    Part diary, part scientific record, biological field notebooks often contain details necessary to understanding the location and environmental conditions existent during collecting events. Despite their clear value for (and recent use in) global change studies, the text-mining outputs from field notebooks have been idiosyncratic to specific research projects, and impossible to discover or re-use. Best practices and workflows for digitization, transcription, extraction, and integration with other sources are nascent or non-existent. In this paper, we demonstrate a workflow to generate structured outputs while also maintaining links to the original texts. The first step in this workflow was to place already digitized and transcribed field notebooks from the University of Colorado Museum of Natural History founder, Junius Henderson, on Wikisource, an open text transcription platform. Next, we created Wikisource templates to document places, dates, and taxa to facilitate annotation and wiki-linking. We then requested help from the public, through social media tools, to take advantage of volunteer efforts and energy. After three notebooks were fully annotated, content was converted into XML and annotations were extracted and cross-walked into Darwin Core compliant record sets. Finally, these recordsets were vetted, to provide valid taxon names, via a process we call "taxonomic referencing." The result is identification and mobilization of 1,068 observations from three of Henderson's thirteen notebooks and a publishable Darwin Core record set for use in other analyses. Although challenges remain, this work demonstrates a feasible approach to unlock observations from field notebooks that enhances their discovery and interoperability without losing the narrative context from which those observations are drawn."Compose your notes as if you were writing a letter to someone a century in the future."Perrine and Patton (2011).

  12. Computing with Mathematica

    CERN Document Server

    Hoft, Margret H

    2002-01-01

    Computing with Mathematica, 2nd edition is engaging and interactive. It is designed to teach readers how to use Mathematica efficiently for solving problems arising in fields such as mathematics, computer science, physics, and engineering. The text moves from simple to complex, often following a specific example on a number of different levels. This gradual increase incomplexity allows readers to steadily build their competence without being overwhelmed. The 2nd edition of this acclaimed book features:* An enclosed CD for Mac and Windows that contains the entire text as acollection of Mathematica notebooks* Substantive real world examples* Challenging exercises, moving from simple to complex* A collection of interactive projects from a variety of applications "I really think this is an almost perfect text." -Stephen Brick, University of South Alabama* Substantive real world examples * Challenging exercises, moving from simple to complex examples * Interactive explorations (on the included CD-ROM) from a ...

  13. Researcher-driven Campaigns Engage Nature's Notebook Participants in Scientific Data Collection

    Science.gov (United States)

    Crimmins, Theresa M.; Elmore, Andrew J.; Huete, Alfredo; Keller, Stephen; Levetin, Estelle; Luvall, Jeffrey; Meyers, Orrin; Stylinski, Cathlyn D.; Van De Water, Peter K.; Vukovic, Ana

    2013-01-01

    One of the many benefits of citizen science projects is the capacity they hold for facilitating data collection on a grand scale and thereby enabling scientists to answer questions they would otherwise not been able to address. Nature's Notebook, the plant and animal phenology observing program of the USA National Phenology Network (USA-NPN) suitable for scientists and non-scientists alike, offers scientifically-vetted data collection protocols and infrastructure and mechanisms to quickly reach out to hundreds to thousands of potential contributors. The USA-NPN has recently partnered with several research teams to engage participants in contributing to specific studies. In one example, a team of scientists from NASA, the New Mexico Department of Health, and universities in Arizona, New Mexico, Oklahoma, and California are using juniper phenology observations submitted by Nature's Notebookparticipants to improve predictions of pollen release and inform asthma and allergy alerts. In a second effort, researchers from the University of Maryland Center for Environmental Science are engaging Nature's Notebookparticipants in tracking leafing phenophases of poplars across the U.S. These observations will be compared to information acquired via satellite imagery and used to determine geographic areas where the tree species are most and least adapted to predicted climate change. Researchers in these partnerships receive benefits primarily in the form of ground observations. Launched in 2010, the juniper pollen effort has engaged participants in several western states and has yielded thousands of observations that can play a role in model ground validation. Periodic evaluation of these observations has prompted the team to improve and enhance the materials that participants receive, in an effort to boost data quality. The poplar project is formally launching in spring of 2013 and will run for three years; preliminary findings from 2013 will be presented. Participants in these

  14. The lost notebook of Enrico Fermi the true story of the discovery of neutron-induced radioactivity

    CERN Document Server

    Guerra, Francesco

    2018-01-01

    This book tells the curious story of an unexpected finding that sheds light on a crucial moment in the development of physics: the discovery of artificial radioactivity induced by neutrons. The finding in question is a notebook, clearly written in Fermi's handwriting, which records the frenzied days and nights that Fermi spent experimenting alone, driven by his theoretical ideas on beta decay. The notebook was found by the authors while browsing through documents left by Oscar D'Agostino, the chemist among Fermi's group. From Fermi's notes, they reconstruct with skill and expertise the detailed timeline of the critical days leading up to his vital discovery. While much is already known about the road that led Fermi to his important result, this is the first time that it has been possible to reconstruct precisely when and how the initial evidence of neutron-induced decay was obtained. In relating this fascinating story, the book will be of great interest not only to those with a passion for the history of scie...

  15. Lessons Learned from the First Two Years of Nature's Notebook, the USA National Phenology Network's Plant and Animal Observation Program

    Science.gov (United States)

    Crimmins, T. M.; Rosemartin, A.; Denny, E. G.; Weltzin, J. F.; Marsh, L.

    2010-12-01

    Nature’s Notebook is the USA National Phenology Network’s (USA-NPN) national-scale plant and animal phenology observation program. The program was launched in March 2009 focusing only on plants; 2010 saw the addition of animals and the name and identity “Nature’s Notebook.” Over these two years, we have learned much about how to effectively recruit, train, and retain participants. We have engaged several thousand participants and can report a retention rate, reflected in the number of registered individuals that report observations, of approximately 25%. In 2009, participants reported observations on 133 species of plants on an average of nine days of the year, resulting in over 151,000 records in the USA-NPN phenology database. Results for the 2010 growing season are still being reported. Some of our most valuable lessons learned have been gleaned from communications with our observers. Through an informal survey, participants indicated that they would like to see more regular and consistent communications from USA-NPN program staff; clear, concise, and readily available training materials; mechanisms to keep them engaged and continuing to participate; and quick turn-around on data summaries. We are using this feedback to shape our program into the future. Another key observation we’ve made about our program is the value of locally and regionally-based efforts to implement Nature’s Notebook; some of our most committed observers are participating through partner programs such as the University of California-Santa Barbara Phenology Stewardship Program, Arbor Day Foundation, and the Great Sunflower Project. Future plans include reaching out to more partner organizations and improving our support for locally-based implementations of the Nature’s Notebook program. We have also recognized that the means for reaching and retaining potential participants in Nature’s Notebook vary greatly across generations. As the majority of our participants to

  16. A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project

    NARCIS (Netherlands)

    Hare, M.; Bers, van C.; Mysiak, J.; Calliari, E.; Haque, A.; Warner, K.; Yuzva, K.; Zissener, M.; Jaspers, A.M.J.; Timmerman, J.G.

    2014-01-01

    This publication, A Best Practices Notebook for Disaster Risk Reduction and Climate Change Adaptation: Guidance and Insights for Policy and Practice from the CATALYST Project is one of two main CATALYST knowledge products that focus on the transformative approaches and measures that can support

  17. Cuadernos de Autoformacion en Participacion Social. Principios y Valores. Volumen 1 (Self Instructional Notebooks on Social Participation. Principles and Values. Volume 1).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  18. Cuadernos de Autoformacion en Participacion Social. Para que y para quienes. Primera Edicion (Self-Informational Notebooks on Social Participation. For What and for Whom)? First Edition.

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  19. Cuadernos de Autoformacion en Participacion Social: Proyectos del INEA. Volumen 3. Primera Edicion (Self-Instructional Notebooks on Social Participation: INEA Projects. Volume 3. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  20. Cuadernos de Autoformacion en Participacion Social: Normatividad. Volumen 5. Primera Edicion (Self-Instructional Notebooks on Social Participation: Legal Issues. Volume 5. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  1. Cuadernos de Autoformacion en Participacion Social: Orientaciones Practicas. Volumen 4. Primera Edicion (Self-Instructional Notebooks on Social Participation: Practical Orientations. Volume 4. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  2. Cuadernos de Autoformacion en Participacion Social: Metodologia. Volumen 2. Primera Edicion (Self-Instructional Notebooks on Social Participation: Methodology. Volume 2. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six-volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  3. Promoting healthy computer use among middle school students: a pilot school-based health promotion program.

    Science.gov (United States)

    Ciccarelli, Marina; Portsmouth, Linda; Harris, Courtenay; Jacobs, Karen

    2012-01-01

    Introduction of notebook computers in many schools has become integral to learning. This has increased students' screen-based exposure and the potential risks to physical and visual health. Unhealthy computing behaviours include frequent and long durations of exposure; awkward postures due to inappropriate furniture and workstation layout, and ignoring computer-related discomfort. Describe the framework for a planned school-based health promotion program to encourage healthy computing behaviours among middle school students. This planned program uses a community- based participatory research approach. Students in Year 7 in 2011 at a co-educational middle school, their parents, and teachers have been recruited. Baseline data was collected on students' knowledge of computer ergonomics, current notebook exposure, and attitudes towards healthy computing behaviours; and teachers' and self-perceived competence to promote healthy notebook use among students, and what education they wanted. The health promotion program is being developed by an inter-professional team in collaboration with students, teachers and parents to embed concepts of ergonomics education in relevant school activities and school culture. End of year changes in reported and observed student computing behaviours will be used to determine the effectiveness of the program. Building a body of evidence regarding physical health benefits to students from this school-based ergonomics program can guide policy development on the healthy use of computers within children's educational environments.

  4. N. S. LESKOV’S NOTEBOOK WITH EXTRACTS FROM “PROLOGUE” (THE EXPERIENCE OF TEXTUAL COMMENTS

    Directory of Open Access Journals (Sweden)

    Inna N. Mineeva

    2016-03-01

    Full Text Available Thе article, for the fi rst time, provides a detailed textual commentary on N. S. Leskov’s notebook with extracts from “Prologue”. The extant literary materials include extracts and abstracts from the early printed Prologue, fi ction and historical literature of the 19th century, letters of European and Russian scholars and authors (Pushkin A., Tolstoy L., Pigault-Lebrun, Sher I., devoted to  doctrine matters and religious aspects, description and analysis of anthropologic categories. The autograph is the evidence of spiritual search and creative experiments of  the writer. In  the books the  writer found endorsement of both his own ideas, and those ones that require further inner understanding, questioning and emotional upheaval. Meanwhile, studying the history, structure and contents of  Prologue in  the 1880s, Leskov found an  exceptional existential and creative experience. The most part of  the notebook shows the writer’s learning process of various examples of repentance, atonement, a sudden rebirth of a sinner, active love, the benefi ts of obedience, the miracle of movement of a saint in space, the phenomenon of manifestation of  supernatural power and its intervention in  life of  a  man (God, the Holy Spirit, Angels, etc. While working with Prologue texts Leskov enunciated some principles of  their artistic processing  (quoting “crisis”, “turning”, unusual fragments in  the Church Slavonic language, emphasizing key situations by changing the name, specifying the narration, acronyms, graphic intonation. General trends in  understanding of  the  Prologue source (ideological, imaginative, plot-compositional, stylistic, identifi ed in the notebook, are subsequently transformed by the author in a series of “Byzantine Legends” where they receive additional semantic and functional load.

  5. Fear and Fascination in the Big City: Rilke's Use of George Simmel in The Notebooks of Malte Laurids Brigge

    Directory of Open Access Journals (Sweden)

    Neil H. Donahue

    1992-06-01

    Full Text Available This essay examines Rainer Maria Rilke's The Notebooks of Malte Laurids Brigge (1910 as one corner in a triangle of reciprocal influence and affinity in early twentieth-century modernity consisting of Rilke, the sociologist Georg Simmel, and the art theorist Wilhelm Worringer. In the notes, this essay documents the biographical relations among the three, but in its text it demonstrates through textual analysis how Rilke's descriptions of Malte in Paris enact Simmel's categories of psychological response for man in the metropolis, as delineated in his essay "The Metropolis and Mental Life"(1903. Rilke's descriptions of Malte's attempts to overcome his fears of the metropolis coincide then with Worringer's thesis in his Abstraction and Empathy (1908 on the psychological origins of abstract art and Joseph Frank's later elaboration of that thesis into an aesthetics of spatial form.

  6. Good enough practices in scientific computing.

    Science.gov (United States)

    Wilson, Greg; Bryan, Jennifer; Cranston, Karen; Kitzes, Justin; Nederbragt, Lex; Teal, Tracy K

    2017-06-01

    Computers are now essential in all branches of science, but most researchers are never taught the equivalent of basic lab skills for research computing. As a result, data can get lost, analyses can take much longer than necessary, and researchers are limited in how effectively they can work with software and data. Computing workflows need to follow the same practices as lab projects and notebooks, with organized data, documented steps, and the project structured for reproducibility, but researchers new to computing often don't know where to start. This paper presents a set of good computing practices that every researcher can adopt, regardless of their current level of computational skill. These practices, which encompass data management, programming, collaborating with colleagues, organizing projects, tracking work, and writing manuscripts, are drawn from a wide variety of published sources from our daily lives and from our work with volunteer organizations that have delivered workshops to over 11,000 people since 2010.

  7. Graphics supercomputer for computational fluid dynamics research

    Science.gov (United States)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  8. THE EFFECT OF BRAND IMAGE, PRODUCT KNOWLEDGE AND PRODUCT QUALITY ON PURCHASE INTENTION OF NOTEBOOK WITH DISCOUNT PRICE AS MODERATING VARIABLE

    OpenAIRE

    Erida, Erida; Rangkuti, Ari Sonang

    2017-01-01

    The purpose of this study are: (1) explaining the effect of brand image, product knowledge and product quality toward purchase intention of Asus Notebook, and (2) explaining the discount price capability on moderating the effect of brand image, product knowledge and product quality toward purchase intention. Research is that through survey, where data collection is: obeservation, interview, and questioning delivery to 99 respondents. Research result shows that brand image, product knowledge a...

  9. Making sense of monitoring data using Jupyter Notebooks: a case study of dissolved oxygen dynamics across a fresh-estuarine gradient

    Science.gov (United States)

    Nelson, N.; Munoz-Carpena, R.

    2016-12-01

    In the presented exercise, students (advanced undergraduate-graduate) explore dissolved oxygen (DO) dynamics at three locations along a fresh-estuarine gradient of the Lower St. Johns River, FL (USA). Spatiotemporal DO trends along this gradient vary as a function of (1) tidal influence, and (2) biotic productivity (phytoplankton photosynthesis and community respiration). This combination of influences produces distinct DO behavior across each of the three hydrologically-connected sites. Through analysis of high frequency monitoring data, students are encouraged to think critically about the roles of physical and biological drivers of DO, and how the relative importance of these factors can vary among different locations within a single tidal waterbody. Data from each of the three locations along the river are downloaded with CUAHSI HydroClient, and analysis is performed with a Python-enabled Jupyter Notebook that has been specifically created for this assignment. Jupyter Notebooks include annotated code organized into blocks that are executed one-at-a-time; this format is amenable to classroom teaching, and provides an approachable introduction to Python for inexperienced coders. The outputs from each code block (i.e. graphs, tables) are produced within the Jupyter Notebook, thus allowing students to directly interact with the code. Expected student learning outcomes include increased spatial reasoning, as well as greater understanding of DO cycling, spatiotemporal variability in tidal systems, and challenges associated with collecting and evaluating large data sets. Specific technical learning outcomes include coding in Python for data management and analysis using Jupyter notebooks. This assignment and associated materials are open-access and available on the Science Education Resource Center website.

  10. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    Science.gov (United States)

    Wray, Richard B.

    1991-01-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  11. Requirements analysis notebook for the flight data systems definition in the Real-Time Systems Engineering Laboratory (RSEL)

    Science.gov (United States)

    Wray, Richard B.

    1991-12-01

    A hybrid requirements analysis methodology was developed, based on the practices actually used in developing a Space Generic Open Avionics Architecture. During the development of this avionics architecture, a method of analysis able to effectively define the requirements for this space avionics architecture was developed. In this methodology, external interfaces and relationships are defined, a static analysis resulting in a static avionics model was developed, operating concepts for simulating the requirements were put together, and a dynamic analysis of the execution needs for the dynamic model operation was planned. The systems engineering approach was used to perform a top down modified structured analysis of a generic space avionics system and to convert actual program results into generic requirements. CASE tools were used to model the analyzed system and automatically generate specifications describing the model's requirements. Lessons learned in the use of CASE tools, the architecture, and the design of the Space Generic Avionics model were established, and a methodology notebook was prepared for NASA. The weaknesses of standard real-time methodologies for practicing systems engineering, such as Structured Analysis and Object Oriented Analysis, were identified.

  12. Computational seismology a practical introduction

    CERN Document Server

    Igel, Heiner

    2016-01-01

    This volume is an introductory text to a range of numerical methods used today to simulate time-dependent processes in Earth science, physics, engineering, and many other fields. The physical problem of elastic wave propagation in 1D serves as a model system with which the various numerical methods are introduced and compared. The theoretical background is presented with substantial graphical material supporting the concepts. The results can be reproduced with the supplementary electronic material provided as Python codes embedded in Jupyter notebooks. The volume starts with a primer on the physics of elastic wave propagation, and a chapter on the fundamentals of parallel programming, computational grids, mesh generation, and hardware models. The core of the volume is the presentation of numerical solutions of the wave equation with six different methods: (1) the finite-difference method; (2) the pseudospectral method (Fourier and Chebyshev); (3) the linear finite-element method; (4) the spectral-element meth...

  13. Using lab notebooks to examine students' engagement in modeling in an upper-division electronics lab course

    Science.gov (United States)

    Stanley, Jacob T.; Su, Weifeng; Lewandowski, H. J.

    2017-12-01

    We demonstrate how students' use of modeling can be examined and assessed using student notebooks collected from an upper-division electronics lab course. The use of models is a ubiquitous practice in undergraduate physics education, but the process of constructing, testing, and refining these models is much less common. We focus our attention on a lab course that has been transformed to engage students in this modeling process during lab activities. The design of the lab activities was guided by a framework that captures the different components of model-based reasoning, called the Modeling Framework for Experimental Physics. We demonstrate how this framework can be used to assess students' written work and to identify how students' model-based reasoning differed from activity to activity. Broadly speaking, we were able to identify the different steps of students' model-based reasoning and assess the completeness of their reasoning. Varying degrees of scaffolding present across the activities had an impact on how thoroughly students would engage in the full modeling process, with more scaffolded activities resulting in more thorough engagement with the process. Finally, we identified that the step in the process with which students had the most difficulty was the comparison between their interpreted data and their model prediction. Students did not use sufficiently sophisticated criteria in evaluating such comparisons, which had the effect of halting the modeling process. This may indicate that in order to engage students further in using model-based reasoning during lab activities, the instructor needs to provide further scaffolding for how students make these types of experimental comparisons. This is an important design consideration for other such courses attempting to incorporate modeling as a learning goal.

  14. ROLE OF COMPUTER ORIENTED LABORATORY TRAINING COURSE IN PHYSICS FOR DEVELOPMENT OF KEY COMPETENCES OF FUTURE ENGINEERS

    Directory of Open Access Journals (Sweden)

    Iryna Slipukhina

    2014-06-01

    Full Text Available In the article the features of the core competencies, which are formed in the course study of Physics at the Technical University are described. Some features and examples of the use of computer-oriented laboratory work for the formation of technological competencies engineering students are highlighted. Definitely possible elements of interactive content notebook integrated with software analysis of the experimental data.

  15. Uma análise dos atributos importantes no processo de decisão de compra de notebooks utilizando análise fatorial e escalonamento multidimensional.

    Directory of Open Access Journals (Sweden)

    Valter Afonso Vieira

    2006-12-01

    Full Text Available Identificar atributos importantes no processo decisório do consumidor é uma tarefa árdua para profissionais de marketing. Diversos são os segmentos que necessitam de tais tipos de pesquisas. Com base nesse contexto, este artigo tem como objetivo identificar os atributos importantes considerados pelos consumidores na compra de notebook. Para tal fim, realizou-se uma pesquisa exploratória-qualitativa por meio da entrevista de profundidade com profissionais da área de informática e com potenciais compradores de notebook. Os resultados, após análise de conteúdo, demonstraram 42 atributos considerados para a compra. Em um segundo momento foi realizada uma etapa quantitativa tipo survey com uma amostra bola-de-neve de 131 entrevistados. Assim, após aplicação da análise fatorial exploratória, cinco dimensões foram identificadas, correspondendo aos atributos mais importantes para o processo de decisão de compra. As dimensões foram classificadas como prazer e benefício, características do aparelho, desempenho, cautela e operacional. Por fim, conclusões finais e pesquisas futuras são apresentadas e discutidas.

  16. a N-D Virtual Notebook about the Basilica of S. Ambrogio in Milan: Information Modeling for the Communication of Historical Phases Subtraction Process

    Science.gov (United States)

    Stanga, C.; Spinelli, C.; Brumana, R.; Oreni, D.; Valente, R.; Banfi, F.

    2017-08-01

    This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  17. A N-D VIRTUAL NOTEBOOK ABOUT THE BASILICA OF S. AMBROGIO IN MILAN: INFORMATION MODELING FOR THE COMMUNICATION OF HISTORICAL PHASES SUBTRACTION PROCESS

    Directory of Open Access Journals (Sweden)

    C. Stanga

    2017-08-01

    Full Text Available This essay describes the combination of 3D solutions and software techniques with traditional studies and researches in order to achieve an integrated digital documentation between performed surveys, collected data, and historical research. The approach of this study is based on the comparison of survey data with historical research, and interpretations deduced from a data cross-check between the two mentioned sources. The case study is the Basilica of S. Ambrogio in Milan, one of the greatest monuments in the city, a pillar of the Christianity and of the History of Architecture. It is characterized by a complex stratification of phases of restoration and transformation. Rediscovering the great richness of the traditional architectural notebook, which collected surveys and data, this research aims to realize a virtual notebook, based on a 3D model that supports the dissemination of the collected information. It can potentially be understandable and accessible by anyone through the development of a mobile app. The 3D model was used to explore the different historical phases, starting from the recent layers to the oldest ones, through a virtual subtraction process, following the methods of Archaeology of Architecture. Its components can be imported into parametric software and recognized both in their morphological and typological aspects. It is based on the concept of LoD and ReverseLoD in order to fit the accuracy required by each step of the research.

  18. The Solar Energy Notebook.

    Science.gov (United States)

    Rankins, William H., III; Wilson, David A.

    This publication is a handbook for the do-it-yourselfer or anyone else interested in solar space and water heating. Described are methods for calculating sun angles, available energy, heating requirements, and solar heat storage. Also described are collector and system designs with mention of some design problems to avoid. Climatological data for…

  19. Cuadernos de Autoformacion en Participacion Social: Educacion con la comunidad. Volumen 6. Primera Edicion (Self-Instructional Notebooks on Social Participation: Education with the Community. Volume 6. First Edition).

    Science.gov (United States)

    Instituto Nacional para la Educacion de los Adultos, Mexico City (Mexico).

    The series "Self-instructional Notes on Social Participation" is a six volume series intended as teaching aids for adult educators. The theoretical, methodological, informative and practical elements of this series will assist professionals in their work and help them achieve greater success. The specific purpose of each notebook is…

  20. Generating scientific documentation for computational experiments using provenance

    NARCIS (Netherlands)

    Wibisono, A.; Bloem, P.; de Vries, G.K.D.; Groth, P.; Belloum, A.; Bubak, M.; Ludäscher, B.; Plale, B.

    2015-01-01

    lectronic notebooks are a common mechanism for scientists to document and investigate their work. With the advent of tools such as IPython Notebooks and Knitr, these notebooks allow code and data to be mixed together and published online. However, these approaches assume that all work is done in the

  1. Visual ergonomic aspects of glare on computer displays: glossy screens and angular dependence

    Science.gov (United States)

    Brunnström, Kjell; Andrén, Börje; Konstantinides, Zacharias; Nordström, Lukas

    2007-02-01

    Recently flat panel computer displays and notebook computer are designed with a so called glare panel i.e. highly glossy screens, have emerged on the market. The shiny look of the display appeals to the costumers, also there are arguments that the contrast, colour saturation etc improves by using a glare panel. LCD displays suffer often from angular dependent picture quality. This has been even more pronounced by the introduction of Prism Light Guide plates into displays for notebook computers. The TCO label is the leading labelling system for computer displays. Currently about 50% of all computer displays on the market are certified according to the TCO requirements. The requirements are periodically updated to keep up with the technical development and the latest research in e.g. visual ergonomics. The gloss level of the screen and the angular dependence has recently been investigated by conducting user studies. A study of the effect of highly glossy screens compared to matt screens has been performed. The results show a slight advantage for the glossy screen when no disturbing reflexes are present, however the difference was not statistically significant. When disturbing reflexes are present the advantage is changed into a larger disadvantage and this difference is statistically significant. Another study of angular dependence has also been performed. The results indicates a linear relationship between the picture quality and the centre luminance of the screen.

  2. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceable requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate our need for varying degrees of reliability in routine weighing operations. 1 ref

  3. A portable, automated, inexpensive mass and balance calibration system

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1987-01-01

    Reliable mass measurements are essential for a nuclear production facility or process control laboratory. DOE Order 5630.2 requires that traceable standards be used to calibrate and monitor equipment used for nuclear material measurements. To ensure the reliability of mass measurements and to comply with DOE traceability requirements, a portable, automated mass and balance calibration system is used at the Savannah River Plant. Automation is achieved using an EPSON HX-20 notebook computer, which can be operated via RS232C interfacing to electronic balances or function with manual data entry if computer interfacing is not feasible. This economical, comprehensive, user-friendly system has three main functions in a mass measurement control program (MMCP): balance certification, calibration of mass standards, and daily measurement of traceable standards. The balance certification program tests for accuracy, precision, sensitivity, linearity, and cornerloading versus specific requirements. The mass calibration program allows rapid calibration of inexpensive mass standards traceable to certified Class S standards. This MMCP permits daily measurement of traceable standards to monitor the reliability of balances during routine use. The automated system verifies balance calibration, stores results for future use, and provides a printed control chart of the stored data. Another feature of the system permits three different weighing routines that accommodate their need for varying degrees of reliability in routine weighing operations

  4. Academic writing in a corpus of 4th grade science notebooks: An analysis of student language use and adult expectations of the genres of school science

    Science.gov (United States)

    Esquinca, Alberto

    This is a study of language use in the context of an inquiry-based science curriculum in which conceptual understanding ratings are used split texts into groups of "successful" and "unsuccessful" texts. "Successful" texts could include known features of science language. 420 texts generated by students in 14 classrooms from three school districts, culled from a prior study on the effectiveness of science notebooks to assess understanding, in addition to the aforementioned ratings are the data sources. In science notebooks, students write in the process of learning (here, a unit on electricity). The analytical framework is systemic functional linguistics (Halliday and Matthiessen, 2004; Eggins, 2004), specifically the concepts of genre, register and nominalization. Genre classification involves an analysis of the purpose and register features in the text (Schleppegrell, 2004). The use of features of the scientific academic register, namely the use relational processes and nominalization (Halliday and Martin, 1993), requires transitivity analysis and noun analysis. Transitivity analysis, consisting of the identification of the process type, is conducted on 4737 ranking clauses. A manual count of each noun used in the corpus allows for a typology of nouns. Four school science genres, procedures, procedural recounts reports and explanations, are found. Most texts (85.4%) are factual, and 14.1% are classified as explanations, the analytical genre. Logistic regression analysis indicates that there is no significant probability that the texts classified as explanation are placed in the group of "successful" texts. In addition, material process clauses predominate in the corpus, followed by relational process clauses. Results of a logistic regression analysis indicate that there is a significant probability (Chi square = 15.23, p placed in the group of "successful" texts. In addition, 59.5% of 6511 nouns are references to physical materials, followed by references to

  5. Science Notebooks for the 21st Century. Going Digital Provides Opportunities to Learn "with" Technology Rather than "from" Technology

    Science.gov (United States)

    Fulton, Lori; Paek, Seungoh; Taoka, Mari

    2017-01-01

    Students of today are digital natives who for the most part come to school with experiences that may surpass those of their teachers. They use tablet computers and other devices in their personal lives and are eager to use them in the classroom. For teachers, this means they must integrate technology in ways that allow their students to learn with…

  6. Laboratory Sequence in Computational Methods for Introductory Chemistry

    Science.gov (United States)

    Cody, Jason A.; Wiser, Dawn C.

    2003-07-01

    A four-exercise laboratory sequence for introductory chemistry integrating hands-on, student-centered experience with computer modeling has been designed and implemented. The progression builds from exploration of molecular shapes to intermolecular forces and the impact of those forces on chemical separations made with gas chromatography and distillation. The sequence ends with an exploration of molecular orbitals. The students use the computers as a tool; they build the molecules, submit the calculations, and interpret the results. Because of the construction of the sequence and its placement spanning the semester break, good laboratory notebook practices are reinforced and the continuity of course content and methods between semesters is emphasized. The inclusion of these techniques in the first year of chemistry has had a positive impact on student perceptions and student learning.

  7. Epigrafía de Clunia (Burgos en los Cuadernos de Excavación de Blas Taracena = Clunian Epigraphy in Blas Taracena’s Notebooks

    Directory of Open Access Journals (Sweden)

    Javier Del Hoyo Calleja

    2015-03-01

    Full Text Available Blas Taracena acometió diversas campañas de excavación en Clunia durante la primera mitad de la década de 1930. Sus resultados nunca han visto la luz salvo en un artículo fechado en 1946, centrado en los aspectos arquitectónicos de la casa n.º 1. Sin embargo, en los cuadernos personales que redactaba día a día, aún inéditos, dejó cumplida cuenta de los descubrimientos que se iban realizando. Además de tres inscripciones procedentes de ellos, parcialmente editadas, presentamos dos árulas inéditas conservadas en los fondos del Museo de Burgos, también fruto de los trabajos de Taracena.Blas Taracena worked in several excavations in Clunia during the first half of the 1930s. His results have never been published except one article dated in 1946 about the architectural aspects of a structure called house no. 1. However, he wrote every day a personal notebook, still unpublished, in which he detailed all the discoveries were made. Besides three inscriptions partially edited we present two unknown altas allocated nowadays in the Museum of Burgos.

  8. Bioinformatics process management: information flow via a computational journal

    Directory of Open Access Journals (Sweden)

    Lushington Gerald

    2007-12-01

    Full Text Available Abstract This paper presents the Bioinformatics Computational Journal (BCJ, a framework for conducting and managing computational experiments in bioinformatics and computational biology. These experiments often involve series of computations, data searches, filters, and annotations which can benefit from a structured environment. Systems to manage computational experiments exist, ranging from libraries with standard data models to elaborate schemes to chain together input and output between applications. Yet, although such frameworks are available, their use is not widespread–ad hoc scripts are often required to bind applications together. The BCJ explores another solution to this problem through a computer based environment suitable for on-site use, which builds on the traditional laboratory notebook paradigm. It provides an intuitive, extensible paradigm designed for expressive composition of applications. Extensive features facilitate sharing data, computational methods, and entire experiments. By focusing on the bioinformatics and computational biology domain, the scope of the computational framework was narrowed, permitting us to implement a capable set of features for this domain. This report discusses the features determined critical by our system and other projects, along with design issues. We illustrate the use of our implementation of the BCJ on two domain-specific examples.

  9. Automated spike preparation system for Isotope Dilution Mass Spectrometry (IDMS)

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1990-01-01

    Isotope Dilution Mass Spectrometry (IDMS) is a method frequently employed to measure dissolved, irradiated nuclear materials. A known quantity of a unique isotope of the element to be measured (referred to as the ''spike'') is added to the solution containing the analyte. The resulting solution is chemically purified then analyzed by mass spectrometry. By measuring the magnitude of the response for each isotope and the response for the ''unique spike'' then relating this to the known quantity of the ''spike'', the quantity of the nuclear material can be determined. An automated spike preparation system was developed at the Savannah River Site (SRS) to dispense spikes for use in IDMS analytical methods. Prior to this development, technicians weighed each individual spike manually to achieve the accuracy required. This procedure was time-consuming and subjected the master stock solution to evaporation. The new system employs a high precision SMI Model 300 Unipump dispenser interfaced with an electronic balance and a portable Epson HX-20 notebook computer to automate spike preparation

  10. Integration of computer technology into the medical curriculum: the King's experience

    Directory of Open Access Journals (Sweden)

    Vickie Aitken

    1997-12-01

    Full Text Available Recently, there have been major changes in the requirements of medical education which have set the scene for the revision of medical curricula (Towle, 1991; GMC, 1993. As part of the new curriculum at King's, the opportunity has been taken to integrate computer technology into the course through Computer-Assisted Learning (CAL, and to train graduates in core IT skills. Although the use of computers in the medical curriculum has up to now been limited, recent studies have shown encouraging steps forward (see Boelen, 1995. One area where there has been particular interest is the use of notebook computers to allow students increased access to IT facilities (Maulitz et al, 1996.

  11. Computer assisted instruction in the general chemistry laboratory

    Science.gov (United States)

    Pate, Jerry C.

    This dissertation examines current applications concerning the use of computer technology to enhance instruction in the general chemistry laboratory. The dissertation critiques widely-used educational software, and explores examples of multimedia presentations such as those used in beginning chemistry laboratory courses at undergraduate and community colleges. The dissertation describes a prototype compact disc (CD) used to (a) introduce the general chemistry laboratory, (b) familiarize students with using chemistry laboratory equipment, (c) introduce laboratory safety practices, and (d) provide approved techniques for maintaining a laboratory notebook. Upon completing the CD portion of the pre-lab, students are linked to individual self-help (WebCT) quizzes covering the information provided on the CD. The CD is designed to improve student understanding of basic concepts, techniques, and procedures used in the general chemistry laboratory.

  12. Open Data, Jupyter Notebooks and Geospatial Data Standards Combined - Opening up large volumes of marine and climate data to other communities

    Science.gov (United States)

    Clements, O.; Siemen, S.; Wagemann, J.

    2017-12-01

    The EU-funded Earthserver-2 project aims to offer on-demand access to large volumes of environmental data (Earth Observation, Marine, Climate data and Planetary data) via the interface standard Web Coverage Service defined by the Open Geospatial Consortium. Providing access to data via OGC web services (e.g. WCS and WMS) has the potential to open up services to a wider audience, especially to users outside the respective communities. Especially WCS 2.0 with its processing extension Web Coverage Processing Service (WCPS) is highly beneficial to make large volumes accessible to non-expert communities. Users do not have to deal with custom community data formats, such as GRIB for the meteorological community, but can directly access the data in a format they are more familiar with, such as NetCDF, JSON or CSV. Data requests can further directly be integrated into custom processing routines and users are not required to download Gigabytes of data anymore. WCS supports trim (reduction of data extent) and slice (reduction of data dimension) operations on multi-dimensional data, providing users a very flexible on-demand access to the data. WCPS allows the user to craft queries to run on the data using a text-based query language, similar to SQL. These queries can be very powerful, e.g. condensing a three-dimensional data cube into its two-dimensional mean. However, the more processing-intensive the more complex the query. As part of the EarthServer-2 project, we developed a python library that helps users to generate complex WCPS queries with Python, a programming language they are more familiar with. The interactive presentation aims to give practical examples how users can benefit from two specific WCS services from the Marine and Climate community. Use-cases from the two communities will show different approaches to take advantage of a Web Coverage (Processing) Service. The entire content is available with Jupyter Notebooks, as they prove to be a highly beneficial tool

  13. The AtChem On-line model and Electronic Laboratory Notebook (ELN): A free community modelling tool with provenance capture

    Science.gov (United States)

    Young, J. C.; Boronska, K.; Martin, C. J.; Rickard, A. R.; Vázquez Moreno, M.; Pilling, M. J.; Haji, M. H.; Dew, P. M.; Lau, L. M.; Jimack, P. K.

    2010-12-01

    AtChem On-line1 is a simple to use zero-dimensional box modelling toolkit, developed for use by laboratory, field and chamber scientists. Any set of chemical reactions can be simulated, in particular the whole Master Chemical Mechanism (MCM2) or any subset of it. Parameters and initial data can be provided through a self-explanatory web form and the resulting model is compiled and run on a dedicated server. The core part of the toolkit, providing a robust solver for thousands of chemical reactions, is written in Fortran and uses SUNDIALS3 CVODE libraries. Chemical systems can be constrained at multiple, user-determined timescales; this enabled studies of radical chemistry at one minute timescales. AtChem On-line is free to use and requires no installation - a web browser, text editor and any compressing software is all the user needs. CPU and storage are provided by the server (input and output data are saved indefinitely). An off-line version is also being developed, which will provide batch processing, an advanced graphical user interface and post-processing tools, for example, Rate of Production Analysis (ROPA) and chainlength analysis. The source code is freely available for advanced users wishing to adapt and run the program locally. Data management, dissemination and archiving are essential in all areas of science. In order to do this in an efficient and transparent way, there is a critical need to capture high quality metadata/provenance for modelling activities. An Electronic Laboratory Notebook (ELN) has been developed in parallel with AtChem Online as part of the EC EUROCHAMP24 project. In order to use controlled chamber experiments to evaluate the MCM, we need to be able to archive, track and search information on all associated chamber model runs, so that they can be used in subsequent mechanism development. Therefore it would be extremely useful if experiment and model metadata/provenance could be easily and automatically stored electronically

  14. Entre lo impreso y lo manuscrito: viaje por España de la mano de un manual y un cuaderno escolar. Between the printed and the manuscript: travel through Spain since a handbook and a school notebook.

    Directory of Open Access Journals (Sweden)

    Elena Fernández Gómez

    2017-06-01

    Full Text Available (ES Los manuales y cuadernos escolares constituyen una de las fuentes para reconstruir la cultura es-crita infantil y conocer tanto el contexto de producción y difusión de la misma como los usos, funciones y significados que los niños y niñas atribuyen a la escritura y a la lectura. A través del análisis comparativo de un manual y de un cuaderno escolar de los años 40, en este artículo nos aproximaremos a la escuela del primer franquismo y a la importancia que en el seno de la misma se le dio a la Geografía y a la Historia. (EN School textbooks and notebooks are the main sources to reconstruct the child to know both the culture and context of production and dissemination of the same as the uses, functions and mean-ings that children attribute to writing and reading. Through the comparative analysis of a manual and a school notebook of the 40s, in this article we will approach the school early Franco and importance within the same was given to Geography and History.

  15. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  16. The state of ergonomics for mobile computing technology.

    Science.gov (United States)

    Dennerlein, Jack T

    2015-01-01

    Because mobile computing technologies, such as notebook computers, smart mobile phones, and tablet computers afford users many different configurations through their intended mobility, there is concern about their effects on musculoskeletal pain and a need for usage recommendations. Therefore the main goal of this paper to determine which best practices surrounding the use of mobile computing devices can be gleaned from current field and laboratory studies of mobile computing devices. An expert review was completed. Field studies have documented various user configurations, which often include non-neutral postures, that users adopt when using mobile technology, along with some evidence suggesting that longer duration of use is associated with more discomfort. It is therefore prudent for users to take advantage of their mobility and not get stuck in any given posture for too long. The use of accessories such as appropriate cases or riser stands, as well as external keyboards and pointing devices, can also improve postures and comfort. Overall, the state of ergonomics for mobile technology is a work in progress and there are more research questions to be addressed.

  17. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    Science.gov (United States)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  18. Perl Testing A Developer's Notebook

    CERN Document Server

    Langworth, Ian

    2005-01-01

    Is there any sexier topic in software development than software testing? That is, besides game programming, 3D graphics, audio, high-performance clustering, cool websites, et cetera? Okay, so software testing is low on the list. And that's unfortunate, because good software testing can increase your productivity, improve your designs, raise your quality, ease your maintenance burdens, and help to satisfy your customers, coworkers, and managers. Perl has a strong history of automated tests. A very early release of Perl 1.0 included a comprehensive test suite, and it's only improved from th

  19. The Electric Company Writers' Notebook.

    Science.gov (United States)

    Children's Television Workshop, New York, NY.

    This handbook outlines the curriculum objectives for the children's television program, "The Electric Company." The first portion of the text delineates strategies for teaching symbol/sound analysis, including units on blends, letter groups, and word structure. A second section addresses strategies for reading for meaning, including…

  20. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  1. Implementing and Operating Computer Graphics in the Contemporary Chemistry Education

    Directory of Open Access Journals (Sweden)

    Olga Popovska

    2017-11-01

    Full Text Available Technology plays a crucial role in modern teaching, providing both, educators and students fundamental theoretical insights as well as supporting the interpretation of experimental data. In the long term it gives students a clear stake in their learning processes. Advancing in education furthermore largely depends on providing valuable experiences and tools throughout digital and computer literacy. Here and after, the computer’s benefit makes no exception in the chemistry as a science. The major part of computer revolutionizing in the chemistry laboratory is with the use of images, diagrams, molecular models, graphs and specialized chemistry programs. In the sense of this, the teacher provides more interactive classes and numerous dynamic teaching methods along with advanced technology. All things considered, the aim of this article is to implement interactive teaching methods of chemistry subjects using chemistry computer graphics. A group of students (n = 30 at the age of 18–20 were testing using methods such as brainstorming, demonstration, working in pairs, and writing laboratory notebooks. The results showed that demonstration is the most acceptable interactive method (95%. This article is expected to be of high value to teachers and researchers of chemistry, implementing interactive methods, and operating computer graphics.

  2. Optical Computing

    OpenAIRE

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  3. Computer group

    International Nuclear Information System (INIS)

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  4. Computer Engineers.

    Science.gov (United States)

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  5. Computer Music

    Science.gov (United States)

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  6. Battlefield awareness computers: the engine of battlefield digitization

    Science.gov (United States)

    Ho, Jackson; Chamseddine, Ahmad

    1997-06-01

    solution, Computing Devices is planning to develop a notebook sized military computer designed for space limited vehicle-mounted applications, as well as a high-performance portable workstation equipped with a 19', full color, ultra-high resolution and high brightness active matrix liquid crystal display (AMLCD) targeting the command posts and tactical operations centers (TOC) applications. Together with the wearable computers Computing Devices developed at the Minneapolis facility for dismounted soldiers, Computing Devices will have a complete suite of interoperable battlefield awareness computers spanning the entire spectrum of battle digitization operating environments. Although this paper's primary focus is on a second generation 'combat ready' battlefield awareness computer or the V3+, this paper also briefly discusses the extension of the V3+ architecture to address the needs of the embedded and command post applications.3080

  7. Analog computing

    CERN Document Server

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  8. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  9. Digitized molecular diagnostics: reading disk-based bioassays with standard computer drives.

    Science.gov (United States)

    Li, Yunchao; Ou, Lily M L; Yu, Hua-Zhong

    2008-11-01

    We report herein a digital signal readout protocol for screening disk-based bioassays with standard optical drives of ordinary desktop/notebook computers. Three different types of biochemical recognition reactions (biotin-streptavidin binding, DNA hybridization, and protein-protein interaction) were performed directly on a compact disk in a line array format with the help of microfluidic channel plates. Being well-correlated with the optical darkness of the binding sites (after signal enhancement by gold nanoparticle-promoted autometallography), the reading error levels of prerecorded audio files can serve as a quantitative measure of biochemical interaction. This novel readout protocol is about 1 order of magnitude more sensitive than fluorescence labeling/scanning and has the capability of examining multiplex microassays on the same disk. Because no modification to either hardware or software is needed, it promises a platform technology for rapid, low-cost, and high-throughput point-of-care biomedical diagnostics.

  10. Thermal performance of cooling system for a laptop computer using a boiling enhancement microstructure

    International Nuclear Information System (INIS)

    Cho, N. H.; Jeong, W. Y.; Park, S. H.

    2008-01-01

    The increasing heat generation rates in CPU of notebook computers motivate a research on cooling technologies with low thermal resistance. This paper develops a closed-loop two-phase cooling system using a micropump to circulate a dielectric liquid(PF5060). The cooling system consists of an evaporator containing a boiling enhancement microstructure connected to a condenser with mini fans providing external forced convection. The cooling system is characterized by a parametric study which determines the effects of volume fill ratio of coolant, existence of a boiling enhancement microstructure and pump flow rates on thermal performance of the closed loop. Experimental data shows the optimal parametric values which can dissipate 33.9W with a film heater maintained at 95 .deg. C

  11. Thermal performance of cooling system for a laptop computer using a boiling enhancement microstructure

    Energy Technology Data Exchange (ETDEWEB)

    Cho, N. H.; Jeong, W. Y.; Park, S. H. [Kumoh National Institute of Technology, Gumi (Korea, Republic of)

    2008-07-01

    The increasing heat generation rates in CPU of notebook computers motivate a research on cooling technologies with low thermal resistance. This paper develops a closed-loop two-phase cooling system using a micropump to circulate a dielectric liquid(PF5060). The cooling system consists of an evaporator containing a boiling enhancement microstructure connected to a condenser with mini fans providing external forced convection. The cooling system is characterized by a parametric study which determines the effects of volume fill ratio of coolant, existence of a boiling enhancement microstructure and pump flow rates on thermal performance of the closed loop. Experimental data shows the optimal parametric values which can dissipate 33.9W with a film heater maintained at 95 .deg. C.

  12. Quantum Computing

    OpenAIRE

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  13. Computer program for diagnostic X-ray exposure conversion

    International Nuclear Information System (INIS)

    Lewis, S.L.

    1984-01-01

    Presented is a computer program designed to convert any given set of diagnostic X-ray exposure factors sequentially into another, yielding either an equivalent photographic density or one increased or decreased by a specifiable proportion. In addition to containing the wherewithal with which to manipulate a set of exposure factors, the facility to print hard (paper) copy is included enabling the results to be pasted into a notebook and used at any time. This program was originally written as an investigative exercise into examining the potential use of computers for practical radiographic purposes as conventionally encountered. At the same time, its possible use as an educational tool was borne in mind. To these ends, the current version of this program may be used as a means whereby exposure factors used in a diagnostic department may be altered to suit a particular requirement or may be used in the school as a mathematical model to describe the behaviour of exposure factors under manipulation without patient exposure. (author)

  14. A computational approach to climate science education with CLIMLAB

    Science.gov (United States)

    Rose, B. E. J.

    2017-12-01

    CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format

  15. Computational Medicine

    DEFF Research Database (Denmark)

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  16. Grid Computing

    Indian Academy of Sciences (India)

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  17. Green Computing

    Directory of Open Access Journals (Sweden)

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  18. Quantum computers and quantum computations

    International Nuclear Information System (INIS)

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  19. Quantum Computing for Computer Architects

    CERN Document Server

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  20. Pervasive Computing

    NARCIS (Netherlands)

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  1. Computational vision

    CERN Document Server

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  2. Spatial Computation

    Science.gov (United States)

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  3. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  4. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  5. Quantum computation

    International Nuclear Information System (INIS)

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  6. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  7. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  8. Enhancing an appointment diary on a pocket computer for use by people after brain injury.

    Science.gov (United States)

    Wright, P; Rogers, N; Hall, C; Wilson, B; Evans, J; Emslie, H

    2001-12-01

    People with memory loss resulting from brain injury benefit from purpose-designed memory aids such as appointment diaries on pocket computers. The present study explores the effects of extending the range of memory aids and including games. For 2 months, 12 people who had sustained brain injury were loaned a pocket computer containing three purpose-designed memory aids: diary, notebook and to-do list. A month later they were given another computer with the same memory aids but a different method of text entry (physical keyboard or touch-screen keyboard). Machine order was counterbalanced across participants. Assessment was by interviews during the loan periods, rating scales, performance tests and computer log files. All participants could use the memory aids and ten people (83%) found them very useful. Correlations among the three memory aids were not significant, suggesting individual variation in how they were used. Games did not increase use of the memory aids, nor did loan of the preferred pocket computer (with physical keyboard). Significantly more diary entries were made by people who had previously used other memory aids, suggesting that a better understanding of how to use a range of memory aids could benefit some people with brain injury.

  9. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  10. Symbolic computation of the Hartree-Fock energy from a chiral EFT three-nucleon interaction at N2LO

    International Nuclear Information System (INIS)

    Gebremariam, B.; Bogner, S.K.; Duguet, T.

    2010-01-01

    We present the first of a two-part Mathematica notebook collection that implements a symbolic approach for the application of the density matrix expansion (DME) to the Hartree-Fock (HF) energy from a chiral effective field theory (EFT) three-nucleon interaction at N 2 LO. The final output from the notebooks is a Skyrme-like energy density functional that provides a quasi-local approximation to the non-local HF energy. In this paper, we discuss the derivation of the HF energy and its simplification in terms of the scalar/vector-isoscalar/isovector parts of the one-body density matrix. Furthermore, a set of steps is described and illustrated on how to extend the approach to other three-nucleon interactions. Program summary: Program title: SymbHFNNN; Catalogue identifier: AEGC v 1 0 ; Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEGC_v1_0.html; Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland; Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html; No. of lines in distributed program, including test data, etc.: 96 666; No. of bytes in distributed program, including test data, etc.: 378 083; Distribution format: tar.gz; Programming language: Mathematica 7.1; Computer: Any computer running Mathematica 6.0 and later versions; Operating system: Windows Xp, Linux/Unix; RAM: 256 Mb; Classification: 5, 17.16, 17.22; Nature of problem: The calculation of the HF energy from the chiral EFT three-nucleon interaction at N 2 LO involves tremendous spin-isospin algebra. The problem is compounded by the need to eventually obtain a quasi-local approximation to the HF energy, which requires the HF energy to be expressed in terms of scalar/vector-isoscalar/isovector parts of the one-body density matrix. The Mathematica notebooks discussed in this paper solve the latter issue. Solution method: The HF energy from the chiral EFT three-nucleon interaction at N 2 LO is cast into a form suitable for an automatic

  11. Organic Computing

    CERN Document Server

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  12. Computational biomechanics

    International Nuclear Information System (INIS)

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  13. Computational Composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  14. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    Science.gov (United States)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  15. GPGPU COMPUTING

    Directory of Open Access Journals (Sweden)

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  16. Quantum Computing

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  17. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  18. Quantum Computing

    Indian Academy of Sciences (India)

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  19. Quantum computing

    OpenAIRE

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  20. Computational Pathology

    Science.gov (United States)

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  1. Cloud Computing

    DEFF Research Database (Denmark)

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  2. Computability theory

    CERN Document Server

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  3. Computational Streetscapes

    Directory of Open Access Journals (Sweden)

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  4. Chalk and computers

    DEFF Research Database (Denmark)

    Rasmussen, Lisa Rosén

    highly connected to technological innovation that across the period has inspired hope as well as fear in teachers, pupils and parents. I take my starting point in the changing teaching aids of everyday school life to analyse how the technological development has been dealt with in the Danish school......Since 1970 school books have first been supplemented by photocopies and later PDF files and the use of Internet sites. Chalkboards have been replaced by Smart Boards and notebooks by laptops and IPADS. Digital media has made its way into the classroom and into everyday school life. This has been...... in the period 1970-2011. I wish to discuss how the analysis can benefit from a focus on the parallel introduction of thoughts concerning children’s culture, the competent child and the linkage of ‘Play & learn’. Looking at everyday life I also aim at discussing how the introduction of the new teaching aids has...

  5. SciServer Compute brings Analysis to Big Data in the Cloud

    Science.gov (United States)

    Raddick, Jordan; Medvedev, Dmitry; Lemson, Gerard; Souter, Barbara

    2016-06-01

    SciServer Compute uses Jupyter Notebooks running within server-side Docker containers attached to big data collections to bring advanced analysis to big data "in the cloud." SciServer Compute is a component in the SciServer Big-Data ecosystem under development at JHU, which will provide a stable, reproducible, sharable virtual research environment.SciServer builds on the popular CasJobs and SkyServer systems that made the Sloan Digital Sky Survey (SDSS) archive one of the most-used astronomical instruments. SciServer extends those systems with server-side computational capabilities and very large scratch storage space, and further extends their functions to a range of other scientific disciplines.Although big datasets like SDSS have revolutionized astronomy research, for further analysis, users are still restricted to downloading the selected data sets locally - but increasing data sizes make this local approach impractical. Instead, researchers need online tools that are co-located with data in a virtual research environment, enabling them to bring their analysis to the data.SciServer supports this using the popular Jupyter notebooks, which allow users to write their own Python and R scripts and execute them on the server with the data (extensions to Matlab and other languages are planned). We have written special-purpose libraries that enable querying the databases and other persistent datasets. Intermediate results can be stored in large scratch space (hundreds of TBs) and analyzed directly from within Python or R with state-of-the-art visualization and machine learning libraries. Users can store science-ready results in their permanent allocation on SciDrive, a Dropbox-like system for sharing and publishing files. Communication between the various components of the SciServer system is managed through SciServer‘s new Single Sign-on Portal.We have created a number of demos to illustrate the capabilities of SciServer Compute, including Python and R scripts

  6. AN INTERACTIVE WEB-BASED ANALYSIS FRAMEWORK FOR REMOTE SENSING CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    X. Z. Wang

    2015-07-01

    Full Text Available Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users’ private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook

  7. Heat-driven liquid metal cooling device for the thermal management of a computer chip

    Energy Technology Data Exchange (ETDEWEB)

    Ma Kunquan; Liu Jing [Cryogenic Laboratory, PO Box 2711, Technical Institute of Physics and Chemistry, Chinese Academy of Sciences, Beijing 100080 (China)

    2007-08-07

    The tremendous heat generated in a computer chip or very large scale integrated circuit raises many challenging issues to be solved. Recently, liquid metal with a low melting point was established as the most conductive coolant for efficiently cooling the computer chip. Here, by making full use of the double merits of the liquid metal, i.e. superior heat transfer performance and electromagnetically drivable ability, we demonstrate for the first time the liquid-cooling concept for the thermal management of a computer chip using waste heat to power the thermoelectric generator (TEG) and thus the flow of the liquid metal. Such a device consumes no external net energy, which warrants it a self-supporting and completely silent liquid-cooling module. Experiments on devices driven by one or two stage TEGs indicate that a dramatic temperature drop on the simulating chip has been realized without the aid of any fans. The higher the heat load, the larger will be the temperature decrease caused by the cooling device. Further, the two TEGs will generate a larger current if a copper plate is sandwiched between them to enhance heat dissipation there. This new method is expected to be significant in future thermal management of a desk or notebook computer, where both efficient cooling and extremely low energy consumption are of major concern.

  8. Heat-driven liquid metal cooling device for the thermal management of a computer chip

    International Nuclear Information System (INIS)

    Ma Kunquan; Liu Jing

    2007-01-01

    The tremendous heat generated in a computer chip or very large scale integrated circuit raises many challenging issues to be solved. Recently, liquid metal with a low melting point was established as the most conductive coolant for efficiently cooling the computer chip. Here, by making full use of the double merits of the liquid metal, i.e. superior heat transfer performance and electromagnetically drivable ability, we demonstrate for the first time the liquid-cooling concept for the thermal management of a computer chip using waste heat to power the thermoelectric generator (TEG) and thus the flow of the liquid metal. Such a device consumes no external net energy, which warrants it a self-supporting and completely silent liquid-cooling module. Experiments on devices driven by one or two stage TEGs indicate that a dramatic temperature drop on the simulating chip has been realized without the aid of any fans. The higher the heat load, the larger will be the temperature decrease caused by the cooling device. Further, the two TEGs will generate a larger current if a copper plate is sandwiched between them to enhance heat dissipation there. This new method is expected to be significant in future thermal management of a desk or notebook computer, where both efficient cooling and extremely low energy consumption are of major concern

  9. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    Science.gov (United States)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  10. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  11. QDENSITY—A Mathematica quantum computer simulation

    Science.gov (United States)

    Juliá-Díaz, Bruno; Burdis, Joseph M.; Tabakin, Frank

    2009-03-01

    This Mathematica 6.0 package is a simulation of a Quantum Computer. The program provides a modular, instructive approach for generating the basic elements that make up a quantum circuit. The main emphasis is on using the density matrix, although an approach using state vectors is also implemented in the package. The package commands are defined in Qdensity.m which contains the tools needed in quantum circuits, e.g., multiqubit kets, projectors, gates, etc. New version program summaryProgram title: QDENSITY 2.0 Catalogue identifier: ADXH_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXH_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 26 055 No. of bytes in distributed program, including test data, etc.: 227 540 Distribution format: tar.gz Programming language: Mathematica 6.0 Operating system: Any which supports Mathematica; tested under Microsoft Windows XP, Macintosh OS X, and Linux FC4 Catalogue identifier of previous version: ADXH_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 914 Classification: 4.15 Does the new version supersede the previous version?: Offers an alternative, more up to date, implementation Nature of problem: Analysis and design of quantum circuits, quantum algorithms and quantum clusters. Solution method: A Mathematica package is provided which contains commands to create and analyze quantum circuits. Several Mathematica notebooks containing relevant examples: Teleportation, Shor's Algorithm and Grover's search are explained in detail. A tutorial, Tutorial.nb is also enclosed. Reasons for new version: The package has been updated to make it fully compatible with Mathematica 6.0 Summary of revisions: The package has been updated to make it fully compatible with Mathematica 6.0 Running time: Most examples

  12. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  13. Computational physics

    CERN Document Server

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  14. Computational physics

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  15. Cloud Computing

    CERN Document Server

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  16. Computational Viscoelasticity

    CERN Document Server

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  17. Optical computing.

    Science.gov (United States)

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  18. Computational physics

    International Nuclear Information System (INIS)

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  19. Phenomenological Computation?

    DEFF Research Database (Denmark)

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  20. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  1. Personal Computers.

    Science.gov (United States)

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  2. Computational Literacy

    DEFF Research Database (Denmark)

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  3. Computing Religion

    DEFF Research Database (Denmark)

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  4. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  5. Grid Computing

    Indian Academy of Sciences (India)

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  6. Development of a research prototype computer `Wearables` that one can wear on his or her body; Minitsukeru computer `Wearables` kenkyuyo shisakuki wo kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    Development has been made on a prototype of a wearable computer `Wearables` that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company`s portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The `wearable computer` aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the `wearable computer` as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  7. Computer tomographs

    International Nuclear Information System (INIS)

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  8. Computational sustainability

    CERN Document Server

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  9. Computing farms

    International Nuclear Information System (INIS)

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  10. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  11. Computational creativity

    Directory of Open Access Journals (Sweden)

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  12. Efficient scatter model for simulation of ultrasound images from computed tomography data

    Science.gov (United States)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  13. x-y-recording in transmission electron microscopy. A versatile and inexpensive interface to personal computers with application to stereology.

    Science.gov (United States)

    Rickmann, M; Siklós, L; Joó, F; Wolff, J R

    1990-09-01

    An interface for IBM XT/AT-compatible computers is described which has been designed to read the actual specimen stage position of electron microscopes. The complete system consists of (i) optical incremental encoders attached to the x- and y-stage drivers of the microscope, (ii) two keypads for operator input, (iii) an interface card fitted to the bus of the personal computer, (iv) a standard configuration IBM XT (or compatible) personal computer optionally equipped with a (v) HP Graphic Language controllable colour plotter. The small size of the encoders and their connection to the stage drivers by simple ribbed belts allows an easy adaptation of the system to most electron microscopes. Operation of the interface card itself is supported by any high-level language available for personal computers. By the modular concept of these languages, the system can be customized to various applications, and no computer expertise is needed for actual operation. The present configuration offers an inexpensive attachment, which covers a wide range of applications from a simple notebook to high-resolution (200-nm) mapping of tissue. Since section coordinates can be processed in real-time, stereological estimations can be derived directly "on microscope". This is exemplified by an application in which particle numbers were determined by the disector method.

  14. Quantum computing

    International Nuclear Information System (INIS)

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  15. Quantum computing

    Energy Technology Data Exchange (ETDEWEB)

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  16. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  17. Scientific computing

    CERN Document Server

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  18. Computational Psychiatry

    Science.gov (United States)

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  19. Computational artifacts

    DEFF Research Database (Denmark)

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  20. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  1. Cloud Computing

    CERN Document Server

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  2. Computational Logistics

    DEFF Research Database (Denmark)

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  3. Computational Logistics

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  4. Computational engineering

    CERN Document Server

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  5. Computer busses

    CERN Document Server

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  6. Reconfigurable Computing

    CERN Document Server

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  7. Riemannian computing in computer vision

    CERN Document Server

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  8. Statistical Computing

    Indian Academy of Sciences (India)

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  9. Computational biology

    DEFF Research Database (Denmark)

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  10. Computing News

    CERN Multimedia

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  11. Quantum Computation

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  12. Cloud computing.

    Science.gov (United States)

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  13. Computer Recreations.

    Science.gov (United States)

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  14. [Grid computing

    CERN Multimedia

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  15. Computational Finance

    DEFF Research Database (Denmark)

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  16. Optical Computing

    Indian Academy of Sciences (India)

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  17. Um Rio para estudante ver: engenhosidades na produção de cadernos escolares - A Rio de Janeiro to be seen by students: ingenuity in producing school notebooks

    Directory of Open Access Journals (Sweden)

    Ana Chrystina Venancio Mignot, Roberta Lopes da Veiga

    2011-03-01

    Full Text Available Resumo Analisar as intenções que guiaram a produção e comercialização da Coleção Rio, editada pela Casa Cruz, para comemorar 110 anos de existência da papelaria, em parceria com a Tilibra, a maior fabricante de cadernos escolares do país, implica em discutir a expansão e desenvolvimento da indústria caderneira resultante da modernização do parque gráfico. Para tanto, assim como editais e matérias publicadas na imprensa sobre o concurso de pintura que deu origem à coleção, alguns impressos dirigidos aos comerciantes de artigos escolares, revista e catálogos de diversas indústrias do ramo, são as principais fontes de pesquisa, visto que permitem compreender tanto as concepções que têm dos estudantes, como as preocupações que informam e conformam a produção de cadernos escolares, que deixam de ser vistos como simples suportes da escrita escolar, para serem transformados em objeto de desejo do consumidor. As escolhas das imagens das capas dos cadernos fazem parte das estratégias para conquistar este consumidor privilegiado: artistas de novelas, cantores famosos, desenhos animados, personagens de filmes e jogadores de futebol, que agradem à maioria. Com a Coleção Rio, a Casa Cruz se sobressai ao fugir da temática predominante na produção caderneira. Ao colocar em destaque algumas paisagens da Cidade Maravilhosa, também revela algumas concepções e expectativas que tem do consumidor. Nas capas assinadas por artistas plásticos cariocas, que estampam pontos turísticos e monumentos, a papelaria veicula a imagem que gostaria de perpetuar: uma cidade sem violência, medo, exclusão, uma cidade para estudante ver, amar, preservar. Palavras-chave: cadernos escolares, produção, comercialização.   A RIO DE JANEIRO TO BE SEEN BY STUDENTS: INGENUITY IN PRODUCING SCHOOL NOTEBOOKS Abstract Any attempt to analyze the intentions which have guided the production and commercialization of Rio Collection, edited by Casa Cruz

  18. NEW SCIENCE OF LEARNING: COGNITION, COMPUTERS AND COLLABORATION IN EDUCATION

    Directory of Open Access Journals (Sweden)

    Reviewed by Onur DONMEZ

    2011-01-01

    Full Text Available Information and Communication Technologies (ICTs have pervaded and changed much of our lives both on individual and societal scales. PCs, notebooks, tablets, cell phones, RSS feeds, emails, podcasts, tweets, social networks are all technologies we are familiar with and we are intensively using them in our daily lives. It is safe to say that our lives are becoming more and more digitized day by day.We have already invented bunch of terms to refer effects of these technologies on our lives. Digital nomads, grasshopper minds, millennium learners, digital natives, information age, knowledge building, knowledge society, network society are all terms invented to refer societal changes motivated by ICTs. New opportunities provided by ICTs are also shaping skill and quality demands of the next age. Individuals have to match these qualities if they want to earn their rightful places in tomorrow‘s world. Education is of course the sole light to guide them in their transformation to tomorrow‘s individual. One question arises however: ―are today‘s educational paradigms and practices ready to confront such a challenge?‖ There is a coherent and strong opinion among educators that the answer is ―NO‖. ―Today‘s students think and process information fundamentally differently from their predecessors‖(Prensky, 2001. And education has to keep pace with these students and their needs. But how? Khine & Saleh managed to gather distinguished colleagues around this question within their book titled ―New Science of Learning: Cognition, Computers and Collaboration‖. The book is composed of 29 chapters within three major topics which are: cognition, computers and collaboration.

  19. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  20. Algebraic computing

    International Nuclear Information System (INIS)

    MacCallum, M.A.H.

    1990-01-01

    The implementation of a new computer algebra system is time consuming: designers of general purpose algebra systems usually say it takes about 50 man-years to create a mature and fully functional system. Hence the range of available systems and their capabilities changes little between one general relativity meeting and the next, despite which there have been significant changes in the period since the last report. The introductory remarks aim to give a brief survey of capabilities of the principal available systems and highlight one or two trends. The reference to the most recent full survey of computer algebra in relativity and brief descriptions of the Maple, REDUCE and SHEEP and other applications are given. (author)

  1. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  2. Computed tomography

    International Nuclear Information System (INIS)

    Andre, M.; Resnick, D.

    1988-01-01

    Computed tomography (CT) has matured into a reliable and prominent tool for study of the muscoloskeletal system. When it was introduced in 1973, it was unique in many ways and posed a challenge to interpretation. It is in these unique features, however, that its advantages lie in comparison with conventional techniques. These advantages will be described in a spectrum of important applications in orthopedics and rheumatology

  3. Computed radiography

    International Nuclear Information System (INIS)

    Pupchek, G.

    2004-01-01

    Computed radiography (CR) is an image acquisition process that is used to create digital, 2-dimensional radiographs. CR employs a photostimulable phosphor-based imaging plate, replacing the standard x-ray film and intensifying screen combination. Conventional radiographic exposure equipment is used with no modification required to the existing system. CR can transform an analog x-ray department into a digital one and eliminates the need for chemicals, water, darkrooms and film processor headaches. (author)

  4. Computational universes

    International Nuclear Information System (INIS)

    Svozil, Karl

    2005-01-01

    Suspicions that the world might be some sort of a machine or algorithm existing 'in the mind' of some symbolic number cruncher have lingered from antiquity. Although popular at times, the most radical forms of this idea never reached mainstream. Modern developments in physics and computer science have lent support to the thesis, but empirical evidence is needed before it can begin to replace our contemporary world view

  5. Conversion of IVA Human Computer Model to EVA Use and Evaluation and Comparison of the Result to Existing EVA Models

    Science.gov (United States)

    Hamilton, George S.; Williams, Jermaine C.

    1998-01-01

    This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.

  6. Advanced display object selection methods for enhancing user-computer productivity

    Science.gov (United States)

    Osga, Glenn A.

    1993-01-01

    The User-Interface Technology Branch at NCCOSC RDT&E Division has been conducting a series of studies to address the suitability of commercial off-the-shelf (COTS) graphic user-interface (GUI) methods for efficiency and performance in critical naval combat systems. This paper presents an advanced selection algorithm and method developed to increase user performance when making selections on tactical displays. The method has also been applied with considerable success to a variety of cursor and pointing tasks. Typical GUI's allow user selection by: (1) moving a cursor with a pointing device such as a mouse, trackball, joystick, touchscreen; and (2) placing the cursor on the object. Examples of GUI objects are the buttons, icons, folders, scroll bars, etc. used in many personal computer and workstation applications. This paper presents an improved method of selection and the theoretical basis for the significant performance gains achieved with various input devices tested. The method is applicable to all GUI styles and display sizes, and is particularly useful for selections on small screens such as notebook computers. Considering the amount of work-hours spent pointing and clicking across all styles of available graphic user-interfaces, the cost/benefit in applying this method to graphic user-interfaces is substantial, with the potential for increasing productivity across thousands of users and applications.

  7. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II)

    Energy Technology Data Exchange (ETDEWEB)

    David P. Colton

    2007-02-28

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time.

  8. Emergency Response Equipment and Related Training: Airborne Radiological Computer System (Model II) user's manual

    International Nuclear Information System (INIS)

    David P. Colton

    2007-01-01

    The materials included in the Airborne Radiological Computer System, Model-II (ARCS-II) were assembled with several considerations in mind. First, the system was designed to measure and record the airborne gamma radiation levels and the corresponding latitude and longitude coordinates, and to provide a first overview look of the extent and severity of an accident's impact. Second, the portable system had to be light enough and durable enough that it could be mounted in an aircraft, ground vehicle, or watercraft. Third, the system must control the collection and storage of the data, as well as provide a real-time display of the data collection results to the operator. The notebook computer and color graphics printer components of the system would only be used for analyzing and plotting the data. In essence, the provided equipment is composed of an acquisition system and an analysis system. The data can be transferred from the acquisition system to the analysis system at the end of the data collection or at some other agreeable time

  9. Customizable computing

    CERN Document Server

    Chen, Yu-Ting; Gill, Michael; Reinman, Glenn; Xiao, Bingjun

    2015-01-01

    Since the end of Dennard scaling in the early 2000s, improving the energy efficiency of computation has been the main concern of the research community and industry. The large energy efficiency gap between general-purpose processors and application-specific integrated circuits (ASICs) motivates the exploration of customizable architectures, where one can adapt the architecture to the workload. In this Synthesis lecture, we present an overview and introduction of the recent developments on energy-efficient customizable architectures, including customizable cores and accelerators, on-chip memory

  10. The Art and Science of Notebooks

    Science.gov (United States)

    Porter, Keri; Yokoi, Craig; Yee, Bertina

    2011-01-01

    Along with inquiry-based teaching, exploring the elements of art can guide students to view and represent objects realistically. Understanding line, shape, color, value, form, space, and texture helps bridge the gap between what students actually observe and what their preconceived ideas about the object may be. This type of explicit instruction…

  11. DELL : 2:1 Convertible Notebook

    OpenAIRE

    Nilsen, Aurora B.

    2017-01-01

    Bacheloroppgave i Internasjonal Markedsføring fra University of Technology, Sydney i Australia, 2017 After conducting an internal and external analysis based on primary and secondary research, the conclusion was that Dell is a strong brand in the commercial market. However, the research also concluded that Dell is facing three key issues: The blurred lines between the private and enterprise market have become more blurred as Microsoft and their Surface Pro are targeting small b...

  12. Engineer's Notebook--A Design Assessment Tool

    Science.gov (United States)

    Kelley, Todd R.

    2011-01-01

    As technology education continues to consider a move toward an engineering design focus as proposed by various leaders in technology education, it will be necessary to employ new pedagogical approaches. Hill (2006) provided some new perspectives regarding pedagogical approaches for technology education with an engineering design focus. One…

  13. Chautauqua notebook: appropriate technology on radio

    Energy Technology Data Exchange (ETDEWEB)

    Renz, B.

    1981-01-01

    Experiences in establishing and maintaining a regional call-in information-exchange radio show (Chautauqua) on energy conservation, appropriate technology, renewable energy sources, and self-reliance are discussed. Information is presented on: appropriate technology; the Chautauquaa concept; topics discussed; research performed; guests; interviewing tips; types of listeners; program features; where to find help; promotion and publicity; the technical and engineering aspects; the budget and funding; and station policies. (MCW)

  14. Idea Notebook. Quick Activities for Every Teacher.

    Science.gov (United States)

    Meagher, Judy; And Others

    1996-01-01

    Presents suggestions for elementary-level teachers to use at the beginning of the school year, including meet the teacher activities, back-to-school parades, a welcome bulletin board, bereavement coping skills, creative science, math manipulatives, social studies activities, and creative story writing. (SM)

  15. From the notebooks of a troubled inventor

    Science.gov (United States)

    Iddan, Gavriel J.

    2006-02-01

    Contemporary inventors and investors are faced with new challenges and difficulties that have not been experienced by past generations of inventors. This basically is the result of the exponential increase in the amount of knowledge and the tough global competition. A magic formula for success does not exist, but following some basic rules to be discussed here, will greatly increase the inventor / entrepreneur's chance for success. Two examples, the Video capsule and a 3D imaging Camera that are based on the author's past inventions are described and analyzed to demonstrate some of the rules.

  16. Idea Notebook: Recycling with an Educational Purpose.

    Science.gov (United States)

    Gerth, Tom; Wilson, David A.

    1986-01-01

    Four students at St. Louis University High School developed a project to clean up the environment while saving energy and natural resources. Aluminum and steel cans were recycled and the money was used to buy and plant trees. Students learned about recycling, organization, money management, and improving the environment. (JMM)

  17. Tech Notebook: E-Reference Tools.

    Science.gov (United States)

    Sassman, Charlotte

    1999-01-01

    Two resources to help students conduct research are the new CD-ROM versions of the Encarta Encyclopedia Deluxe 2000 (for strong readers, grades 5-8) and the Year 2000 Grolier Multimedia Encyclopedia (for strong readers, grades 3 and up). By supplementing traditional text-based information with specialized features (e.g., virtual tours and video…

  18. Hanford Site Waste Storage Tank Information Notebook

    International Nuclear Information System (INIS)

    Husa, E.I.; Raymond, R.E.; Welty, R.K.; Griffith, S.M.; Hanlon, B.M.; Rios, R.R.; Vermeulen, N.J.

    1993-07-01

    This report provides summary data on the radioactive waste stored in underground tanks in the 200 East and West Areas at the Hanford Site. The summary data covers each of the existing 161 Series 100 underground waste storage tanks (500,000 gallons and larger). It also contains information on the design and construction of these tanks. The information in this report is derived from existing reports that document the status of the tanks and their materials. This report also contains interior, surface photographs of each of the 54 Watch List tanks, which are those tanks identified as Priority I Hanford Site Tank Farm Safety Issues in accordance with Public Law 101-510, Section 3137*

  19. Computed tomography

    International Nuclear Information System (INIS)

    Wells, P.; Davis, J.; Morgan, M.

    1994-01-01

    X-ray or gamma-ray transmission computed tomography (CT) is a powerful non-destructive evaluation (NDE) technique that produces two-dimensional cross-sectional images of an object without the need to physically section it. CT is also known by the acronym CAT, for computerised axial tomography. This review article presents a brief historical perspective on CT, its current status and the underlying physics. The mathematical fundamentals of computed tomography are developed for the simplest transmission CT modality. A description of CT scanner instrumentation is provided with an emphasis on radiation sources and systems. Examples of CT images are shown indicating the range of materials that can be scanned and the spatial and contrast resolutions that may be achieved. Attention is also given to the occurrence, interpretation and minimisation of various image artefacts that may arise. A final brief section is devoted to the principles and potential of a range of more recently developed tomographic modalities including diffraction CT, positron emission CT and seismic tomography. 57 refs., 2 tabs., 14 figs

  20. Computing Services and Assured Computing

    Science.gov (United States)

    2006-05-01

    fighters’ ability to execute the mission.” Computing Services 4 We run IT Systems that: provide medical care pay the warfighters manage maintenance...users • 1,400 applications • 18 facilities • 180 software vendors • 18,000+ copies of executive software products • Virtually every type of mainframe and... chocs electriques, de branchez les deux cordons d’al imentation avant de faire le depannage P R IM A R Y SD A S B 1 2 PowerHub 7000 RST U L 00- 00

  1. Computational neuroscience

    CERN Document Server

    Blackwell, Kim L

    2014-01-01

    Progress in Molecular Biology and Translational Science provides a forum for discussion of new discoveries, approaches, and ideas in molecular biology. It contains contributions from leaders in their fields and abundant references. This volume brings together different aspects of, and approaches to, molecular and multi-scale modeling, with applications to a diverse range of neurological diseases. Mathematical and computational modeling offers a powerful approach for examining the interaction between molecular pathways and ionic channels in producing neuron electrical activity. It is well accepted that non-linear interactions among diverse ionic channels can produce unexpected neuron behavior and hinder a deep understanding of how ion channel mutations bring about abnormal behavior and disease. Interactions with the diverse signaling pathways activated by G protein coupled receptors or calcium influx adds an additional level of complexity. Modeling is an approach to integrate myriad data sources into a cohesiv...

  2. Social Computing

    CERN Multimedia

    CERN. Geneva

    2011-01-01

    The past decade has witnessed a momentous transformation in the way people interact with each other. Content is now co-produced, shared, classified, and rated by millions of people, while attention has become the ephemeral and valuable resource that everyone seeks to acquire. This talk will describe how social attention determines the production and consumption of content within both the scientific community and social media, how its dynamics can be used to predict the future and the role that social media plays in setting the public agenda. About the speaker Bernardo Huberman is a Senior HP Fellow and Director of the Social Computing Lab at Hewlett Packard Laboratories. He received his Ph.D. in Physics from the University of Pennsylvania, and is currently a Consulting Professor in the Department of Applied Physics at Stanford University. He originally worked in condensed matter physics, ranging from superionic conductors to two-dimensional superfluids, and made contributions to the theory of critical p...

  3. computer networks

    Directory of Open Access Journals (Sweden)

    N. U. Ahmed

    2002-01-01

    Full Text Available In this paper, we construct a new dynamic model for the Token Bucket (TB algorithm used in computer networks and use systems approach for its analysis. This model is then augmented by adding a dynamic model for a multiplexor at an access node where the TB exercises a policing function. In the model, traffic policing, multiplexing and network utilization are formally defined. Based on the model, we study such issues as (quality of service QoS, traffic sizing and network dimensioning. Also we propose an algorithm using feedback control to improve QoS and network utilization. Applying MPEG video traces as the input traffic to the model, we verify the usefulness and effectiveness of our model.

  4. Computer Tree

    Directory of Open Access Journals (Sweden)

    Onur AĞAOĞLU

    2014-12-01

    Full Text Available It is crucial that gifted and talented students should be supported by different educational methods for their interests and skills. The science and arts centres (gifted centres provide the Supportive Education Program for these students with an interdisciplinary perspective. In line with the program, an ICT lesson entitled “Computer Tree” serves for identifying learner readiness levels, and defining the basic conceptual framework. A language teacher also contributes to the process, since it caters for the creative function of the basic linguistic skills. The teaching technique is applied for 9-11 aged student level. The lesson introduces an evaluation process including basic information, skills, and interests of the target group. Furthermore, it includes an observation process by way of peer assessment. The lesson is considered to be a good sample of planning for any subject, for the unpredicted convergence of visual and technical abilities with linguistic abilities.

  5. Computed tomography

    International Nuclear Information System (INIS)

    Boyd, D.P.

    1989-01-01

    This paper reports on computed tomographic (CT) scanning which has improved computer-assisted imaging modalities for radiologic diagnosis. The advantage of this modality is its ability to image thin cross-sectional planes of the body, thus uncovering density information in three dimensions without tissue superposition problems. Because this enables vastly superior imaging of soft tissues in the brain and body, CT scanning was immediately successful and continues to grow in importance as improvements are made in speed, resolution, and cost efficiency. CT scanners are used for general purposes, and the more advanced machines are generally preferred in large hospitals, where volume and variety of usage justifies the cost. For imaging in the abdomen, a scanner with a rapid speed is preferred because peristalsis, involuntary motion of the diaphram, and even cardiac motion are present and can significantly degrade image quality. When contrast media is used in imaging to demonstrate scanner, immediate review of images, and multiformat hardcopy production. A second console is reserved for the radiologist to read images and perform the several types of image analysis that are available. Since CT images contain quantitative information in terms of density values and contours of organs, quantitation of volumes, areas, and masses is possible. This is accomplished with region-of- interest methods, which involve the electronic outlining of the selected region of the television display monitor with a trackball-controlled cursor. In addition, various image- processing options, such as edge enhancement (for viewing fine details of edges) or smoothing filters (for enhancing the detectability of low-contrast lesions) are useful tools

  6. Cloud Computing: The Future of Computing

    OpenAIRE

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  7. Development of a research prototype computer 'Wearables' that one can wear on his or her body. Minitsukeru computer 'Wearables' kenkyuyo shisakuki wo kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    1999-02-01

    Development has been made on a prototype of a wearable computer 'Wearables' that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company's portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The 'wearable computer' aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the 'wearable computer' as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  8. Computer Refurbishment

    International Nuclear Information System (INIS)

    Ichiyen, Norman; Chan, Dominic; Thompson, Paul

    2004-01-01

    The major activity for the 18-month refurbishment outage at the Point Lepreau Generating Station is the replacement of all 380 fuel channel and calandria tube assemblies and the lower portion of connecting feeder pipes. New Brunswick Power would also take advantage of this outage to conduct a number of repairs, replacements, inspections and upgrades (such as rewinding or replacing the generator, replacement of shutdown system trip computers, replacement of certain valves and expansion joints, inspection of systems not normally accessible, etc.). This would allow for an additional 25 to 30 years. Among the systems to be replaced are the PDC's for both shutdown systems. Assessments have been completed for both the SDS1 and SDS2 PDC's, and it has been decided to replace the SDS2 PDCs with the same hardware and software approach that has been used successfully for the Wolsong 2, 3, and 4 and the Qinshan 1 and 2 SDS2 PDCs. For SDS1, it has been decided to use the same software development methodology that was used successfully for the Wolsong and Qinshan called the I A and to use a new hardware platform in order to ensure successful operation for the 25-30 year station operating life. The selected supplier is Triconex, which uses a triple modular redundant architecture that will enhance the robustness/fault tolerance of the design with respect to equipment failures

  9. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses ... CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed tomography, more commonly known ...

  10. Illustrated computer tomography

    International Nuclear Information System (INIS)

    Takahashi, S.

    1983-01-01

    This book provides the following information: basic aspects of computed tomography; atlas of computed tomography of the normal adult; clinical application of computed tomography; and radiotherapy planning and computed tomography

  11. Analog and hybrid computing

    CERN Document Server

    Hyndman, D E

    2013-01-01

    Analog and Hybrid Computing focuses on the operations of analog and hybrid computers. The book first outlines the history of computing devices that influenced the creation of analog and digital computers. The types of problems to be solved on computers, computing systems, and digital computers are discussed. The text looks at the theory and operation of electronic analog computers, including linear and non-linear computing units and use of analog computers as operational amplifiers. The monograph examines the preparation of problems to be deciphered on computers. Flow diagrams, methods of ampl

  12. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  13. The kids got game: Computer/video games, gender and learning outcomes in science classrooms

    Science.gov (United States)

    Anderson, Janice Lyn

    In recent years educators have begun to explore how to purposively design computer/video games to support student learning. This interest in video games has arisen in part because educational video games appear to have the potential to improve student motivation and interest in technology, and engage students in learning through the use of a familiar medium (Squire, 2005; Shaffer, 2006; Gee, 2005). The purpose of this dissertation research is to specifically address the issue of student learning through the use of educational computer/video games. Using the Quest Atlantis computer game, this study involved a mixed model research strategy that allowed for both broad understandings of classroom practices and specific analysis of outcomes through the themes that emerged from the case studies of the gendered groups using the game. Specifically, this study examined how fifth-grade students learning about science concepts, such as water quality and ecosystems, unfolds over time as they participate in the Quest Atlantis computer game. Data sources included classroom observations and video, pre- and post-written assessments, pre- and post- student content interviews, student field notebooks, field reports and the field notes of the researcher. To make sense of how students learning unfolded, video was analyzed using a framework of interaction analysis and small group interactions (Jordan & Henderson, 1995; Webb, 1995). These coded units were then examined with respect to student artifacts and assessments and patterns of learning trajectories analyzed. The analysis revealed that overall, student learning outcomes improved from pre- to post-assessments for all students. While there were no observable gendered differences with respect to the test scores and content interviews, there were gendered differences with respect to game play. Implications for game design, use of external scaffolds, games as tools for learning and gendered findings are discussed.

  14. Unconventional Quantum Computing Devices

    OpenAIRE

    Lloyd, Seth

    2000-01-01

    This paper investigates a variety of unconventional quantum computation devices, including fermionic quantum computers and computers that exploit nonlinear quantum mechanics. It is shown that unconventional quantum computing devices can in principle compute some quantities more rapidly than `conventional' quantum computers.

  15. Computing handbook computer science and software engineering

    CERN Document Server

    Gonzalez, Teofilo; Tucker, Allen

    2014-01-01

    Overview of Computer Science Structure and Organization of Computing Peter J. DenningComputational Thinking Valerie BarrAlgorithms and Complexity Data Structures Mark WeissBasic Techniques for Design and Analysis of Algorithms Edward ReingoldGraph and Network Algorithms Samir Khuller and Balaji RaghavachariComputational Geometry Marc van KreveldComplexity Theory Eric Allender, Michael Loui, and Kenneth ReganFormal Models and Computability Tao Jiang, Ming Li, and Bala

  16. Machine learning in computational biology to accelerate high-throughput protein expression.

    Science.gov (United States)

    Sastry, Anand; Monk, Jonathan; Tegel, Hanna; Uhlen, Mathias; Palsson, Bernhard O; Rockberg, Johan; Brunk, Elizabeth

    2017-08-15

    The Human Protein Atlas (HPA) enables the simultaneous characterization of thousands of proteins across various tissues to pinpoint their spatial location in the human body. This has been achieved through transcriptomics and high-throughput immunohistochemistry-based approaches, where over 40 000 unique human protein fragments have been expressed in E. coli. These datasets enable quantitative tracking of entire cellular proteomes and present new avenues for understanding molecular-level properties influencing expression and solubility. Combining computational biology and machine learning identifies protein properties that hinder the HPA high-throughput antibody production pipeline. We predict protein expression and solubility with accuracies of 70% and 80%, respectively, based on a subset of key properties (aromaticity, hydropathy and isoelectric point). We guide the selection of protein fragments based on these characteristics to optimize high-throughput experimentation. We present the machine learning workflow as a series of IPython notebooks hosted on GitHub (https://github.com/SBRG/Protein_ML). The workflow can be used as a template for analysis of further expression and solubility datasets. ebrunk@ucsd.edu or johanr@biotech.kth.se. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  17. BioSPICE: access to the most current computational tools for biologists.

    Science.gov (United States)

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  18. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  19. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  20. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  1. BONFIRE: benchmarking computers and computer networks

    OpenAIRE

    Bouckaert, Stefan; Vanhie-Van Gerwen, Jono; Moerman, Ingrid; Phillips, Stephen; Wilander, Jerker

    2011-01-01

    The benchmarking concept is not new in the field of computing or computer networking. With “benchmarking tools”, one usually refers to a program or set of programs, used to evaluate the performance of a solution under certain reference conditions, relative to the performance of another solution. Since the 1970s, benchmarking techniques have been used to measure the performance of computers and computer networks. Benchmarking of applications and virtual machines in an Infrastructure-as-a-Servi...

  2. Democratizing Computer Science

    Science.gov (United States)

    Margolis, Jane; Goode, Joanna; Ryoo, Jean J.

    2015-01-01

    Computer science programs are too often identified with a narrow stratum of the student population, often white or Asian boys who have access to computers at home. But because computers play such a huge role in our world today, all students can benefit from the study of computer science and the opportunity to build skills related to computing. The…

  3. Computing at Stanford.

    Science.gov (United States)

    Feigenbaum, Edward A.; Nielsen, Norman R.

    1969-01-01

    This article provides a current status report on the computing and computer science activities at Stanford University, focusing on the Computer Science Department, the Stanford Computation Center, the recently established regional computing network, and the Institute for Mathematical Studies in the Social Sciences. Also considered are such topics…

  4. O caderno de uma professora-aluna e as propostas para o ensino da aritmética na escola ativa (Minas Gerais, década de 1930 - A teacher’s notebook and the proposals for teaching arithmetic in active school (Minas Gerais, 1930

    Directory of Open Access Journals (Sweden)

    Diogo Alves de Faria Reis

    2014-01-01

    Full Text Available O artigo versa sobre o caderno de Metodologia da Aritmética de Imene Guimarães, aluna da professora Alda Lodi (1898-2002 na segunda turma da Escola de Aperfeiçoamento de Minas Gerais. Alda Lodi participou do grupo de docentes enviadas pelo governo mineiro ao Teacher’s College, nos Estados Unidos, para se prepararem para atuar na formação de professoras primárias em exercício no contexto das reformas educacionais de 1927-1928. Considerando a relevância, as potencialidades e os limites dos cadernos escolares como fonte, os registros desse caderno de 1932 são estudados e cotejados com outros materiais, em busca de uma compreensão inicial dos modos de apropriação das propostas para o ensino da aritmética no momento da adesão ao ideário da escola ativaem Minas Gerais.Palavras-chave: cadernos escolares, metodologia da aritmética, Escola de Aperfeiçoamento de Minas Gerais, Alda Lodi, história da educação matemática brasileira.A TEACHER’S NOTEBOOK AND THE PROPOSALS FOR TEACHING ARITHMETIC IN ACTIVE SCHOOL (MINAS GERAIS, 1930AbstractThe article focuses on a notebook which belonged to Imene Guimarães, a student of professor Alda Lodi (1898-2002 in Escola de Aperfeiçoamento, an institution of continuing education for teachers created by educational reforms promoted by the government of the state of Minas Gerais in 1927-1928. Alda Lodi taught Methodology of Arithmetic in this institution. Considering the relevance, potentialities and limitations of school notebooks as a source for the history of education, the records of this notebook of 1932 are studied and compared with other materials for the purpose of an initial understanding of the modes of appropriation of proposals for renovating the teaching of arithmetic according to the ideas associated to active school in Minas Gerais.Keywords: school notebooks, methodology of arithmetic, Escola de Aperfeiçoamento de Minas Gerais, Alda Lodi, history of mathematics education in

  5. Soft computing in computer and information science

    CERN Document Server

    Fray, Imed; Pejaś, Jerzy

    2015-01-01

    This book presents a carefully selected and reviewed collection of papers presented during the 19th Advanced Computer Systems conference ACS-2014. The Advanced Computer Systems conference concentrated from its beginning on methods and algorithms of artificial intelligence. Further future brought new areas of interest concerning technical informatics related to soft computing and some more technological aspects of computer science such as multimedia and computer graphics, software engineering, web systems, information security and safety or project management. These topics are represented in the present book under the categories Artificial Intelligence, Design of Information and Multimedia Systems, Information Technology Security and Software Technologies.

  6. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  7. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... When the image slices are reassembled by computer software, the result is a very detailed multidimensional view ... Safety Images related to Computed Tomography (CT) - Head Videos related to Computed Tomography (CT) - Head Sponsored by ...

  8. Computers: Instruments of Change.

    Science.gov (United States)

    Barkume, Megan

    1993-01-01

    Discusses the impact of computers in the home, the school, and the workplace. Looks at changes in computer use by occupations and by industry. Provides information on new job titles in computer occupations. (JOW)

  9. DNA computing models

    CERN Document Server

    Ignatova, Zoya; Zimmermann, Karl-Heinz

    2008-01-01

    In this excellent text, the reader is given a comprehensive introduction to the field of DNA computing. The book emphasizes computational methods to tackle central problems of DNA computing, such as controlling living cells, building patterns, and generating nanomachines.

  10. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  11. Computational Modeling | Bioenergy | NREL

    Science.gov (United States)

    cell walls and are the source of biofuels and biomaterials. Our modeling investigates their properties . Quantum Mechanical Models NREL studies chemical and electronic properties and processes to reduce barriers Computational Modeling Computational Modeling NREL uses computational modeling to increase the

  12. Computer Viruses: An Overview.

    Science.gov (United States)

    Marmion, Dan

    1990-01-01

    Discusses the early history and current proliferation of computer viruses that occur on Macintosh and DOS personal computers, mentions virus detection programs, and offers suggestions for how libraries can protect themselves and their users from damage by computer viruses. (LRW)

  13. Computer Virus and Trends

    OpenAIRE

    Tutut Handayani; Soenarto Usna,Drs.MMSI

    2004-01-01

    Since its appearance the first time in the mid-1980s, computer virus has invited various controversies that still lasts to this day. Along with the development of computer systems technology, viruses komputerpun find new ways to spread itself through a variety of existing communications media. This paper discusses about some things related to computer viruses, namely: the definition and history of computer viruses; the basics of computer viruses; state of computer viruses at this time; and ...

  14. Plasticity: modeling & computation

    National Research Council Canada - National Science Library

    Borja, Ronaldo Israel

    2013-01-01

    .... "Plasticity Modeling & Computation" is a textbook written specifically for students who want to learn the theoretical, mathematical, and computational aspects of inelastic deformation in solids...

  15. Cloud Computing Quality

    Directory of Open Access Journals (Sweden)

    Anamaria Şiclovan

    2013-02-01

    Full Text Available Cloud computing was and it will be a new way of providing Internet services and computers. This calculation approach is based on many existing services, such as the Internet, grid computing, Web services. Cloud computing as a system aims to provide on demand services more acceptable as price and infrastructure. It is exactly the transition from computer to a service offered to the consumers as a product delivered online. This paper is meant to describe the quality of cloud computing services, analyzing the advantages and characteristics offered by it. It is a theoretical paper.Keywords: Cloud computing, QoS, quality of cloud computing

  16. Computer hardware fault administration

    Science.gov (United States)

    Archer, Charles J.; Megerian, Mark G.; Ratterman, Joseph D.; Smith, Brian E.

    2010-09-14

    Computer hardware fault administration carried out in a parallel computer, where the parallel computer includes a plurality of compute nodes. The compute nodes are coupled for data communications by at least two independent data communications networks, where each data communications network includes data communications links connected to the compute nodes. Typical embodiments carry out hardware fault administration by identifying a location of a defective link in the first data communications network of the parallel computer and routing communications data around the defective link through the second data communications network of the parallel computer.

  17. Computer jargon explained

    CERN Document Server

    Enticknap, Nicholas

    2014-01-01

    Computer Jargon Explained is a feature in Computer Weekly publications that discusses 68 of the most commonly used technical computing terms. The book explains what the terms mean and why the terms are important to computer professionals. The text also discusses how the terms relate to the trends and developments that are driving the information technology industry. Computer jargon irritates non-computer people and in turn causes problems for computer people. The technology and the industry are changing so rapidly; it is very hard even for professionals to keep updated. Computer people do not

  18. Computers and data processing

    CERN Document Server

    Deitel, Harvey M

    1985-01-01

    Computers and Data Processing provides information pertinent to the advances in the computer field. This book covers a variety of topics, including the computer hardware, computer programs or software, and computer applications systems.Organized into five parts encompassing 19 chapters, this book begins with an overview of some of the fundamental computing concepts. This text then explores the evolution of modern computing systems from the earliest mechanical calculating devices to microchips. Other chapters consider how computers present their results and explain the storage and retrieval of

  19. Computers in nuclear medicine

    International Nuclear Information System (INIS)

    Giannone, Carlos A.

    1999-01-01

    This chapter determines: capture and observation of images in computers; hardware and software used, personal computers, networks and workstations. The use of special filters determine the quality image

  20. Advances in unconventional computing

    CERN Document Server

    2017-01-01

    The unconventional computing is a niche for interdisciplinary science, cross-bred of computer science, physics, mathematics, chemistry, electronic engineering, biology, material science and nanotechnology. The aims of this book are to uncover and exploit principles and mechanisms of information processing in and functional properties of physical, chemical and living systems to develop efficient algorithms, design optimal architectures and manufacture working prototypes of future and emergent computing devices. This first volume presents theoretical foundations of the future and emergent computing paradigms and architectures. The topics covered are computability, (non-)universality and complexity of computation; physics of computation, analog and quantum computing; reversible and asynchronous devices; cellular automata and other mathematical machines; P-systems and cellular computing; infinity and spatial computation; chemical and reservoir computing. The book is the encyclopedia, the first ever complete autho...

  1. Computability and unsolvability

    CERN Document Server

    Davis, Martin

    1985-01-01

    ""A clearly written, well-presented survey of an intriguing subject."" - Scientific American. Classic text considers general theory of computability, computable functions, operations on computable functions, Turing machines self-applied, unsolvable decision problems, applications of general theory, mathematical logic, Kleene hierarchy, computable functionals, classification of unsolvable decision problems and more.

  2. Mathematics for computer graphics

    CERN Document Server

    Vince, John

    2006-01-01

    Helps you understand the mathematical ideas used in computer animation, virtual reality, CAD, and other areas of computer graphics. This work also helps you to rediscover the mathematical techniques required to solve problems and design computer programs for computer graphic applications

  3. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  4. Symbiotic Cognitive Computing

    OpenAIRE

    Farrell, Robert G.; Lenchner, Jonathan; Kephjart, Jeffrey O.; Webb, Alan M.; Muller, MIchael J.; Erikson, Thomas D.; Melville, David O.; Bellamy, Rachel K.E.; Gruen, Daniel M.; Connell, Jonathan H.; Soroker, Danny; Aaron, Andy; Trewin, Shari M.; Ashoori, Maryam; Ellis, Jason B.

    2016-01-01

    IBM Research is engaged in a research program in symbiotic cognitive computing to investigate how to embed cognitive computing in physical spaces. This article proposes 5 key principles of symbiotic cognitive computing.  We describe how these principles are applied in a particular symbiotic cognitive computing environment and in an illustrative application.  

  5. Computer scientist looks at reliability computations

    International Nuclear Information System (INIS)

    Rosenthal, A.

    1975-01-01

    Results from the theory of computational complexity are applied to reliability computations on fault trees and networks. A well known class of problems which almost certainly have no fast solution algorithms is presented. It is shown that even approximately computing the reliability of many systems is difficult enough to be in this class. In the face of this result, which indicates that for general systems the computation time will be exponential in the size of the system, decomposition techniques which can greatly reduce the effective size of a wide variety of realistic systems are explored

  6. Roadmap to greener computing

    CERN Document Server

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  7. Brief: Managing computing technology

    International Nuclear Information System (INIS)

    Startzman, R.A.

    1994-01-01

    While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies

  8. Computer mathematics for programmers

    CERN Document Server

    Abney, Darrell H; Sibrel, Donald W

    1985-01-01

    Computer Mathematics for Programmers presents the Mathematics that is essential to the computer programmer.The book is comprised of 10 chapters. The first chapter introduces several computer number systems. Chapter 2 shows how to perform arithmetic operations using the number systems introduced in Chapter 1. The third chapter covers the way numbers are stored in computers, how the computer performs arithmetic on real numbers and integers, and how round-off errors are generated in computer programs. Chapter 4 details the use of algorithms and flowcharting as problem-solving tools for computer p

  9. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  10. The digital computer

    CERN Document Server

    Parton, K C

    2014-01-01

    The Digital Computer focuses on the principles, methodologies, and applications of the digital computer. The publication takes a look at the basic concepts involved in using a digital computer, simple autocode examples, and examples of working advanced design programs. Discussions focus on transformer design synthesis program, machine design analysis program, solution of standard quadratic equations, harmonic analysis, elementary wage calculation, and scientific calculations. The manuscript then examines commercial and automatic programming, how computers work, and the components of a computer

  11. Cloud computing for radiologists

    OpenAIRE

    Amit T Kharat; Amjad Safvi; S S Thind; Amarjit Singh

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as...

  12. Toward Cloud Computing Evolution

    OpenAIRE

    Susanto, Heru; Almunawar, Mohammad Nabil; Kang, Chen Chin

    2012-01-01

    -Information Technology (IT) shaped the success of organizations, giving them a solid foundation that increases both their level of efficiency as well as productivity. The computing industry is witnessing a paradigm shift in the way computing is performed worldwide. There is a growing awareness among consumers and enterprises to access their IT resources extensively through a "utility" model known as "cloud computing." Cloud computing was initially rooted in distributed grid-based computing. ...

  13. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  14. A powerful way of cooling computer chip using liquid metal with low melting point as the cooling fluid

    Energy Technology Data Exchange (ETDEWEB)

    Li Teng; Lv Yong-Gang [Chinese Academy of Sciences, Beijing (China). Cryogenic Lab.; Chinese Academy of Sciences, Beijing (China). Graduate School; Liu Jing; Zhou Yi-Xin [Chinese Academy of Sciences, Beijing (China). Cryogenic Lab.

    2006-12-15

    With the improvement of computational speed, thermal management becomes a serious concern in computer system. CPU chips are squeezing into tighter and tighter spaces with no more room for heat to escape. Total power-dissipation levels now reside about 110 W, and peak power densities are reaching 400-500 W/mm{sup 2} and are still steadily climbing. As a result, higher performance and greater reliability are extremely tough to attain. But since the standard conduction and forced-air convection techniques no longer be able to provide adequate cooling for sophisticated electronic systems, new solutions are being looked into liquid cooling, thermoelectric cooling, heat pipes, and vapor chambers. In this paper, we investigated a novel method to significantly lower the chip temperature using liquid metal with low melting point as the cooling fluid. The liquid gallium was particularly adopted to test the feasibility of this cooling approach, due to its low melting point at 29.7 C, high thermal conductivity and heat capacity. A series of experiments with different flow rates and heat dissipation rates were performed. The cooling capacity and reliability of the liquid metal were compared with that of the water-cooling and very attractive results were obtained. Finally, a general criterion was introduced to evaluate the cooling performance difference between the liquid metal cooling and the water-cooling. The results indicate that the temperature of the computer chip can be significantly reduced with the increasing flow rate of liquid gallium, which suggests that an even higher power dissipation density can be achieved with a large flow of liquid gallium and large area of heat dissipation. The concept discussed in this paper is expected to provide a powerful cooling strategy for the notebook PC, desktop PC and large computer. It can also be extended to more wide area involved with thermal management on high heat generation rate. (orig.)

  15. Development traumatic brain injury computer user interface for disaster area in Indonesia supported by emergency broadband access network.

    Science.gov (United States)

    Sutiono, Agung Budi; Suwa, Hirohiko; Ohta, Toshizumi; Arifin, Muh Zafrullah; Kitamura, Yohei; Yoshida, Kazunari; Merdika, Daduk; Qiantori, Andri; Iskandar

    2012-12-01

    Disasters bring consequences of negative impacts on the environment and human life. One of the common cause of critical condition is traumatic brain injury (TBI), namely, epidural (EDH) and subdural hematoma (SDH), due to downfall hard things during earthquake. We proposed and analyzed the user response, namely neurosurgeon, general doctor/surgeon and nurse when they interacted with TBI computer interface. The communication systems was supported by TBI web based applications using emergency broadband access network with tethered balloon and simulated in the field trial to evaluate the coverage area. The interface consisted of demography data and multi tabs for anamnesis, treatment, follow up and teleconference interfaces. The interface allows neurosurgeon, surgeon/general doctors and nurses to entry the EDH and SDH patient's data during referring them on the emergency simulation and evaluated based on time needs and their understanding. The average time needed was obtained after simulated by Lenovo T500 notebook using mouse; 8-10 min for neurosurgeons, 12-15 min for surgeons/general doctors and 15-19 min for nurses. By using Think Pad X201 Tablet, the time needed for entry data was 5-7 min for neurosurgeon, 7-10 min for surgeons/general doctors and 12-16 min for nurses. We observed that the time difference was depending on the computer type and user literacy qualification as well as their understanding on traumatic brain injury, particularly for the nurses. In conclusion, there are five data classification for simply TBI GUI, namely, 1) demography, 2) specific anamnesis for EDH and SDH, 3) treatment action and medicine of TBI, 4) follow up data display and 5) teleneurosurgery for streaming video consultation. The type of computer, particularly tablet PC was more convenient and faster for entry data, compare to that computer mouse touched pad. Emergency broadband access network using tethered balloon is possible to be employed to cover the communications systems in

  16. Synthetic Computation: Chaos Computing, Logical Stochastic Resonance, and Adaptive Computing

    Science.gov (United States)

    Kia, Behnam; Murali, K.; Jahed Motlagh, Mohammad-Reza; Sinha, Sudeshna; Ditto, William L.

    Nonlinearity and chaos can illustrate numerous behaviors and patterns, and one can select different patterns from this rich library of patterns. In this paper we focus on synthetic computing, a field that engineers and synthesizes nonlinear systems to obtain computation. We explain the importance of nonlinearity, and describe how nonlinear systems can be engineered to perform computation. More specifically, we provide an overview of chaos computing, a field that manually programs chaotic systems to build different types of digital functions. Also we briefly describe logical stochastic resonance (LSR), and then extend the approach of LSR to realize combinational digital logic systems via suitable concatenation of existing logical stochastic resonance blocks. Finally we demonstrate how a chaotic system can be engineered and mated with different machine learning techniques, such as artificial neural networks, random searching, and genetic algorithm, to design different autonomous systems that can adapt and respond to environmental conditions.

  17. Future Computer Requirements for Computational Aerodynamics

    Science.gov (United States)

    1978-01-01

    Recent advances in computational aerodynamics are discussed as well as motivations for and potential benefits of a National Aerodynamic Simulation Facility having the capability to solve fluid dynamic equations at speeds two to three orders of magnitude faster than presently possible with general computers. Two contracted efforts to define processor architectures for such a facility are summarized.

  18. Computers and Computation. Readings from Scientific American.

    Science.gov (United States)

    Fenichel, Robert R.; Weizenbaum, Joseph

    A collection of articles from "Scientific American" magazine has been put together at this time because the current period in computer science is one of consolidation rather than innovation. A few years ago, computer science was moving so swiftly that even the professional journals were more archival than informative; but today it is…

  19. Know Your Personal Computer Introduction to Computers

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 1. Know Your Personal Computer Introduction to Computers. Siddhartha Kumar Ghoshal. Series Article Volume 1 Issue 1 January 1996 pp 48-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  20. Heterotic computing: exploiting hybrid computational devices.

    Science.gov (United States)

    Kendon, Viv; Sebald, Angelika; Stepney, Susan

    2015-07-28

    Current computational theory deals almost exclusively with single models: classical, neural, analogue, quantum, etc. In practice, researchers use ad hoc combinations, realizing only recently that they can be fundamentally more powerful than the individual parts. A Theo Murphy meeting brought together theorists and practitioners of various types of computing, to engage in combining the individual strengths to produce powerful new heterotic devices. 'Heterotic computing' is defined as a combination of two or more computational systems such that they provide an advantage over either substrate used separately. This post-meeting collection of articles provides a wide-ranging survey of the state of the art in diverse computational paradigms, together with reflections on their future combination into powerful and practical applications. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  1. Cloud Computing for radiologists.

    Science.gov (United States)

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  2. Cloud Computing for radiologists

    International Nuclear Information System (INIS)

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future

  3. Cloud computing for radiologists

    Directory of Open Access Journals (Sweden)

    Amit T Kharat

    2012-01-01

    Full Text Available Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. Review of quantum computation

    International Nuclear Information System (INIS)

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  5. Computers for imagemaking

    CERN Document Server

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  6. Computer Lexis and Terminology

    Directory of Open Access Journals (Sweden)

    Gintautas Grigas

    2011-04-01

    Full Text Available Computer becomes a widely used tool in everyday work and at home. Every computer user sees texts on its screen containing a lot of words naming new concepts. Those words come from the terminology used by specialists. The common vocabury between computer terminology and lexis of everyday language comes into existence. The article deals with the part of computer terminology which goes to everyday usage and the influence of ordinary language to computer terminology. The relation between English and Lithuanian computer terminology, the construction and pronouncing of acronyms are discussed as well.

  7. Computations in plasma physics

    International Nuclear Information System (INIS)

    Cohen, B.I.; Killeen, J.

    1984-01-01

    A review of computer application in plasma physics is presented. Computer contribution to the investigation of magnetic and inertial confinement of a plasma and charged particle beam propagation is described. Typical utilization of computer for simulation and control of laboratory and cosmic experiments with a plasma and for data accumulation in these experiments is considered. Basic computational methods applied in plasma physics are discussed. Future trends of computer utilization in plasma reseaches are considered in terms of an increasing role of microprocessors and high-speed data plotters and the necessity of more powerful computer application

  8. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  9. Explorations in quantum computing

    CERN Document Server

    Williams, Colin P

    2011-01-01

    By the year 2020, the basic memory components of a computer will be the size of individual atoms. At such scales, the current theory of computation will become invalid. ""Quantum computing"" is reinventing the foundations of computer science and information theory in a way that is consistent with quantum physics - the most accurate model of reality currently known. Remarkably, this theory predicts that quantum computers can perform certain tasks breathtakingly faster than classical computers -- and, better yet, can accomplish mind-boggling feats such as teleporting information, breaking suppos

  10. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  11. Polymorphous computing fabric

    Science.gov (United States)

    Wolinski, Christophe Czeslaw [Los Alamos, NM; Gokhale, Maya B [Los Alamos, NM; McCabe, Kevin Peter [Los Alamos, NM

    2011-01-18

    Fabric-based computing systems and methods are disclosed. A fabric-based computing system can include a polymorphous computing fabric that can be customized on a per application basis and a host processor in communication with said polymorphous computing fabric. The polymorphous computing fabric includes a cellular architecture that can be highly parameterized to enable a customized synthesis of fabric instances for a variety of enhanced application performances thereof. A global memory concept can also be included that provides the host processor random access to all variables and instructions associated with the polymorphous computing fabric.

  12. Computer ray tracing speeds.

    Science.gov (United States)

    Robb, P; Pawlowski, B

    1990-05-01

    The results of measuring the ray trace speed and compilation speed of thirty-nine computers in fifty-seven configurations, ranging from personal computers to super computers, are described. A correlation of ray trace speed has been made with the LINPACK benchmark which allows the ray trace speed to be estimated using LINPACK performance data. The results indicate that the latest generation of workstations, using CPUs based on RISC (Reduced Instruction Set Computer) technology, are as fast or faster than mainframe computers in compute-bound situations.

  13. Computing networks from cluster to cloud computing

    CERN Document Server

    Vicat-Blanc, Pascale; Guillier, Romaric; Soudan, Sebastien

    2013-01-01

    "Computing Networks" explores the core of the new distributed computing infrastructures we are using today:  the networking systems of clusters, grids and clouds. It helps network designers and distributed-application developers and users to better understand the technologies, specificities, constraints and benefits of these different infrastructures' communication systems. Cloud Computing will give the possibility for millions of users to process data anytime, anywhere, while being eco-friendly. In order to deliver this emerging traffic in a timely, cost-efficient, energy-efficient, and

  14. Computing Nash equilibria through computational intelligence methods

    Science.gov (United States)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  15. Reversible computing fundamentals, quantum computing, and applications

    CERN Document Server

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  16. Computing in high energy physics

    Energy Technology Data Exchange (ETDEWEB)

    Watase, Yoshiyuki

    1991-09-15

    The increasingly important role played by computing and computers in high energy physics is displayed in the 'Computing in High Energy Physics' series of conferences, bringing together experts in different aspects of computing - physicists, computer scientists, and vendors.

  17. Searching with Quantum Computers

    OpenAIRE

    Grover, Lov K.

    2000-01-01

    This article introduces quantum computation by analogy with probabilistic computation. A basic description of the quantum search algorithm is given by representing the algorithm as a C program in a novel way.

  18. Book Review: Computational Topology

    DEFF Research Database (Denmark)

    Raussen, Martin

    2011-01-01

    Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...

  19. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  20. Know Your Personal Computer

    Indian Academy of Sciences (India)

    computer with IBM PC .... read by a human and not translated by a compiler are called .... by different stages of education becomes a computer scientist. ... ancestors knew and carried out the semantic actions without question or comment.

  1. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  2. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  3. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... images. These images can be viewed on a computer monitor, printed on film or transferred to a ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ...

  4. Computational Science Facility (CSF)

    Data.gov (United States)

    Federal Laboratory Consortium — PNNL Institutional Computing (PIC) is focused on meeting DOE's mission needs and is part of PNNL's overarching research computing strategy. PIC supports large-scale...

  5. Quantum Computer Science

    Science.gov (United States)

    Mermin, N. David

    2007-08-01

    Preface; 1. Cbits and Qbits; 2. General features and some simple examples; 3. Breaking RSA encryption with a quantum computer; 4. Searching with a quantum computer; 5. Quantum error correction; 6. Protocols that use just a few Qbits; Appendices; Index.

  6. Computer Vision Syndrome.

    Science.gov (United States)

    Randolph, Susan A

    2017-07-01

    With the increased use of electronic devices with visual displays, computer vision syndrome is becoming a major public health issue. Improving the visual status of workers using computers results in greater productivity in the workplace and improved visual comfort.

  7. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... are the limitations of CT of the Sinuses? What is CT (Computed Tomography) of the Sinuses? Computed ... nasal cavity by small openings. top of page What are some common uses of the procedure? CT ...

  8. Computer Technology Directory.

    Science.gov (United States)

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  9. My Computer Is Learning.

    Science.gov (United States)

    Good, Ron

    1986-01-01

    Describes instructional uses of computer programs found in David Heiserman's book "Projects in Machine Intelligence for Your Home Computer." The programs feature "creatures" of various colors that move around within a rectangular white border. (JN)

  10. What is Computed Tomography?

    Science.gov (United States)

    ... Imaging Medical X-ray Imaging What is Computed Tomography? Share Tweet Linkedin Pin it More sharing options ... Chest X ray Image back to top Computed Tomography (CT) Although also based on the variable absorption ...

  11. Joint Computing Facility

    Data.gov (United States)

    Federal Laboratory Consortium — Raised Floor Computer Space for High Performance ComputingThe ERDC Information Technology Laboratory (ITL) provides a robust system of IT facilities to develop and...

  12. Computing for Belle

    CERN Multimedia

    CERN. Geneva

    2004-01-01

    2s-1, 10 times as much as we obtain now. This presentation describes Belle's efficient computing operations, struggles to manage large amount of raw and physics data, and plans for Belle computing for Super KEKB/Belle.

  13. Computational Continuum Mechanics

    CERN Document Server

    Shabana, Ahmed A

    2011-01-01

    This text presents the theory of continuum mechanics using computational methods. Ideal for students and researchers, the second edition features a new chapter on computational geometry and finite element analysis.

  14. Applications of computer algebra

    CERN Document Server

    1985-01-01

    Today, certain computer software systems exist which surpass the computational ability of researchers when their mathematical techniques are applied to many areas of science and engineering. These computer systems can perform a large portion of the calculations seen in mathematical analysis. Despite this massive power, thousands of people use these systems as a routine resource for everyday calculations. These software programs are commonly called "Computer Algebra" systems. They have names such as MACSYMA, MAPLE, muMATH, REDUCE and SMP. They are receiving credit as a computational aid with in­ creasing regularity in articles in the scientific and engineering literature. When most people think about computers and scientific research these days, they imagine a machine grinding away, processing numbers arithmetically. It is not generally realized that, for a number of years, computers have been performing non-numeric computations. This means, for example, that one inputs an equa­ tion and obtains a closed for...

  15. ICASE Computer Science Program

    Science.gov (United States)

    1985-01-01

    The Institute for Computer Applications in Science and Engineering computer science program is discussed in outline form. Information is given on such topics as problem decomposition, algorithm development, programming languages, and parallel architectures.

  16. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  17. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... ring, called a gantry. The computer workstation that processes the imaging information is located in a separate ... follows a spiral path. A special computer program processes this large volume of data to create two- ...

  18. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Stroke Brain Tumors Computer Tomography (CT) Safety During Pregnancy Head and Neck Cancer X-ray, Interventional Radiology and Nuclear Medicine Radiation Safety Images related to Computed Tomography (CT) - ...

  19. Intimacy and Computer Communication.

    Science.gov (United States)

    Robson, Dave; Robson, Maggie

    1998-01-01

    Addresses the relationship between intimacy and communication that is based on computer technology. Discusses definitions of intimacy and the nature of intimate conversations that use computers as a communications medium. Explores implications for counseling. (MKA)

  20. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... other in a ring, called a gantry. The computer workstation that processes the imaging information is located ... ray beam follows a spiral path. A special computer program processes this large volume of data to ...

  1. Cognitive Computing for Security.

    Energy Technology Data Exchange (ETDEWEB)

    Debenedictis, Erik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Rothganger, Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Aimone, James Bradley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Marinella, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Evans, Brian Robert [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Warrender, Christina E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mickel, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    Final report for Cognitive Computing for Security LDRD 165613. It reports on the development of hybrid of general purpose/ne uromorphic computer architecture, with an emphasis on potential implementation with memristors.

  2. Computation: A New Open Access Journal of Computational Chemistry, Computational Biology and Computational Engineering

    OpenAIRE

    Karlheinz Schwarz; Rainer Breitling; Christian Allen

    2013-01-01

    Computation (ISSN 2079-3197; http://www.mdpi.com/journal/computation) is an international scientific open access journal focusing on fundamental work in the field of computational science and engineering. Computational science has become essential in many research areas by contributing to solving complex problems in fundamental science all the way to engineering. The very broad range of application domains suggests structuring this journal into three sections, which are briefly characterized ...

  3. Nanoelectronics: Metrology and Computation

    International Nuclear Information System (INIS)

    Lundstrom, Mark; Clark, Jason V.; Klimeck, Gerhard; Raman, Arvind

    2007-01-01

    Research in nanoelectronics poses new challenges for metrology, but advances in theory, simulation and computing and networking technology provide new opportunities to couple simulation and metrology. This paper begins with a brief overview of current work in computational nanoelectronics. Three examples of how computation can assist metrology will then be discussed. The paper concludes with a discussion of how cyberinfrastructure can help connect computing and metrology using the nanoHUB (www.nanoHUB.org) as a specific example

  4. Foundations of Neuromorphic Computing

    Science.gov (United States)

    2013-05-01

    paradigms: few sensors/complex computations and many sensors/simple computation. Challenges with Nano-enabled Neuromorphic Chips A wide variety of...FOUNDATIONS OF NEUROMORPHIC COMPUTING MAY 2013 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE; DISTRIBUTION...2009 – SEP 2012 4. TITLE AND SUBTITLE FOUNDATIONS OF NEUROMORPHIC COMPUTING 5a. CONTRACT NUMBER IN-HOUSE 5b. GRANT NUMBER N/A 5c. PROGRAM

  5. Approximation and Computation

    CERN Document Server

    Gautschi, Walter; Rassias, Themistocles M

    2011-01-01

    Approximation theory and numerical analysis are central to the creation of accurate computer simulations and mathematical models. Research in these areas can influence the computational techniques used in a variety of mathematical and computational sciences. This collection of contributed chapters, dedicated to renowned mathematician Gradimir V. Milovanovia, represent the recent work of experts in the fields of approximation theory and numerical analysis. These invited contributions describe new trends in these important areas of research including theoretic developments, new computational alg

  6. Computed tomography for radiographers

    International Nuclear Information System (INIS)

    Brooker, M.

    1986-01-01

    Computed tomography is regarded by many as a complicated union of sophisticated x-ray equipment and computer technology. This book overcomes these complexities. The rigid technicalities of the machinery and the clinical aspects of computed tomography are discussed including the preparation of patients, both physically and mentally, for scanning. Furthermore, the author also explains how to set up and run a computed tomography department, including advice on how the room should be designed

  7. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  8. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  9. Quantum mechanics and computation

    International Nuclear Information System (INIS)

    Cirac Sasturain, J. I.

    2000-01-01

    We review how some of the basic principles of Quantum Mechanics can be used in the field of computation. In particular, we explain why a quantum computer can perform certain tasks in a much more efficient way than the computers we have available nowadays. We give the requirements for a quantum system to be able to implement a quantum computer and illustrate these requirements in some particular physical situations. (Author) 16 refs

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  11. COMPUTER GAMES AND EDUCATION

    OpenAIRE

    Sukhov, Anton

    2018-01-01

    This paper devoted to the research of educational resources and possibilities of modern computer games. The “internal” educational aspects of computer games include educational mechanism (a separate or integrated “tutorial”) and representation of a real or even fantastic educational process within virtual worlds. The “external” dimension represents educational opportunities of computer games for personal and professional development in different genres of computer games (various transport, so...

  12. Man and computer

    International Nuclear Information System (INIS)

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  13. Computational physics an introduction

    CERN Document Server

    Vesely, Franz J

    1994-01-01

    Author Franz J. Vesely offers students an introductory text on computational physics, providing them with the important basic numerical/computational techniques. His unique text sets itself apart from others by focusing on specific problems of computational physics. The author also provides a selection of modern fields of research. Students will benefit from the appendixes which offer a short description of some properties of computing and machines and outline the technique of 'Fast Fourier Transformation.'

  14. Computing environment logbook

    Science.gov (United States)

    Osbourn, Gordon C; Bouchard, Ann M

    2012-09-18

    A computing environment logbook logs events occurring within a computing environment. The events are displayed as a history of past events within the logbook of the computing environment. The logbook provides search functionality to search through the history of past events to find one or more selected past events, and further, enables an undo of the one or more selected past events.

  15. The Computer Revolution.

    Science.gov (United States)

    Berkeley, Edmund C.

    "The Computer Revolution", a part of the "Second Industrial Revolution", is examined with reference to the social consequences of computers. The subject is introduced in an opening section which discusses the revolution in the handling of information and the history, powers, uses, and working s of computers. A second section examines in detail the…

  16. Advances in physiological computing

    CERN Document Server

    Fairclough, Stephen H

    2014-01-01

    This edited collection will provide an overview of the field of physiological computing, i.e. the use of physiological signals as input for computer control. It will cover a breadth of current research, from brain-computer interfaces to telemedicine.

  17. Physics of quantum computation

    International Nuclear Information System (INIS)

    Belokurov, V.V.; Khrustalev, O.A.; Sadovnichij, V.A.; Timofeevskaya, O.D.

    2003-01-01

    In the paper, the modern status of the theory of quantum computation is considered. The fundamental principles of quantum computers and their basic notions such as quantum processors and computational basis states of the quantum Turing machine as well as the quantum Fourier transform are discussed. Some possible experimental realizations on the basis of NMR methods are given

  18. Quantum walk computation

    International Nuclear Information System (INIS)

    Kendon, Viv

    2014-01-01

    Quantum versions of random walks have diverse applications that are motivating experimental implementations as well as theoretical studies. Recent results showing quantum walks are “universal for quantum computation” relate to algorithms, to be run on quantum computers. We consider whether an experimental implementation of a quantum walk could provide useful computation before we have a universal quantum computer

  19. The Challenge of Computers.

    Science.gov (United States)

    Leger, Guy

    Computers may change teachers' lifestyles, teaching styles, and perhaps even their personal values. A brief survey of the history of computers demonstrates the incredible pace at which computer technology is moving ahead. The cost and size of microchips will continue to decline dramatically over the next 20 years, while the capability and variety…

  20. Visitor's Computer Guidelines | CTIO

    Science.gov (United States)

    Visitor's Computer Guidelines Network Connection Request Instruments Instruments by Telescope IR Instruments Guidelines Library Facilities Outreach NOAO-S EPO Program team Art of Darkness Image Gallery EPO/CADIAS ‹› You are here CTIO Home » Astronomers » Visitor's Computer Guidelines Visitor's Computer

  1. Medical Computational Thinking

    DEFF Research Database (Denmark)

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  2. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray ... What is CT Scanning of the Head? Computed tomography, more commonly known as a CT or CAT ...

  3. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  4. Computed Tomography (CT) -- Sinuses

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Sinuses Computed tomography (CT) of the sinuses uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  5. Computed Tomography (CT) -- Head

    Medline Plus

    Full Text Available ... Physician Resources Professions Site Index A-Z Computed Tomography (CT) - Head Computed tomography (CT) of the head uses special x-ray equipment ... story here Images × Image Gallery Patient undergoing computed tomography (CT) scan. View full size with caption Pediatric Content ...

  6. Beyond the Computer Literacy.

    Science.gov (United States)

    Streibel, Michael J.; Garhart, Casey

    1985-01-01

    Describes the approach taken in an education computing course for pre- and in-service teachers. Outlines the basic operational, analytical, and evaluation skills that are emphasized in the course, suggesting that these skills go beyond the attainment of computer literacy and can assist in the effective use of computers. (ML)

  7. Computer algebra applications

    International Nuclear Information System (INIS)

    Calmet, J.

    1982-01-01

    A survey of applications based either on fundamental algorithms in computer algebra or on the use of a computer algebra system is presented. Recent work in biology, chemistry, physics, mathematics and computer science is discussed. In particular, applications in high energy physics (quantum electrodynamics), celestial mechanics and general relativity are reviewed. (Auth.)

  8. Computer-assisted instruction

    NARCIS (Netherlands)

    Voogt, J.; Fisser, P.; Wright, J.D.

    2015-01-01

    Since the early days of computer technology in education in the 1960s, it was claimed that computers can assist instructional practice and hence improve student learning. Since then computer technology has developed, and its potential for education has increased. In this article, we first discuss

  9. Designing with computational intelligence

    CERN Document Server

    Lopes, Heitor; Mourelle, Luiza

    2017-01-01

    This book discusses a number of real-world applications of computational intelligence approaches. Using various examples, it demonstrates that computational intelligence has become a consolidated methodology for automatically creating new competitive solutions to complex real-world problems. It also presents a concise and efficient synthesis of different systems using computationally intelligent techniques.

  10. A new computing principle

    International Nuclear Information System (INIS)

    Fatmi, H.A.; Resconi, G.

    1988-01-01

    In 1954 while reviewing the theory of communication and cybernetics the late Professor Dennis Gabor presented a new mathematical principle for the design of advanced computers. During our work on these computers it was found that the Gabor formulation can be further advanced to include more recent developments in Lie algebras and geometric probability, giving rise to a new computing principle

  11. Computers and Information Flow.

    Science.gov (United States)

    Patrick, R. L.

    This paper is designed to fill the need for an easily understood introduction to the computing and data processing field for the layman who has, or can expect to have, some contact with it. Information provided includes the unique terminology and jargon of the field, the various types of computers and the scope of computational capabilities, and…

  12. Computer naratology: narrative templates in computer games

    OpenAIRE

    Praks, Vítězslav

    2009-01-01

    Relations and actions between literature and computer games were examined. Study contains theoretical analysis of game as an aesthetic artefact. To play a game means to leave practical world for sake of a fictional world. Artistic communication has more similarities with game communication than with normal, practical communication. Game study can help us understand basic concepts of art communication (game rules - poetic rules, game world - fiction, function in game - meaning in art). Compute...

  13. Neural Computation and the Computational Theory of Cognition

    Science.gov (United States)

    Piccinini, Gualtiero; Bahar, Sonya

    2013-01-01

    We begin by distinguishing computationalism from a number of other theses that are sometimes conflated with it. We also distinguish between several important kinds of computation: computation in a generic sense, digital computation, and analog computation. Then, we defend a weak version of computationalism--neural processes are computations in the…

  14. Quantum computing and spintronics

    International Nuclear Information System (INIS)

    Kantser, V.

    2007-01-01

    Tentative to build a computer, which can operate according to the quantum laws, has leaded to concept of quantum computing algorithms and hardware. In this review we highlight recent developments which point the way to quantum computing on the basis solid state nanostructures after some general considerations concerning quantum information science and introducing a set of basic requirements for any quantum computer proposal. One of the major direction of research on the way to quantum computing is to exploit the spin (in addition to the orbital) degree of freedom of the electron, giving birth to the field of spintronics. We address some semiconductor approach based on spin orbit coupling in semiconductor nanostructures. (authors)

  15. Theory of computation

    CERN Document Server

    Tourlakis, George

    2012-01-01

    Learn the skills and acquire the intuition to assess the theoretical limitations of computer programming Offering an accessible approach to the topic, Theory of Computation focuses on the metatheory of computing and the theoretical boundaries between what various computational models can do and not do—from the most general model, the URM (Unbounded Register Machines), to the finite automaton. A wealth of programming-like examples and easy-to-follow explanations build the general theory gradually, which guides readers through the modeling and mathematical analysis of computational pheno

  16. Computer Security Handbook

    CERN Document Server

    Bosworth, Seymour; Whyne, Eric

    2012-01-01

    The classic and authoritative reference in the field of computer security, now completely updated and revised With the continued presence of large-scale computers; the proliferation of desktop, laptop, and handheld computers; and the vast international networks that interconnect them, the nature and extent of threats to computer security have grown enormously. Now in its fifth edition, Computer Security Handbook continues to provide authoritative guidance to identify and to eliminate these threats where possible, as well as to lessen any losses attributable to them. With seventy-seven chapter

  17. Secure cloud computing

    CERN Document Server

    Jajodia, Sushil; Samarati, Pierangela; Singhal, Anoop; Swarup, Vipin; Wang, Cliff

    2014-01-01

    This book presents a range of cloud computing security challenges and promising solution paths. The first two chapters focus on practical considerations of cloud computing. In Chapter 1, Chandramouli, Iorga, and Chokani describe the evolution of cloud computing and the current state of practice, followed by the challenges of cryptographic key management in the cloud. In Chapter 2, Chen and Sion present a dollar cost model of cloud computing and explore the economic viability of cloud computing with and without security mechanisms involving cryptographic mechanisms. The next two chapters addres

  18. Scalable optical quantum computer

    Energy Technology Data Exchange (ETDEWEB)

    Manykin, E A; Mel' nichenko, E V [Institute for Superconductivity and Solid-State Physics, Russian Research Centre ' Kurchatov Institute' , Moscow (Russian Federation)

    2014-12-31

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr{sup 3+}, regularly located in the lattice of the orthosilicate (Y{sub 2}SiO{sub 5}) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)

  19. Computing meaning v.4

    CERN Document Server

    Bunt, Harry; Pulman, Stephen

    2013-01-01

    This book is a collection of papers by leading researchers in computational semantics. It presents a state-of-the-art overview of recent and current research in computational semantics, including descriptions of new methods for constructing and improving resources for semantic computation, such as WordNet, VerbNet, and semantically annotated corpora. It also presents new statistical methods in semantic computation, such as the application of distributional semantics in the compositional calculation of sentence meanings. Computing the meaning of sentences, texts, and spoken or texted dialogue i

  20. Scalable optical quantum computer

    International Nuclear Information System (INIS)

    Manykin, E A; Mel'nichenko, E V

    2014-01-01

    A way of designing a scalable optical quantum computer based on the photon echo effect is proposed. Individual rare earth ions Pr 3+ , regularly located in the lattice of the orthosilicate (Y 2 SiO 5 ) crystal, are suggested to be used as optical qubits. Operations with qubits are performed using coherent and incoherent laser pulses. The operation protocol includes both the method of measurement-based quantum computations and the technique of optical computations. Modern hybrid photon echo protocols, which provide a sufficient quantum efficiency when reading recorded states, are considered as most promising for quantum computations and communications. (quantum computer)