WorldWideScience

Sample records for integrated library software

  1. Flight Software Math Library

    Science.gov (United States)

    McComas, David

    2013-01-01

    The flight software (FSW) math library is a collection of reusable math components that provides typical math utilities required by spacecraft flight software. These utilities are intended to increase flight software quality reusability and maintainability by providing a set of consistent, well-documented, and tested math utilities. This library only has dependencies on ANSI C, so it is easily ported. Prior to this library, each mission typically created its own math utilities using ideas/code from previous missions. Part of the reason for this is that math libraries can be written with different strategies in areas like error handling, parameters orders, naming conventions, etc. Changing the utilities for each mission introduces risks and costs. The obvious risks and costs are that the utilities must be coded and revalidated. The hidden risks and costs arise in miscommunication between engineers. These utilities must be understood by both the flight software engineers and other subsystem engineers (primarily guidance navigation and control). The FSW math library is part of a larger goal to produce a library of reusable Guidance Navigation and Control (GN&C) FSW components. A GN&C FSW library cannot be created unless a standardized math basis is created. This library solves the standardization problem by defining a common feature set and establishing policies for the library s design. This allows the libraries to be maintained with the same strategy used in its initial development, which supports a library of reusable GN&C FSW components. The FSW math library is written for an embedded software environment in C. This places restrictions on the language features that can be used by the library. Another advantage of the FSW math library is that it can be used in the FSW as well as other environments like the GN&C analyst s simulators. This helps communication between the teams because they can use the same utilities with the same feature set and syntax.

  2. Open source software and libraries

    OpenAIRE

    Randhawa, Sukhwinder

    2008-01-01

    Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environment. Library professionals should be aware of the advantages of open source software and should involve in their development. They should have basic knowledge about the selection, installation and main...

  3. National Software Reference Library (NSRL)

    Science.gov (United States)

    National Software Reference Library (NSRL) (PC database for purchase)   A collaboration of the National Institute of Standards and Technology (NIST), the National Institute of Justice (NIJ), the Federal Bureau of Investigation (FBI), the Defense Computer Forensics Laboratory (DCFL),the U.S. Customs Service, software vendors, and state and local law enforement organizations, the NSRL is a tool to assist in fighting crime involving computers.

  4. ARC Code TI: CFD Utility Software Library

    Data.gov (United States)

    National Aeronautics and Space Administration — The CFD Utility Software Library consists of nearly 30 libraries of Fortran 90 and 77 subroutines and almost 100 applications built on those libraries. Many of the...

  5. Software for Library Management: Selection and Evaluation.

    Science.gov (United States)

    Notowitz, Carol I.

    1987-01-01

    This discussion of library software packages includes guidelines for library automation with microcomputers; criteria to aid in software selection; comparison of some features of available acquisitions, circulation and overdues software; references for software reviews; additional information on microsoftware; and a directory of producers and…

  6. Software problems in library automation in India

    OpenAIRE

    Francis, A. T.

    1998-01-01

    Important software problems faced by the library professionals in India are analysed and points out various compatibility and suitability issues in the selection of a library software. The paper also hints that these problems has affected the progress of computerisation of libraries. Upto date and detailed information on softwares available in India can prevent several issues that may arise in the course of computerisation. An agency/mechanism to continuously evaluate the softwares may be ...

  7. Integrated circuit cell library

    Science.gov (United States)

    Whitaker, Sterling R. (Inventor); Miles, Lowell H. (Inventor)

    2005-01-01

    According to the invention, an ASIC cell library for use in creation of custom integrated circuits is disclosed. The ASIC cell library includes some first cells and some second cells. Each of the second cells includes two or more kernel cells. The ASIC cell library is at least 5% comprised of second cells. In various embodiments, the ASIC cell library could be 10% or more, 20% or more, 30% or more, 40% or more, 50% or more, 60% or more, 70% or more, 80% or more, 90% or more, or 95% or more comprised of second cells.

  8. E-GRANTHALAYA: LIBRARY INFORMATION SCIENCE OPEN SOURCE AUTOMATION SOFTWARE: AN OVERVIEW

    OpenAIRE

    Umaiyorubagam, R.; JohnAnish, R; Jeyapragash, B

    2015-01-01

    The paper describes that Free Library software’s availability on-line. The open source software is available on three categories.They are library automation software, Digital Library software and integrated library packages. The paper discusses these aspect in detail.

  9. Practical open source software for libraries

    CERN Document Server

    Engard, Nicole

    2010-01-01

    Open source refers to an application whose source code is made available for use or modification as users see fit. This means libraries gain more flexibility and freedom than with software purchased with license restrictions. Both the open source community and the library world live by the same rules and principles. Practical Open Source Software for Libraries explains the facts and dispels myths about open source. Chapters introduce librarians to open source and what it means for libraries. The reader is provided with links to a toolbox full of freely available open source products to use in

  10. How to Evaluate Integrated Library Automation Systems.

    Science.gov (United States)

    Powell, James R.; Slach, June E.

    1985-01-01

    This paper describes methodology used in compiling a list of candidate integrated library automation systems at a corporate technical library. Priorities for automation, identification of candidate systems, the filtering process, information for suppliers, software and hardware considerations, on-site evaluations, and final system selection are…

  11. Afghanistan Digital Library Initiative: Revitalizing an Integrated Library System

    Directory of Open Access Journals (Sweden)

    Yan HAN

    2007-12-01

    Full Text Available This paper describes an Afghanistan digital library initiative of building an integrated library system (ILS for Afghanistan universities and colleges based on open-source software. As one of the goals of the Afghan eQuality Digital Libraries Alliance, the authors applied systems analysis approach, evaluated different open-source ILSs, and customized the selected software to accommodate users’ needs. Improvements include Arabic and Persian language support, user interface changes, call number label printing, and ISBN-13 support. To our knowledge, this ILS is the first at a large academic library running on open-source software.

  12. Opening up Library Automation Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Throughout the history of library automation, the author has seen a steady advancement toward more open systems. In the early days of library automation, when proprietary systems dominated, the need for standards was paramount since other means of inter-operability and data exchange weren't possible. Today's focus on Application Programming…

  13. The Elusive Cost of Library Software

    Science.gov (United States)

    Breeding, Marshall

    2009-01-01

    Software pricing is not a straightforward issue, since each procurement involves a special business arrangement between a library and its chosen vendor. The author thinks that it is reasonable to scale the cost of a product to such factors as the size of the library, the complexity of the installation, the number of simultaneous users, or the…

  14. ABCD, an Open Source Software for Modern Libraries

    Directory of Open Access Journals (Sweden)

    Sangeeta Namdev Dhamdhere

    2011-12-01

    Full Text Available Nowadays, librarians are using various kinds of open source software for different purposes such as library automation, digitization, institutional repository, content management. ABCD, acronym for Automatisación de Bibliotécas y Centros de Documentación, is one of such software. It caters to almost all present needs of modern libraries of any sizes. It offers a solution to library automation with ISBD as well as local formats. It has excellent indexing and retrieval features based on UNESCO’s ISIS technology, a web OPAC, and a library Portal with integrated meta-search and content management system to manage online as well as offline digital resources and physical documents and media.

  15. An open-source thermodynamic software library

    DEFF Research Database (Denmark)

    Ritschel, Tobias Kasper Skovborg; Gaspar, Jozsef; Capolei, Andrea

    This is a technical report which accompanies the article ”An open-source thermodynamic software library” which describes an efficient Matlab and C implementation for evaluation of thermodynamic properties. In this technical report we present the model equations, that are also presented in the paper......, together with a full set of first and second order derivatives with respect to temperature and pressure, and in cases where applicable, also with respect to mole numbers. The library is based on parameters and correlations from the DIPPR database and the Peng-Robinson and the Soave-Redlich-Kwong equations...

  16. Adoption of open source digital library software packages: a survey

    OpenAIRE

    Jose, Sanjo

    2007-01-01

    Open source digital library packages are gaining popularity nowadays. To build a digital library under economical conditions open source software is preferable. This paper tries to identify the extent of adoption of open source digital library software packages in various organizations through an online survey. It lays down the findings from the survey.

  17. Selection and Management of Open Source Software in Libraries

    OpenAIRE

    Vimal Kumar, V.

    2007-01-01

    Open source software was a revolutionary concept among computer programmers and users. To a certain extent open source solutions could provide an alternative solution to costly commercial software. Open source software is, software that users have the ability to run, copy, distribute, study, change, share and improve for any purpose. Open source library software’s does not need the initial cost of commercial software and enables libraries to have greater control over their working environmen...

  18. Integral comparison of library performance

    International Nuclear Information System (INIS)

    Mori, Takamasa

    2006-01-01

    The 2003-2004 activities of Reactor Integral Test WG under Subcommittee on Reactor Constants of Japanese Nuclear Data Committee are presented. During this period, the WG carried out integral tests of JENDL-3.3, ENDF/B-VI and JEF 2.2 (JEFF-3.0) for reactor applications. Some results of integral tests for other latest libraries, JEFF-3.1 and ENDF/B-VII are also presented. (author)

  19. Cuba: Multidimensional numerical integration library

    Science.gov (United States)

    Hahn, Thomas

    2016-08-01

    The Cuba library offers four independent routines for multidimensional numerical integration: Vegas, Suave, Divonne, and Cuhre. The four algorithms work by very different methods, and can integrate vector integrands and have very similar Fortran, C/C++, and Mathematica interfaces. Their invocation is very similar, making it easy to cross-check by substituting one method by another. For further safeguarding, the output is supplemented by a chi-square probability which quantifies the reliability of the error estimate.

  20. Advanced Transport Operating System (ATOPS) utility library software description

    Science.gov (United States)

    Clinedinst, Winston C.; Slominski, Christopher J.; Dickson, Richard W.; Wolverton, David A.

    1993-01-01

    The individual software processes used in the flight computers on-board the Advanced Transport Operating System (ATOPS) aircraft have many common functional elements. A library of commonly used software modules was created for general uses among the processes. The library includes modules for mathematical computations, data formatting, system database interfacing, and condition handling. The modules available in the library and their associated calling requirements are described.

  1. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  2. A Padawan Programmer's Guide to Developing Software Libraries

    DEFF Research Database (Denmark)

    Yurkovich, James T.; Yurkovich, Benjamin J.; Dräger, Andreas

    2017-01-01

    /Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide......With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R......, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide...

  3. Integrating existing software toolkits into VO system

    Science.gov (United States)

    Cui, Chenzhou; Zhao, Yong-Heng; Wang, Xiaoqian; Sang, Jian; Luo, Ze

    2004-09-01

    Virtual Observatory (VO) is a collection of interoperating data archives and software tools. Taking advantages of the latest information technologies, it aims to provide a data-intensively online research environment for astronomers all around the world. A large number of high-qualified astronomical software packages and libraries are powerful and easy of use, and have been widely used by astronomers for many years. Integrating those toolkits into the VO system is a necessary and important task for the VO developers. VO architecture greatly depends on Grid and Web services, consequently the general VO integration route is "Java Ready - Grid Ready - VO Ready". In the paper, we discuss the importance of VO integration for existing toolkits and discuss the possible solutions. We introduce two efforts in the field from China-VO project, "gImageMagick" and "Galactic abundance gradients statistical research under grid environment". We also discuss what additional work should be done to convert Grid service to VO service.

  4. Application Reuse Library for Software, Requirements, and Guidelines

    Science.gov (United States)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  5. Digital Preservation in Open-Source Digital Library Software

    Science.gov (United States)

    Madalli, Devika P.; Barve, Sunita; Amin, Saiful

    2012-01-01

    Digital archives and digital library projects are being initiated all over the world for materials of different formats and domains. To organize, store, and retrieve digital content, many libraries as well as archiving centers are using either proprietary or open-source software. While it is accepted that print media can survive for centuries with…

  6. Integrated Library System (ILS): An Option for Library and ...

    African Journals Online (AJOL)

    The purpose of this paper is to review the capability of integrated library system as an option for managing library information resources using Koha ILS used at Babcock University as a case study. The paper is a review of the functions and performance of Koha ILS. It also includes personal experiences of librarians and the ...

  7. Robotic Software Integration Using MARIE

    Directory of Open Access Journals (Sweden)

    Carle Côté

    2006-03-01

    Full Text Available This paper presents MARIE, a middleware framework oriented towards developing and integrating new and existing software for robotic systems. By using a generic communication framework, MARIE aims to create a flexible distributed component system that allows robotics developers to share software programs and algorithms, and design prototypes rapidly based on their own integration needs. The use of MARIE is illustrated with the design of a socially interactive autonomous mobile robot platform capable of map building, localization, navigation, tasks scheduling, sound source localization, tracking and separation, speech recognition and generation, visual tracking, message reading and graphical interaction using a touch screen interface.

  8. Software attribute visualization for high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Pollock, G.M.

    1998-03-01

    This report documents a prototype tool developed to investigate the use of visualization and virtual reality technologies for improving software surety confidence. The tool is utilized within the execution phase of the software life cycle. It provides a capability to monitor an executing program against prespecified requirements constraints provided in a program written in the requirements specification language SAGE. The resulting Software Attribute Visual Analysis Tool (SAVAnT) also provides a technique to assess the completeness of a software specification.

  9. Adoption of open source software in library management: an ...

    African Journals Online (AJOL)

    This literature and practical knowledge based opinion paper explored Koha software, bringing to limelight its applicability in the library. The various modules contained in the software (Circulation, Patrons, Advanced Search/OPAC, Cataloguing, Serials, Acquisition, Report and Tools) and the various specific services that ...

  10. Software Library for Bruker TopSpin NMR Data Files

    Energy Technology Data Exchange (ETDEWEB)

    2016-10-14

    A software library for parsing and manipulating frequency-domain data files that have been processed using the Bruker TopSpin NMR software package. In the context of NMR, the term "processed" indicates that the end-user of the Bruker TopSpin NMR software package has (a) Fourier transformed the raw, time-domain data (the Free Induction Decay) into the frequency-domain and (b) has extracted the list of NMR peaks.

  11. Proposed method for selecting library automation software

    OpenAIRE

    Café, Lígia; Santos, Christophe Dos; Macedo, Flávia

    2001-01-01

    Apresenta um método para avaliação e seleção de softwares de automação de bibliotecas. Consiste na atribuição de critérios e cálculos estatísticos em uma lista elaborada para a seleção e avaliação deste tipo de software. Este método pretender servir como instrumento de apoio à tomada de decisão no processo de escolha do software mais adequado às necessidades de cada instituição. Este trabalho foi motivado por uma demanda do Instituto Brasileiro de Informação em Ciência e Tecnologia (IBICT) pa...

  12. An Integrated Library System: Preliminary Considerations.

    Science.gov (United States)

    Neroda, Edward

    Noting difficulties experienced by small to medium sized colleges in acquiring integrated library computer systems, this position paper outlines issues related to the subject with the intention of increasing familiarity and interest in integrated library systems. The report includes: a brief review of technological advances as they relate to…

  13. A Padawan Programmer's Guide to Developing Software Libraries.

    Science.gov (United States)

    Yurkovich, James T; Yurkovich, Benjamin J; Dräger, Andreas; Palsson, Bernhard O; King, Zachary A

    2017-11-22

    With the rapid adoption of computational tools in the life sciences, scientists are taking on the challenge of developing their own software libraries and releasing them for public use. This trend is being accelerated by popular technologies and platforms, such as GitHub, Jupyter, R/Shiny, that make it easier to develop scientific software and by open-source licenses that make it easier to release software. But how do you build a software library that people will use? And what characteristics do the best libraries have that make them enduringly popular? Here, we provide a reference guide, based on our own experiences, for developing software libraries along with real-world examples to help provide context for scientists who are learning about these concepts for the first time. While we can only scratch the surface of these topics, we hope that this article will act as a guide for scientists who want to write great software that is built to last. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Software for pipeline integrity administration

    Energy Technology Data Exchange (ETDEWEB)

    Soula, Gerardo; Perona, Lucas Fernandez [Gie SA., Buenos Aires (Argentina); Martinich, Carlos [Refinaria do Norte S. A. (REFINOR), Tartagal, Provincia de Salta (Argentina)

    2009-07-01

    A Software for 'pipeline integrity management' was developed. It allows to deal with Geographical Information and a PODS database (Pipeline Open database Standard) simultaneously, in a simple and reliable way. The premises for the design were the following: didactic, geo referenced, multiple reference systems. Program skills: 1.PODS+GIS: the PODS database in which the software is based on is completely integrated with the GIS module. 2 Management of different kinds of information: it allows to manage information on facilities, repairs, interventions, physical inspections, geographical characteristics, compliance with regulations, training, offline events, operation measures, O and M information treatment and importing specific data and studies in a massive way. It also assures the integrity of the loaded information. 3 Right of way survey: it allows to verify the class location, ROW occupation, sensitive areas identification and to manage landowners. 4 Risk analysis: it is done in a qualitative way, depending on the entered data, allowing the user to identify the riskiest stretches of the system. Either results from risk analysis, data and consultations made about the database, can be exported to standard formats. (author)

  15. The simulation library of the Belle II software system

    Science.gov (United States)

    Kim, D. Y.; Ritter, M.; Bilka, T.; Bobrov, A.; Casarosa, G.; Chilikin, K.; Ferber, T.; Godang, R.; Jaegle, I.; Kandra, J.; Kodys, P.; Kuhr, T.; Kvasnicka, P.; Nakayama, H.; Piilonen, L.; Pulvermacher, C.; Santelj, L.; Schwenker, B.; Sibidanov, A.; Soloviev, Y.; Starič, M.; Uglov, T.

    2017-10-01

    SuperKEKB, the next generation B factory, has been constructed in Japan as an upgrade of KEKB. This brand new e+ e- collider is expected to deliver a very large data set for the Belle II experiment, which will be 50 times larger than the previous Belle sample. Both the triggered physics event rate and the background event rate will be increased by at least 10 times than the previous ones, and will create a challenging data taking environment for the Belle II detector. The software system of the Belle II experiment is designed to execute this ambitious plan. A full detector simulation library, which is a part of the Belle II software system, is created based on Geant4 and has been tested thoroughly. Recently the library has been upgraded with Geant4 version 10.1. The library is behaving as expected and it is utilized actively in producing Monte Carlo data sets for various studies. In this paper, we will explain the structure of the simulation library and the various interfaces to other packages including geometry and beam background simulation.

  16. Model-integrating software components engineering flexible software systems

    CERN Document Server

    Derakhshanmanesh, Mahdi

    2015-01-01

    In his study, Mahdi Derakhshanmanesh builds on the state of the art in modeling by proposing to integrate models into running software on the component-level without translating them to code. Such so-called model-integrating software exploits all advantages of models: models implicitly support a good separation of concerns, they are self-documenting and thus improve understandability and maintainability and in contrast to model-driven approaches there is no synchronization problem anymore between the models and the code generated from them. Using model-integrating components, software will be

  17. 37 CFR 201.24 - Warning of copyright for software lending by nonprofit libraries.

    Science.gov (United States)

    2010-07-01

    ... software lending by nonprofit libraries. 201.24 Section 201.24 Patents, Trademarks, and Copyrights... copyright for software lending by nonprofit libraries. (a) Definition. A Warning of Copyright for Software... packaging that contains the computer program which is lent by a nonprofit library for nonprofit purposes. (b...

  18. What makes computational open source software libraries successful?

    Science.gov (United States)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  19. What makes computational open source software libraries successful?

    International Nuclear Information System (INIS)

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects. (paper)

  20. The fast azimuthal integration Python library: pyFAI.

    Science.gov (United States)

    Ashiotis, Giannis; Deschildre, Aurore; Nawaz, Zubair; Wright, Jonathan P; Karkoulis, Dimitrios; Picca, Frédéric Emmanuel; Kieffer, Jérôme

    2015-04-01

    pyFAI is an open-source software package designed to perform azimuthal integration and, correspondingly, two-dimensional regrouping on area-detector frames for small- and wide-angle X-ray scattering experiments. It is written in Python (with binary submodules for improved performance), a language widely accepted and used by the scientific community today, which enables users to easily incorporate the pyFAI library into their processing pipeline. This article focuses on recent work, especially the ease of calibration, its accuracy and the execution speed for integration.

  1. CGNS Mid-Level Software Library and Users Guide

    Science.gov (United States)

    Poirier, Diane; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: - The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its I/O software, which handles the actual reading and writing of data from and to external storage media; - The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; - The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and - The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The CGNS Mid-level Library was designed to ease the implementation of CGNS by providing developers with a collection of handy I/O functions. Since knowledge of the ADF core is not required to use this library, it will greatly facilitate the task of interfacing with CGNS. There are currently 48 user callable functions that comprise the Mid-level library and are described in the Users Guide. The library is written in

  2. CRISPR library designer (CLD): software for multispecies design of single guide RNA libraries.

    Science.gov (United States)

    Heigwer, Florian; Zhan, Tianzuo; Breinig, Marco; Winter, Jan; Brügemann, Dirk; Leible, Svenja; Boutros, Michael

    2016-03-24

    Genetic screens using CRISPR/Cas9 are a powerful method for the functional analysis of genomes. Here we describe CRISPR library designer (CLD), an integrated bioinformatics application for the design of custom single guide RNA (sgRNA) libraries for all organisms with annotated genomes. CLD is suitable for the design of libraries using modified CRISPR enzymes and targeting non-coding regions. To demonstrate its utility, we perform a pooled screen for modulators of the TNF-related apoptosis inducing ligand (TRAIL) pathway using a custom library of 12,471 sgRNAs. CLD predicts a high fraction of functional sgRNAs and is publicly available at https://github.com/boutroslab/cld.

  3. Automation Challenges of the 80's: What to Do until Your Integrated Library System Arrives.

    Science.gov (United States)

    Allan, Ferne C.; Shields, Joyce M.

    1986-01-01

    A medium-sized aerospace library has developed interim solutions to automation needs by using software and equipment that were available in-house in preparation for an expected integrated library system. Automated processes include authors' file of items authored by employees, journal routing (including routing slips), statistics, journal…

  4. Integrated Power, Avionics, and Software (IPAS) Flexible Systems Integration

    Data.gov (United States)

    National Aeronautics and Space Administration — The Integrated Power, Avionics, and Software (IPAS) facility is a flexible, multi-mission hardware and software design environment. This project will develop a...

  5. The Astrophysics Source Code Library: Supporting software publication and citation

    Science.gov (United States)

    Allen, Alice; Teuben, Peter

    2018-01-01

    The Astrophysics Source Code Library (ASCL, ascl.net), established in 1999, is a free online registry for source codes used in research that has appeared in, or been submitted to, peer-reviewed publications. The ASCL is indexed by the SAO/NASA Astrophysics Data System (ADS) and Web of Science and is citable by using the unique ascl ID assigned to each code. In addition to registering codes, the ASCL can house archive files for download and assign them DOIs. The ASCL advocations for software citation on par with article citation, participates in multidiscipinary events such as Force11, OpenCon, and the annual Workshop on Sustainable Software for Science, works with journal publishers, and organizes Special Sessions and Birds of a Feather meetings at national and international conferences such as Astronomical Data Analysis Software and Systems (ADASS), European Week of Astronomy and Space Science, and AAS meetings. In this presentation, I will discuss some of the challenges of gathering credit for publishing software and ideas and efforts from other disciplines that may be useful to astronomy.

  6. Open Source Opens Doors: Repurposing Library Software to Facilitate Faculty Research and Collaboration

    Directory of Open Access Journals (Sweden)

    Sandra L. Stump

    2013-10-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE Asked to convert a faculty-created Microsoft Word document of biblical references found within popular films into a searchable database for scholars, the Albright College library staff helped create a multi-access database called Bible in the Reel World. The database relied on student workers for inputting data, used MARC standard formatting for future portability, and encouraged interactive feedback, enabling scholars to submit comments and suggest additional films and references. Using the open source integrated library system Koha, MarcEdit software, and free record exporting from IMDb, library staff created a fully-searchable database for researchers and scholars to examine the use of scripture in popular film.

  7. Computer Software: Copyright and Licensing Considerations for Schools and Libraries. ERIC Digest.

    Science.gov (United States)

    Reed, Mary Hutchings

    This digest notes that the terms and conditions of computer software package license agreements control the use of software in schools and libraries, and examines the implications of computer software license agreements for classroom use and for library lending policies. Guidelines are provided for interpreting the Copyright Act, and insuring the…

  8. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  9. Advanced Data Format (ADF) Software Library and Users Guide

    Science.gov (United States)

    Smith, Matthew; Smith, Charles A. (Technical Monitor)

    1998-01-01

    The "CFD General Notation System" (CGNS) consists of a collection of conventions, and conforming software, for the storage and retrieval of Computational Fluid Dynamics (CFD) data. It facilitates the exchange of data between sites and applications, and helps stabilize the archiving of aerodynamic data. This effort was initiated in order to streamline the procedures in exchanging data and software between NASA and its customers, but the goal is to develop CGNS into a National Standard for the exchange of aerodynamic data. The CGNS development team is comprised of members from Boeing Commercial. Airplane Group, NASA-Ames, NASA-Langley, NASA-Lewis, McDonnell-Douglas Corporation (now Boeing-St. Louis), Air Force-Wright Lab., and ICEM-CFD Engineering. The elements of CGNS address all activities associated with the storage of data on external media and its movement to and from application programs. These elements include: 1) The Advanced Data Format (ADF) Database manager, consisting of both a file format specification and its 1/0 software, which handles the actual reading and writing of data from and to external storage media; 2) The Standard Interface Data Structures (SIDS), which specify the intellectual content of CFD data and the conventions governing naming and terminology; 3) The SIDS-to-ADF File Mapping conventions, which specify the exact location where the CFD data defined by the SIDS is to be stored within the ADF file(s); and 4) The CGNS Mid-level Library, which provides CFD-knowledgeable routines suitable for direct installation into application codes. The ADF is a generic database manager with minimal intrinsic capability. It was written for the purpose of storing large numerical datasets in an efficient, platform independent manner. To be effective, it must be used in conjunction with external agreements on how the data will be organized within the ADF database such defined by the SIDS. There are currently 34 user callable functions that comprise the ADF

  10. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  11. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  12. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  13. An Integrated Library System on the CERN Document Server

    CERN Document Server

    Rodrigues Silvestre, Joaquim Jorge; Le Meur, Jean-Yves; Šimko, Tibor

    2010-01-01

    CERN – The European Organization for Nuclear Research – is one of the largest research centres worldwide, responsible for several discoveries in physics as well as in computer science. The CERN Document Server, also known as CDS Invenio, is a software developed at CERN, which aims to provide a set of tools for managing digital libraries. In order to improve the functionalities of CDS Invenio a new module was developed , called BibCirculation, to manage books (and other items) from the CERN library, and working as an Integrated Library System. This thesis shows the steps that have been done to achieve the several goals of this project, explaining, among others aspects, the process of integration with other existing modules as well as the way to associate the information about books with the metadata from CDS Invenio. You can also find a detailed explanation of the entire implementation process and testing. Finally, there are presented the conclusions of this project and ideas for future development.

  14. Software extension and integration with type classes

    DEFF Research Database (Denmark)

    Lämmel, Ralf; Ostermann, Klaus

    2006-01-01

    expressiveness, by using the language concept of \\emph{type classes}, as it is available in the functional programming language Haskell. A detailed comparison with related work shows that type classes provide a powerful framework in which solutions to known software extension and integration problems can...... be provided. We also pinpoint several limitations of type classes in this context....

  15. Readiness of librarians in public libraries towards integration of ...

    African Journals Online (AJOL)

    The study is designed to x-ray the level of preparedness of librarians in Nigerian Public libraries towards integrating social media to the provision of library and information services (LIS). The survey research method was adopted using public libraries in south-east geo-political zone of Nigeria. The population of study ...

  16. OhioLINK: Implementing Integrated Library Services across Institutional Boundaries.

    Science.gov (United States)

    Hawks, Carol Pitts

    1995-01-01

    Discusses the implementation of the OhioLINK (Ohio Library and Information Network) system, an integrated library system linking 23 public and private academic institutions and the Ohio State Library. Topics include a history of OhioLINK; organizational structure; decision-making procedures; public relations strategies; cooperative circulation;…

  17. Integrating total quality management in a library setting

    CERN Document Server

    Jurow, Susan

    2013-01-01

    Improve the delivery of library services by implementing total quality management (TQM), a system of continuous improvement employing participative management and centered on the needs of customers. Although TQM was originally designed for and successfully applied in business and manufacturing settings, this groundbreaking volume introduces strategies for translating TQM principles from the profit-based manufacturing sector to the library setting. Integrating Total Quality Management in a Library Setting shows librarians how to improve library services by implementing strategies such as employ

  18. Integrating interface slicing into software engineering processes

    Science.gov (United States)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  19. Integration of software for scenario exploration

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    1999-03-01

    The scenario exploration methodology using shadow models is a variation of the environmental simulation method. Key aspect of the scenario exploration is the use of shadow models which are not corresponding to any specific assumptions on physical processes and, instead, abstract their general features relevant to the effects on nuclide transport in a general manner so that benefit of using simulation approach can be maximized. In developing the shadow models, all the modelling options that have not yet been denied by the experts are kept and parametrized in a very general framework. This, in turn, enables one to treat various types of the uncertainty in performance assessment, i.e., scenario uncertainty, conceptual model uncertainty, mathematical model uncertainty and parameter uncertainty, in a common framework of uncertainty / sensitivity analysis. Objective of the current study is to review / modify the tools which have been developed separately and, thence, not fully consistent from one to the other and to integrate them into a unified methodology and software. Tasks for this are; 1. modification / integration of tools for scenario exploration of nuclide transport in the EBS and the near-field host rock, 2. verification of the software modified and integrated, 3. installation of the software at JNC. (author)

  20. Lightning talk slide for "SLACKHA: Software Library for Accelerating Chemical Kinetics on Hybrid Architectures"

    OpenAIRE

    Niemeyer, Kyle; Sung, Chih-Jen

    2018-01-01

    Lightning talk slide describing the "SLACKHA: Software Library for Accelerating Chemical Kinetics on Hybrid Architectures" project at the 2018 NSF SI2 PI meeting: https://si2-pi-community.github.io/2018-meeting/

  1. Asset management: integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-06-01

    Two new multi-dimensional databases, which expand the `row and column` concept of spreadsheets into multiple categories of data called dimensions, are described. These integrated software packages provide the foundation for industry players such as Poco Petroleum Ltd and Numac Energy Inc to gain a competitive advantage, by overhauling their respective data collection and retrieval systems to allow for timely cost analysis and financial reporting. Energy Warehouse, an on-line analytical processing product marketed by SysGold Ltd, is one of the software products described. It gathers various sources of information, allows advanced searches and generates reports previously unavailable in other conventional financial accounting systems. The second product discussed - the Canadian Upstream Energy System (CUES) - is an on-line analytical processing system developed by Oracle Corporation and Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server and software development tools with ATS`s upstream financial, land, geotechnical and production applications. The software also allows for optimization of facilities, analysis of production efficiencies and comparison of performance against industry standards.

  2. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  3. User and system considerations for the TCSTEK software library

    International Nuclear Information System (INIS)

    Gray, W.H.

    1979-08-01

    This report documents the idiosyncrasies of the Tektronix PLOT 10 Terminal Control System level 3.3 software as it currently exists on the ORNL Fusion Energy Division DECsystem-10 computer. It is intended to serve as a reference for future Terminal Control System updates in order that continuity between releases of Terminal Control System PLOT 10 software may be maintained

  4. A Public Domain Software Library for Reading and Language Arts.

    Science.gov (United States)

    Balajthy, Ernest

    A three-year project carried out by the Microcomputers and Reading Committee of the New Jersey Reading Association involved the collection, improvement, and distribution of free microcomputer software (public domain programs) designed to deal with reading and writing skills. Acknowledging that this free software is not without limitations (poor…

  5. User and system considerations for the TCSTEK software library

    Energy Technology Data Exchange (ETDEWEB)

    Gray, W.H.

    1979-08-01

    This report documents the idiosyncrasies of the Tektronix PLOT 10 Terminal Control System level 3.3 software as it currently exists on the ORNL Fusion Energy Division DECsystem-10 computer. It is intended to serve as a reference for future Terminal Control System updates in order that continuity between releases of Terminal Control System PLOT 10 software may be maintained.

  6. Software Quality Assurance and Verification for the MPACT Library Generation Process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yuxuan [Univ. of Michigan, Ann Arbor, MI (United States); Williams, Mark L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wiarda, Dorothea [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Clarno, Kevin T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Kim, Kang Seog [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Celik, Cihangir [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX and VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.

  7. Directory of Library Automation Software, Systems, and Services. 1998 Edition.

    Science.gov (United States)

    Cibbarelli, Pamela R., Ed.; Cibbarelli, Shawn E., Ed.

    This book includes basic information to locate and compare available options for library automation based on various criteria such as hardware requirements, operating systems, components and applications, and price, and provides the necessary contact information to allow further investigation. The major part of the directory lists 211 software…

  8. Application software packages for library operations and services in ...

    African Journals Online (AJOL)

    Purposive sampling was used to select 218 subjects from five Federal Universities where Library Automation have begun. Analysis and discussions were made. Findings revealed that all Universities studied are making use of KOHA, Virtua, E-LIB and Dspace and Greenstone to manage their digital Information resources.

  9. Asset management -- Integrated software optimizes production performance

    International Nuclear Information System (INIS)

    Polczer, S.

    1998-01-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle's universal data server software development tools with ATS's upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting

  10. Asset management -- Integrated software optimizes production performance

    Energy Technology Data Exchange (ETDEWEB)

    Polczer, S.

    1998-10-01

    Developments in data collection and retrieval systems to allow timely cost analysis, financial reporting and production management are discussed. One of the most important new OLAP (on-line analytical processing) products is Energy Warehouse which gathers field information from various sources, allows advanced searches, and generates reports previously unavailable in other conventional financial accounting systems. Another OLAP-based system, the Canadian Upstream Energy System (CUES), was developed by the Oracle Corporation and the Calgary-based Applied Terravision Systems (ATS) Inc. CUES combines Oracle`s universal data server software development tools with ATS`s upstream financial, land, geotechnical and production applications. ATS also developed a product called IDPMARS (Integrated Daily Production Management Accounting Reporting System). It interfaces with CUES to link working interests, government royalties, administration, facility charges, lifting costs, transportation tooling, and customers by integrating field data collection systems with financial accounting.

  11. An Assessment of the Library Application Software Packages in ...

    African Journals Online (AJOL)

    Journal Home > Vol 7, No 2 (2007) > ... the study examined the adopted softwares' security, compatibility/capabilities, ... The study found that most application packages available in the Nigerian automation market place are effective since they ...

  12. Demographic Variables as Factors Influencing Accessibility and Utilisation of Library Software by Undergraduates in Two Private Universities in Nigeria

    Science.gov (United States)

    Tolulope, Akano

    2017-01-01

    Libraries before the 21st century carried out daily routine library task such as cataloguing and classification, acquisition, reference services etc using manual procedures only but the advent of Information Technology as transformed these routine task that libraries can now automate their activities by deploying the use of library software in…

  13. CSIR's new integrated electronic library information-system

    CSIR Research Space (South Africa)

    Michie, A

    1995-08-01

    Full Text Available The CSIR has developed a CDROM-based electronic library information system which provides the ability to reproduce and search for published information and colour brochures on the computer screen. The system integrates this information with online...

  14. The Event-Driven Software Library for YARP—With Algorithms and iCub Applications

    Directory of Open Access Journals (Sweden)

    Arren Glover

    2018-01-01

    Full Text Available Event-driven (ED cameras are an emerging technology that sample the visual signal based on changes in the signal magnitude, rather than at a fixed-rate over time. The change in paradigm results in a camera with a lower latency, that uses less power, has reduced bandwidth, and higher dynamic range. Such cameras offer many potential advantages for on-line, autonomous, robots; however, the sensor data do not directly integrate with current “image-based” frameworks and software libraries. The iCub robot uses Yet Another Robot Platform (YARP as middleware to provide modular processing and connectivity to sensors and actuators. This paper introduces a library that incorporates an event-based framework into the YARP architecture, allowing event cameras to be used with the iCub (and other YARP-based robots. We describe the philosophy and methods for structuring events to facilitate processing, while maintaining low-latency and real-time operation. We also describe several processing modules made available open-source, and three example demonstrations that can be run on the neuromorphic iCub.

  15. The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).

    Science.gov (United States)

    Library Software Review, 1984

    1984-01-01

    Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…

  16. Integrated modeling of software cost and quality

    International Nuclear Information System (INIS)

    Rone, K.Y.; Olson, K.M.

    1994-01-01

    In modeling the cost and quality of software systems, the relationship between cost and quality must be considered. This explicit relationship is dictated by the criticality of the software being developed. The balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and the developers with respect to the processes being employed

  17. Integrating digital topology in image-processing libraries.

    Science.gov (United States)

    Lamy, Julien

    2007-01-01

    This paper describes a method to integrate digital topology informations in image-processing libraries. This additional information allows a library user to write algorithms respecting topological constraints, for example, a seed fill or a skeletonization algorithm. As digital topology is absent from most image-processing libraries, such constraints cannot be fulfilled. We describe and give code samples for all the structures necessary for this integration, and show a use case in the form of a homotopic thinning filter inside ITK. The obtained filter can be up to a hundred times as fast as ITK's thinning filter and works for any image dimension. This paper mainly deals of integration within ITK, but can be adapted with only minor modifications to other image-processing libraries.

  18. Caliko: An Inverse Kinematics Software Library Implementation of the FABRIK Algorithm

    OpenAIRE

    Lansley, Alastair; Vamplew, Peter; Smith, Philip; Foale, Cameron

    2016-01-01

    The Caliko library is an implementation of the FABRIK (Forward And Backward Reaching Inverse Kinematics) algorithm written in Java. The inverse kinematics (IK) algorithm is implemented in both 2D and 3D, and incorporates a variety of joint constraints as well as the ability to connect multiple IK chains together in a hierarchy. The library allows for the simple creation and solving of multiple IK chains as well as visualisation of these solutions. It is licensed under the MIT software license...

  19. Tailor-made training for digital library software

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    Six librarians and IT engineers from Senegal, Mali, Burkina Faso, Ivory Coast and Morocco are currently spending several weeks at CERN as a follow-up to the 5-day CERN-UNESCO Digital Libraries School held in Dakar, Senegal, last year. During their stay, they are honing their mastery of CERN’s Invenio digital library management platform in order to put it to a variety of uses once they return home.   From left to right: Essaid Ait Allal (Morocco), Guillaume Nikiema (Burkina Faso), Eric Guedegbe (Senegal), Fama Diagne Sene Ndiaye (Senegal), Abdrahamane Anne (Mali) and Jens VIGEN (CERN).  Cécile Coulibaly (Ivory Coast), who was also taking part in the training programme, is not in the picture. “We plan to use Invenio to build a portal for all African university dissertations to make them accessible to the global academic community. We need a system which can harvest data from various existing platforms, then convert the bibliographic records and make them...

  20. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  1. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Energy Technology Data Exchange (ETDEWEB)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  2. A virtualized software based on the NVIDIA cuFFT library for image denoising: performance analysis

    DEFF Research Database (Denmark)

    Galletti, Ardelio; Marcellino, Livia; Montella, Raffaele

    2017-01-01

    Abstract Generic Virtualization Service (GVirtuS) is a new solution for enabling GPGPU on Virtual Machines or low powered devices. This paper focuses on the performance analysis that can be obtained using a GPGPU virtualized software. Recently, GVirtuS has been extended in order to support CUDA...... ancillary libraries with good results. Here, our aim is to analyze the applicability of this powerful tool to a real problem, which uses the NVIDIA cuFFT library. As case study we consider a simple denoising algorithm, implementing a virtualized GPU-parallel software based on the convolution theorem...

  3. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    Science.gov (United States)

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  4. Software, Copyright, and Site-License Agreements: Publishers' Perspective of Library Practice.

    Science.gov (United States)

    Happer, Stephanie K.

    Thirty-one academic publishers of stand-alone software and book/disk packages were surveyed to determine whether publishers have addressed the copyright issues inherent in circulating these packages within the library environment. Twenty-two questionnaires were returned, providing a 71% return rate. There were 18 usable questionnaires. Publishers…

  5. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    Science.gov (United States)

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  6. Integration of fragment screening and library design.

    Science.gov (United States)

    Siegal, Gregg; Ab, Eiso; Schultz, Jan

    2007-12-01

    With more than 10 years of practical experience and theoretical analysis, fragment-based drug discovery (FBDD) has entered the mainstream of the pharmaceutical and biotech industries. An array of biophysical techniques has been used to detect the weak interaction between a fragment and the target. Each technique presents its own requirements regarding the fragment collection and the target; therefore, in order to optimize the potential of FBDD, the nature of the target should be a driving factor for simultaneous development of both the library and the screening technology. A roadmap is now available to guide fragment-to-lead evolution when structural information is available. The next challenge is to apply FBDD to targets for which high-resolution structural information is not available.

  7. Evolution of the 'Trick' Dynamic Software Executive and Model Libraries for Reusable Flight Software, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — In response to a need for cost-effective small satellite missions, Odyssey Space Research is proposing the development of a common flight software executive and a...

  8. Integrating Behaviour in Software Models: An Event Coordination Notation

    DEFF Research Database (Denmark)

    Kindler, Ekkart

    2011-01-01

    One of the main problems in model-based software engineering is modelling behaviour in such a way that the behaviour models can be easily integrated with each other, with the structural software models and with pre-existing software. In this paper, we propose an event coordination notation (ECNO)...

  9. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  10. Requirements Engineering for Software Integrity and Safety

    Science.gov (United States)

    Leveson, Nancy G.

    2002-01-01

    Requirements flaws are the most common cause of errors and software-related accidents in operational software. Most aerospace firms list requirements as one of their most important outstanding software development problems and all of the recent, NASA spacecraft losses related to software (including the highly publicized Mars Program failures) can be traced to requirements flaws. In light of these facts, it is surprising that relatively little research is devoted to requirements in contrast with other software engineering topics. The research proposed built on our previous work. including both criteria for determining whether a requirements specification is acceptably complete and a new approach to structuring system specifications called Intent Specifications. This grant was to fund basic research on how these ideas could be extended to leverage innovative approaches to the problems of (1) reducing the impact of changing requirements, (2) finding requirements specification flaws early through formal and informal analysis, and (3) avoiding common flaws entirely through appropriate requirements specification language design.

  11. Producing software by integration: challenges and research directions (keynote)

    OpenAIRE

    Inverardi , Paola; Autili , Marco; Di Ruscio , Davide; Pelliccione , Patrizio; Tivoli , Massimo

    2013-01-01

    International audience; Software is increasingly produced according to a certain goal and by integrating existing software produced by third-parties, typically black-box, and often provided without a machine readable documentation. This implies that development processes of the next future have to explicitly deal with an inherent incompleteness of information about existing software, notably on its behaviour. Therefore, on one side a software producer will less and less know the precise behav...

  12. Caliko: An Inverse Kinematics Software Library Implementation of the FABRIK Algorithm

    Directory of Open Access Journals (Sweden)

    Alastair Lansley

    2016-09-01

    Full Text Available The Caliko library is an implementation of the FABRIK (Forward And Backward Reaching Inverse Kinematics algorithm written in Java. The inverse kinematics (IK algorithm is implemented in both 2D and 3D, and incorporates a variety of joint constraints as well as the ability to connect multiple IK chains together in a hierarchy. The library allows for the simple creation and solving of multiple IK chains as well as visualisation of these solutions. It is licensed under the MIT software license and the source code is freely available for use and modification at: https://github.com/feduni/caliko

  13. The user's manual of 'Manyo Library' data reduction software framework at MLF, J-PARC

    International Nuclear Information System (INIS)

    Inamura, Yasuhiro; Nakatani, Takeshi; Ito, Takayoshi; Suzuki, Jiro

    2016-06-01

    Manyo Library is a software framework for developing analysis software of neutron scattering data produced at MLF, J-PARC. This software framework is required to work on many instruments in MLF and to include base functions applied to various scientific purposes at beam lines. This framework mainly consists of data containers, which enable to store 1, 2 and 3 dimensional axes data for neutron scattering. Data containers have many functions to calculate four arithmetic operations with errors distribution between containers, to store the meta-data about measurements and to read or write text file. The analysis codes are constructed using various analysis operators defined in Manyo Library, which executes functions with given data containers and output the results. On the other hands, the main interface for instrument scientists and users must be easy and interactive to treat data containers and functions or to develop new analysis codes. Therefore we chose Python as user interface. Since Manyo Library is built in C++ language, we've introduced the technology to call C++ function from Python environment into the framework. As a result, we have already developed a lot of software for data reduction, analysis and visualization, which are utilized widely in beam lines at MLF. This document is the manual for the beginner to touch this framework. (author)

  14. An Integrated System for Managing the Andalusian Parliament's Digital Library

    Science.gov (United States)

    de Campos, Luis M.; Fernandez-Luna, Juan M.; Huete, Juan F.; Martin-Dancausa, Carlos J.; Tagua-Jimenez, Antonio; Tur-Vigil, Carmen

    2009-01-01

    Purpose: The purpose of this paper is to present an overview of the reorganisation of the Andalusian Parliament's digital library to improve the electronic representation and access of its official corpus by taking advantage of a document's internal organisation. Video recordings of the parliamentary sessions have also been integrated with their…

  15. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  16. The Environment for Application Software Integration and Execution (EASIE), version 1.0. Volume 2: Program integration guide

    Science.gov (United States)

    Jones, Kennie H.; Randall, Donald P.; Stallcup, Scott S.; Rowell, Lawrence F.

    1988-01-01

    The Environment for Application Software Integration and Execution, EASIE, provides a methodology and a set of software utility programs to ease the task of coordinating engineering design and analysis codes. EASIE was designed to meet the needs of conceptual design engineers that face the task of integrating many stand-alone engineering analysis programs. Using EASIE, programs are integrated through a relational data base management system. In volume 2, the use of a SYSTEM LIBRARY PROCESSOR is used to construct a DATA DICTIONARY describing all relations defined in the data base, and a TEMPLATE LIBRARY. A TEMPLATE is a description of all subsets of relations (including conditional selection criteria and sorting specifications) to be accessed as input or output for a given application. Together, these form the SYSTEM LIBRARY which is used to automatically produce the data base schema, FORTRAN subroutines to retrieve/store data from/to the data base, and instructions to a generic REVIEWER program providing review/modification of data for a given template. Automation of these functions eliminates much of the tedious, error prone work required by the usual approach to data base integration.

  17. Implementing a modeling software for animated protein-complex interactions using a physics simulation library.

    Science.gov (United States)

    Ueno, Yutaka; Ito, Shuntaro; Konagaya, Akihiko

    2014-12-01

    To better understand the behaviors and structural dynamics of proteins within a cell, novel software tools are being developed that can create molecular animations based on the findings of structural biology. This study proposes our method developed based on our prototypes to detect collisions and examine the soft-body dynamics of molecular models. The code was implemented with a software development toolkit for rigid-body dynamics simulation and a three-dimensional graphics library. The essential functions of the target software system included the basic molecular modeling environment, collision detection in the molecular models, and physical simulations of the movement of the model. Taking advantage of recent software technologies such as physics simulation modules and interpreted scripting language, the functions required for accurate and meaningful molecular animation were implemented efficiently.

  18. Integrated library system in the library of Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    Yonezawa, Minoru; Mineo, Yukinobu; Itabashi, Keizo

    1987-01-01

    Integrated library system has been developed using a stand-alone mini-computer in the Japan Atomic Energy Research Institute library. This system consists of three subsystems for serials control, books acquisition and circulation control. Serials control subsystem deals with subscription, acquisition, claiming and inquiry of journals. This has been operating since the beginning of 1985. Book acquisition sub-system, which has been started since April 1986, deals with accounting and cataloguing of books. Circulation control sub-system deals with circulation, statistics compilation, book inventory and retrieval, which has been operating since April 1987. This system contributes greatly not only to the reduction of the circulation work load but also to the promotion of the library services. However, the convenience in circulation processing should be improved for materials without catalogue information stored in the computer. The pertinence for maximum number of books retrieved has to be also reconsidered. (author)

  19. A real-time MPEG software decoder using a portable message-passing library

    Energy Technology Data Exchange (ETDEWEB)

    Kwong, Man Kam; Tang, P.T. Peter; Lin, Biquan

    1995-12-31

    We present a real-time MPEG software decoder that uses message-passing libraries such as MPL, p4 and MPI. The parallel MPEG decoder currently runs on the IBM SP system but can be easil ported to other parallel machines. This paper discusses our parallel MPEG decoding algorithm as well as the parallel programming environment under which it uses. Several technical issues are discussed, including balancing of decoding speed, memory limitation, 1/0 capacities, and optimization of MPEG decoding components. This project shows that a real-time portable software MPEG decoder is feasible in a general-purpose parallel machine.

  20. New software library of geometrical primitives for modelling of solids used in Monte Carlo detector simulations

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    We present our effort for the creation of a new software library of geometrical primitives, which are used for solid modelling in Monte Carlo detector simulations. We plan to replace and unify current geometrical primitive classes in the CERN software projects Geant4 and ROOT with this library. Each solid is represented by a C++ class with methods suited for measuring distances of particles from the surface of a solid and for determination as to whether the particles are located inside, outside or on the surface of the solid. We use numerical tolerance for determining whether the particles are located on the surface. The class methods also contain basic support for visualization. We use dedicated test suites for validation of the shape codes. These include also special performance and numerical value comparison tests for help with analysis of possible candidates of class methods as well as to verify that our new implementation proposals were designed and implemented properly. Currently, bridge classes are u...

  1. An integrated framework for software vulnerability detection ...

    Indian Academy of Sciences (India)

    Manoj Kumar

    2017-07-15

    Jul 15, 2017 ... concern and intelligent framework and provides more secured ... In the present scenario, the software systems are being .... human. In human body, the autonomic nervous system ..... such as artificial neural networks, genetic algorithm, grey ..... [8] Bansiya J 1997 A hierarchical model for quality assessment.

  2. CHECWORKS integrated software for corrosion control

    International Nuclear Information System (INIS)

    Schefski, C.; Pietralik; Hazelton, T.

    1997-01-01

    CHECWORKS, a comprehensive software package for managing Flow-Accelerated Corrosion (FAC, also called erosion-corrosion and flow-assisted corrosion) concerns, is expanding to include other systems and other aspects of corrosion control in CANDU reactors. This paper will outline CHECWORKS applications at various CANDU stations and further plans for CHECWORKS to become a code for comprehensive corrosion control management. (author)

  3. Software Ergonomics of Iranian Digital Library Software’s: An Accessibility-Centered Survey

    Directory of Open Access Journals (Sweden)

    Saeideh Jahanghiri

    2016-06-01

    Full Text Available Purpose: The Purpose of this study is to evaluate accessibility features of Iranian Digital Library Software’s (IDLS. Method/Approach: This is an applied research and has done as a heuristic survey. Statistical population of the study includes five Digital Library Softwares: Azarakhsh, Nosa, Papyrus, Parvanpajooh and Payam. The researcher-made criteria list of this study is based on ISO 9241-171 and has prepared through a Delphi method. Different types of descriptive statistical techniques in collaboration with Friedman test and SAW decision making method used for data analyzing. Findings: Research results showed that IDLSs have made no impressive effort for regarding accessibility features and their accessibility has obtained solely through the Operating System and Platform that the software runs on it. That’s why input accessibility features – which have regarded through OS-, have gained first rank among other accessibility features. There is meaningful statistical difference between IDLSs in regarding accessibility features. Originality/Value: This study which survey the accessibility features of IDLSs, is one of the first attending software accessibility features in Iran and it can have an important role in introducing disable users’ needs to software developers and digital collection makers.

  4. Characterizing and Modeling the Cost of Rework in a Library of Reusable Software Components

    Science.gov (United States)

    Basili, Victor R.; Condon, Steven E.; ElEmam, Khaled; Hendrick, Robert B.; Melo, Walcelio

    1997-01-01

    In this paper we characterize and model the cost of rework in a Component Factory (CF) organization. A CF is responsible for developing and packaging reusable software components. Data was collected on corrective maintenance activities for the Generalized Support Software reuse asset library located at the Flight Dynamics Division of NASA's GSFC. We then constructed a predictive model of the cost of rework using the C4.5 system for generating a logical classification model. The predictor variables for the model are measures of internal software product attributes. The model demonstrates good prediction accuracy, and can be used by managers to allocate resources for corrective maintenance activities. Furthermore, we used the model to generate proscriptive coding guidelines to improve programming, practices so that the cost of rework can be reduced in the future. The general approach we have used is applicable to other environments.

  5. Automation and Networking of Public Libraries in India Using the E-Granthalaya Software from the National Informatics Centre

    Science.gov (United States)

    Matoria, Ram Kumar; Upadhyay, P. K.; Moni, Madaswamy

    2007-01-01

    Purpose: To describe the development of the library management system, e-Granthalaya, for public libraries in India. This is an initiative of the Indian government's National Informatics Centre (NIC). The paper outlines the challenges and the potential of a full-scale deployment of this software at a national level. Design/methodology/approach:…

  6. Reverse Engineering in Data Integration Software

    Directory of Open Access Journals (Sweden)

    Vlad DIACONITA

    2013-05-01

    Full Text Available Integrated applications are complex solutions that help build better consolidated and standardized systems from existing (usually transactional systems. Integrated applications are complex solutions, whose complexity are determined by the economic processes they implement, the amount of data employed (millions of records grouped in hundreds of tables, databases, hundreds of GB and the number of users [11]. Oracle, once mainly known for his database and e-business solutions has been constantly expanding its product portfolio, providing solutions for SOA, BPA, Warehousing, Big Data and Cloud Computing. In this article I will review the facilities and the power of using a dedicated integration tool in an environment with multiple data sources and a target data mart.

  7. Integration and validation of a data grid software

    Science.gov (United States)

    Carenton-Madiec, Nicolas; Berger, Katharina; Cofino, Antonio

    2014-05-01

    The Earth System Grid Federation (ESGF) Peer-to-Peer (P2P) is a software infrastructure for the management, dissemination, and analysis of model output and observational data. The ESGF grid is composed with several types of nodes which have different roles. About 40 data nodes host model outputs and datasets using thredds catalogs. About 25 compute nodes offer remote visualization and analysis tools. About 15 index nodes crawl data nodes catalogs and implement faceted and federated search in a web interface. About 15 Identity providers nodes manage accounts, authentication and authorization. Here we will present an actual size test federation spread across different institutes in different countries and a python test suite that were started in December 2013. The first objective of the test suite is to provide a simple tool that helps to test and validate a single data node and its closest index, compute and identity provider peer. The next objective will be to run this test suite on every data node of the federation and therefore test and validate every single node of the whole federation. The suite already implements nosetests, requests, myproxy-logon, subprocess, selenium and fabric python libraries in order to test both web front ends, back ends and security services. The goal of this project is to improve the quality of deliverable in a small developers team context. Developers are widely spread around the world working collaboratively and without hierarchy. This kind of working organization context en-lighted the need of a federated integration test and validation process.

  8. Integrated software tool automates MOV diagnosis

    International Nuclear Information System (INIS)

    Joshi, B.D.; Upadhyaya, B.R.

    1996-01-01

    This article reports that researchers at the University of Tennessee have developed digital signal processing software that takes the guesswork out of motor current signature analysis (MCSA). The federal testing regulations for motor-operated valves (MOV) used in nuclear power plants have recently come under critical scrutiny by the Nuclear Regulatory Commission (NRC) and the American Society of Mechanical Engineers (ASME). New ASME testing specifications mandate that all valves performing a safety function are to be tested -- not just ASME Code 1, 2 and 3 valves. The NRC will likely endorse the ASME regulations in the near future. Because of these changes, several utility companies have voluntarily expanded the scope of their in-service testing programs for MOVs, in spite of the additional expense

  9. Peak fitting and identification software library for high resolution gamma-ray spectra

    International Nuclear Information System (INIS)

    Uher, Josef; Roach, Greg; Tickner, James

    2010-01-01

    A new gamma-ray spectral analysis software package is under development in our laboratory. It can be operated as a stand-alone program or called as a software library from Java, C, C++ and MATLAB TM environments. It provides an advanced graphical user interface for data acquisition, spectral analysis and radioisotope identification. The code uses a peak-fitting function that includes peak asymmetry, Compton continuum and flexible background terms. Peak fitting function parameters can be calibrated as functions of energy. Each parameter can be constrained to improve fitting of overlapping peaks. All of these features can be adjusted by the user. To assist with peak identification, the code can automatically measure half-lives of single or multiple overlapping peaks from a time series of spectra. It implements library-based peak identification, with options for restricting the search based on radioisotope half-lives and reaction types. The software also improves the reliability of isotope identification by utilizing Monte-Carlo simulation results.

  10. Thermonuclear Reaction Rate Libraries and Software Tools for Nuclear Astrophysics Research

    International Nuclear Information System (INIS)

    Smith, Michael S.; Cyburt, Richard; Schatz, Hendrik; Smith, Karl; Warren, Scott; Ferguson, Ryan; Wiescher, Michael; Lingerfelt, Eric; Buckner, Kim; Nesaraja, Caroline D.

    2008-01-01

    Thermonuclear reaction rates are a crucial input for simulating a wide variety of astrophysical environments. A new collaboration has been formed to ensure that astrophysical modelers have access to reaction rates based on the most recent experimental and theoretical nuclear physics information. To reach this goal, a new version of the REACLIB library has been created by the Joint Institute for Nuclear Astrophysics (JINA), now available online at http://www.nscl.msu.edu/~nero/db. A complementary effort is the development of software tools in the Computational Infrastructure for Nuclear Astrophysics, online at nucastrodata.org, to streamline, manage, and access the workflow of the reaction evaluations from their initiation to peer review to incorporation into the library. Details of these new projects will be described

  11. An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence

    Science.gov (United States)

    Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras

    2014-05-01

    We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.

  12. Integrating and Managing Bim in GIS, Software Review

    Science.gov (United States)

    El Meouche, R.; Rezoug, M.; Hijazi, I.

    2013-08-01

    Since the advent of Computer-Aided Design (CAD) and Geographical Information System (GIS) tools, project participants have been increasingly leveraging these tools throughout the different phases of a civil infrastructure project. In recent years the number of GIS software that provides tools to enable the integration of Building information in geo context has risen sharply. More and more GIS software are added tools for this purposes and other software projects are regularly extending these tools. However, each software has its different strength and weakness and its purpose of use. This paper provides a thorough review to investigate the software capabilities and clarify its purpose. For this study, Autodesk Revit 2012 i.e. BIM editor software was used to create BIMs. In the first step, three building models were created, the resulted models were converted to BIM format and then the software was used to integrate it. For the evaluation of the software, general characteristics was studied such as the user interface, what formats are supported (import/export), and the way building information are imported.

  13. IM (Integrity Management) software must show flexibility to local codes

    Energy Technology Data Exchange (ETDEWEB)

    Brors, Markus [ROSEN Technology and Research Center GmbH (Germany); Diggory, Ian [Macaw Engineering Ltd., Northumberland (United Kingdom)

    2009-07-01

    There are many internationally recognized codes and standards, such as API 1160 and ASME B31.8S, which help pipeline operators to manage and maintain the integrity of their pipeline networks. However, operators in many countries still use local codes that often reflect the history of pipeline developments in their region and are based on direct experience and research on their pipelines. As pipeline companies come under increasing regulatory and financial pressures to maintain the integrity of their networks, it is important that operators using regional codes are able to benchmark their integrity management schemes against these international standards. Any comprehensive Pipeline Integrity Management System (PIMS) software package should therefore not only incorporate industry standards for pipeline integrity assessment but also be capable of implementing regional codes for comparison purposes. This paper describes the challenges and benefits of incorporating one such set of regional pipeline standards into ROSEN Asset Integrity Management Software (ROAIMS). (author)

  14. BioSig: the free and open source software library for biomedical signal processing.

    Science.gov (United States)

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  15. Exploring the organizational impact of software-as-a-Service on software vendors the role of organizational integration in software-as-a-Service development and operation

    CERN Document Server

    Stuckenberg, Sebastian

    2014-01-01

    Software-as-a-Service has gained momentum as a software delivery and pricing model within the software industry. Existing practices of software vendors are challenged by a potential paradigm shift. This book analyzes the implications of Software-as-a-Service on software vendors using a business model and value chain perspective. The analysis of qualitative data from software vendors highlights the role of organizational integration within software vendors. By providing insights regarding the impact of Software-as-a-Service on organizational structures and processes of software vendors, this st

  16. SITEGI Project: Applying Geotechnologies to Road Inspection. Sensor Integration and software processing

    Directory of Open Access Journals (Sweden)

    J. Martínez-Sánchez

    2013-10-01

    Full Text Available Infrastructure management represents a critical economic milestone. The current decision-making process in infrastructure rehabilitation is essentially based on qualitative parameters obtained from visual inspections and subject to the ability of technicians. In order to increase both efficiency and productivity in infrastructure management, this work addresses the integration of different instrumentation and sensors in a mobile mapping vehicle. This vehicle allows the continuous recording of quantitative data suitable for roadside inspection. The geometric integration and synchronization of these sensors is achieved through hardware and/or software strategies that permit the georeferencing of the data obtained with each sensor. In addition, a visualization software for simpler data management was implemented using Qt framework, PCL library and C++. As a result, the developed system supports the decision-making in road inspection, providing quantitative information suitable for sophisticated analysis systems.

  17. Integrated FASTBUS, VME and CAMAC diagnostic software at Fermilab

    International Nuclear Information System (INIS)

    Anderson, J.; Forster, R.; Franzen, J.; Wilcer, N.

    1992-10-01

    A fully integrated system for the diagnosis and repair of data acquisition hardware in FASTBUS, VME and CAMAC is described. A short cost/benefit analysis of using a distributed network of personal computers for diagnosis is presented. The SPUDS (Single Platform Uniting Diagnostic Software) software package developed at Fermilab by the authors is introduced. Examples of how SPUDS is currently used in the Fermilab equipment repair facility, as an evaluation tool and for field diagnostics are given

  18. First Invenio Workshop: CERN’s digital library software ten years on

    CERN Multimedia

    Joannah Caborn Wengler

    2012-01-01

    To mark the release of Invenio 1.0, the first User Group Workshop was held last week, with more than 40 developers, system administrators and librarians from 14 different countries attending. The participants were able to catch up on developments in CERN’s digital library software and get a glimpse of where it's going next.   “This is the first time we’ve held such a big workshop,” explains Jean-Yves Le Meur, head of Digital Library Services. “There was a lot of demand for an event like this, and bringing out version 1.0 of Invenio was an obvious time to do it.” Ask him what’s new in version 1.0 and he opens his eyes wide. “There’s so much, it’s hard to summarise. One key improvement is that the code and the database are stable, well tested and optimised, which makes the software more efficient.” The participants, on the other hand, highlight new features coming up in Invenio...

  19. End-user satisfaction analysis on library management system unnes using technology acceptance model towards national standard of integrated library

    Science.gov (United States)

    Hardyanto, W.; Purwinarko, A.; Adhi, M. A.

    2018-03-01

    The library which is the gate of the University should be supported by the existence of an adequate information system, to provide excellent service and optimal to every user. Library management system that has been in existence since 2009 needs to be re-evaluated so that the system can meet the needs of both operator and Unnes user in particular, and users from outside Unnes in general. This study aims to evaluate and improve the existing library management system to produce a system that is accountable and able to meet the needs of end users, as well as produce a library management system that is integrated Unnes. Research is directed to produce evaluation report with Technology Acceptance Model (TAM) approach and library management system integrated with the national standard.

  20. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    International Nuclear Information System (INIS)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H

    2016-01-01

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  1. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  2. A Roadmap to Continuous Integration for ATLAS Software Development

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.

  3. Extending IM beyond the Reference Desk: A Case Study on the Integration of Chat Reference and Library-Wide Instant Messaging Network

    Directory of Open Access Journals (Sweden)

    Ian Chan

    2012-09-01

    Full Text Available Openfire is an open source IM network and a single unified application that meets the needs of chat reference and internal communications. In Fall 2009, the California State University San Marcos (CSUSM Library began use of Openfire and other Jive software instant messaging technologies, to simultaneously improve our existing IM-integrated chat reference software and implement an internal IM network. This case study describes the chat reference and internal communications environment at the CSUSM Library and the selection, implementation, and evaluation of Openfire. In addition, the authors discuss the benefits of deploying an integrated instant messaging and chat reference network.

  4. A roadmap to continuous integration for ATLAS software development

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00132984; The ATLAS collaboration; Elmsheuser, Johannes; Obreshkov, Emil; Krasznahorkay, Attila

    2017-01-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million C++ and 1.4 million python lines. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI ...

  5. A Roadmap to Continuous Integration for ATLAS Software Development

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Obreshkov, Emil; Undrus, Alexander

    2016-01-01

    The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million C++ and 1.4 million python lines. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This presentation describes t...

  6. A Posteriori Integration of University CAPE Software Developments

    DEFF Research Database (Denmark)

    Tolksdorf, Gregor; Fillinger, Sandra; Wozny, Guenter

    2015-01-01

    This contribution deals with the mutual integration of existing CAPE software products developed at different universities in Germany, Denmark, and Italy. After the motivation MOSAIC is presented as the bridge building the connection between the modelling tool ICAS-MoT and the numerical processin...

  7. Software for the occupational health and safety integrated management system

    International Nuclear Information System (INIS)

    Vătăsescu, Mihaela

    2015-01-01

    This paper intends to present the design and the production of a software for the Occupational Health and Safety Integrated Management System with the view to a rapid drawing up of the system documents in the field of occupational health and safety

  8. Software for the occupational health and safety integrated management system

    Energy Technology Data Exchange (ETDEWEB)

    Vătăsescu, Mihaela [University Politehnica Timisoara, Department of Engineering and Management, 5 Revolutiei street, 331128 Hunedoara (Romania)

    2015-03-10

    This paper intends to present the design and the production of a software for the Occupational Health and Safety Integrated Management System with the view to a rapid drawing up of the system documents in the field of occupational health and safety.

  9. Integrated management software files action in case of fire

    International Nuclear Information System (INIS)

    Moreno-Ventas Garcia, V.; Gimeno Serrano, F.

    2010-01-01

    The proper management of emergencies, is a challenge for which it is essential to be prepared. Integrated Software Performance Chips In case of fire and rapid access to information, make this application a must to effectively drive any emergency due to fire at any nuclear facility.

  10. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  11. USE OF SOFTWARES FOR POSTURE ASSESSMENT: INTEGRATIVE REVIEW

    Directory of Open Access Journals (Sweden)

    Edyla Maria Porto de Freitas Camelo

    2015-09-01

    Full Text Available To carry out an integrative literature review on the postural analysis softwares available today. It is an integrative-narrative review of qualitative and methodological nature performed during April-July 2014. As inclusion criteria, the articles should be bibliographical or original research and available with full access. At first, we proceeded to the identification of the keywords for the softwares related to postural assessment commonly used in the health field, in such case "posture", "software", and "postural assessment". The search was narrowed by publication date from 2002 to 2014. Through the information acquired from the articles and from the software developers, information on 12 programs that assist the postural evaluation were obtained - Alcimage, All Body Scan 3D, Aplob, APPID, Biotonix, Corporis Pro, Fisimetrix, Fisiometer Posturograma, Physical Fisio, Physio Easy, Posture Print and SAPO. However, only one tool has more information and studies, namely SAPO. There are many postural analysis softwares available on the internet today, however, these are quite disparate in relation to possible answers and are still poorly widespread as research tools.

  12. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    International Nuclear Information System (INIS)

    Kerns, J; Yaldo, D

    2016-01-01

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the time of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.

  13. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Kerns, J [UT MD Anderson Cancer Center, Houston, TX (United States); Yaldo, D [Advocate Health Care, Park Ridge, IL (United States)

    2016-06-15

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the time of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.

  14. Shoestring Digital Library: If Existing Digital Library Software Doesn't Suit Your Needs, Create Your Own

    Science.gov (United States)

    Weber, Jonathan

    2006-01-01

    Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…

  15. Debris Examination Using Ballistic and Radar Integrated Software

    Science.gov (United States)

    Griffith, Anthony; Schottel, Matthew; Lee, David; Scully, Robert; Hamilton, Joseph; Kent, Brian; Thomas, Christopher; Benson, Jonathan; Branch, Eric; Hardman, Paul; hide

    2012-01-01

    The Debris Examination Using Ballistic and Radar Integrated Software (DEBRIS) program was developed to provide rapid and accurate analysis of debris observed by the NASA Debris Radar (NDR). This software provides a greatly improved analysis capacity over earlier manual processes, allowing for up to four times as much data to be analyzed by one-quarter of the personnel required by earlier methods. There are two applications that comprise the DEBRIS system: the Automated Radar Debris Examination Tool (ARDENT) and the primary DEBRIS tool.

  16. Collaboration in Global Software Engineering Based on Process Description Integration

    Science.gov (United States)

    Klein, Harald; Rausch, Andreas; Fischer, Edward

    Globalization is one of the big trends in software development. Development projects need a variety of different resources with appropriate expert knowledge to be successful. More and more of these resources are nowadays obtained from specialized organizations and countries all over the world, varying in development approaches, processes, and culture. As seen with early outsourcing attempts, collaboration may fail due to these differences. Hence, the major challenge in global software engineering is to streamline collaborating organizations towards a successful conjoint development. Based on typical collaboration scenarios, this paper presents a structured approach to integrate processes in a comprehensible way.

  17. Construction, database integration, and application of an Oenothera EST library.

    Science.gov (United States)

    Mrácek, Jaroslav; Greiner, Stephan; Cho, Won Kyong; Rauwolf, Uwe; Braun, Martha; Umate, Pavan; Altstätter, Johannes; Stoppel, Rhea; Mlcochová, Lada; Silber, Martina V; Volz, Stefanie M; White, Sarah; Selmeier, Renate; Rudd, Stephen; Herrmann, Reinhold G; Meurer, Jörg

    2006-09-01

    Coevolution of cellular genetic compartments is a fundamental aspect in eukaryotic genome evolution that becomes apparent in serious developmental disturbances after interspecific organelle exchanges. The genus Oenothera represents a unique, at present the only available, resource to study the role of the compartmentalized plant genome in diversification of populations and speciation processes. An integrated approach involving cDNA cloning, EST sequencing, and bioinformatic data mining was chosen using Oenothera elata with the genetic constitution nuclear genome AA with plastome type I. The Gene Ontology system grouped 1621 unique gene products into 17 different functional categories. Application of arrays generated from a selected fraction of ESTs revealed significantly differing expression profiles among closely related Oenothera species possessing the potential to generate fertile and incompatible plastid/nuclear hybrids (hybrid bleaching). Furthermore, the EST library provides a valuable source of PCR-based polymorphic molecular markers that are instrumental for genotyping and molecular mapping approaches.

  18. Integration of Simulink Models with Component-based Software Models

    Directory of Open Access Journals (Sweden)

    MARIAN, N.

    2008-06-01

    Full Text Available Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behavior as a means of computation, communication and constraints, using computational blocks and aggregates for both discrete and continuous behavior, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI, University of Southern Denmark. Once specified, the software model has to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behavior, and the transformation of the software system into the S

  19. Special Features of the Advanced Loans Module of the ABCD Integrated Library System

    Science.gov (United States)

    de Smet, Egbert

    2011-01-01

    Purpose: The "advanced loans" module of the relatively new library software, ABCD, is an addition to the normal loans module and it offers a "generic transaction decision-making engine" functionality. The module requires extra installation effort and parameterisation, so this article aims to explain to the many potentially interested libraries,…

  20. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  1. Smart Libraries: Integrating Communications Channels and Information Sources.

    Science.gov (United States)

    Webb, T. D.; Jensen, E. A.

    Noting that the changing nature of information delivery has established immediacy as the new basis for modern library service, this paper describes the new facilities design and floor plan for the library of Kapiolani Community College of the University of Hawaii. The new library was carefully designed so that students can move progressively from…

  2. An integrated infrastructure in support of software development

    International Nuclear Information System (INIS)

    Antonelli, S; Bencivenni, M; De Girolamo, D; Giacomini, F; Longo, S; Manzali, M; Veraldi, R; Zani, S

    2014-01-01

    This paper describes the design and the current state of implementation of an infrastructure made available to software developers within the Italian National Institute for Nuclear Physics (INFN) to support and facilitate their daily activity. The infrastructure integrates several tools, each providing a well-identified function: project management, version control system, continuous integration, dynamic provisioning of virtual machines, efficiency improvement, knowledge base. When applicable, access to the services is based on the INFN-wide Authentication and Authorization Infrastructure. The system is being installed and progressively made available to INFN users belonging to tens of sites and laboratories and will represent a solid foundation for the software development efforts of the many experiments and projects that see the involvement of the Institute. The infrastructure will be beneficial especially for small- and medium-size collaborations, which often cannot afford the resources, in particular in terms of know-how, needed to set up such services.

  3. Integrating commercial software in accelerator control- case study

    International Nuclear Information System (INIS)

    Pace, Alberto

    1994-01-01

    Using existing commercial software is the dream of any control system engineer for the development cost reduction that can reach one order of magnitude. This dream often vanishes when appears the requirement to have a uniform and consistent architecture through a wide number of components and applications. This makes it difficult to integrate several commercial packages that often impose different user interface and communication standards. This paper will describe the approach and standards that have been chosen for the CERN ISOLDE control system that have allowed several commercial packages to be integrated in the system as-they-are permitting the software development cost to be reduced to a minimum. (author). 10 refs., 2 tabs., 9 figs

  4. Business Intelligence Applied to the ALMA Software Integration Process

    Science.gov (United States)

    Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.

    2012-09-01

    Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.

  5. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  6. Integrated software system for low level waste management

    International Nuclear Information System (INIS)

    Worku, G.

    1995-01-01

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal under the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications

  7. CyberGIS software: a synthetic review and integration roadmap

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Shaowen [University of Illinois, Urbana-Champaign; Anselin, Luc [Arizona State University; Bhaduri, Budhendra L [ORNL; Cosby, Christopher [University Navstar Consortium, Boulder, CO; Goodchild, Michael [University of California, Santa Barbara; Liu, Yan [University of Illinois, Urbana-Champaign; Nygers, Timothy L. [University of Washington, Seattle

    2013-01-01

    CyberGIS defined as cyberinfrastructure-based geographic information systems (GIS) has emerged as a new generation of GIS representing an important research direction for both cyberinfrastructure and geographic information science. This study introduces a 5-year effort funded by the US National Science Foundation to advance the science and applications of CyberGIS, particularly for enabling the analysis of big spatial data, computationally intensive spatial analysis and modeling (SAM), and collaborative geospatial problem-solving and decision-making, simultaneously conducted by a large number of users. Several fundamental research questions are raised and addressed while a set of CyberGIS challenges and opportunities are identified from scientific perspectives. The study reviews several key CyberGIS software tools that are used to elucidate a vision and roadmap for CyberGIS software research. The roadmap focuses on software integration and synthesis of cyberinfrastructure, GIS, and SAM by defining several key integration dimensions and strategies. CyberGIS, based on this holistic integration roadmap, exhibits the following key characteristics: high-performance and scalable, open and distributed, collaborative, service-oriented, user-centric, and community-driven. As a major result of the roadmap, two key CyberGIS modalities gateway and toolkit combined with a community-driven and participatory approach have laid a solid foundation to achieve scientific breakthroughs across many geospatial communities that would be otherwise impossible.

  8. Key issues regarding digital libraries evaluation and integration

    CERN Document Server

    Shen, Rao; Fox, Edward A

    2013-01-01

    This is the second book based on the 5S (Societies, Scenarios, Spaces, Structures, Streams) approach to digital libraries (DLs). Leveraging the first volume, on Theoretical Foundations, we focus on the key issues of evaluation and integration. These cross-cutting issues serve as a bridge for those interested in DLs, connecting the introduction and formal discussion in the first book, with the coverage of key technologies in the third book, and of illustrative applications in the fourth book. These two topics have central importance in the DL field, allowing it to be treated scientifically as well as practically. In the scholarly world, we only really understand something if we know how to measure and evaluate it. In the Internet era of distributed information systems, we only can be practical at scale if we integrate across both systems and their associated content. Evaluation of DLs must take place atmultiple levels,so we can address the different entities and their associated measures. Thus, for digital obj...

  9. Applications of a Case Library of Technology Integration Stories for Teachers

    Science.gov (United States)

    Wang, Feng-Kwei; Jonassen, David H.; Strobel, Johannes; Cernusca, Dawn

    2003-01-01

    Stories are the most natural form of communication and learning among humans. In this paper, we describe how we have designed and implemented an case library of technology integration stories to support pre-service and in-service teachers learning how to integrate technologies into their teaching. The case library was built using the artificial…

  10. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  11. Library/vendor relationships

    CERN Document Server

    Brooks, Sam

    2014-01-01

    A view of the mutual dependence between libraries and vendorsAs technology advances, libraries are forced to reach beyond their own resources to find effective ways to maintain accuracy and superior service levels. Vendors provide databases and integrated library systems that perform those functions for profit. Library/Vendor Relationships examines the increasing cooperation in which libraries find they must participate in, and vice versa, with the vendors that provide system infrastructure and software. Expert contributors provide insights from all sides of this unique collaboration, offering

  12. Software for the Integration of Multiomics Experiments in Bioconductor.

    Science.gov (United States)

    Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi

    2017-11-01

    Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.

  13. [Development of integrated support software for clinical nutrition].

    Science.gov (United States)

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  14. Development of integrated support software for clinical nutrition

    Directory of Open Access Journals (Sweden)

    Pedro Siquier Homar

    2015-09-01

    Full Text Available Objectives: to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. Methods: the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. Results: this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. Conclusions: this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer

  15. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae; Top, Søren

    2008-01-01

    , communication and constraints, using computational blocks and aggregates for both discrete and continuous behaviour, different interconnection and execution disciplines for event-based and time-based controllers, and so on, to encompass the demands to more functionality, at even lower prices, and with opposite...... to be analyzed. One way of doing that is to integrate in wrapper files the model back into Simulink S-functions, and use its extensive simulation features, thus allowing an early exploration of the possible design choices over multiple disciplines. The paper describes a safe translation of a restricted set...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...

  16. Large scale continuous integration and delivery : Making great software better and faster

    NARCIS (Netherlands)

    Stahl, Daniel

    2017-01-01

    Since the inception of continuous integration, and later continuous delivery, the methods of producing software in the industry have changed dramatically over the last two decades. Automated, rapid and frequent compilation, integration, testing, analysis, packaging and delivery of new software

  17. Integrating manufacturing softwares for intelligent planning execution: a CIIMPLEX perspective

    Science.gov (United States)

    Chu, Bei Tseng B.; Tolone, William J.; Wilhelm, Robert G.; Hegedus, M.; Fesko, J.; Finin, T.; Peng, Yun; Jones, Chris H.; Long, Junshen; Matthews, Mike; Mayfield, J.; Shimp, J.; Su, S.

    1997-01-01

    Recent developments have made it possible to interoperate complex business applications at much lower costs. Application interoperation, along with business process re- engineering can result in significant savings by eliminating work created by disconnected business processes due to isolated business applications. However, we believe much greater productivity benefits can be achieved by facilitating timely decision-making, utilizing information from multiple enterprise perspectives. The CIIMPLEX enterprise integration architecture is designed to enable such productivity gains by helping people to carry out integrated enterprise scenarios. An enterprise scenario is triggered typically by some external event. The goal of an enterprise scenario is to make the right decisions considering the full context of the problem. Enterprise scenarios are difficult for people to carry out because of the interdependencies among various actions. One can easily be overwhelmed by the large amount of information. We propose the use of software agents to help gathering relevant information and present them in the appropriate context of an enterprise scenario. The CIIMPLEX enterprise integration architecture is based on the FAIME methodology for application interoperation and plug-and-play. It also explores the use of software agents in application plug-and- play.

  18. Digital Library and Digital Reference Service: Integration and Mutual Complementarity

    Science.gov (United States)

    Liu, Jia

    2008-01-01

    Both the digital library and the digital reference service were invented and have been developed under the networked environment. Among their intersections, the fundamental thing is their symbiotic interest--serving the user in a more efficient way. The article starts by discussing the digital library and its service and the digital reference…

  19. Multimedia in German Libraries--Aspects of Cooperation and Integration.

    Science.gov (United States)

    Cremer, Monika

    This paper on multimedia in German libraries begins with an introduction to multimedia. Initiatives of the federal government and in the Laender (federal states) are then described, including: a 1997 symposium organized by the university library of Goettingen that presented several multimedia models developed in universities; the multimedia…

  20. Integrated risk assessment for spent fuel transportation using developed software

    International Nuclear Information System (INIS)

    Yun, Mi Rae; Christian, Robby; Kim, Bo Gyung; Almomani, Belal; Ham, Jae Hyun; Kang, Gook Hyun; Lee, Sang hoon

    2016-01-01

    As on-site spent fuel storage meets limitation of their capacity, spent fuel need to be transported to other place. In this research, risk of two ways of transportation method, maritime transportation and on-site transportation, and interim storage facility were analyzed. Easier and integrated risk assessment for spent fuel transportation will be possible by applying this software. Risk assessment for spent fuel transportation has not been researched and this work showed a case for analysis. By using this analysis method and developed software, regulators can get some insights for spent fuel transportation. For example, they can restrict specific region for preventing ocean accident and also they can arrange spend fuel in interim storage facility avoiding most risky region which have high risk from aircraft engine shaft. Finally, they can apply soft material on the floor for specific stage for on-site transportation. In this software, because we targeted Korea, we need to use Korean reference data. However, there were few Korean reference data. Especially, there was no food chain data for Korean ocean. In MARINRAD, they used steady state food chain model, but it is far from reality. Therefore, to get Korean realistic reference data, dynamic food chain model for Korean ocean need to be developed

  1. Integrated risk assessment for spent fuel transportation using developed software

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Christian, Robby; Kim, Bo Gyung; Almomani, Belal; Ham, Jae Hyun; Kang, Gook Hyun [KAIST, Daejeon (Korea, Republic of); Lee, Sang hoon [Keimyung University, Daegu (Korea, Republic of)

    2016-05-15

    As on-site spent fuel storage meets limitation of their capacity, spent fuel need to be transported to other place. In this research, risk of two ways of transportation method, maritime transportation and on-site transportation, and interim storage facility were analyzed. Easier and integrated risk assessment for spent fuel transportation will be possible by applying this software. Risk assessment for spent fuel transportation has not been researched and this work showed a case for analysis. By using this analysis method and developed software, regulators can get some insights for spent fuel transportation. For example, they can restrict specific region for preventing ocean accident and also they can arrange spend fuel in interim storage facility avoiding most risky region which have high risk from aircraft engine shaft. Finally, they can apply soft material on the floor for specific stage for on-site transportation. In this software, because we targeted Korea, we need to use Korean reference data. However, there were few Korean reference data. Especially, there was no food chain data for Korean ocean. In MARINRAD, they used steady state food chain model, but it is far from reality. Therefore, to get Korean realistic reference data, dynamic food chain model for Korean ocean need to be developed.

  2. Pegasys: software for executing and integrating analyses of biological sequences

    Directory of Open Access Journals (Sweden)

    Lett Drew

    2004-04-01

    Full Text Available Abstract Background We present Pegasys – a flexible, modular and customizable software system that facilitates the execution and data integration from heterogeneous biological sequence analysis tools. Results The Pegasys system includes numerous tools for pair-wise and multiple sequence alignment, ab initio gene prediction, RNA gene detection, masking repetitive sequences in genomic DNA as well as filters for database formatting and processing raw output from various analysis tools. We introduce a novel data structure for creating workflows of sequence analyses and a unified data model to store its results. The software allows users to dynamically create analysis workflows at run-time by manipulating a graphical user interface. All non-serial dependent analyses are executed in parallel on a compute cluster for efficiency of data generation. The uniform data model and backend relational database management system of Pegasys allow for results of heterogeneous programs included in the workflow to be integrated and exported into General Feature Format for further analyses in GFF-dependent tools, or GAME XML for import into the Apollo genome editor. The modularity of the design allows for new tools to be added to the system with little programmer overhead. The database application programming interface allows programmatic access to the data stored in the backend through SQL queries. Conclusions The Pegasys system enables biologists and bioinformaticians to create and manage sequence analysis workflows. The software is released under the Open Source GNU General Public License. All source code and documentation is available for download at http://bioinformatics.ubc.ca/pegasys/.

  3. TimeBench: a data model and software library for visual analytics of time-oriented data.

    Science.gov (United States)

    Rind, Alexander; Lammarsch, Tim; Aigner, Wolfgang; Alsallakh, Bilal; Miksch, Silvia

    2013-12-01

    Time-oriented data play an essential role in many Visual Analytics scenarios such as extracting medical insights from collections of electronic health records or identifying emerging problems and vulnerabilities in network traffic. However, many software libraries for Visual Analytics treat time as a flat numerical data type and insufficiently tackle the complexity of the time domain such as calendar granularities and intervals. Therefore, developers of advanced Visual Analytics designs need to implement temporal foundations in their application code over and over again. We present TimeBench, a software library that provides foundational data structures and algorithms for time-oriented data in Visual Analytics. Its expressiveness and developer accessibility have been evaluated through application examples demonstrating a variety of challenges with time-oriented data and long-term developer studies conducted in the scope of research and student projects.

  4. Intercomparison of derived integral data from evaluated data libraries of the actinides

    International Nuclear Information System (INIS)

    Paviotti Corcuera, R.

    1988-12-01

    Resonance integrals and fission spectrum averaged cross-sections are calculated for the actinides from all recent major evaluated libraries. Whenever possible the results are compared against measurements. It is found that the experimental data are scarce and that there exist considerable differences between experimentally measured data and those derived from the evaluated libraries. (author). 93 refs and tabs

  5. The Integrated Library System of the 1990s: The OhioLINK Experience.

    Science.gov (United States)

    Hawks, Carol Pitts

    1992-01-01

    Discussion of integrated library systems focuses on the development of the Ohio Library and Information Network (OhioLINK). Capabilities of eight existing systems are described, including catalog creation and maintenance; the online public access catalog (OPAC); circulation, interlibrary loan, and document delivery; acquisitions and serials…

  6. Planning an Integrated On-Line Library system (IOLS)

    Science.gov (United States)

    1989-03-01

    Logical Workflow for Circulation of Library Materials ............. 14 Figure 9. Detail of Circulation of Libary Materials ...................... 15...Operating Honolulu, HI 96826 System (808) 947-4441 DATA RESEARCH ASSOCIATES, Inc. (ATLAS) 9270 Olive Blvd. St. Louis, MO 01775 DIGITAL EQUIPMENT CORP... DIGITAL EQUIPMENT CORP. Stow, MA 01775 (617) 897-7163 EYRING LIBRARY SYSTEMS (CARL) 5280 S. West, Suite E260 Salt Lake City, UT 84107 TANDEM SYSTEMS

  7. Library Operations Policies and Procedures, Volume 2. Central Archive for Reusable Defense Software (CARDS)

    Science.gov (United States)

    1994-02-28

    use and customize those policies and procedures applicable to the implementor’s situation. It is not the intent of this manual to restrict the library...improvements. Pare 10 ka•- V •DkI U Release Manager The Release Manager provides franchisees with media copies of existing libraries, as needed. Security...implementors, and potential library franchisees . Security Team The Security Team assists the Security Officer with security analysis. Team members are

  8. Framework programmable platform for the advanced software development workstation. Integration mechanism design document

    Science.gov (United States)

    Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike

    1991-01-01

    The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.

  9. GeoDeepDive: Towards a Machine Reading-Ready Digital Library and Information Integration Resource

    Science.gov (United States)

    Husson, J. M.; Peters, S. E.; Livny, M.; Ross, I.

    2015-12-01

    Recent developments in machine reading and learning approaches to text and data mining hold considerable promise for accelerating the pace and quality of literature-based data synthesis, but these advances have outpaced even basic levels of access to the published literature. For many geoscience domains, particularly those based on physical samples and field-based descriptions, this limitation is significant. Here we describe a general infrastructure to support published literature-based machine reading and learning approaches to information integration and knowledge base creation. This infrastructure supports rate-controlled automated fetching of original documents, along with full bibliographic citation metadata, from remote servers, the secure storage of original documents, and the utilization of considerable high-throughput computing resources for the pre-processing of these documents by optical character recognition, natural language parsing, and other document annotation and parsing software tools. New tools and versions of existing tools can be automatically deployed against original documents when they are made available. The products of these tools (text/XML files) are managed by MongoDB and are available for use in data extraction applications. Basic search and discovery functionality is provided by ElasticSearch, which is used to identify documents of potential relevance to a given data extraction task. Relevant files derived from the original documents are then combined into basic starting points for application building; these starting points are kept up-to-date as new relevant documents are incorporated into the digital library. Currently, our digital library stores contains more than 360K documents supplied by Elsevier and the USGS and we are actively seeking additional content providers. By focusing on building a dependable infrastructure to support the retrieval, storage, and pre-processing of published content, we are establishing a foundation for

  10. Re-engineering software systems in the Department of Defense using integrated computer aided software engineering tools

    OpenAIRE

    Jennings, Charles A.

    1992-01-01

    Approved for public release; distribution is unlimited The Department of Defense (DoD) is plagues with severe cost overruns and delays in developing software systems. Existing software within Dod, some developed 15-to 20 years ago, require continual maintenance and modification. Major difficulties arise with maintaining older systems due to cryptic source code and a lack of adequate documentation. To remedy this situation, the DoD, is pursuing the integrated computer aided software engi...

  11. Software features and applications in process design, integration and operation

    Energy Technology Data Exchange (ETDEWEB)

    Dhole, V. [Aspen Tech Limited, Warrington (United Kingdom)

    1999-02-01

    Process engineering technologies and tools have evolved rapidly over the last twenty years. Process simulation/modeling, advanced process control, on-line optimisation, production planning and supply chain management are some of the examples of technologies that have rapidly matured from early commercial prototypes and concepts to established tools with significant impact on profitability of process industry today. Process Synthesis or Process Integration (PI) in comparison is yet to create its impact and still remains largely in the domain of few expert users. One of the key reasons as to why PI has not taken off is because the PI tools have not become integral components of the standard process engineering environments. On the last 15 years AspenTech has grown from a small process simulation tool provider to a large multinational company providing a complete suite of process engineering technologies and services covering process design, operation, planning and supply chain management. Throughout this period, AspenTech has acquired experience in rapidly evolving technologies from their early prototype stage to mature products and services. The paper outlines AspenTech`s strategy of integrating PI with other more established process design and operational improvement technologies. The paper illustrates the key elements of AspenTech`s strategy via examples of software development initiatives and services projects. The paper also outlines AspenTech`s future vision of the role of PI in process engineering. (au)

  12. Software for the computized neutron data library of the SOKRATOR system

    International Nuclear Information System (INIS)

    Kolesov, V.E.; Krivtsov, A.S.; Solov'ev, N.A.

    1976-01-01

    When preparing data for nuclear reactors and shield computations using the evaluated nuclear data library it is necessary to have a set of special service programs to maintain the library itself. In this paper the structure of this set is discussed and a brief description of some programs is presented

  13. Experiences Using an Open Source Software Library to Teach Computer Vision Subjects

    Science.gov (United States)

    Cazorla, Miguel; Viejo, Diego

    2015-01-01

    Machine vision is an important subject in computer science and engineering degrees. For laboratory experimentation, it is desirable to have a complete and easy-to-use tool. In this work we present a Java library, oriented to teaching computer vision. We have designed and built the library from the scratch with emphasis on readability and…

  14. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  15. Propulsion/flight control integration technology (PROFIT) software system definition

    Science.gov (United States)

    Carlin, C. M.; Hastings, W. J.

    1978-01-01

    The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.

  16. Integrated software system for improving medical equipment management.

    Science.gov (United States)

    Bliznakov, Z; Pappous, G; Bliznakova, K; Pallikarakis, N

    2003-01-01

    The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot

  17. User Manual for XnWlup2.0, A Software to Visualize Nuclear Data for Thermal Reactors in WIMS-D Libraries

    International Nuclear Information System (INIS)

    Thiyagarajan, T.K.; Ganesan, S.; Jagannathan, V.; Karthikeyan, R.

    2002-10-01

    A project to prepare an exhaustive handbook of WIMS-D cross sections for thermal reactor applications comparing different WIMS-D compatible nuclear data libraries originating from various countries has been successfully implemented. A computer software, called XnWlup2.0, with graphical user interface for MS Windows has been developed at BARC. This report summarizes the salient features of this new software for the users of WIMS-D libraries. Several sample outputs produced by the software are presented to illustrate the powerful use of this software for routine use in reactor physics analyses. (author)

  18. Beyond Open Source Software: Solving Common Library Problems Using the Open Source Hardware Arduino Platform

    Directory of Open Access Journals (Sweden)

    Jonathan Younker

    2013-06-01

    Full Text Available Using open source hardware platforms like the Arduino, libraries have the ability to quickly and inexpensively prototype custom hardware solutions to common library problems. The authors present the Arduino environment, what it is, what it does, and how it was used at the James A. Gibson Library at Brock University to create a production portable barcode-scanning utility for in-house use statistics collection as well as a prototype for a service desk statistics tabulation program’s hardware interface.

  19. An Integrated Customer Knowledge Management Framework for Academic Libraries

    Science.gov (United States)

    Daneshgar, Farhad; Parirokh, Mehri

    2012-01-01

    The ability of academic libraries to produce timely and effective responses to various environmental changes constitutes a major challenge for them to enhance their survival rate and maintain growth in competitive environments. This article provides a conceptual model as an analytical tool for both improving current services as well as creating…

  20. Integrating Digital Humanities into the Library and Information Science Curriculum

    Science.gov (United States)

    Moazeni, Sarah Leila

    2015-01-01

    Digital Humanities (DH) is a hot topic, in demand and on the rise. This article begins with excerpts from job listings that were posted to the American Library Association's job list in a two-month span in spring 2015 and they seem to indicate that DH is an increasingly important competency and interest for academic librarians who perform…

  1. User's guide to the CALVEC software library: a computer program for emulation of CALCOMP graphics on a Versatec printer/plotter

    International Nuclear Information System (INIS)

    Gray, W.H.

    1978-08-01

    This document describes a set of FORTRAN subroutines collectively called the CALVEC subprogram library. The purpose of the CALVEC software library is the emulation of CALCOMP pen and ink graphics on a DECsystem 10. A user level interface with CALVEC software allows standard CALCOMP subprogram calls to produce a VECtor file, SEGMNT.VEC. This vector file may subsequently be postprocessed into an image in a variety of ways

  2. User's guide to the CALVEC software library: a computer program for emulation of CALCOMP graphics on a versatec printer/plotter

    International Nuclear Information System (INIS)

    Gray, W.H.

    1979-03-01

    this document describes a set of FORTRAN subroutines collectively called the CALVEC subprogram library. The purpose of the CALVEC software library is the emulation of CALCOMP pen and ink graphics on a DECsystem 10. A user level interface with CALVEC software allows standard CALCOMP subprogram calls to produce a VECtor file, FOR24.VEC. This vector file may subsequently be postprocessed into an image in a variety of ways

  3. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  4. Dietary intake assessment using integrated sensors and software

    Science.gov (United States)

    Shang, Junqing; Pepin, Eric; Johnson, Eric; Hazel, David; Teredesai, Ankur; Kristal, Alan; Mamishev, Alexander

    2012-02-01

    The area of dietary assessment is becoming increasingly important as obesity rates soar, but valid measurement of the food intake in free-living persons is extraordinarily challenging. Traditional paper-based dietary assessment methods have limitations due to bias, user burden and cost, and therefore improved methods are needed to address important hypotheses related to diet and health. In this paper, we will describe the progress of our mobile Diet Data Recorder System (DDRS), where an electronic device is used for objective measurement on dietary intake in real time and at moderate cost. The DDRS consists of (1) a mobile device that integrates a smartphone and an integrated laser package, (2) software on the smartphone for data collection and laser control, (3) an algorithm to process acquired data for food volume estimation, which is the largest source of error in calculating dietary intake, and (4) database and interface for data storage and management. The estimated food volume, together with direct entries of food questionnaires and voice recordings, could provide dietitians and nutritional epidemiologists with more complete food description and more accurate food portion sizes. In this paper, we will describe the system design of DDRS and initial results of dietary assessment.

  5. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework

    International Nuclear Information System (INIS)

    Journe, G.; Guilbaud, C.

    2005-01-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  6. The Digital Library for Earth System Education: A Community Integrator

    Science.gov (United States)

    Marlino, M. R.; Pandya, R. E.

    2003-12-01

    The rapid changes in the geoscience research environment have prompted educators to request support for their efforts to reform geoscience educational practices. DLESE, the Digital Library for Earth System Education, responds to this request by providing a single point of access to high-quality educational resources for teaching about the Earth as a system. DLESE is supported by the National Science Foundation and is an operational library used by tens of thousands of educators every month. DLESE resources include a variety of media formats, from text-based lesson plans to highly-sophisticated tools for interactive three-dimensional visualization of authentic scientific data. The DLESE community is particularly interested in partnering with scientific researchers to ensure that the tools of practicing scientists become widely available to geoscience educators. Two emerging large-scale scientific efforts, the GEON project and EarthScope, provide compelling illustrations of the potential of these partnerships. Both are cutting-edge, cross-disciplinary projects that use digital tools in a distributed environment to support scientific investigation. Both have also made a deep commitment to use these same tools to support geoscience education, and both are including DLESE as part of that commitment. Our interactive presentation will allow users to discover a variety of educational resources and communication services within the library. We will highlight those library resources and services that take particular advantage of the digital media to support new modes of learning and teaching. For example, annotation tools allow educators to add tips on the most effective way to use a specific resource. Data services will help educators find and use real-time data to illustrate geoscience phenomena. Multi-dimensional visualization tools allow students to interact with authentic student data in inquiry-based learning environment. DLESE will continue to actively collaborate

  7. Outcomes from the First Wingman Software in the Loop Integration Event: January 2017

    Science.gov (United States)

    2017-06-28

    ARL-TN-0830 ● June 2017 US Army Research Laboratory Outcomes from the First Wingman Software- in-the-Loop Integration Event...ARL-TN-0830 ● JUNE 2017 US Army Research Laboratory Outcomes from the First Wingman Software- in-the-Loop Integration Event: January 2017...Note 3. DATES COVERED (From - To) January 2017–September 2017 4. TITLE AND SUBTITLE Outcomes from the First Wingman Software-in-the-Loop Integration

  8. Library

    OpenAIRE

    Dulaney, Ronald E. Jr.

    1997-01-01

    This study began with the desire to design a public town library of the future and became a search for an inkling of what is essential to Architecture. It is murky and full of contradictions. It asks more than it proposes, and the traces of its windings are better ordered through collage than logical synthesis. This study is neither a thesis nor a synthesis. When drawing out the measure of this study it may be beneficial to state what it attempts to place at the ...

  9. Libraries as a means for the integration of immigrant populations in Ávila province

    Directory of Open Access Journals (Sweden)

    María Jesús Romera Iruela

    2014-11-01

    Full Text Available Public libraries can be considered as suitable agents and places for the integration of immigrant population. Therefore, services and programs should be developed in order to give an adequate answer to their needs. This article provides some research findings for the information needs of immigrants at Ávila province. These needs are detected from three different perspectives: public libraries, educational centers, and immigrant associations. The data were collected, using ad-hoc questionnaires, in the districts and municipalities with largest immigrant presence. Several intercultural actions directed to the integration of this population are suggested in this research. Thus, amongst others, the need of a Web site with thematic information on the design of cooperative programs on intercultural education between public libraries and scholar libraries, involving both parents of immigrant students and immigrants associations

  10. The use of software agents and distributed objects to integrate enterprises: Compatible or competing technologies?

    Energy Technology Data Exchange (ETDEWEB)

    Pancerella, C.M.

    1998-04-01

    Distributed object and software agent technologies are two integration methods for connecting enterprises. The two technologies have overlapping goals--interoperability and architectural support for integrating software components--though to date little or no integration of the two technologies has been made at the enterprise level. The primary difference between these two technologies is that distributed object technologies focus on the problems inherent in connecting distributed heterogeneous systems whereas software agent technologies focus on the problems involved with coordination and knowledge exchange across domain boundaries. This paper addresses the integration of these technologies in support of enterprise integration across organizational and geographic boundaries. The authors discuss enterprise integration issues, review their experiences with both technologies, and make recommendations for future work. Neither technology is a panacea. Good software engineering techniques must be applied to integrate an enterprise because scalability and a distributed software development team are realities.

  11. Data Portal for the Library of Integrated Network-based Cellular Signatures (LINCS) program: integrated access to diverse large-scale cellular perturbation response data

    Science.gov (United States)

    Koleti, Amar; Terryn, Raymond; Stathias, Vasileios; Chung, Caty; Cooper, Daniel J; Turner, John P; Vidović, Dušica; Forlin, Michele; Kelley, Tanya T; D’Urso, Alessandro; Allen, Bryce K; Torre, Denis; Jagodnik, Kathleen M; Wang, Lily; Jenkins, Sherry L; Mader, Christopher; Niu, Wen; Fazel, Mehdi; Mahi, Naim; Pilarczyk, Marcin; Clark, Nicholas; Shamsaei, Behrouz; Meller, Jarek; Vasiliauskas, Juozas; Reichard, John; Medvedovic, Mario; Ma’ayan, Avi; Pillai, Ajay

    2018-01-01

    Abstract The Library of Integrated Network-based Cellular Signatures (LINCS) program is a national consortium funded by the NIH to generate a diverse and extensive reference library of cell-based perturbation-response signatures, along with novel data analytics tools to improve our understanding of human diseases at the systems level. In contrast to other large-scale data generation efforts, LINCS Data and Signature Generation Centers (DSGCs) employ a wide range of assay technologies cataloging diverse cellular responses. Integration of, and unified access to LINCS data has therefore been particularly challenging. The Big Data to Knowledge (BD2K) LINCS Data Coordination and Integration Center (DCIC) has developed data standards specifications, data processing pipelines, and a suite of end-user software tools to integrate and annotate LINCS-generated data, to make LINCS signatures searchable and usable for different types of users. Here, we describe the LINCS Data Portal (LDP) (http://lincsportal.ccs.miami.edu/), a unified web interface to access datasets generated by the LINCS DSGCs, and its underlying database, LINCS Data Registry (LDR). LINCS data served on the LDP contains extensive metadata and curated annotations. We highlight the features of the LDP user interface that is designed to enable search, browsing, exploration, download and analysis of LINCS data and related curated content. PMID:29140462

  12. CCLab--a multi-objective genetic algorithm based combinatorial library design software and an application for histone deacetylase inhibitor design.

    Science.gov (United States)

    Fang, Guanghua; Xue, Mengzhu; Su, Mingbo; Hu, Dingyu; Li, Yanlian; Xiong, Bing; Ma, Lanping; Meng, Tao; Chen, Yuelei; Li, Jingya; Li, Jia; Shen, Jingkang

    2012-07-15

    The introduction of the multi-objective optimization has dramatically changed the virtual combinatorial library design, which can consider many objectives simultaneously, such as synthesis cost and drug-likeness, thus may increase positive rates of biological active compounds. Here we described a software called CCLab (Combinatorial Chemistry Laboratory) for combinatorial library design based on the multi-objective genetic algorithm. Tests of the convergence ability and the ratio to re-take the building blocks in the reference library were conducted to assess the software in silico, and then it was applied to a real case of designing a 5×6 HDAC inhibitor library. Sixteen compounds in the resulted library were synthesized, and the histone deactetylase (HDAC) enzymatic assays proved that 14 compounds showed inhibitory ratios more than 50% against tested 3 HDAC enzymes at concentration of 20 μg/mL, with IC(50) values of 3 compounds comparable to SAHA. These results demonstrated that the CCLab software could enhance the hit rates of the designed library and would be beneficial for medicinal chemists to design focused library in drug development (the software can be downloaded at: http://202.127.30.184:8080/drugdesign.html). Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. The Software Architecture for Performing Scientific Computation with the JLAPACK Libraries in ScalaLab

    Directory of Open Access Journals (Sweden)

    Stergios Papadimitriou

    2012-01-01

    Full Text Available Although LAPACK is a powerful library its utilization is difficult. JLAPACK, a Java translation obtained automatically from the Fortran LAPACK sources, retains exactly the same difficult to use interface of LAPACK routines. The MTJ library implements an object oriented Java interface to JLAPACK that hides many complicated details. ScalaLab exploits the flexibility of the Scala language to present an even more friendly and convenient interface to the powerful but complicated JLAPACK library. The article describes the interfacing of the low-level JLAPACK routines within the ScalaLab environment. This is performed rather easily by exploiting well suited features of the Scala language. Also, the paper demonstrates the convenience of using JLAPACK routines for linear algebra operations from within ScalaLab.

  14. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  15. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  16. Endnote Referencing Software: Importing references from an Ebsco database, attaching full text, organising your Endnote library

    OpenAIRE

    Turner, Susan

    2017-01-01

    This video demonstrates importing bibliographic references from EBSCO Discovery Service, the same method can be used for all EBSCO databases. \\ud The video also demonstrates how to attach full text files to the references and how to organise your references within the endnote library using groups.

  17. The Value of Social Software in School Library Instruction, Communication, and Collaboration

    Science.gov (United States)

    Summers, Laura L.

    2009-01-01

    As budget cuts loom in school districts across the nation, school librarians are expected to show artifacts and share data to cement their credibility as instructional leaders, since according to Zmuda (2007) and many others, the effectiveness of the school library media program must be measured by what students learn as a result of their…

  18. Why Do Staff of Joint-Use Libraries Sometimes Fail to Integrate? Investigating Cultures and Ethics in a Public-Tertiary Joint-Use Library

    Science.gov (United States)

    Calvert, Philip James

    2010-01-01

    Joint-use libraries have identified staff integration as a problem. Using focus groups, this project investigated the culture, professional ethics, and attitudes of staff in a public-tertiary joint-use library in Auckland, New Zealand. Findings show some difference in organizational cultures, but more variation at the lower level of roles and…

  19. Adding Cross-Platform Support to a High-Throughput Software Stack and Exploration of Vectorization Libraries

    CERN Document Server

    AUTHOR|(CDS)2258962

    This master thesis is written at the LHCb experiment at CERN. It is part of the initiative for improving software in view of the upcoming upgrade in 2021 which will significantly increase the amount of acquired data. This thesis consists of two parts. The first part is about the exploration of different vectorization libraries and their usefulness for the LHCb collaboration. The second part is about adding cross-platform support to the LHCb software stack. Here, the LHCb stack is successfully ported to ARM (aarch64) and its performance is analyzed. At the end of the thesis, the port to PowerPC(ppc64le) awaits the performance analysis. The main goal of porting the stack is the cost-performance evaluation for the different platforms to get the most cost efficient hardware for the new server farm for the upgrade. For this, selected vectorization libraries are extended to support the PowerPC and ARM platform. And though the same compiler is used, platform-specific changes to the compilation flags are required. In...

  20. How to use Open Source Software for Manage a Library System

    OpenAIRE

    Sumithchandra, Pandula

    2009-01-01

    Open source is an approach to the design, development, and distribution of software, offering practical accessibility to a software's source code. Some consider open source as one of various possible design approaches, while others consider it a critical strategic element of their operations. Before open source became widely adopted, developers and producers used a variety of phrases to describe the concept; the term open source gained popularity with the rise of the Internet, which provided ...

  1. Experiences with Integrating Simulation into a Software Engineering Curriculum

    Science.gov (United States)

    Bollin, Andreas; Hochmuller, Elke; Mittermeir, Roland; Samuelis, Ladislav

    2012-01-01

    Software Engineering education must account for a broad spectrum of knowledge and skills software engineers will be required to apply throughout their professional life. Covering all the topics in depth within a university setting is infeasible due to curricular constraints as well as due to the inherent differences between educational…

  2. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  3. Direct Integration: Training Software Developers to Conduct Usability Evaluations

    DEFF Research Database (Denmark)

    Skov, Mikael B.; Stage, Jan

    2008-01-01

    is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting......Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve...... a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out their own usability evaluations. The paper...

  4. Avaliação comparativa do software Pergamum entre usuarios de uma biblioteca publica e de uma biblioteca universitariaComaparative evaluation of Pergamum software between user of a public and an university library

    Directory of Open Access Journals (Sweden)

    Josué Sales Barbosa

    2012-07-01

    Full Text Available Aborda a avaliação comparativa do software de gerenciamento de bibliotecas de duas instituições públicas sendo uma biblioteca universitária e outra a biblioteca pública do estado de Minas Gerais. Para tal foi realizado um estudo de usuários utilizando questionários em duas distintas etapas da pesquisa. Para a discussão utilizamos os modelos de abordagem de Dervin e Carol Kuhlthau. Palavras-chave: Estudo de usuários. Biblioteca universitária. Biblioteca pública. Avaliação de software. Abstract Approaches a comparative evaluation of library management software of two public library institutions being one university library and one public library from Minas Gerais state. A user study war realized using surveys in two distinct stages. The discussion was grounded in the models of Brenda Dervin and Carol Kuhlthau approaches. Keywords: Study of users. University library. Public library. Evaluation software.

  5. Method for critical software event execution reliability in high integrity software

    Energy Technology Data Exchange (ETDEWEB)

    Kidd, M.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-11-01

    This report contains viewgraphs on a method called SEER, which provides a high level of confidence that critical software driven event execution sequences faithfully exceute in the face of transient computer architecture failures in both normal and abnormal operating environments.

  6. Object-Oriented Technology-Based Software Library for Operations of Water Reclamation Centers

    Science.gov (United States)

    Otani, Tetsuo; Shimada, Takehiro; Yoshida, Norio; Abe, Wataru

    SCADA systems in water reclamation centers have been constructed based on hardware and software that each manufacturer produced according to their design. Even though this approach used to be effective to realize real-time and reliable execution, it is an obstacle to cost reduction about system construction and maintenance. A promising solution to address the problem is to set specifications that can be used commonly. In terms of software, information model approach has been adopted in SCADA systems in other field, such as telecommunications and power systems. An information model is a piece of software specification that describes a physical or logical object to be monitored. In this paper, we propose information models for operations of water reclamation centers, which have not ever existed. In addition, we show the feasibility of the information model in terms of common use and processing performance.

  7. GfaPy: a flexible and extensible software library for handling sequence graphs in Python.

    Science.gov (United States)

    Gonnella, Giorgio; Kurtz, Stefan

    2017-10-01

    GFA 1 and GFA 2 are recently defined formats for representing sequence graphs, such as assembly, variation or splicing graphs. The formats are adopted by several software tools. Here, we present GfaPy, a software package for creating, parsing and editing GFA graphs using the programming language Python. GfaPy supports GFA 1 and GFA 2, using the same interface and allows for interconversion between both formats. The software package provides a simple interface for custom record types, which is an important new feature of GFA 2 (compared to GFA 1). This enables new applications of the format. GfaPy is available open source at https://github.com/ggonnella/gfapy and installable via pip. gonnella@zbh.uni-hamburg.de. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  8. Survey of Integration Cost-Adoption between Digital Library Systems in Iran

    Directory of Open Access Journals (Sweden)

    Mehdi Alipour-Hafezi

    2013-03-01

    Full Text Available The main goal of this article was identifying cost elements in syntactic integrating digital library systems in Iran. The levels of integration are content, technical, and organizational. It is obvious we could access some sub goals such as identifying current situation of information systems from the view points of data storage, needed standard outputs, and the current situation of interoperability in Iran, and suitable integration model in Iranian digital libraries. The analytical survey method was used in this research. Research population included 11 digital library systems that were used in Iranian digital libraries. In order to gather information, a researcher made questionnaire was used because of not existing standard collecting data tool. Findings demonstrated that we should search on three levels of interoperability: content, technical, and organizational level in order to identifying cost-adoption elements in syntactic interoperability. Also, findings showed that the elements of organizational level were too important level in cost-adoption elements. Also this research demonstrated that the high cost of adoption was related to libraries and their organizations.

  9. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    Science.gov (United States)

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries

  10. Software-Programmed Optical Networking with Integrated NFV Service Provisioning

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Wang, Xi; Basu, Shrutarshi

    2017-01-01

    We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS).......We showcase demonstrations of “program & compile” styled optical networking as well as open platforms & standards based NFV service provisioning using a proof-of-concept implementation of the Software-Programmed Networking Operating System (SPN OS)....

  11. An Integrated Online Library System as a Node in a Local Area Network: The Mitre Experience.

    Science.gov (United States)

    Kidwell, Mary Coyle

    1987-01-01

    Discusses the Mitre Corporation's implementation of OCLC's LS/2000 integrated library system using a local area network (LAN). LAN issues--requirements, equipment, reliability, growth, security, and traffic--are covered in general and as they relate to Mitre. Installation of the LAN/system interface and benefits and drawbacks of using a LAN for…

  12. The design of a web – based integrated library system with internet ...

    African Journals Online (AJOL)

    Developing countries like Nigeria face series of defects in developing, managing and securing an Integrated Library System (ILS) in tertiary institutions and Secondary Schools where they are mostly needed. ILSs face issues such as non – interactive, speed, cost, unavailability of experienced users and programmers and ...

  13. Implementation of an Integrated Information Management System at the National Library of Wales: A Case Study

    Science.gov (United States)

    Evans, Manon Foster; Thomas, Sian

    2007-01-01

    Purpose: This paper aims to describe the experiences of the National Library of Wales in implementing an integrated information management system. Design/methodology/approach: Discusses the stages involved in the procurement process, data migration and general system implementation. Findings: Emphasises the need for a well-prepared yet flexible…

  14. Comparison of integral cross section values of several cross section libraries in the SAND-II format

    International Nuclear Information System (INIS)

    Zijp, W.L.; Nolthenius, H.J.

    1978-01-01

    A comparison of some integral cross section values for several cross section libraries in the SAND-II format is presented. The integral cross section values are calculated with aid of the spectrum functions for a Watt fission spectrum, a 1/E spectrum and a Maxwellian spectrum. The libraries which are considered here are CCC-112B, ENDF/B-IV, DETAN74, LAPENAS and CESNEF. These 5 cross section libraries used have all the SAND-II format. (author)

  15. Digital Libraries: The Challenge of Integrating Instagram with a Taxonomy for Content Management

    OpenAIRE

    Simona Ibba; Filippo Eros Pani

    2016-01-01

    Interoperability and social implication are two current challenges in the digital library (DL) context. To resolve the problem of interoperability, our work aims to find a relationship between the main metadata schemas. In particular, we want to formalize knowledge through the creation of a metadata taxonomy built with the analysis and the integration of existing schemas associated with DLs. We developed a method to integrate and combine Instagram metadata and hashtags. The final result is a ...

  16. Continuous integration and quality control for scientific software

    Science.gov (United States)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  17. Planning for the integration of the digital library, clinical decision support, and evidence at the point of care.

    Science.gov (United States)

    Schwartz, Linda Matula; Iobst, Barbara

    2008-01-01

    Integrating knowledge-based resources at the point of care is an important opportunity for hospital library involvement. In the progression of an IAIMS planning grant, the digital library is recognized as pivotal to the success of information domain integration throughout the institution. The planning process, data collection, and evolution of the planning project are discussed.

  18. Integrating Design Decision Management with Model-based Software Development

    DEFF Research Database (Denmark)

    Könemann, Patrick

    Design decisions are continuously made during the development of software systems and are important artifacts for design documentation. Dedicated decision management systems are often used to capture such design knowledge. Most such systems are, however, separated from the design artifacts...... of the system. In model-based software development, where design models are used to develop a software system, outcomes of many design decisions have big impact on design models. The realization of design decisions is often manual and tedious work on design models. Moreover, keeping design models consistent......, or by ignoring the causes. This substitutes manual reviews to some extent. The concepts, implemented in a tool, have been validated with design patterns, refactorings, and domain level tests that comprise a replay of a real project. This proves the applicability of the solution to realistic examples...

  19. Integration of Simulink Models with Component-based Software Models

    DEFF Research Database (Denmark)

    Marian, Nicolae

    2008-01-01

    Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...... of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has...

  20. Automatic attenuator upgrade for a Siemens D500 diffractometer via a generic software library to overcome hardware limitations

    International Nuclear Information System (INIS)

    Mayr, Sina; Randau, Christian; Kreuzpaintner, Wolfgang

    2017-01-01

    A proxy software was developed which allows the Siemens D500 x-ray diffractometer to be upgraded with add-ons that have never been officially available for it. For demonstration, we designed and integrated an automatic attenuator option and demonstrated the feasibility of our upgrade path by typical comparative x-ray measurements, which would usually saturate the x-ray detector, if no attenuator is used.

  1. Automatic attenuator upgrade for a Siemens D500 diffractometer via a generic software library to overcome hardware limitations

    Energy Technology Data Exchange (ETDEWEB)

    Mayr, Sina, E-mail: sina.mayr@frm2.tum.de [Technische Universität München, Physik-Department E21, James-Franck-Str. 1, 85748 Garching (Germany); Randau, Christian [Georg-August-Universität Göttingen, Fakultät für Geowissenschaften und Geologie, Abteilung Isotopengeologie Außenstelle MLZ (FRM II), Lichtenbergstr. 1, 85748 Garching (Germany); Kreuzpaintner, Wolfgang [Technische Universität München, Physik-Department E21, James-Franck-Str. 1, 85748 Garching (Germany)

    2017-05-21

    A proxy software was developed which allows the Siemens D500 x-ray diffractometer to be upgraded with add-ons that have never been officially available for it. For demonstration, we designed and integrated an automatic attenuator option and demonstrated the feasibility of our upgrade path by typical comparative x-ray measurements, which would usually saturate the x-ray detector, if no attenuator is used.

  2. Integrating Dynamic Mathematics Software into Cooperative Learning Environments in Mathematics

    Science.gov (United States)

    Zengin, Yilmaz; Tatar, Enver

    2017-01-01

    The aim of this study was to evaluate the implementation of the cooperative learning model supported with dynamic mathematics software (DMS), that is a reflection of constructivist learning theory in the classroom environment, in the teaching of mathematics. For this purpose, a workshop was conducted with the volunteer teachers on the…

  3. Concrete containment integrity software: Procedure manual and guidelines

    International Nuclear Information System (INIS)

    Dameron, R.A.; Dunham, R.S.; Rashid, Y.R.

    1990-06-01

    This report is an executive summary describing the concrete containment analysis methodology and software that was developed in the EPRI-sponsored research to predict the overpressure behavior and leakage of concrete containments. A set of guidelines has been developed for performing reliable 2D axisymmetric concrete containment analysis with a cracking concrete constitutive model developed by ANATECH. The software package developed during this research phase is designed for use in conjunction with ABAQUS-EPGEN; it provides the concrete model and automates axisymmetric grid preparation, and rebar generation for 2D and 3D grids. The software offers the option of generating pre-programmed axisymmetric grids that can be tailored to a specific containment by input of a few geometry parameters. The goal of simplified axisymmetric analysis within the framework of the containment leakage prediction methodology is to compute global liner strain histories at various locations within the containment. A simplified approach for generating peak liner strains at structural discontinuities as function of the global liner strains has been presented in a separate leakage criteria document; the curves for strain magnification factors and liner stress triaxiality factors found in that document are intended to be applied to the global liner strain histories developed through global 2D analysis. This report summarizes the procedures for global 2D analysis and gives an overview of the constitutive model and the special purpose concrete containment analysis software developed in this research phase. 8 refs., 10 figs

  4. Library Computing

    Science.gov (United States)

    Library Computing, 1985

    1985-01-01

    Special supplement to "Library Journal" and "School Library Journal" covers topics of interest to school, public, academic, and special libraries planning for automation: microcomputer use, readings in automation, online searching, databases of microcomputer software, public access to microcomputers, circulation, creating a…

  5. Integrating open-source software applications to build molecular dynamics systems.

    Science.gov (United States)

    Allen, Bruce M; Predecki, Paul K; Kumosa, Maciej

    2014-04-05

    Three open-source applications, NanoEngineer-1, packmol, and mis2lmp are integrated using an open-source file format to quickly create molecular dynamics (MD) cells for simulation. The three software applications collectively make up the open-source software (OSS) suite known as MD Studio (MDS). The software is validated through software engineering practices and is verified through simulation of the diglycidyl ether of bisphenol-a and isophorone diamine (DGEBA/IPD) system. Multiple simulations are run using the MDS software to create MD cells, and the data generated are used to calculate density, bulk modulus, and glass transition temperature of the DGEBA/IPD system. Simulation results compare well with published experimental and numerical results. The MDS software prototype confirms that OSS applications can be analyzed against real-world research requirements and integrated to create a new capability. Copyright © 2014 Wiley Periodicals, Inc.

  6. Integration of the MUSE Software Pipeline into the Astro-WISE System

    NARCIS (Netherlands)

    Pizagno, J.; Streicher, O.; Vriend, W.-J.; Ballester, P.; Egret, D.; Lorente, N.P.F.

    We discuss the current state of integrating the Mutli Unit Spectroscopic Explorer (hereafter: MUSE) software pipeline (Weilbacher et al. 2006) into the Astro-WISE system (Valentijn et al. 2007a; Vriend et al. 2012). MUSE is a future integral-field spectrograph for the VLT, consisting of 24 Integral

  7. Optimal integration and test plans for software releases of lithographic systems

    NARCIS (Netherlands)

    Boumen, R.; Jong, de I.S.M.; Mortel - Fronczak, van de J.M.; Rooda, J.E.

    2007-01-01

    This paper describes a method to determine the optimal integration and test plan for embedded systems software releases. The method consists of four steps: 1)describe the integration and test problem in an integration and test model which is introduced in this paper, 2) determine possible test

  8. Integrated Software Development System/Higher Order Software Conceptual Description (ISDS/HOS)

    Science.gov (United States)

    1976-11-01

    Structured Flowchart Conventions 270 6.3.5.3 Design Diagram Notation 273 xii HIGHER ORDER SOFTWARE, INC. 843 MASSACHUSETTS AVENUE. CAMBRIDGE, MASSACHUSETTS...associated with the process steps. They also reference other HIPO diagrams as well an non-HIPO documentation such as flowcharts or decision tables of...syntax that is easy to learn and must provide the novice with some prompting to help him avoid classic beginner errors. Desirable editing capabilities

  9. On integrating modeling software for application to total-system performance assessment

    International Nuclear Information System (INIS)

    Lewis, L.C.; Wilson, M.L.

    1994-05-01

    We examine the processes and methods used to facilitate collaboration in software development between two organizations at separate locations -- Lawrence Livermore National Laboratory (LLNL) in California and Sandia National Laboratories (SNL) in New Mexico. Our software development process integrated the efforts of these two laboratories. Software developed at LLNL to model corrosion and failure of waste packages and subsequent releases of radionuclides was incorporated as a source term into SNLs computer models for fluid flow and radionuclide transport through the geosphere

  10. State of the Art : Integrated Management of Requirements in Model-Based Software Engineering

    OpenAIRE

    Thörn, Christer

    2006-01-01

    This report describes the background and future of research concerning integrated management of requirements in model-based software engineering. The focus is on describing the relevant topics and existing theoretical backgrounds that form the basis for the research. The report describes the fundamental difficulties of requirements engineering for software projects, and proposes that the results and methods of models in software engineering can help leverage those problems. Taking inspiration...

  11. Public Access to Digital Material; A Call to Researchers: Digital Libraries Need Collaboration across Disciplines; Greenstone: Open-Source Digital Library Software; Retrieval Issues for the Colorado Digitization Project's Heritage Database; Report on the 5th European Conference on Digital Libraries, ECDL 2001; Report on the First Joint Conference on Digital Libraries.

    Science.gov (United States)

    Kahle, Brewster; Prelinger, Rick; Jackson, Mary E.; Boyack, Kevin W.; Wylie, Brian N.; Davidson, George S.; Witten, Ian H.; Bainbridge, David; Boddie, Stefan J.; Garrison, William A.; Cunningham, Sally Jo; Borgman, Christine L.; Hessel, Heather

    2001-01-01

    These six articles discuss various issues relating to digital libraries. Highlights include public access to digital materials; intellectual property concerns; the need for collaboration across disciplines; Greenstone software for construction and presentation of digital information collections; the Colorado Digitization Project; and conferences…

  12. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Comparison of integral cross section values of several cross section libraries in the SAND-II format

    International Nuclear Information System (INIS)

    Zijp, W.L.; Nolthenius, H.J.

    1976-09-01

    A comparison of some integral cross-section values for several cross-section libraries in the SAND-II format is presented. The integral cross-section values are calculated with the aid of the spectrum functions for a Watt fission spectrum, a 1/E spectrum and a Maxwellian spectrum. The libraries which are considered here are CCC-112B, ENDF/B-IV, DETAN74, LAPENAS and CESNEF. These 5 cross-section libraries used have all the SAND-II format. Discrepancies between cross-sections in the different libraries are indicated but not discussed

  14. Integrating HCI Specialists into Open Source Software Development Projects

    Science.gov (United States)

    Hedberg, Henrik; Iivari, Netta

    Typical open source software (OSS) development projects are organized around technically talented developers, whose communication is based on technical aspects and source code. Decision-making power is gained through proven competence and activity in the project, and non-technical end-user opinions are too many times neglected. In addition, also human-computer interaction (HCI) specialists have encountered difficulties in trying to participate in OSS projects, because there seems to be no clear authority and responsibility for them. In this paper, based on HCI and OSS literature, we introduce an extended OSS development project organization model that adds a new level of communication and roles for attending human aspects of software. The proposed model makes the existence of HCI specialists visible in the projects, and promotes interaction between developers and the HCI specialists in the course of a project.

  15. A framework to integrate software behavior into dynamic probabilistic risk assessment

    International Nuclear Information System (INIS)

    Zhu Dongfeng; Mosleh, Ali; Smidts, Carol

    2007-01-01

    Software plays an increasingly important role in modern safety-critical systems. Although, research has been done to integrate software into the classical probabilistic risk assessment (PRA) framework, current PRA practice overwhelmingly neglects the contribution of software to system risk. Dynamic probabilistic risk assessment (DPRA) is considered to be the next generation of PRA techniques. DPRA is a set of methods and techniques in which simulation models that represent the behavior of the elements of a system are exercised in order to identify risks and vulnerabilities of the system. The fact remains, however, that modeling software for use in the DPRA framework is also quite complex and very little has been done to address the question directly and comprehensively. This paper develops a methodology to integrate software contributions in the DPRA environment. The framework includes a software representation, and an approach to incorporate the software representation into the DPRA environment SimPRA. The software representation is based on multi-level objects and the paper also proposes a framework to simulate the multi-level objects in the simulation-based DPRA environment. This is a new methodology to address the state explosion problem in the DPRA environment. This study is the first systematic effort to integrate software risk contributions into DPRA environments

  16. RAGE Reusable Game Software Components and Their Integration into Serious Game Engines

    NARCIS (Netherlands)

    Van der Vegt, Wim; Nyamsuren, Enkhbold; Westera, Wim

    2016-01-01

    This paper presents and validates a methodology for integrating reusable software components in diverse game engines. While conforming to the RAGE com-ponent-based architecture described elsewhere, the paper explains how the interac-tions and data exchange processes between a reusable software

  17. COMSY - A software tool for PLIM + PLEX with integrated risk-informed approaches

    International Nuclear Information System (INIS)

    Zander, A.; Nopper, H.; Roessner, R.

    2004-01-01

    The majority of mechanical components and structures in a thermal power plant are designed to experience a service life which is far above the intended design life. In most cases, only a small percentage of mechanical components are subject to significant degradation which may affect the integrity or the function of the component. If plant life extension (PLEX) is considered as an option, a plant specific PLIM strategy needs to be developed. One of the most important tasks of such a PLIM strategy is to identify those components which (i) are relevant for the safety and/or availability of the plant and (ii) experience elevated degradation due to their operating and design conditions. For these components special life management strategies need to be established to reliably monitor their condition. FRAMATOME ANP GmbH has developed the software tool COMSY, which is designed to efficiently support a plant-wide lifetime management strategy for static mechanical components, providing the basis for plant life extension (PLEX) activities. The objective is the economical and safe operation of power plants over their design lifetime - and beyond. The tool provides the capability to establish a program guided technical documentation of the plant by utilizing a virtual plant data model. The software integrates engineering analysis functions and comprehensive material libraries to perform a lifetime analysis for various degradation mechanisms typically experienced in power plants (e.g. flow-accelerated corrosion, intergranular stress corrosion cracking, strain-induced cracking, material fatigue, cavitation erosion, droplet impingement erosion, pitting, etc.). A risk-based prioritization serves to focus inspection activities on safety or availability relevant locations, where a degradation potential exists. Trending functions support the comparison of the as-measured condition with the predicted progress of degradation while making allowance for measurement tolerances. The

  18. LearnWeb 2.0. Integrating Social Software for Lifelong Learning.

    NARCIS (Netherlands)

    Marenzi, Ivana; Demidova, Elena; Nejdl, Wolfgang

    2008-01-01

    Marenzi, I., Demidova, E., & Nejdl, W. (2008). LearnWeb 2.0. Integrating Social Software for Lifelong Learning. Proceedings of the ED-Media 2008. World Conference on Educational Multimedia, Hypermedia & Telecommunications. June, 30 - July, 4, 2008, Austria, Vienna.

  19. PiCO QL: A software library for runtime interactive queries on program data

    Science.gov (United States)

    Fragkoulis, Marios; Spinellis, Diomidis; Louridas, Panos

    PiCO QL is an open source C/C++ software whose scientific scope is real-time interactive analysis of in-memory data through SQL queries. It exposes a relational view of a system's or application's data structures, which is queryable through SQL. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. PiCO QL makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of PiCO QL include the Linux kernel, the Valgrind instrumentation framework, a GIS application, a virtual real-time observatory of stellar objects, and a source code analyser.

  20. PiCO QL: A software library for runtime interactive queries on program data

    Directory of Open Access Journals (Sweden)

    Marios Fragkoulis

    2016-01-01

    Full Text Available Pico ql is an open source c/c++ software whose scientific scope is real-time interactive analysis of in-memory data through sql queries. It exposes a relational view of a system’s or application’s data structures, which is queryable through sql. While the application or system is executing, users can input queries through a web-based interface or issue web service requests. Queries execute on the live data structures through the respective relational views. pico ql makes a good candidate for ad-hoc data analysis in applications and for diagnostics in systems settings. Applications of pico ql include the Linux kernel, the Valgrind instrumentation framework, a gis application, a virtual real-time observatory of stellar objects, and a source code analyser.

  1. Integrated management software files action in case of fire; Software de gestion integral de fichas de actuacion en caso de incendio

    Energy Technology Data Exchange (ETDEWEB)

    Moreno-Ventas Garcia, V.; Gimeno Serrano, F.

    2010-07-01

    The proper management of emergencies, is a challenge for which it is essential to be prepared. Integrated Software Performance Chips In case of fire and rapid access to information, make this application a must to effectively drive any emergency due to fire at any nuclear facility.

  2. The Usage of Programming Software “The Library of Electronic Visual Aids “Algebra 7-9” During Algebra Learning in 7-9 Forms.

    Directory of Open Access Journals (Sweden)

    V.A. Kreknin

    2008-06-01

    Full Text Available The Programming software “The Library of Electronic Visual Aids “Algebra 7-9” for secondary institutions was developed for the computer support of algebra classes in 7-9 forms of secondary school. The present article describes the data about its basic characteristics features and possibilities.

  3. Sensitivity Analysis for Design Optimization Integrated Software Tools, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective of this proposed project is to provide a new set of sensitivity analysis theory and codes, the Sensitivity Analysis for Design Optimization Integrated...

  4. Hermeneutics framework: integration of design rationale and optimizing software modules

    NARCIS (Netherlands)

    Aksit, Mehmet; Malakuti Khah Olun Abadi, Somayeh

    To tackle the evolution challenges of adaptive systems, this paper argues on the necessity of hermeneutic approaches that help to avoid too early elimination of design alternatives. This visionary paper proposes the Hermeneutics Framework, which computationally integrates a design rationale

  5. Command Center Library Model Document. Comprehensive Approach to Reusable Defense Software (CARDS)

    Science.gov (United States)

    1992-05-31

    system, and functionality for specifying the layout of the document. 3.7.16.1 FrameMaker FrameMaker is a Commercial Off The Shelf (COTS) component...facilitating WYSIWYG creation of formatted reports with embedded graphics. FrameMaker is an advanced publishing tool that integrates word processing...available for the component FrameMaker : * Product evaluation reports in ASCII and postscript formats • Product assessment on line in model 0 Product

  6. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    International Nuclear Information System (INIS)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves; Kruzelecki, Karol

    2010-01-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  7. Servicing HEP experiments with a complete set of ready integreated and configured common software components

    Energy Technology Data Exchange (ETDEWEB)

    Roiser, Stefan; Gaspar, Ana; Perrin, Yves [CERN, CH-1211 Geneva 23, PH Department, SFT Group (Switzerland); Kruzelecki, Karol, E-mail: stefan.roiser@cern.c, E-mail: ana.gaspar@cern.c, E-mail: yves.perrin@cern.c, E-mail: karol.kruzelecki@cern.c [CERN, CH-1211 Geneva 23, PH Department, LBC Group (Switzerland)

    2010-04-01

    The LCG Applications Area at CERN provides basic software components for the LHC experiments such as ROOT, POOL, COOL which are developed in house and also a set of 'external' software packages (70) which are needed in addition such as Python, Boost, Qt, CLHEP, etc. These packages target many different areas of HEP computing such as data persistency, math, simulation, grid computing, databases, graphics, etc. Other packages provide tools for documentation, debugging, scripting languages and compilers. All these packages are provided in a consistent manner on different compilers, architectures and operating systems. The Software Process and Infrastructure project (SPI) [1] is responsible for the continous testing, coordination, release and deployment of these software packages. The main driving force for the actions carried out by SPI are the needs of the LHC experiments, but also other HEP experiments could profit from the set of consistent libraries provided and receive a stable and well tested foundation to build their experiment software frameworks. This presentation will first provide a brief description of the tools and services provided for the coordination, testing, release, deployment and presentation of LCG/AA software packages and then focus on a second set of tools provided for outside LHC experiments to deploy a stable set of HEP related software packages both as binary distribution or from source.

  8. Software library of meteorological routines for air quality models; Libreria de software de procedimientos meteorologicos para modelos de dispersion de contaminantes

    Energy Technology Data Exchange (ETDEWEB)

    Galindo Garcia, Ivan Francisco

    1999-04-01

    Air quality models are an essential tool for most air pollution studies. The models require, however, certain meteorological information about the model domain. Some of the required meteorological parameters can be measured directly, but others must be estimated from available measured data. Therefore, a set of procedures, routines and computational programs to obtain all the meteorological and micrometeorological input data is required. The objective in this study is the identification and implementation of several relationships and methods for the determination of all the meteorological parameters required as input data by US-EPA recommended air pollution models. To accomplish this, a study about air pollution models was conducted, focusing, particularly, on the model meteorological input data. Also, the meteorological stations from the Servicio Meteorologico Nacional (SMN) were analyzed. The type and quality of the meteorological data produced was obtained. The routines and methods developed were based, particularly, on the data produced by SMN stations. Routines were organized in a software library, which allows one to build the specific meteorological processor needed, independently of the model used. Methods were validated against data obtained from an advanced meteorological station owned and operated by the Electrical Research Institute (Instituto de Investigaciones Electricas (IIE)). The results from the validation show that the estimation of the parameters required by air pollution models from routinely available data from Mexico meteorological stations is feasible and therefore let us take full advantage of the use of air pollution models. As an application example of the software library developed, the building of a meteorological processor for a specific air pollution model (CALPUFF) is described. The big advantage the library represents is evident from this example. [Espanol] Los modelos de dispersion de contaminantes constituyen una herramienta

  9. A MODEL FOR INTEGRATED SOFTWARE TO IMPROVE COMMUNICATION POLICY IN DENTAL TECHNICAL LABS

    Directory of Open Access Journals (Sweden)

    Minko M. Milev

    2017-06-01

    Full Text Available Introduction: Integrated marketing communications (IMC are all kinds of communications between organisations and customers, partners, other organisations and society. Aim: To develop and present an integrated software model, which can improve the effectiveness of communications in dental technical services. Material and Methods: The model of integrated software is based on recommendations of a total of 700 respondents (students of dental technology, dental physicians, dental technicians and patients of dental technical laboratories in Northeastern Bulgaria. Results and Discussion: We present the benefits of future integrated software to improve the communication policy in the dental technical laboratory that meets the needs of fast cooperation and well-built communicative network between dental physicians, dental technicians, patients and students. Conclusion: The use of integrated communications could be a powerful unified approach to improving the communication policy between all players at the market of dental technical services.

  10. Digital Libraries: The Challenge of Integrating Instagram with a Taxonomy for Content Management

    Directory of Open Access Journals (Sweden)

    Simona Ibba

    2016-05-01

    Full Text Available Interoperability and social implication are two current challenges in the digital library (DL context. To resolve the problem of interoperability, our work aims to find a relationship between the main metadata schemas. In particular, we want to formalize knowledge through the creation of a metadata taxonomy built with the analysis and the integration of existing schemas associated with DLs. We developed a method to integrate and combine Instagram metadata and hashtags. The final result is a taxonomy, which provides innovative metadata with respect to the classification of resources, as images of Instagram and the user-generated content, that play a primary role in the context of modern DLs. The possibility of Instagram to localize the photos inserted by users allows us to interpret the most relevant and interesting informative content for a specific user type and in a specific location and to improve access, visibility and searching of library content.

  11. Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

    NARCIS (Netherlands)

    Young, J.; Elhawary, H.; Popovic, A.

    2012-01-01

    We have integrated the Philips Research robot arm with the Johns Hopkins University cisst library, an open-source platform for computerassisted surgical intervention. The development of a Matlab to C++ wrapper to abstract away servo-level details facilitates the rapid development of a

  12. Integrating environmental component models. Development of a software framework

    NARCIS (Netherlands)

    Schmitz, O.

    2014-01-01

    Integrated models consist of interacting component models that represent various natural and social systems. They are important tools to improve our understanding of environmental systems, to evaluate cause–effect relationships of human–natural interactions, and to forecast the behaviour of

  13. NeuroMatic: An Integrated Open-Source Software Toolkit for Acquisition, Analysis and Simulation of Electrophysiological Data

    Science.gov (United States)

    Rothman, Jason S.; Silver, R. Angus

    2018-01-01

    Acquisition, analysis and simulation of electrophysiological properties of the nervous system require multiple software packages. This makes it difficult to conserve experimental metadata and track the analysis performed. It also complicates certain experimental approaches such as online analysis. To address this, we developed NeuroMatic, an open-source software toolkit that performs data acquisition (episodic, continuous and triggered recordings), data analysis (spike rasters, spontaneous event detection, curve fitting, stationarity) and simulations (stochastic synaptic transmission, synaptic short-term plasticity, integrate-and-fire and Hodgkin-Huxley-like single-compartment models). The merging of a wide range of tools into a single package facilitates a more integrated style of research, from the development of online analysis functions during data acquisition, to the simulation of synaptic conductance trains during dynamic-clamp experiments. Moreover, NeuroMatic has the advantage of working within Igor Pro, a platform-independent environment that includes an extensive library of built-in functions, a history window for reviewing the user's workflow and the ability to produce publication-quality graphics. Since its original release, NeuroMatic has been used in a wide range of scientific studies and its user base has grown considerably. NeuroMatic version 3.0 can be found at http://www.neuromatic.thinkrandom.com and https://github.com/SilverLabUCL/NeuroMatic. PMID:29670519

  14. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  15. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  16. Experience with Intel's Many Integrated Core Architecture in ATLAS Software

    CERN Document Server

    Fleischmann, S; The ATLAS collaboration; Lavrijsen, W; Neumann, M; Vitillo, R

    2013-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks. This should make it possible to develop for both throughput and latency devices using a single code base.\

  17. Quality control of next-generation sequencing library through an integrative digital microfluidic platform.

    Science.gov (United States)

    Thaitrong, Numrin; Kim, Hanyoup; Renzi, Ronald F; Bartsch, Michael S; Meagher, Robert J; Patel, Kamlesh D

    2012-12-01

    We have developed an automated quality control (QC) platform for next-generation sequencing (NGS) library characterization by integrating a droplet-based digital microfluidic (DMF) system with a capillary-based reagent delivery unit and a quantitative CE module. Using an in-plane capillary-DMF interface, a prepared sample droplet was actuated into position between the ground electrode and the inlet of the separation capillary to complete the circuit for an electrokinetic injection. Using a DNA ladder as an internal standard, the CE module with a compact LIF detector was capable of detecting dsDNA in the range of 5-100 pg/μL, suitable for the amount of DNA required by the Illumina Genome Analyzer sequencing platform. This DMF-CE platform consumes tenfold less sample volume than the current Agilent BioAnalyzer QC technique, preserving precious sample while providing necessary sensitivity and accuracy for optimal sequencing performance. The ability of this microfluidic system to validate NGS library preparation was demonstrated by examining the effects of limited-cycle PCR amplification on the size distribution and the yield of Illumina-compatible libraries, demonstrating that as few as ten cycles of PCR bias the size distribution of the library toward undesirable larger fragments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A method for establishing integrity in software-based systems

    International Nuclear Information System (INIS)

    Staple, B.D.; Berg, R.S.; Dalton, L.J.

    1997-01-01

    In this paper, the authors present a digital system requirements specification method that has demonstrated a potential for improving the completeness of requirements while reducing ambiguity. It assists with making proper digital system design decisions, including the defense against specific digital system failures modes. It also helps define the technical rationale for all of the component and interface requirements. This approach is a procedural method that abstracts key features that are expanded in a partitioning that identifies and characterizes hazards and safety system function requirements. The key system features are subjected to a hierarchy that progressively defines their detailed characteristics and components. This process produces a set of requirements specifications for the system and all of its components. Based on application to nuclear power plants, the approach described here uses two ordered domains: plant safety followed by safety system integrity. Plant safety refers to those systems defined to meet the safety goals for the protection of the public. Safety system integrity refers to systems defined to ensure that the system can meet the safety goals. Within each domain, a systematic process is used to identify hazards and define the corresponding means of defense and mitigation. In both domains, the approach and structure are focused on the completeness of information and eliminating ambiguities in the generation of safety system requirements that will achieve the plant safety goals

  19. Automating the Small Library.

    Science.gov (United States)

    Skapura, Robert

    1987-01-01

    Discusses the use of microcomputers for automating school libraries, both for entire systems and for specific library tasks. Highlights include available library management software, newsletters that evaluate software, constructing an evaluation matrix, steps to consider in library automation, and a brief discussion of computerized card catalogs.…

  20. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    CERN Document Server

    Keyes, Robert; The ATLAS collaboration

    2016-01-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware and software, associated to various sub-detectors that must seamlessly cooperate in order to select 1 collision of interest out of every 40,000 delivered by the LHC every millisecond. This talk will discuss the challenges, workflow and organization of the ongoing trigger software development, validation and deployment. This development, from the top level integration and configuration to the individual components responsible for each sub system, is done to ensure that the most up to date algorithms are used to optimize the performance of the experiment. This optimization hinges on the reliability and predictability of the software performance, which is why validation is of the utmost importance. The software adheres to a hierarchical release structure, with newly validated releases propagating upwards. Integration tests are carried out on a daily basis to ensure that the releases deployed to the online trigger farm duri...

  1. ZZ DOSCROS, Neutron Cross-Section Library for Spectra Unfolding and Integral Parameter Evaluation

    International Nuclear Information System (INIS)

    Zijp, Willem L.; Nolthenius, Henk J.; Rieffe, Henk Ch.

    1987-01-01

    1 - Description of problem or function: Format: SAND-II; Number of groups: 640 fine group cross section values; Nuclides: Li, B, F, Na, Mg, Al, S, Sc, Ti, Cr, Mn, Fe, Co, Ni, Cu, Zn, As, Br, Nb, Mo, Rh, Pd, Ag, In, Sb, I, Cs, La, Eu, Sm, Dy, Lu, Ta, W, Re, Au, Th, U, Np, Pu. Origin: ENDF/B-V mainly, ENDF/B-IV, INDL/V. This library forms in combination with the DAMSIG81 library a convenient source of evaluated energy dependent cross section sets which may be used in the determination of neutron spectra by means of adjustment (or unfolding) procedures or which can be used for the determination of integral parameters (such as damage-to-activation ratio) useful in characterising the neutron spectra. The energy dependent fine group cross section data are presented in a 640 group structure of the SAND-II type. This group structure has 45 energy groups per energy decade below 1 MeV and a group width of 100 KeV above 1 MeV. The total energy span of this group structure is from 10 -10 MeV to 20 MeV. The library has the SAND-II format, which implies that a special part of the library has to contain cover cross section data sets. These cross section data sets are required in the SAND-II program for taking into account the influence of special detector surroundings which may be used during an irradiation. 2 - Method of solution: The selection of the reactions from the evaluated nuclear data libraries was determined by various properties of the reactions for neutron metrology. For this reason all the well- known reactions of the ENDF/B-V dosimetry file are included but these data are supplemented with cross section sets for less well known metrology reactions which may become of interest

  2. Software Uncertainty in Integrated Environmental Modelling: the role of Semantics and Open Science

    Science.gov (United States)

    de Rigo, Daniele

    2013-04-01

    Computational aspects increasingly shape environmental sciences [1]. Actually, transdisciplinary modelling of complex and uncertain environmental systems is challenging computational science (CS) and also the science-policy interface [2-7]. Large spatial-scale problems falling within this category - i.e. wide-scale transdisciplinary modelling for environment (WSTMe) [8-10] - often deal with factors (a) for which deep-uncertainty [2,11-13] may prevent usual statistical analysis of modelled quantities and need different ways for providing policy-making with science-based support. Here, practical recommendations are proposed for tempering a peculiar - not infrequently underestimated - source of uncertainty. Software errors in complex WSTMe may subtly affect the outcomes with possible consequences even on collective environmental decision-making. Semantic transparency in CS [2,8,10,14,15] and free software [16,17] are discussed as possible mitigations (b) . Software uncertainty, black-boxes and free software. Integrated natural resources modelling and management (INRMM) [29] frequently exploits chains of nontrivial data-transformation models (D- TM), each of them affected by uncertainties and errors. Those D-TM chains may be packaged as monolithic specialized models, maybe only accessible as black-box executables (if accessible at all) [50]. For end-users, black-boxes merely transform inputs in the final outputs, relying on classical peer-reviewed publications for describing the internal mechanism. While software tautologically plays a vital role in CS, it is often neglected in favour of more theoretical aspects. This paradox has been provocatively described as "the invisibility of software in published science. Almost all published papers required some coding, but almost none mention software, let alone include or link to source code" [51]. Recently, this primacy of theory over reality [52-54] has been challenged by new emerging hybrid approaches [55] and by the

  3. Integration testing through reusing representative unit test cases for high-confidence medical software.

    Science.gov (United States)

    Shin, Youngsul; Choi, Yunja; Lee, Woo Jin

    2013-06-01

    As medical software is getting larger-sized, complex, and connected with other devices, finding faults in integrated software modules gets more difficult and time consuming. Existing integration testing typically takes a black-box approach, which treats the target software as a black box and selects test cases without considering internal behavior of each software module. Though it could be cost-effective, this black-box approach cannot thoroughly test interaction behavior among integrated modules and might leave critical faults undetected, which should not happen in safety-critical systems such as medical software. This work anticipates that information on internal behavior is necessary even for integration testing to define thorough test cases for critical software and proposes a new integration testing method by reusing test cases used for unit testing. The goal is to provide a cost-effective method to detect subtle interaction faults at the integration testing phase by reusing the knowledge obtained from unit testing phase. The suggested approach notes that the test cases for the unit testing include knowledge on internal behavior of each unit and extracts test cases for the integration testing from the test cases for the unit testing for a given test criteria. The extracted representative test cases are connected with functions under test using the state domain and a single test sequence to cover the test cases is produced. By means of reusing unit test cases, the tester has effective test cases to examine diverse execution paths and find interaction faults without analyzing complex modules. The produced test sequence can have test coverage as high as the unit testing coverage and its length is close to the length of optimal test sequences. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  5. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  6. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  7. Multimedia Bootcamp: a health sciences library provides basic training to promote faculty technology integration.

    Science.gov (United States)

    Ramsey, Ellen C

    2006-04-25

    Recent research has shown a backlash against the enthusiastic promotion of technological solutions as replacements for traditional educational content delivery. Many institutions, including the University of Virginia, have committed staff and resources to supporting state-of-the-art, showpiece educational technology projects. However, the Claude Moore Health Sciences Library has taken the approach of helping Health Sciences faculty be more comfortable using technology in incremental ways for instruction and research presentations. In July 2004, to raise awareness of self-service multimedia resources for instructional and professional development needs, the Library conducted a "Multimedia Bootcamp" for nine Health Sciences faculty and fellows. Case study. Program stewardship by a single Library faculty member contributed to the delivery of an integrated learning experience. The amount of time required to attend the sessions and complete homework was the maximum fellows had to devote to such pursuits. The benefit of introducing technology unfamiliar to most fellows allowed program instructors to start everyone at the same baseline while not appearing to pass judgment on the technology literacy skills of faculty. The combination of wrapping the program in the trappings of a fellowship and selecting fellows who could commit to a majority of scheduled sessions yielded strong commitment from participants as evidenced by high attendance and a 100% rate of assignment completion. Response rates to follow-up evaluation requests, as well as continued use of Media Studio resources and Library expertise for projects begun or conceived during Bootcamp, bode well for the long-term success of this program. An incremental approach to integrating technology with current practices in instruction and presentation provided a supportive yet energizing environment for Health Sciences faculty. Keys to this program were its faculty focus, traditional hands-on instruction, unrestricted

  8. Software quality assurance plan for the National Ignition Facility integrated computer control system

    International Nuclear Information System (INIS)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project's controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy's (DOE's) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project

  9. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  10. Integrated Payload Data Handling Systems Using Software Partitioning

    Science.gov (United States)

    Taylor, Alun; Hann, Mark; Wishart, Alex

    2015-09-01

    An integrated Payload Data Handling System (I-PDHS) is one in which multiple instruments share a central payload processor for their on-board data processing tasks. This offers a number of advantages over the conventional decentralised architecture. Savings in payload mass and power can be realised because the total processing resource is matched to the requirements, as opposed to the decentralised architecture here the processing resource is in effect the sum of all the applications. Overall development cost can be reduced using a common processor. At individual instrument level the potential benefits include a standardised application development environment, and the opportunity to run the instrument data handling application on a fully redundant and more powerful processing platform [1]. This paper describes a joint program by SCISYS UK Limited, Airbus Defence and Space, Imperial College London and RAL Space to implement a realistic demonstration of an I-PDHS using engineering models of flight instruments (a magnetometer and camera) and a laboratory demonstrator of a central payload processor which is functionally representative of a flight design. The objective is to raise the Technology Readiness Level of the centralised data processing technique by address the key areas of task partitioning to prevent fault propagation and the use of a common development process for the instrument applications. The project is supported by a UK Space Agency grant awarded under the National Space Technology Program SpaceCITI scheme. [1].

  11. The dynamic of modern software development project management and the software crisis of quality. An integrated system dynamics approach towards software quality improvement

    OpenAIRE

    Nasirikaljahi, Armindokht

    2012-01-01

    The software industry is plagued by cost-overruns, delays, poor customer satisfaction and quality issues that are costing clients and customers world-wide billions of dollars each year. The phenomenon is coined The Software Crisis", and poses a huge challenge for software project management. This thesis addresses one of the core issues of the software crisis, namely software quality. The challenges of software quality are central for understanding the other symptoms of the software crisis. Th...

  12. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    Science.gov (United States)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  13. Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

    OpenAIRE

    Young, J.; Elhawary, H.; Popovic, A.

    2012-01-01

    We have integrated the Philips Research robot arm with the Johns Hopkins University cisst library, an open-source platform for computerassisted surgical intervention. The development of a Matlab to C++ wrapper to abstract away servo-level details facilitates the rapid development of a component-based framework with “plug and play” features. This allows the user to easily exchange the robot with an alternative manipulator while maintaining the same overall functionality.

  14. An integrated software testing framework for FGA-based controllers in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eun Sub; Yoo, Jun Beom; Lee, Young Jun; Choi, Jong Gyun

    2016-01-01

    Field-programmable gate arrays (FPGAs) have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs

  15. Using MDA for integration of heterogeneous components in software supply chains

    NARCIS (Netherlands)

    Hartmann, Johan Herman; Keren, Mila; Matsinger, Aart; Rubin, Julia; Trew, Tim; Yatzkar-Haham, Tali

    2013-01-01

    Software product lines are increasingly built using components from specialized suppliers. A company that is in the middle of a supply chain has to integrate components from its suppliers and offer (partially configured) products to its customers. To satisfy both the variability required by each

  16. On Integrating Student Empirical Software Engineering Studies with Research and Teaching Goals

    NARCIS (Netherlands)

    Galster, Matthias; Tofan, Dan; Avgeriou, Paris

    2012-01-01

    Background: Many empirical software engineering studies use students as subjects and are conducted as part of university courses. Aim: We aim at reporting our experiences with using guidelines for integrating empirical studies with our research and teaching goals. Method: We document our experience

  17. Integrating communication protocol selection with partitioning in hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1998-01-01

    frequencies of system components such as buses, CPU's, ASIC's, software code size, hardware area, and component prices. A distinct feature of the model is the modeling of driver processing of data (packing, splitting, compression, etc.) and its impact on communication throughput. The integration...

  18. The Rapid Integration and Test Environment - A Process for Achieving Software Test Acceptance

    OpenAIRE

    Jack, Rick

    2010-01-01

    Proceedings Paper (for Acquisition Research Program) Approved for public release; distribution unlimited. The Rapid Integration and Test Environment (RITE) initiative, implemented by the Program Executive Office, Command, Control, Communications, Computers and Intelligence, Command and Control Program Office (PMW-150), was born of necessity. Existing processes for requirements definition and management, as well as those for software development, did not consistently deliver high-qualit...

  19. The impact of continuous integration on other software development practices: a large-scale empirical study

    NARCIS (Netherlands)

    Zhao, Y.; Serebrenik, A.; Zhou, Y.; Filkov, V.; Vasilescu, B.N.

    2017-01-01

    Continuous Integration (CI) has become a disruptive innovation in software development: with proper tool support and adoption, positive effects have been demonstrated for pull request throughput and scaling up of project sizes. As any other innovation, adopting CI implies adapting existing practices

  20. An Integrated Platform for Dynamic Software Updating and its Application in Self-* systems

    DEFF Research Database (Denmark)

    Gregersen, Allan Raundahl; Jørgensen, Bo Nørregaard; Hadaytullah

    2012-01-01

    Practical dynamic updating of modern Java applications requires tool support to become an integral part of the software development and maintenance lifecycle. In this paper we present Javeleon, an easy-to-use tool for dynamic updates of Java applications. To support integration with specific...... frameworks, component systems and application servers, Javeleon currently provides tight integration with the NetBeans Platform, facilitating dynamic updating for applications built on top of the NetBeans Platform in an unconstrained manner. Javeleon supports state-preserving unanticipated runtime evolution...

  1. An intelligent and integrated V and V environment design for NPP I and C software systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Son Han Seong; Seong, Poong Hyun

    2001-01-01

    Nuclear Power Plant (NPP) is the safety critical system. Since, nuclear instrumentation and control (I and C) systems including the plant protection system play the brain part of human, nuclear I and C systems have an influence on safety and operation of NPP. Essentially, software V and V should be performed for the safety critical systems based on software. It is very important in the technical aspect because of the problems concerning license acquisitions. In this work, an intelligent and integrated V and V environment supporting the automation of V and V was designed. The intelligent and integrated V and V environment consists of the intelligent controller part, components part, interface part, and GUI part. These parts were integrated systematically, while taking their own independent functions

  2. Power, Avionics and Software - Phase 1.0:. [Subsystem Integration Test Report

    Science.gov (United States)

    Ivancic, William D.; Sands, Obed S.; Bakula, Casey J.; Oldham, Daniel R.; Wright, Ted; Bradish, Martin A.; Klebau, Joseph M.

    2014-01-01

    This report describes Power, Avionics and Software (PAS) 1.0 subsystem integration testing and test results that occurred in August and September of 2013. This report covers the capabilities of each PAS assembly to meet integration test objectives for non-safety critical, non-flight, non-human-rated hardware and software development. This test report is the outcome of the first integration of the PAS subsystem and is meant to provide data for subsequent designs, development and testing of the future PAS subsystems. The two main objectives were to assess the ability of the PAS assemblies to exchange messages and to perform audio testing of both inbound and outbound channels. This report describes each test performed, defines the test, the data, and provides conclusions and recommendations.

  3. Status of software for PGNAA bulk analysis by the Monte Carlo - Library Least-Squares (MCLLS) approach

    International Nuclear Information System (INIS)

    Gardner, R.P.; Zhang, W.; Metwally, W.A.

    2005-01-01

    The Center for Engineering Applications of Radioisotopes (CEAR) has been working for about ten years on the Monte Carlo - Library Least-Squares (MCLLS) approach for treating the nonlinear inverse analysis problem for PGNAA bulk analysis. This approach consists essentially of using Monte Carlo simulation to generate the libraries of all the elements to be analyzed plus any other required libraries. These libraries are then used in the linear Library Least-Squares (LLS) approach with unknown sample spectra to analyze for all elements in the sample. The other libraries include all sources of background which includes: (1) gamma-rays emitted by the neutron source, (2) prompt gamma-rays produced in the analyzer construction materials, (3) natural gamma-rays from K-40 and the uranium and thorium decay chains, and (4) prompt and decay gamma-rays produced in the NaI detector by neutron activation. A number of unforeseen problems have arisen in pursuing this approach including: (1) the neutron activation of the most common detector (NaI) used in bulk analysis PGNAA systems, (2) the nonlinearity of this detector, and (3) difficulties in obtaining detector response functions for this (and other) detectors. These problems have been addressed by CEAR recently and have either been solved or are almost solved at the present time. Development of Monte Carlo simulation for all of the libraries has been finished except the prompt gamma-ray library from the activation of the NaI detector. Treatment for the coincidence schemes for Na and particularly I must be first determined to complete the Monte Carlo simulation of this last library. (author)

  4. Westinghouse integrated protection system. An overview of the software design and maintenance features

    International Nuclear Information System (INIS)

    Gibson, R.J.

    1995-01-01

    The Westinghouse Integrated Protection System was designed with the goal of providing a system which can be easily verified, validated, and maintained. The software design and structure promote the ease of translation from functional requirements to applications function software while also improving the ability to verify and maintain the applications function software. The use of independent, reusable, common functions software modules focuses the design, verification, and validation of the software and reduces the likelihood of errors occurring during the application and maintenance of the software. The simple continuous loop method of operation used throughout the IPS provides a standard deterministic method of operation. The IPS design also incorporates the use of embedded self-diagnostics to perform continuous hardware oriented tests of the system and the use of an independent subsystem to automatically perform a functional test of the system. Maintenance interfaces also exist to readily identify and locate faults as well as providing other maintenance capabilities. These testing and maintenance features enhance the overall reliability and availability of the system. (orig.) (2 refs., 2 figs.)

  5. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  6. CDApps: integrated software for experimental planning and data processing at beamline B23, Diamond Light Source.

    Science.gov (United States)

    Hussain, Rohanah; Benning, Kristian; Javorfi, Tamas; Longo, Edoardo; Rudd, Timothy R; Pulford, Bill; Siligardi, Giuliano

    2015-03-01

    The B23 Circular Dichroism beamline at Diamond Light Source has been operational since 2009 and has seen visits from more than 200 user groups, who have generated large amounts of data. Based on the experience of overseeing the users' progress at B23, four key areas requiring the most assistance are identified: planning of experiments and note-keeping; designing titration experiments; processing and analysis of the collected data; and production of experimental reports. To streamline these processes an integrated software package has been developed and made available for the users. The subsequent article summarizes the main features of the software.

  7. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  8. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  9. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  10. Integrated software environment dedicated for implementation of control systems based on PLC controllers

    Directory of Open Access Journals (Sweden)

    Szymon SURMA

    2007-01-01

    Full Text Available Industrial processes’ control systems based on PLC controllers play today a very important role in all fields of transport, including also sea transport. Construction of control systems is the field of engineering, which has been continuously evolving towards maximum simplification of system design path. Up to now the time needed forthe system construction from the design to commissioning had to be divided into a few stages. A mistake made in an earlier stage caused that in most cases the next stages had to be restarted. Available debugging systems allows defect detection at an early stage of theproject implementation. The paper presents general characteristic of integrated software for implementation of complex control systems. The issues related to the software use for programming of the visualisation environment, control computer, selection oftransmission medium and transmission protocol as well as PLC controllers’ configuration, software and control have been analysed.

  11. Software quality assurance plan for the National Ignition Facility integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Woodruff, J.

    1996-11-01

    Quality achievement is the responsibility of the line organizations of the National Ignition Facility (NIF) Project. This Software Quality Assurance Plan (SQAP) applies to the activities of the Integrated Computer Control System (ICCS) organization and its subcontractors. The Plan describes the activities implemented by the ICCS section to achieve quality in the NIF Project`s controls software and implements the NIF Quality Assurance Program Plan (QAPP, NIF-95-499, L-15958-2) and the Department of Energy`s (DOE`s) Order 5700.6C. This SQAP governs the quality affecting activities associated with developing and deploying all control system software during the life cycle of the NIF Project.

  12. A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems

    Science.gov (United States)

    Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.

    2017-05-01

    Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.

  13. ImatraNMR: Novel software for batch integration and analysis of quantitative NMR spectra

    Science.gov (United States)

    Mäkelä, A. V.; Heikkilä, O.; Kilpeläinen, I.; Heikkinen, S.

    2011-08-01

    Quantitative NMR spectroscopy is a useful and important tool for analysis of various mixtures. Recently, in addition of traditional quantitative 1D 1H and 13C NMR methods, a variety of pulse sequences aimed for quantitative or semiquantitative analysis have been developed. To obtain actual usable results from quantitative spectra, they must be processed and analyzed with suitable software. Currently, there are many processing packages available from spectrometer manufacturers and third party developers, and most of them are capable of analyzing and integration of quantitative spectra. However, they are mainly aimed for processing single or few spectra, and are slow and difficult to use when large numbers of spectra and signals are being analyzed, even when using pre-saved integration areas or custom scripting features. In this article, we present a novel software, ImatraNMR, designed for batch analysis of quantitative spectra. In addition to capability of analyzing large number of spectra, it provides results in text and CSV formats, allowing further data-analysis using spreadsheet programs or general analysis programs, such as Matlab. The software is written with Java, and thus it should run in any platform capable of providing Java Runtime Environment version 1.6 or newer, however, currently it has only been tested with Windows and Linux (Ubuntu 10.04). The software is free for non-commercial use, and is provided with source code upon request.

  14. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  15. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  16. Kedalion: NASA's Adaptable and Agile Hardware/Software Integration and Test Lab

    Science.gov (United States)

    Mangieri, Mark L.; Vice, Jason

    2011-01-01

    NASA fs Kedalion engineering analysis lab at Johnson Space Center is on the forefront of validating and using many contemporary avionics hardware/software development and integration techniques, which represent new paradigms to heritage NASA culture. Kedalion has validated many of the Orion hardware/software engineering techniques borrowed from the adjacent commercial aircraft avionics solution space, with the intention to build upon such techniques to better align with today fs aerospace market. Using agile techniques, commercial products, early rapid prototyping, in-house expertise and tools, and customer collaboration, Kedalion has demonstrated that cost effective contemporary paradigms hold the promise to serve future NASA endeavors within a diverse range of system domains. Kedalion provides a readily adaptable solution for medium/large scale integration projects. The Kedalion lab is currently serving as an in-line resource for the project and the Multipurpose Crew Vehicle (MPCV) program.

  17. SALOME. A software integration platform for multi-physics, pre-processing and visualisation

    International Nuclear Information System (INIS)

    Bergeaud, Vincent; Lefebvre, Vincent

    2010-01-01

    In order to ease the development of applications integrating simulation codes, CAD modelers and post-processing tools. CEA and EDF R and D have invested in the SALOME platform, a tool dedicated to the environment of the scientific codes. The platform comes in the shape of a toolbox which offers functionalities for CAD, meshing, code coupling, visualization, GUI development. These tools can be combined to create integrated applications that make the scientific codes easier to use and well-interfaced with their environment be it other codes, CAD and meshing tools or visualization software. Many projects in CEA and EDF R and D now use SALOME, bringing technical coherence to the software suites of our institutions. (author)

  18. Looking Toward the Future of Library Technology. The Systems Librarian

    Science.gov (United States)

    Breeding, Marshall

    2005-01-01

    This article discusses trends in five areas relating to software developed for libraries, and based on these trends, the author's predictions for developments that might play out in the next few years. The author's predictions, based on his own empirical observations, include: (1) the integrated library system (ILS) will be reintegrated; (2) the…

  19. Development, validation and integration of the ATLAS Trigger System software in Run 2

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00377077; The ATLAS collaboration

    2017-01-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high per...

  20. Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

    Science.gov (United States)

    Keyes, Robert; ATLAS Collaboration

    2017-10-01

    The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware, and software, associated to various sub-detectors that must seamlessly cooperate in order to select one collision of interest out of every 40,000 delivered by the LHC every millisecond. These proceedings discuss the challenges, organization and work flow of the ongoing trigger software development, validation, and deployment. The goal of this development is to ensure that the most up-to-date algorithms are used to optimize the performance of the experiment. The goal of the validation is to ensure the reliability and predictability of the software performance. Integration tests are carried out to ensure that the software deployed to the online trigger farm during data-taking run as desired. Trigger software is validated by emulating online conditions using a benchmark run and mimicking the reconstruction that occurs during normal data-taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low-level memory and CPU requirements, to distributions and efficiencies of high-level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort and thus directly influences the overall performance of the ATLAS experiment.

  1. An integrated development framework for rapid development of platform-independent and reusable satellite on-board software

    Science.gov (United States)

    Ziemke, Claas; Kuwahara, Toshinori; Kossev, Ivan

    2011-09-01

    Even in the field of small satellites, the on-board data handling subsystem has become complex and powerful. With the introduction of powerful CPUs and the availability of considerable amounts of memory on-board a small satellite it has become possible to utilize the flexibility and power of contemporary platform-independent real-time operating systems. Especially the non-commercial sector such like university institutes and community projects such as AMSAT or SSETI are characterized by the inherent lack of financial as well as manpower resources. The opportunity to utilize such real-time operating systems will contribute significantly to achieve a successful mission. Nevertheless the on-board software of a satellite is much more than just an operating system. It has to fulfill a multitude of functional requirements such as: Telecommand interpretation and execution, execution of control loops, generation of telemetry data and frames, failure detection isolation and recovery, the communication with peripherals and so on. Most of the aforementioned tasks are of generic nature and have to be conducted on any satellite with only minor modifications. A general set of functional requirements as well as a protocol for communication is defined in the SA ECSS-E-70-41A standard "Telemetry and telecommand packet utilization". This standard not only defines the communication protocol of the satellite-ground link but also defines a set of so called services which have to be available on-board of every compliant satellite and which are of generic nature. In this paper, a platform-independent and reusable framework is described which is implementing not only the ECSS-E-70-41A standard but also functionalities for interprocess communication, scheduling and a multitude of tasks commonly performed on-board of a satellite. By making use of the capabilities of the high-level programming language C/C++, the powerful open source library BOOST, the real-time operating system RTEMS and

  2. Molecular radiotherapy: The NUKFIT software for calculating the time-integrated activity coefficient

    Energy Technology Data Exchange (ETDEWEB)

    Kletting, P.; Schimmel, S.; Luster, M. [Klinik für Nuklearmedizin, Universität Ulm, Ulm 89081 (Germany); Kestler, H. A. [Research Group Bioinformatics and Systems Biology, Institut für Neuroinformatik, Universität Ulm, Ulm 89081 (Germany); Hänscheid, H.; Fernández, M.; Lassmann, M. [Klinik für Nuklearmedizin, Universität Würzburg, Würzburg 97080 (Germany); Bröer, J. H.; Nosske, D. [Bundesamt für Strahlenschutz, Fachbereich Strahlenschutz und Gesundheit, Oberschleißheim 85764 (Germany); Glatting, G. [Medical Radiation Physics/Radiation Protection, Medical Faculty Mannheim, Heidelberg University, Mannheim 68167 (Germany)

    2013-10-15

    Purpose: Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error.Methods: The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB.Results: To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit

  3. The EGSE science software of the IBIS instrument on-board INTEGRAL satellite

    International Nuclear Information System (INIS)

    La Rosa, Giovanni; Fazio, Giacomo; Segreto, Alberto; Gianotti, Fulvio; Stephen, John; Trifoglio, Massimo

    2000-01-01

    IBIS (Imager on Board INTEGRAL Satellite) is one of the key instrument on-board the INTEGRAL satellite, the follow up mission of the high energy missions CGRO and Granat. The EGSE of IBIS is composed by a Satellite Interface Simulator, a Control Station and a Science Station. Here are described the solutions adopted for the architectural design of the software running on the Science Station. Some preliminary results are used to show the science functionality, that allowed to understand the instrument behavior, all along the test and calibration campaigns of the Engineering Model of IBIS

  4. Integrating optical, mechanical, and test software (with applications to freeform optics)

    Science.gov (United States)

    Genberg, Victor; Michels, Gregory; Myer, Brian

    2017-10-01

    Optical systems must perform under environmental conditions including thermal and mechanical loading. To predict the performance in the field, integrated analysis combining optical and mechanical software is required. Freeform and conformal optics offer many new opportunities for optical design. The unconventional geometries can lead to unconventional, and therefore unintuitive, mechanical behavior. Finite element (FE) analysis offers the ability to predict the deformations of freeform optics under various environments and load conditions. To understand the impact on optical performance, the deformations must be brought into optical analysis codes. This paper discusses several issues related to the integrated optomechanical analysis of freeform optics.

  5. NuSEE: an integrated environment of software specification and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Yoo, Jun Beom; Cha, Sung Deok; Youn, Cheong; Han, Hyun Chul

    2006-01-01

    As the use of digital systems becomes more prevalent, adequate techniques for software specification and analysis have become increasingly important in Nuclear Power Plant (NPP) safety-critical systems. Additionally, the importance of software Verification and Validation (V and V) based on adequate specification has received greater emphasis in view of improving software quality. For thorough V and V of safety-critical systems, V and V should be performed throughout the software lifecycle. However, systematic V and V is difficult as it involves many manual-oriented tasks. Tool support is needed in order to more conveniently perform software V and V. In response, we developed four kinds of Computer Aided Software Engineering (CASE) tools to support system specification for a formal-based analysis according to the software lifecycle. In this work, we achieved optimized integration of each tool. The toolset, NuSEE, is an integrated environment for software specification and V and V for PLC based safety-critical systems. In accordance with the software lifecycle, NuSEE consists of NuSISRT for the concept phase, NuSRS for the requirements phase, NuSDS for the design phase and NuSCM for configuration management. It is believed that after further development our integrated environment will be a unique and promising software specification and analysis toolset that will support the entire software lifecycle for the development of PLC based NPP safety-critical systems

  6. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    Science.gov (United States)

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard

  7. Synthesis of a drug-like focused library of trisubstituted pyrrolidines using integrated flow chemistry and batch methods.

    Science.gov (United States)

    Baumann, Marcus; Baxendale, Ian R; Kuratli, Christoph; Ley, Steven V; Martin, Rainer E; Schneider, Josef

    2011-07-11

    A combination of flow and batch chemistries has been successfully applied to the assembly of a series of trisubstituted drug-like pyrrolidines. This study demonstrates the efficient preparation of a focused library of these pharmaceutically important structures using microreactor technologies, as well as classical parallel synthesis techniques, and thus exemplifies the impact of integrating innovative enabling tools within the drug discovery process.

  8. Practical and effective management of libraries integrating case studies, general management theory and self-understanding

    CERN Document Server

    Moniz, Jr, Richard

    2010-01-01

    Aimed at library science students and librarians with newly assigned administrative duties the book is about improving one's thinking and decision making in a role as a library manager. Most librarians get very little exposure to management issues prior to finding themselves in a management role. Furthermore, most library science students do not expect that they will need to understand management yet they quickly find that there is a need to understand this perspective to be effective at almost any library job. Effective library management is about having some tools to make decisions (such as

  9. Omics Informatics: From Scattered Individual Software Tools to Integrated Workflow Management Systems.

    Science.gov (United States)

    Ma, Tianle; Zhang, Aidong

    2017-01-01

    Omic data analyses pose great informatics challenges. As an emerging subfield of bioinformatics, omics informatics focuses on analyzing multi-omic data efficiently and effectively, and is gaining momentum. There are two underlying trends in the expansion of omics informatics landscape: the explosion of scattered individual omics informatics tools with each of which focuses on a specific task in both single- and multi- omic settings, and the fast-evolving integrated software platforms such as workflow management systems that can assemble multiple tools into pipelines and streamline integrative analysis for complicated tasks. In this survey, we give a holistic view of omics informatics, from scattered individual informatics tools to integrated workflow management systems. We not only outline the landscape and challenges of omics informatics, but also sample a number of widely used and cutting-edge algorithms in omics data analysis to give readers a fine-grained view. We survey various workflow management systems (WMSs), classify them into three levels of WMSs from simple software toolkits to integrated multi-omic analytical platforms, and point out the emerging needs for developing intelligent workflow management systems. We also discuss the challenges, strategies and some existing work in systematic evaluation of omics informatics tools. We conclude by providing future perspectives of emerging fields and new frontiers in omics informatics.

  10. Development of SRS.php, a Simple Object Access Protocol-based library for data acquisition from integrated biological databases.

    Science.gov (United States)

    Barbosa-Silva, A; Pafilis, E; Ortega, J M; Schneider, R

    2007-12-11

    Data integration has become an important task for biological database providers. The current model for data exchange among different sources simplifies the manner that distinct information is accessed by users. The evolution of data representation from HTML to XML enabled programs, instead of humans, to interact with biological databases. We present here SRS.php, a PHP library that can interact with the data integration Sequence Retrieval System (SRS). The library has been written using SOAP definitions, and permits the programmatic communication through webservices with the SRS. The interactions are possible by invoking the methods described in WSDL by exchanging XML messages. The current functions available in the library have been built to access specific data stored in any of the 90 different databases (such as UNIPROT, KEGG and GO) using the same query syntax format. The inclusion of the described functions in the source of scripts written in PHP enables them as webservice clients to the SRS server. The functions permit one to query the whole content of any SRS database, to list specific records in these databases, to get specific fields from the records, and to link any record among any pair of linked databases. The case study presented exemplifies the library usage to retrieve information regarding registries of a Plant Defense Mechanisms database. The Plant Defense Mechanisms database is currently being developed, and the proposal of SRS.php library usage is to enable the data acquisition for the further warehousing tasks related to its setup and maintenance.

  11. Integrated management tool for controls software problems, requests and project tasking at SLAC

    International Nuclear Information System (INIS)

    Rogind, D.; Allen, W.; Colocho, W.; DeContreras, G.; Gordon, J.; Pandey, P.; Shoaee, H.

    2012-01-01

    The Accelerator Directorate (AD) Instrumentation and Controls (ICD) Software (SW) Department at SLAC, with its service center model, continuously receives engineering requests to design, build and support controls for accelerator systems lab-wide. Each customer request can vary in complexity from a small software engineering change to a major enhancement. SLAC's Accelerator Improvement Projects (AIPs), along with DOE Construction projects, also contribute heavily to the work load. The various customer requests and projects, paired with the ongoing operational maintenance and problem reports, place a demand on the department that consistently exceeds the capacity of available resources. A centralized repository - comprised of all requests, project tasks, and problems - available to physicists, operators, managers, and engineers alike, is essential to capture, communicate, prioritize, assign, schedule, track, and finally, commission all work components. The Software Department has recently integrated request / project tasking into SLAC's custom online problem tracking 'Comprehensive Accelerator Tool for Enhancing Reliability' (CATER) tool. This paper discusses the newly implemented software request management tool - the workload it helps to track, its structure, features, reports, work-flow and its many usages. (authors)

  12. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  13. Integrated software package for nuclear material safeguards in a MOX fuel fabrication facility

    International Nuclear Information System (INIS)

    Schreiber, H.J.; Piana, M.; Moussalli, G.; Saukkonen, H.

    2000-01-01

    Since computerized data processing was introduced to Safeguards at large bulk handling facilities, a large number of individual software applications have been developed for nuclear material Safeguards implementation. Facility inventory and flow data are provided in computerized format for performing stratification, sample size calculation and selection of samples for destructive and non-destructive assay. Data is collected from nuclear measurement systems running in attended, unattended mode and more recently from remote monitoring systems controlled. Data sets from various sources have to be evaluated for Safeguards purposes, such as raw data, processed data and conclusions drawn from data evaluation results. They are reported in computerized format at the International Atomic Energy Agency headquarters and feedback from the Agency's mainframe computer system is used to prepare and support Safeguards inspection activities. The integration of all such data originating from various sources cannot be ensured without the existence of a common data format and a database system. This paper describes the fundamental relations between data streams, individual data processing tools, data evaluation results and requirements for an integrated software solution to facilitate nuclear material Safeguards at a bulk handling facility. The paper also explains the basis for designing a software package to manage data streams from various data sources and for incorporating diverse data processing tools that until now have been used independently from each other and under different computer operating systems. (author)

  14. Design of energy efficient optical networks with software enabled integrated control plane

    DEFF Research Database (Denmark)

    Wang, Jiayuan; Yan, Ying; Dittmann, Lars

    2015-01-01

    energy consumption by proposing a new integrated control plane structure utilising Software Defined Networking technologies. The integrated control plane increases the efficiencies of exchanging control information across different network domains, while introducing new possibilities to the routing...... methods and the control over quality of service (QoS). The structure is defined as an overlay generalised multi-protocol label switching (GMPLS) control model. With the defined structure, the integrated control plane is able to gather information from different domains (i.e. optical core network......'s) routing behaviours. With the flexibility of the routing structure, results show that the energy efficiency of the network can be improved without compromising the QoS for delay/blocking sensitive services....

  15. A COTS RF Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RFOptical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  16. Integrated State Estimation and Contingency Analysis Software Implementation using High Performance Computing Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Rice, Mark J.; Huang, Zhenyu

    2015-12-31

    Power system simulation tools are traditionally developed in sequential mode and codes are optimized for single core computing only. However, the increasing complexity in the power grid models requires more intensive computation. The traditional simulation tools will soon not be able to meet the grid operation requirements. Therefore, power system simulation tools need to evolve accordingly to provide faster and better results for grid operations. This paper presents an integrated state estimation and contingency analysis software implementation using high performance computing techniques. The software is able to solve large size state estimation problems within one second and achieve a near-linear speedup of 9,800 with 10,000 cores for contingency analysis application. The performance evaluation is presented to show its effectiveness.

  17. IEEE Computer Society/Software Engineering Institute Watts S. Humphrey Software Process Achievement Award 2016: Raytheon Integrated Defense Systems Design for Six Sigma Team

    Science.gov (United States)

    2017-04-01

    worldwide • $23 billion in sales for 2015 Raytheon Integrated Defense Systems (IDS) is one of five businesses within Raytheon Company and is headquartered...Raytheon Integrated Defense Systems DFSS team has developed and implemented numerous leading-edge improvement and optimization methodologies resulting in...our software systems . In this section, we explain the first methodology, the application of statistical test optimization (STO) using Design of

  18. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    Directory of Open Access Journals (Sweden)

    Yury V. Zaytsev

    2013-01-01

    Full Text Available High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI, a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  19. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    Science.gov (United States)

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  20. Penerapan INLISlite (Integrated Library System di Kantor Perpustakaan dan Arsip Daerah Kabupaten Pangkep, Sulawesi Selatan

    Directory of Open Access Journals (Sweden)

    Abdul Hamid

    2015-12-01

    Full Text Available Information and communication technology (ICT has been undoubtly applied in all fields of life. It provides variety of services in which it is dependent on user needs. The Kantor Perpustakaan dan Arsip Daerah (the District Library and Archieve of Pangkep has been implementing ICT to enhance its services to the citizens of Pangkep. INSLISLite was choosen as its automted library system. The current study investigates how INLIS lite has been implemented in the library, what were the challanges the library faced, and the floating library services of KPAD. The data were collected throughout the representative informans. The study found that INLIS lite needed to be more optimized, the challenges were human resources and budgetary, and the floating library has played important roles to educate the Pangkep island citizens.

  1. A Development Framework for Software Security in Nuclear Safety Systems: Integrating Secure Development and System Security Activities

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaekwan; Suh, Yongsuk [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-02-15

    The protection of nuclear safety software is essential in that a failure can result in significant economic loss and physical damage to the public. However, software security has often been ignored in nuclear safety software development. To enforce security considerations, nuclear regulator commission recently issued and revised the security regulations for nuclear computer-based systems. It is a great challenge for nuclear developers to comply with the security requirements. However, there is still no clear software development process regarding security activities. This paper proposes an integrated development process suitable for the secure development requirements and system security requirements described by various regulatory bodies. It provides a three-stage framework with eight security activities as the software development process. Detailed descriptions are useful for software developers and licensees to understand the regulatory requirements and to establish a detailed activity plan for software design and engineering.

  2. Development of a new model to predict indoor daylighting: Integration in CODYRUN software and validation

    Energy Technology Data Exchange (ETDEWEB)

    Fakra, A.H., E-mail: fakra@univ-reunion.f [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France); Miranville, F.; Boyer, H.; Guichard, S. [Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT), University of La Reunion, 117 rue du General Ailleret, 97430 Le Tampon (French Overseas Dpt.), Reunion (France)

    2011-07-15

    Research highlights: {yields} This study presents a new model capable to simulate indoor daylighting. {yields} The model was introduced in research software called CODYRUN. {yields} The validation of the code was realized from a lot of tests cases. -- Abstract: Many models exist in the scientific literature for determining indoor daylighting values. They are classified in three categories: numerical, simplified and empirical models. Nevertheless, each of these categories of models are not convenient for every application. Indeed, the numerical model requires high calculation time; conditions of use of the simplified models are limited, and experimental models need not only important financial resources but also a perfect control of experimental devices (e.g. scale model), as well as climatic characteristics of the location (e.g. in situ experiment). In this article, a new model based on a combination of multiple simplified models is established. The objective is to improve this category of model. The originality of our paper relies on the coupling of several simplified models of indoor daylighting calculations. The accuracy of the simulation code, introduced into CODYRUN software to simulate correctly indoor illuminance, is then verified. Besides, the software consists of a numerical building simulation code, developed in the Physics and Mathematical Engineering Laboratory for Energy and Environment (PIMENT) at the University of Reunion. Initially dedicated to the thermal, airflow and hydrous phenomena in the buildings, the software has been completed for the calculation of indoor daylighting. New models and algorithms - which rely on a semi-detailed approach - will be presented in this paper. In order to validate the accuracy of the integrated models, many test cases have been considered as analytical, inter-software comparisons and experimental comparisons. In order to prove the accuracy of the new model - which can properly simulate the illuminance - a

  3. Insect Pests and Integrated Pest Management in Museums, Libraries and Historic Buildings.

    Science.gov (United States)

    Querner, Pascal

    2015-06-16

    Insect pests are responsible for substantial damage to museum objects, historic books and in buildings like palaces or historic houses. Different wood boring beetles (Anobium punctatum, Hylotrupes bajulus, Lyctus sp. or introduced species), the biscuit beetle (Stegobium paniceum), the cigarette beetle (Lasioderma serricorne), different Dermestides (Attagenus sp., Anthrenus sp., Dermestes sp., Trogoderma sp.), moths like the webbing clothes moth (Tineola bisselliella), Silverfish (Lepisma saccharina) and booklice (Psocoptera) can damage materials, objects or building parts. They are the most common pests found in collections in central Europe, but most of them are distributed all over the world. In tropical countries, termites, cockroaches and other insect pests are also found and result in even higher damage of wood and paper or are a commune annoyance in buildings. In this short review, an introduction to Integrated Pest Management (IPM) in museums is given, the most valuable collections, preventive measures, monitoring in museums, staff responsible for the IPM and chemical free treatment methods are described. In the second part of the paper, the most important insect pests occurring in museums, archives, libraries and historic buildings in central Europe are discussed with a description of the materials and object types that are mostly infested and damaged. Some information on their phenology and biology are highlighted as they can be used in the IPM concept against them.

  4. Algorithm integration using ADL (Algorithm Development Library) for improving CrIMSS EDR science product quality

    Science.gov (United States)

    Das, B.; Wilson, M.; Divakarla, M. G.; Chen, W.; Barnet, C.; Wolf, W.

    2013-05-01

    Algorithm Development Library (ADL) is a framework that mimics the operational system IDPS (Interface Data Processing Segment) that is currently being used to process data from instruments aboard Suomi National Polar-orbiting Partnership (S-NPP) satellite. The satellite was launched successfully in October 2011. The Cross-track Infrared and Microwave Sounder Suite (CrIMSS) consists of the Advanced Technology Microwave Sounder (ATMS) and Cross-track Infrared Sounder (CrIS) instruments that are on-board of S-NPP. These instruments will also be on-board of JPSS (Joint Polar Satellite System) that will be launched in early 2017. The primary products of the CrIMSS Environmental Data Record (EDR) include global atmospheric vertical temperature, moisture, and pressure profiles (AVTP, AVMP and AVPP) and Ozone IP (Intermediate Product from CrIS radiances). Several algorithm updates have recently been proposed by CrIMSS scientists that include fixes to the handling of forward modeling errors, a more conservative identification of clear scenes, indexing corrections for daytime products, and relaxed constraints between surface temperature and air temperature for daytime land scenes. We have integrated these improvements into the ADL framework. This work compares the results from ADL emulation of future IDPS system incorporating all the suggested algorithm updates with the current official processing results by qualitative and quantitative evaluations. The results prove these algorithm updates improve science product quality.

  5. Integrative characterization of germ cell-specific genes from mouse spermatocyte UniGene library

    Directory of Open Access Journals (Sweden)

    Eddy Edward M

    2007-07-01

    Full Text Available Abstract Background The primary regulator of spermatogenesis, a highly ordered and tightly regulated developmental process, is an intrinsic genetic program involving male germ cell-specific genes. Results We analyzed the mouse spermatocyte UniGene library containing 2155 gene-oriented transcript clusters. We predict that 11% of these genes are testis-specific and systematically identified 24 authentic genes specifically and abundantly expressed in the testis via in silico and in vitro approaches. Northern blot analysis disclosed various transcript characteristics, such as expression level, size and the presence of isoform. Expression analysis revealed developmentally regulated and stage-specific expression patterns in all of the genes. We further analyzed the genes at the protein and cellular levels. Transfection assays performed using GC-2 cells provided information on the cellular characteristics of the gene products. In addition, antibodies were generated against proteins encoded by some of the genes to facilitate their identification and characterization in spermatogenic cells and sperm. Our data suggest that a number of the gene products are implicated in transcriptional regulation, nuclear integrity, sperm structure and motility, and fertilization. In particular, we found for the first time that Mm.333010, predicted to contain a trypsin-like serine protease domain, is a sperm acrosomal protein. Conclusion We identify 24 authentic genes with spermatogenic cell-specific expression, and provide comprehensive information about the genes. Our findings establish a new basis for future investigation into molecular mechanisms underlying male reproduction.

  6. CASSys: an integrated software-system for the interactive analysis of ChIP-seq data

    Directory of Open Access Journals (Sweden)

    Alawi Malik

    2011-06-01

    Full Text Available The mapping of DNA-protein interactions is crucial for a full understanding of transcriptional regulation. Chromatin-immunoprecipitation followed bymassively parallel sequencing (ChIP-seq has become the standard technique for analyzing these interactions on a genome-wide scale. We have developed a software system called CASSys (ChIP-seq data Analysis Software System spanning all steps of ChIP-seq data analysis. It supersedes the laborious application of several single command line tools. CASSys provides functionality ranging from quality assessment and -control of short reads, over the mapping of reads against a reference genome (readmapping and the detection of enriched regions (peakdetection to various follow-up analyses. The latter are accessible via a state-of-the-art web interface and can be performed interactively by the user. The follow-up analyses allow for flexible user defined association of putative interaction sites with genes, visualization of their genomic context with an integrated genome browser, the detection of putative binding motifs, the identification of over-represented Gene Ontology-terms, pathway analysis and the visualization of interaction networks. The system is client-server based, accessible via a web browser and does not require any software installation on the client side. To demonstrate CASSys’s functionality we used the system for the complete data analysis of a publicly available Chip-seq study that investigated the role of the transcription factor estrogen receptor-α in breast cancer cells.

  7. Integrating Multimedia ICT Software in Language Curriculum: Students’ Perception, Use, and Effectivenes

    Directory of Open Access Journals (Sweden)

    Nikolai Penner

    2014-03-01

    Full Text Available Information and Communication Technologies (ICT constitute an integral part of the teaching and learning environment in present-day educational institutions and play an increasingly important role in the modern second language classroom. In this study, an online language learning tool Tell Me More (TMM has been introduced as  a supplementary tool in French and German first and second-year language university classes. At the end of the academic year, the students completed a questionnaire exploring their TMM usage behaviour and perception of the software. The survey also addressed aspects of the respondents' readiness for self-directed language learning. The data were then imported into SPSS and underwent statistical analysis. The results of the study show that 1 relatively few of today's university students are open to the idea of voluntarily using ICT for independent language practice; 2 grade, price, and availability of alternative means of language practice are the most important factors affecting the students' decision to purchase and use ICT software; 3 there is a relationship between the students' decision to buy and use ICT software and their readiness for self-directed learning.

  8. Experience Supporting the Integration of LHC Experiments Software Framework with the LCG Middleware

    CERN Document Server

    Santinelli, Roberto

    2006-01-01

    The LHC experiments are currently preparing for data acquisition in 2007 and because of the large amount of required computing and storage resources, they decided to embrace the grid paradigm. The LHC Computing Project (LCG) provides and operates a computing infrastructure suitable for data handling, Monte Carlo production and analysis. While LCG offers a set of high level services, intended to be generic enough to accommodate the needs of different Virtual Organizations, the LHC experiments software framework and applications are very specific and focused on the computing and data models. The LCG Experiment Integration Support team works in close contact with the experiments, the middleware developers and the LCG certification and operations teams to integrate the underlying grid middleware with the experiment specific components. The strategical position between the experiments and the middleware suppliers allows EIS team to play a key role at communications level between the customers and the service provi...

  9. Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis

    Science.gov (United States)

    Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.

    2013-03-01

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.

  10. Plagiarism Awareness among Students: Assessing Integration of Ethics Theory into Library Instruction

    Science.gov (United States)

    Strittmatter, Connie; Bratton, Virginia K.

    2014-01-01

    The library literature on plagiarism instruction focuses on students' understanding of what plagiarism is and is not. This study evaluates the effect of library instruction from a broader perspective by examining the pre- and posttest (instruction) levels of students' perceptions toward plagiarism ethics. Eighty-six students completed a pre- and…

  11. Software Configuration Management Plan for the K West Basin Integrated Water Treatment System (IWTS) - Project A.9

    International Nuclear Information System (INIS)

    GREEN, J.W.

    2000-01-01

    This document provides a configuration control plan for the software associated with the operation and control of the Integrated Water Treatment System (IWTS). It establishes requirements for ensuring configuration item identification, configuration control, configuration status accounting, defect reporting and resolution of computer software. It is written to comply with HNF-SD-SNF-CM-001, Spent Nuclear Fuel Configuration Management Plan (Forehand 1998) and HNF-PRO-309 Computer Software Quality Assurance Requirements, and applicable sections of administrative procedure CM-6-037-00, SNF Project Process Automation Software and Equipment

  12. Extension of the AMBER molecular dynamics software to Intel's Many Integrated Core (MIC) architecture

    Science.gov (United States)

    Needham, Perri J.; Bhuiyan, Ashraf; Walker, Ross C.

    2016-04-01

    We present an implementation of explicit solvent particle mesh Ewald (PME) classical molecular dynamics (MD) within the PMEMD molecular dynamics engine, that forms part of the AMBER v14 MD software package, that makes use of Intel Xeon Phi coprocessors by offloading portions of the PME direct summation and neighbor list build to the coprocessor. We refer to this implementation as pmemd MIC offload and in this paper present the technical details of the algorithm, including basic models for MPI and OpenMP configuration, and analyze the resultant performance. The algorithm provides the best performance improvement for large systems (>400,000 atoms), achieving a ∼35% performance improvement for satellite tobacco mosaic virus (1,067,095 atoms) when 2 Intel E5-2697 v2 processors (2 ×12 cores, 30M cache, 2.7 GHz) are coupled to an Intel Xeon Phi coprocessor (Model 7120P-1.238/1.333 GHz, 61 cores). The implementation utilizes a two-fold decomposition strategy: spatial decomposition using an MPI library and thread-based decomposition using OpenMP. We also present compiler optimization settings that improve the performance on Intel Xeon processors, while retaining simulation accuracy.

  13. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    Science.gov (United States)

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  14. Design and Implementation of Integrated Software Research and Community Service at State Polytechnic of Manado

    Science.gov (United States)

    Saroinsong, T.; A. S Kondoj, M.; Kandiyoh, G.; Pontoh, G.

    2018-01-01

    The State Polytechnic of Manado (Polimdo) is one of the reliable institutions in North Sulawesi that first implemented ISO 9001. But the accreditation of the institution has not been satisfactory, it means there is still much to be prepared to achieve the expected target. One of the criteria of assessment of institutional accreditation is related to research activities and social work in accordance with the standard seven. Data documentation systems related to research activities and social work are not well integrated and well documented in all existing work units. This causes the process of gathering information related to the activities and the results of research and social work in order to support the accreditation activities of the institution is still not efficient. This study aims to build an integrated software in all work units in Polimdo to obtain documentation and data synchronization in support of activities or reporting of documents accreditation institution in accordance with standard seven specifically in terms of submission of research proposal and dedication. The software will be developed using RUP method with analysis using data flow diagram and ERM so that the result of this research is documentation and synchronization of data and information of research activity and community service which can be used in preparing documents report for accreditation institution.

  15. Integrated software system for seismic evaluation of nuclear power plant structures

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.L.

    1993-01-01

    The computer software CARES (Computer Analysis for Rapid Evaluation of Structures) was developed by the Brookhaven National Laboratory for the U.S. Nuclear Regulatory Commission. It represents an effort to utilize established numerical methodologies commonly employed by industry for structural safety evaluations of nuclear power plant facilities and incorporates them into an integrated computer software package operated on personal computers. CARES was developed with the objective of including all aspects of seismic performance evaluation of nuclear power structures. It can be used to evaluate the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants by various utilities. CARES has a modular format, each module performing a specific type of analysis. The seismic module integrates all the steps of a complete seismic analysis into a single package with many user-friendly features such as interactiveness and quick turnaround. Linear structural theory and pseudo-linear convolution theory are utilized as the bases for the development with a special emphasis on the nuclear regulatory requirements for structural safety of nuclear plants. The organization of the seismic module is arranged in eight options, each performing a specific step of the analysis with most of input/output interfacing processed by the general manager. Finally, CARES provides comprehensive post-processing capability for displaying results graphically or in tabular form so that direct comparisons can be easily made. (author)

  16. Primer3_masker: integrating masking of template sequence with primer design software.

    Science.gov (United States)

    Kõressaar, Triinu; Lepamets, Maarja; Kaplinski, Lauris; Raime, Kairi; Andreson, Reidar; Remm, Maido

    2018-06-01

    Designing PCR primers for amplifying regions of eukaryotic genomes is a complicated task because the genomes contain a large number of repeat sequences and other regions unsuitable for amplification by PCR. We have developed a novel k-mer based masking method that uses a statistical model to detect and mask failure-prone regions on the DNA template prior to primer design. We implemented the software as a standalone software primer3_masker and integrated it into the primer design program Primer3. The standalone version of primer3_masker is implemented in C. The source code is freely available at https://github.com/bioinfo-ut/primer3_masker/ (standalone version for Linux and macOS) and at https://github.com/primer3-org/primer3/ (integrated version). Primer3 web application that allows masking sequences of 196 animal and plant genomes is available at http://primer3.ut.ee/. maido.remm@ut.ee. Supplementary data are available at Bioinformatics online.

  17. The NIH Library of Integrated Network-Based Cellular Signatures (LINCS) Program | Informatics Technology for Cancer Research (ITCR)

    Science.gov (United States)

    By generating and making public data that indicates how cells respond to various genetic and environmental stressors, the LINCS project will help us gain a more detailed understanding of cell pathways and aid efforts to develop therapies that might restore perturbed pathways and networks to their normal states. The LINCS website is a source of information for the research community and general public about the LINCS project. This website along with the LINCS Data Portal contains details about the assays, cell types, and perturbagens that are currently part of the library, as well as links to participating sites, data releases from the sites, and software that can be used for analyzing the data.

  18. An integrated environment of software development and V and V for PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, Seo Ryong

    2005-02-01

    To develop and implement a safety-critical system, the requirements of the system must be analyzed thoroughly during the phases of a software development's life cycle because a single error in the requirements can generate serious software faults. We therefore propose an Integrated Environment (IE) approach for requirements which is an integrated approach that enables easy inspection by combining requirement traceability and effective use of a formal method. For the V and V tasks of requirements phase, our approach uses software inspection, requirement traceability, and formal specification with structural decomposition. Software inspection and the analysis of requirements traceability are the most effective methods of software V and V. Although formal methods are also considered an effective V and V activity, they are difficult to use properly in nuclear fields, as well as in other fields, because of their mathematical nature. We also propose another Integrated Environment (IE) for the design and implementation of safety-critical systems. In this study, a nuclear FED-style design specification and analysis (NuFDS) approach was proposed for PLC based safety-critical systems. The NuFDS approach is suggested in a straightforward manner for the effective and formal specification and analysis of software designs. Accordingly, the proposed NuFDS approach comprises one technique for specifying the software design and another for analyzing the software design. In addition, with the NuFDS approach, we can analyze the safety of software on the basis of fault tree synthesis. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Various tools have been needed to make software V and V more convenient. We therefore developed four kinds of computer-aided software engineering tools that could be used in accordance with the software's life cycle to

  19. Evolution of the Research Libraries Information Network.

    Science.gov (United States)

    Richards, David; Lerche, Carol

    1989-01-01

    Discusses current RLIN (Research Libraries Information Network) communications technology and motivations for change. Goals, topology, hardware, software, and protocol, terminal wiring, and deployment are considered. Sidebars provide a diagram of the current RLIN communications technology and describe the integrated RLIN network. (one reference)…

  20. Integrating Testing into Software Engineering Courses Supported by a Collaborative Learning Environment

    Science.gov (United States)

    Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.

    2014-01-01

    As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…

  1. DiSCaMB: a software library for aspherical atom model X-ray scattering factor calculations with CPUs and GPUs.

    Science.gov (United States)

    Chodkiewicz, Michał L; Migacz, Szymon; Rudnicki, Witold; Makal, Anna; Kalinowski, Jarosław A; Moriarty, Nigel W; Grosse-Kunstleve, Ralf W; Afonine, Pavel V; Adams, Paul D; Dominiak, Paulina Maria

    2018-02-01

    It has been recently established that the accuracy of structural parameters from X-ray refinement of crystal structures can be improved by using a bank of aspherical pseudoatoms instead of the classical spherical model of atomic form factors. This comes, however, at the cost of increased complexity of the underlying calculations. In order to facilitate the adoption of this more advanced electron density model by the broader community of crystallographers, a new software implementation called DiSCaMB , 'densities in structural chemistry and molecular biology', has been developed. It addresses the challenge of providing for high performance on modern computing architectures. With parallelization options for both multi-core processors and graphics processing units (using CUDA), the library features calculation of X-ray scattering factors and their derivatives with respect to structural parameters, gives access to intermediate steps of the scattering factor calculations (thus allowing for experimentation with modifications of the underlying electron density model), and provides tools for basic structural crystallographic operations. Permissively (MIT) licensed, DiSCaMB is an open-source C++ library that can be embedded in both academic and commercial tools for X-ray structure refinement.

  2. MoFi: A Software Tool for Annotating Glycoprotein Mass Spectra by Integrating Hybrid Data from the Intact Protein and Glycopeptide Level.

    Science.gov (United States)

    Skala, Wolfgang; Wohlschlager, Therese; Senn, Stefan; Huber, Gabriel E; Huber, Christian G

    2018-04-18

    Hybrid mass spectrometry (MS) is an emerging technique for characterizing glycoproteins, which typically display pronounced microheterogeneity. Since hybrid MS combines information from different experimental levels, it crucially depends on computational methods. Here, we describe a novel software tool, MoFi, which integrates hybrid MS data to assign glycans and other post-translational modifications (PTMs) in deconvoluted mass spectra of intact proteins. Its two-stage search algorithm first assigns monosaccharide/PTM compositions to each peak and then compiles a hierarchical list of glycan combinations compatible with these compositions. Importantly, the program only includes those combinations which are supported by a glycan library as derived from glycopeptide or released glycan analysis. By applying MoFi to mass spectra of rituximab, ado-trastuzumab emtansine, and recombinant human erythropoietin, we demonstrate how integration of bottom-up data may be used to refine information collected at the intact protein level. Accordingly, our software reveals that a single mass frequently can be explained by a considerable number of glycoforms. Yet, it simultaneously ranks proteoforms according to their probability, based on a score which is calculated from relative glycan abundances. Notably, glycoforms that comprise identical glycans may nevertheless differ in score if those glycans occupy different sites. Hence, MoFi exposes different layers of complexity that are present in the annotation of a glycoprotein mass spectrum.

  3. Statistics for library and information services a primer for using open source R software for accessibility and visualization

    CERN Document Server

    Friedman, Alon

    2016-01-01

    Statistics for Library and Information Services, written for non-statisticians, provides logical, user-friendly, and step-by-step instructions to make statistics more accessible for students and professionals in the field of Information Science. It emphasizes concepts of statistical theory and data collection methodologies, but also extends to the topics of visualization creation and display, so that the reader will be able to better conduct statistical analysis and communicate his/her findings. The book is tailored for information science students and professionals. It has specific examples of dataset sets, scripts, design modules, data repositories, homework assignments, and a glossary lexicon that matches the field of Information Science. The textbook provides a visual road map that is customized specifically for Information Science instructors, students, and professionals regarding statistics and visualization. Each chapter in the book includes full-color illustrations on how to use R for the statistical ...

  4. IClinfMRI Software for Integrating Functional MRI Techniques in Presurgical Mapping and Clinical Studies.

    Science.gov (United States)

    Hsu, Ai-Ling; Hou, Ping; Johnson, Jason M; Wu, Changwei W; Noll, Kyle R; Prabhu, Sujit S; Ferguson, Sherise D; Kumar, Vinodh A; Schomer, Donald F; Hazle, John D; Chen, Jyh-Horng; Liu, Ho-Ling

    2018-01-01

    Task-evoked and resting-state (rs) functional magnetic resonance imaging (fMRI) techniques have been applied to the clinical management of neurological diseases, exemplified by presurgical localization of eloquent cortex, to assist neurosurgeons in maximizing resection while preserving brain functions. In addition, recent studies have recommended incorporating cerebrovascular reactivity (CVR) imaging into clinical fMRI to evaluate the risk of lesion-induced neurovascular uncoupling (NVU). Although each of these imaging techniques possesses its own advantage for presurgical mapping, a specialized clinical software that integrates the three complementary techniques and promptly outputs the analyzed results to radiology and surgical navigation systems in a clinical format is still lacking. We developed the Integrated fMRI for Clinical Research (IClinfMRI) software to facilitate these needs. Beyond the independent processing of task-fMRI, rs-fMRI, and CVR mapping, IClinfMRI encompasses three unique functions: (1) supporting the interactive rs-fMRI mapping while visualizing task-fMRI results (or results from published meta-analysis) as a guidance map, (2) indicating/visualizing the NVU potential on analyzed fMRI maps, and (3) exporting these advanced mapping results in a Digital Imaging and Communications in Medicine (DICOM) format that are ready to export to a picture archiving and communication system (PACS) and a surgical navigation system. In summary, IClinfMRI has the merits of efficiently translating and integrating state-of-the-art imaging techniques for presurgical functional mapping and clinical fMRI studies.

  5. An integrated and accessible sample data library for Mars sample return science

    Science.gov (United States)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  6. Integrated Web-Based Immersive Exploration of the Coordinated Canyon Experiment Data using Open Source STOQS Software

    Science.gov (United States)

    McCann, M. P.; Gwiazda, R.; O'Reilly, T. C.; Maier, K. L.; Lundsten, E. M.; Parsons, D. R.; Paull, C. K.

    2017-12-01

    The Coordinated Canyon Experiment (CCE) in Monterey Submarine Canyon has produced a wealth of oceanographic measurements whose analysis will improve understanding of turbidity current processes. Exploration of this data set, consisting of over 60 parameters from 15 platforms, is facilitated by using the open source Spatial Temporal Oceanographic Query System (STOQS) software (https://github.com/stoqs/stoqs). The Monterey Bay Aquarium Research Institute (MBARI) originally developed STOQS to help manage and visualize upper water column oceanographic measurements, but the generality of its data model permits effective use for any kind of spatial/temporal measurement data. STOQS consists of a PostgreSQL database and server-side Python/Django software; the client-side is jQuery JavaScript supporting AJAX requests to update a single page web application. The User Interface (UI) is optimized to provide a quick overview of data in spatial and temporal dimensions, as well as in parameter, platform, and data value space. A user may zoom into any feature of interest and select it, initiating a filter operation that updates the UI with an overview of all the data in the new filtered selection. When details are desired, radio buttons and checkboxes are selected to generate a number of different types of visualizations. These include color-filled temporal section and line plots, parameter-parameter plots, 2D map plots, and interactive 3D spatial visualizations. The Extensible 3D (X3D) standard and X3DOM JavaScript library provide the technology for presenting animated 3D data directly within the web browser. Most of the oceanographic measurements from the CCE (e.g. mooring mounted ADCP and CTD data) are easily visualized using established methods. However, unified integration and multiparameter display of several concurrently deployed sensors across a network of platforms is a challenge we hope to solve. Moreover, STOQS also allows display of data from a new instrument - the

  7. VisIVO: A Library and Integrated Tools for Large Astrophysical Dataset Exploration

    Science.gov (United States)

    Becciani, U.; Costa, A.; Ersotelos, N.; Krokos, M.; Massimino, P.; Petta, C.; Vitello, F.

    2012-09-01

    VisIVO provides an integrated suite of tools and services that can be used in many scientific fields. VisIVO development starts in the Virtual Observatory framework. VisIVO allows users to visualize meaningfully highly-complex, large-scale datasets and create movies of these visualizations based on distributed infrastructures. VisIVO supports high-performance, multi-dimensional visualization of large-scale astrophysical datasets. Users can rapidly obtain meaningful visualizations while preserving full and intuitive control of the relevant parameters. VisIVO consists of VisIVO Desktop - a stand-alone application for interactive visualization on standard PCs, VisIVO Server - a platform for high performance visualization, VisIVO Web - a custom designed web portal, VisIVOSmartphone - an application to exploit the VisIVO Server functionality and the latest VisIVO features: VisIVO Library allows a job running on a computational system (grid, HPC, etc.) to produce movies directly with the code internal data arrays without the need to produce intermediate files. This is particularly important when running on large computational facilities, where the user wants to have a look at the results during the data production phase. For example, in grid computing facilities, images can be produced directly in the grid catalogue while the user code is running in a system that cannot be directly accessed by the user (a worker node). The deployment of VisIVO on the DG and gLite is carried out with the support of EDGI and EGI-Inspire projects. Depending on the structure and size of datasets under consideration, the data exploration process could take several hours of CPU for creating customized views and the production of movies could potentially last several days. For this reason an MPI parallel version of VisIVO could play a fundamental role in increasing performance, e.g. it could be automatically deployed on nodes that are MPI aware. A central concept in our development is thus to

  8. Learning Boost C++ libraries

    CERN Document Server

    Mukherjee, Arindam

    2015-01-01

    If you are a C++ programmer who has never used Boost libraries before, this book will get you up-to-speed with using them. Whether you are developing new C++ software or maintaining existing code written using Boost libraries, this hands-on introduction will help you decide on the right library and techniques to solve your practical programming problems.

  9. XplorSeq: a software environment for integrated management and phylogenetic analysis of metagenomic sequence data.

    Science.gov (United States)

    Frank, Daniel N

    2008-10-07

    Advances in automated DNA sequencing technology have accelerated the generation of metagenomic DNA sequences, especially environmental ribosomal RNA gene (rDNA) sequences. As the scale of rDNA-based studies of microbial ecology has expanded, need has arisen for software that is capable of managing, annotating, and analyzing the plethora of diverse data accumulated in these projects. XplorSeq is a software package that facilitates the compilation, management and phylogenetic analysis of DNA sequences. XplorSeq was developed for, but is not limited to, high-throughput analysis of environmental rRNA gene sequences. XplorSeq integrates and extends several commonly used UNIX-based analysis tools by use of a Macintosh OS-X-based graphical user interface (GUI). Through this GUI, users may perform basic sequence import and assembly steps (base-calling, vector/primer trimming, contig assembly), perform BLAST (Basic Local Alignment and Search Tool; 123) searches of NCBI and local databases, create multiple sequence alignments, build phylogenetic trees, assemble Operational Taxonomic Units, estimate biodiversity indices, and summarize data in a variety of formats. Furthermore, sequences may be annotated with user-specified meta-data, which then can be used to sort data and organize analyses and reports. A document-based architecture permits parallel analysis of sequence data from multiple clones or amplicons, with sequences and other data stored in a single file. XplorSeq should benefit researchers who are engaged in analyses of environmental sequence data, especially those with little experience using bioinformatics software. Although XplorSeq was developed for management of rDNA sequence data, it can be applied to most any sequencing project. The application is available free of charge for non-commercial use at http://vent.colorado.edu/phyloware.

  10. Virtual pools for interactive analysis and software development through an integrated Cloud environment

    International Nuclear Information System (INIS)

    Grandi, C; Italiano, A; Salomoni, D; Melcarne, A K Calabrese

    2011-01-01

    WNoDeS, an acronym for Worker Nodes on Demand Service, is software developed at CNAF-Tier1, the National Computing Centre of the Italian Institute for Nuclear Physics (INFN) located in Bologna. WNoDeS provides on demand, integrated access to both Grid and Cloud resources through virtualization technologies. Besides the traditional use of computing resources in batch mode, users need to have interactive and local access to a number of systems. WNoDeS can dynamically select these computers instantiating Virtual Machines, according to the requirements (computing, storage and network resources) of users through either the Open Cloud Computing Interface API, or through a web console. An interactive use is usually limited to activities in user space, i.e. where the machine configuration is not modified. In some other instances the activity concerns development and testing of services and thus implies the modification of the system configuration (and, therefore, root-access to the resource). The former use case is a simple extension of the WNoDeS approach, where the resource is provided in interactive mode. The latter implies saving the virtual image at the end of each user session so that it can be presented to the user at subsequent requests. This work describes how the LHC experiments at INFN-Bologna are testing and making use of these dynamically created ad-hoc machines via WNoDeS to support flexible, interactive analysis and software development at the INFN Tier-1 Computing Centre.

  11. Transformation as a Design Process and Runtime Architecture for High Integrity Software

    Energy Technology Data Exchange (ETDEWEB)

    Bespalko, S.J.; Winter, V.L.

    1999-04-05

    We have discussed two aspects of creating high integrity software that greatly benefit from the availability of transformation technology, which in this case is manifest by the requirement for a sophisticated backtracking parser. First, because of the potential for correctly manipulating programs via small changes, an automated non-procedural transformation system can be a valuable tool for constructing high assurance software. Second, modeling the processing of translating data into information as a, perhaps, context-dependent grammar leads to an efficient, compact implementation. From a practical perspective, the transformation process should begin in the domain language in which a problem is initially expressed. Thus in order for a transformation system to be practical it must be flexible with respect to domain-specific languages. We have argued that transformation applied to specification results in a highly reliable system. We also attempted to briefly demonstrate that transformation technology applied to the runtime environment will result in a safe and secure system. We thus believe that the sophisticated multi-lookahead backtracking parsing technology is central to the task of being in a position to demonstrate the existence of HIS.

  12. A software tool integrated risk assessment of spent fuel transpotation and storage

    Energy Technology Data Exchange (ETDEWEB)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of); Christian, Robby [Dept. of Mechanical, Aerospace, and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy (Korea, Republic of); Kim, Bo Gyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Lee, Sang Hoon [Dept. of Mechanical and Automotive Engineering, Keimyung University, Daegu (Korea, Republic of)

    2017-06-15

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode.

  13. A software tool integrated risk assessment of spent fuel transpotation and storage

    International Nuclear Information System (INIS)

    Yun, Mi Rae; Almomani, Belal; Ham, Jae Hyun; Kang, Hyun Gook; Christian, Robby; Kim, Bo Gyung; Lee, Sang Hoon

    2017-01-01

    When temporary spent fuel storage pools at nuclear power plants reach their capacity limit, the spent fuel must be moved to an alternative storage facility. However, radioactive materials must be handled and stored carefully to avoid severe consequences to the environment. In this study, the risks of three potential accident scenarios (i.e., maritime transportation, an aircraft crashing into an interim storage facility, and on-site transportation) associated with the spent fuel transportation process were analyzed using a probabilistic approach. For each scenario, the probabilities and the consequences were calculated separately to assess the risks: the probabilities were calculated using existing data and statistical models, and the consequences were calculated using computation models. Risk assessment software was developed to conveniently integrate the three scenarios. The risks were analyzed using the developed software according to the shipment route, building characteristics, and spent fuel handling environment. As a result of the risk analysis with varying accident conditions, transportation and storage strategies with relatively low risk were developed for regulators and licensees. The focus of this study was the risk assessment methodology; however, the applied model and input data have some uncertainties. Further research to reduce these uncertainties will improve the accuracy of this mode

  14. GMATA: An Integrated Software Package for Genome-Scale SSR Mining, Marker Development and Viewing.

    Science.gov (United States)

    Wang, Xuewen; Wang, Le

    2016-01-01

    Simple sequence repeats (SSRs), also referred to as microsatellites, are highly variable tandem DNAs that are widely used as genetic markers. The increasing availability of whole-genome and transcript sequences provides information resources for SSR marker development. However, efficient software is required to efficiently identify and display SSR information along with other gene features at a genome scale. We developed novel software package Genome-wide Microsatellite Analyzing Tool Package (GMATA) integrating SSR mining, statistical analysis and plotting, marker design, polymorphism screening and marker transferability, and enabled simultaneously display SSR markers with other genome features. GMATA applies novel strategies for SSR analysis and primer design in large genomes, which allows GMATA to perform faster calculation and provides more accurate results than existing tools. Our package is also capable of processing DNA sequences of any size on a standard computer. GMATA is user friendly, only requires mouse clicks or types inputs on the command line, and is executable in multiple computing platforms. We demonstrated the application of GMATA in plants genomes and reveal a novel distribution pattern of SSRs in 15 grass genomes. The most abundant motifs are dimer GA/TC, the A/T monomer and the GCG/CGC trimer, rather than the rich G/C content in DNA sequence. We also revealed that SSR count is a linear to the chromosome length in fully assembled grass genomes. GMATA represents a powerful application tool that facilitates genomic sequence analyses. GAMTA is freely available at http://sourceforge.net/projects/gmata/?source=navbar.

  15. An Integrated Software Suite for Surface-based Analyses of Cerebral Cortex

    Science.gov (United States)

    Van Essen, David C.; Drury, Heather A.; Dickson, James; Harwell, John; Hanlon, Donna; Anderson, Charles H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database. PMID:11522765

  16. An integrated software suite for surface-based analyses of cerebral cortex

    Science.gov (United States)

    Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.

    2001-01-01

    The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.

  17. Enterprise Reference Library

    Science.gov (United States)

    Bickham, Grandin; Saile, Lynn; Havelka, Jacque; Fitts, Mary

    2011-01-01

    Introduction: Johnson Space Center (JSC) offers two extensive libraries that contain journals, research literature and electronic resources. Searching capabilities are available to those individuals residing onsite or through a librarian s search. Many individuals have rich collections of references, but no mechanisms to share reference libraries across researchers, projects, or directorates exist. Likewise, information regarding which references are provided to which individuals is not available, resulting in duplicate requests, redundant labor costs and associated copying fees. In addition, this tends to limit collaboration between colleagues and promotes the establishment of individual, unshared silos of information The Integrated Medical Model (IMM) team has utilized a centralized reference management tool during the development, test, and operational phases of this project. The Enterprise Reference Library project expands the capabilities developed for IMM to address the above issues and enhance collaboration across JSC. Method: After significant market analysis for a multi-user reference management tool, no available commercial tool was found to meet this need, so a software program was built around a commercial tool, Reference Manager 12 by The Thomson Corporation. A use case approach guided the requirements development phase. The premise of the design is that individuals use their own reference management software and export to SharePoint when their library is incorporated into the Enterprise Reference Library. This results in a searchable user-specific library application. An accompanying share folder will warehouse the electronic full-text articles, which allows the global user community to access full -text articles. Discussion: An enterprise reference library solution can provide a multidisciplinary collection of full text articles. This approach improves efficiency in obtaining and storing reference material while greatly reducing labor, purchasing and

  18. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    Science.gov (United States)

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  19. Open source projects as incubators of innovation: From niche phenomenon to integral part of the software industry

    OpenAIRE

    Schrape, Jan-Felix

    2017-01-01

    Over the last 20 years, open source development has become an integral part of the software industry and a key component of the innovation strategies of all major IT providers. Against this backdrop, this paper seeks to develop a systematic overview of open source communities and their socio-economic contexts. I begin with a reconstruction of the genesis of open source software projects and their changing relation- ships to established IT companies. This is followed by the identification of f...

  20. Web-Based Software Integration For Dissemination Of Archival Images: The Frontiers Of Science Website

    Directory of Open Access Journals (Sweden)

    Gary Browne

    2011-07-01

    Full Text Available The Frontiers of Science illustrated comic strip of 'science fact' ran from 1961 to 1982, syndicated worldwide through over 600 newspapers. The Rare Books and Special Collections Library at the University of Sydney, in association with Sydney eScholarship, digitized all 939 strips. We aimed to create a website that could disseminate these comic strips to scholars, enthusiasts and the general public. We wanted to enable users to search and browse through the images simply and effectively, with an intuitive and novel viewing platform. Time and resource constraints dictated the use of (mostly open source code modules wherever possible and the integration and customisation of a range of web-based applications, code snippets and technologies (DSpace, eXtensible Text Framework (XTF, OmniFormat, JQuery Tools, Thickbox and Zoomify, stylistically pulled together using CSS. This approach allowed for a rapid development cycle (6 weeks to deliver the site on time as well as provide us with a framework for similar projects.

  1. News from the Library

    CERN Multimedia

    CERN Library

    2010-01-01

    A third of the world’s current literature in electrical engineering is available on your CERN desktop Looking for a technical standard on software reviews and audits? Is it referred to as "IEEE color books"? Want to download and read NOW the latest version of IEEE 802? Whenever a need for a technical standard or specification arises in your activity, the Library is here to serve you. For IEEE standards it is particularly easy; the whole collection is available for immediate download. Indeed, since 2007, the CERN Library offers readers online access to the complete IEEE Electronic Library (Institute of Electrical and Electronics Engineers). This licence gives unlimited online access to all IEEE and IET journals and proceedings, starting from the first issue. But not everyone knows that this resource gives also access to all current IEEE standards as well as a selection of archival ones. The Library is now working on the integration of a selection of these standards in our onlin...

  2. Millennials in action: a student-guided effort in curriculum-integration of library skills.

    Science.gov (United States)

    Brower, Stewart

    2004-01-01

    By working in tandem with the Coordinator of Information Management Education (IME) at the University at Buffalo Health Sciences Library, students serving on the School of Pharmacy and Pharmaceutical Sciences Curriculum Committee helped map out a three-year plan for training in library and information literacy skills. Through meetings and e-mail exchanges with the student representatives, the IME Coordinator developed a series of specific course-related instruction and assessment opportunities which would cover tertiary resources, bibliographic searching, evidence-based pharmacy, and advanced information skills.

  3. INTEGRATIVE PROPERTIES OF LIBRARY FUNCTIONS: IMPLEMENTATION IN THE EDUCATIONAL ELECTRONIC ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Т. Л. Бірюкова

    2017-02-01

    In our opinion, the use of distance education programs primarily to build interaction of educational and library establishments in the electronic environment. To achieve this goal, through the interaction of the system Library Education created and signed to the practice of teaching subjects at the Documentation and information activities department Odessa National Polytechnic University methodological development, there are student groups whose work promotes the assimilation of theoretical material in practice, just in the information institution, adapting to the professional environment, provided the possibility of passing the full production and pre-diploma practice.

  4. The Use of BookshelF in Teaching Students of Information and Library Management.

    Science.gov (United States)

    Rowley, Jenny E.; Fisher, Shelagh

    1992-01-01

    Describes how BookshelF software, a library management system, has been integrated into the library and information management curriculum at Manchester Polytechnic (England), and discusses the learning objectives that have been achieved. The use of BookshelF modules for ordering, circulation control, cataloging, OPAC (online public access…

  5. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    Science.gov (United States)

    Lyon, A. L.; Kowalkowski, J. B.; Jones, C. D.

    2017-10-01

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  6. Integrating Visualization Applications, such as ParaView, into HEP Software Frameworks for In-situ Event Displays

    Energy Technology Data Exchange (ETDEWEB)

    Lyon, A. L. [Fermilab; Kowalkowski, J. B. [Fermilab; Jones, C. D. [Fermilab

    2017-11-22

    ParaView is a high performance visualization application not widely used in High Energy Physics (HEP). It is a long standing open source project led by Kitware and involves several Department of Energy (DOE) and Department of Defense (DOD) laboratories. Futhermore, it has been adopted by many DOE supercomputing centers and other sites. ParaView is unique in speed and efficiency by using state-of-the-art techniques developed by the academic visualization community that are often not found in applications written by the HEP community. In-situ visualization of events, where event details are visualized during processing/analysis, is a common task for experiment software frameworks. Kitware supplies Catalyst, a library that enables scientific software to serve visualization objects to client ParaView viewers yielding a real-time event display. Connecting ParaView to the Fermilab art framework will be described and the capabilities it brings discussed.

  7. AN Integrated Bibliographic Information System: Concept and Application for Resource Sharing in Special Libraries

    Science.gov (United States)

    1987-05-01

    workload (beyond that of say an equivalent academic or corporate technical libary ) for the Defense Department libraries. Figure 9 illustrates the range...summer. The hardware configuration for the system is as follows: " Digital Equipment Corporation VAX 11/750 central processor with 6 mega- bytes of real

  8. A model library for simulation and benchmarking of integrated urban wastewater systems

    DEFF Research Database (Denmark)

    Saagi, R.; Flores Alsina, Xavier; Kroll, J. S.

    2017-01-01

    This paper presents a freely distributed, open-source toolbox to predict the behaviour of urban wastewater systems (UWS). The proposed library is used to develop a system-wide Benchmark Simulation Model (BSM-UWS) for evaluating (local/global) control strategies in urban wastewater systems (UWS...

  9. Project iPad: Investigating Tablet Integration in Learning and Libraries at Ryerson University

    Science.gov (United States)

    Eichenlaub, Naomi; Gabel, Laine; Jakubek, Dan; McCarthy, Graham; Wang, Weina

    2011-01-01

    The year 2010 saw a major revolution in tablet technology with the introduction of the Apple iPad. Curious about the potential of this new technology for libraries, a group of librarians at Ryerson University in Toronto seized an opportunity to investigate the emerging role of the tablet in the daily academic lives of students. The group found…

  10. Integrated SCM/PDM/CRM and delivery of software products to 160.000 customers

    NARCIS (Netherlands)

    R.L. Jansen (Remy); G.C. Ballintijn (Gerco); S. Brinkkemper; A. van Nieuwland

    2004-01-01

    textabstractThe release and deployment of enterprise application software is a potentially complex task for software vendors. This complexity can unfortunately result in a significant amount of work and risk. This paper presents a case study of a product software vendor that tries to reduce this

  11. Integrating a comprehensive DNA barcode reference library with a global map of yews (Taxus L.) for forensic identification.

    Science.gov (United States)

    Liu, Jie; Milne, Richard I; Möller, Michael; Zhu, Guang-Fu; Ye, Lin-Jiang; Luo, Ya-Huang; Yang, Jun-Bo; Wambulwa, Moses Cheloti; Wang, Chun-Neng; Li, De-Zhu; Gao, Lian-Ming

    2018-05-22

    Rapid and accurate identification of endangered species is a critical component of bio-surveillance and conservation management, and potentially policing illegal trades. However, this is often not possible using traditional taxonomy, especially where only small or pre-processed parts of plants are available. Reliable identification can be achieved via a comprehensive DNA barcode reference library, accompanied by precise distribution data. However, these require extensive sampling at spatial and taxonomic scales, which has rarely been achieved for cosmopolitan taxa. Here we construct a comprehensive DNA barcode reference library, and generate distribution maps using species distribution modeling (SDM), for all 15 Taxus species worldwide. We find that trnL-trnF is the ideal barcode for Taxus: it can distinguish all Taxus species, and in combination with ITS identify hybrids. Among five analysis methods tested, NJ was the most effective. Among 4151 individuals screened for trnL-trnF, 73 haplotypes were detected, all species-specific and some population private. Taxonomical, geographical and genetic dimensions of sampling strategy were all found to affect the comprehensiveness of the resulting DNA barcode library. Maps from SDM showed that most species had allopatric distributions, except three in the Sino-Himalayan region. Using the barcode library and distribution map data, two unknown forensic samples were identified to species (and in one case, population) level, and another was determined as a putative interspecific hybrid. This integrated species identification system for Taxus can be used for bio-surveillance, conservation management and to monitor and prosecute illegal trade. Similar identification systems are recommended for other IUCN- and -CITES listed taxa. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  12. Integrated graphical user interface for the back-end software sub-system

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.

    2001-01-01

    The ATLAS data acquisition and Event Filter prototype '-1' project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the data acquisition (DAQ). The back-end sub-system includes core components and detector integration components. One of the detector integration components is the Integrated Graphical User Interface (IGUI), which is intended to give a view of the status of the DAQ system and its sub-systems (Dataflow, Event Filter and Back-end) and to allow the user (general users, such as a shift operator at a test beam or experts, in order to control and debug the DAQ system) to control its operation. The IGUI is intended to be a Status Display and a Control Interface too, so there are three groups of functional requirements: display requirements (the information to be displayed); control requirements (the actions the IGUI shall perform on the DAQ components); general requirements, applying to the general functionality of the IGUI. The constraint requirements include requirements related to the access control (shift operator or expert user). The quality requirements are related to the portability on different platforms. The IGUI has to interact with many components in a distributed environment. The following design guidelines have been considered in order to fulfil the requirements: use a modular design with easy possibility to integrate different sub-systems; use Java language for portability and powerful graphical features; use CORBA interfaces for communication with other components. The actual implementation of Back-end software components use Inter-Language Unification (ILU) for inter-process communication. Different methods of access of Java applications to ILU C++ servers have been evaluated (native methods, ILU Java support

  13. Virtual Library Design Document; TOPICAL

    International Nuclear Information System (INIS)

    M. A. deLamare

    2001-01-01

    The objective of this document is to establish a design for the virtual library user and administrative layers that complies with the requirements of the virtual library software specification and subordinate module specification

  14. Quantitative Assessment of Free Flap Viability with CEUS Using an Integrated Perfusion Software.

    Science.gov (United States)

    Geis, S; Klein, S; Prantl, L; Dolderer, J; Lamby, P; Jung, E-M

    2015-12-01

    New treatment strategies in oncology and trauma surgery lead to an increasing demand for soft tissue reconstruction with free tissue transfer. In previous studies, CEUS was proven to detect early flap failure. The aim of this study was to detect and quantify vascular disturbances after free flap transplantation using a fast integrated perfusion software tool. From 2011 to 2013, 33 patients were examined by one experienced radiologist using CEUS after a bolus injection of 1-2.4 ml of SonoVue(®). Flap perfusion was analysed qualitatively regarding contrast defects or delayed wash-in. Additionally, an integrated semi-quantitative analysis using time-intensity curve analysis (TIC) was performed. TIC analysis of the transplant was conducted on a centimetre-by-centimetre basis up to a penetration depth of 4 cm. The 2 perfusion parameters "Time to PEAK" and "Area under the Curve" were compared in patients without complications vs. patients with minor complications or complete flap loss to figure out significant differences. TtoPk is given in seconds (s) and Area is given in relative units (rU) Results: A regular postoperative process was observed in 26 (79%) patients. In contrast, 5 (15%) patients with partial superficial flap necrosis, 1 patient (3%) with complete flap loss and 1 patient (3%) with haematoma were observed. TtoPk revealed no significant differences, whereas Area revealed significantly lower perfusion values in the corresponding areas in patients with complications. The critical threshold for sufficient flap perfusion was set below 150 rU. In conclusion, CEUS is a mobile and cost-effective opportunity to quantify tissue perfusion and can even be used almost without any restrictions in multi-morbid patients with renal and hepatic failure. © Georg Thieme Verlag KG Stuttgart · New York.

  15. SpecOp: Optimal Extraction Software for Integral Field Unit Spectrographs

    Science.gov (United States)

    McCarron, Adam; Ciardullo, Robin; Eracleous, Michael

    2018-01-01

    The Hobby-Eberly Telescope’s new low resolution integral field spectrographs, LRS2-B and LRS2-R, each cover a 12”x6” area on the sky with 280 fibers and generate spectra with resolutions between R=1100 and R=1900. To extract 1-D spectra from the instrument’s 3D data cubes, a program is needed that is flexible enough to work for a wide variety of targets, including continuum point sources, emission line sources, and compact sources embedded in complex backgrounds. We therefore introduce SpecOp, a user-friendly python program for optimally extracting spectra from integral-field unit spectrographs. As input, SpecOp takes a sky-subtracted data cube consisting of images at each wavelength increment set by the instrument’s spectral resolution, and an error file for each count measurement. All of these files are generated by the current LRS2 reduction pipeline. The program then collapses the cube in the image plane using the optimal extraction algorithm detailed by Keith Horne (1986). The various user-selected options include the fraction of the total signal enclosed in a contour-defined region, the wavelength range to analyze, and the precision of the spatial profile calculation. SpecOp can output the weighted counts and errors at each wavelength in various table formats using python’s astropy package. We outline the algorithm used for extraction and explain how the software can be used to easily obtain high-quality 1-D spectra. We demonstrate the utility of the program by applying it to spectra of a variety of quasars and AGNs. In some of these targets, we extract the spectrum of a nuclear point source that is superposed on a spatially extended galaxy.

  16. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  17. Experience with Intel's many integrated core architecture in ATLAS software

    International Nuclear Information System (INIS)

    Fleischmann, S; Neumann, M; Kama, S; Lavrijsen, W; Vitillo, R

    2014-01-01

    Intel recently released the first commercial boards of its Many Integrated Core (MIC) Architecture. MIC is Intel's solution for the domain of throughput computing, currently dominated by general purpose programming on graphics processors (GPGPU). MIC allows the use of the more familiar x86 programming model and supports standard technologies such as OpenMP, MPI, and Intel's Threading Building Blocks (TBB). This should make it possible to develop for both throughput and latency devices using a single code base. In ATLAS Software, track reconstruction has been shown to be a good candidate for throughput computing on GPGPU devices. In addition, the newly proposed offline parallel event-processing framework, GaudiHive, uses TBB for task scheduling. The MIC is thus, in principle, a good fit for this domain. In this paper, we report our experiences of porting to and optimizing ATLAS tracking algorithms for the MIC, comparing the programmability and relative cost/performance of the MIC against those of current GPGPUs and latency-optimized CPUs.

  18. Optical Beam Deflection Based AFM with Integrated Hardware and Software Platform for an Undergraduate Engineering Laboratory

    Directory of Open Access Journals (Sweden)

    Siu Hong Loh

    2017-02-01

    Full Text Available Atomic force microscopy (AFM has been used extensively in nanoscience research since its invention. Recently, many teaching laboratories in colleges, undergraduate institutions, and even high schools incorporate AFM as an effective teaching tool for nanoscience education. This paper presents an optical beam deflection (OBD based atomic force microscope, designed specifically for the undergraduate engineering laboratory as a teaching instrument. An electronic module for signal conditioning was built with components that are commonly available in an undergraduate electronic laboratory. In addition to off-the-shelf mechanical parts and optics, the design of custom-built mechanical parts waskept as simple as possible. Hence, the overall cost for the setup is greatly reduced. The AFM controller was developed using National Instruments Educational Laboratory Virtual Instrumentation Suite (NI ELVIS, an integrated hardware and software platform which can be programmed in LabVIEW. A simple yet effective control algorithm for scanning and feedback control was developed. Despite the use of an educational platform and low-cost components from the undergraduate laboratory, the developed AFM is capable of performing imaging in constant-force mode with submicron resolution and at reasonable scanning speed (approximately 18 min per image. Therefore, the AFM is suitable to be used as an educational tool for nanoscience. Moreover, the construction of the system can be a valuable educational experience for electronic and mechanical engineering students.

  19. Marketing in the Special Library Environment.

    Science.gov (United States)

    Powers, Janet E.

    1995-01-01

    Special libraries developed in response to a need for quick access to specific information. Integrated marketing in special libraries focuses the library toward strategic planning and offers the opportunity to develop more effective library services. (Author/AEF)

  20. Infrastructure for Multiphysics Software Integration in High Performance Computing-Aided Science and Engineering

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Michael T. [Illinois Rocstar LLC, Champaign, IL (United States); Safdari, Masoud [Illinois Rocstar LLC, Champaign, IL (United States); Kress, Jessica E. [Illinois Rocstar LLC, Champaign, IL (United States); Anderson, Michael J. [Illinois Rocstar LLC, Champaign, IL (United States); Horvath, Samantha [Illinois Rocstar LLC, Champaign, IL (United States); Brandyberry, Mark D. [Illinois Rocstar LLC, Champaign, IL (United States); Kim, Woohyun [Illinois Rocstar LLC, Champaign, IL (United States); Sarwal, Neil [Illinois Rocstar LLC, Champaign, IL (United States); Weisberg, Brian [Illinois Rocstar LLC, Champaign, IL (United States)

    2016-10-15

    The project described in this report constructed and exercised an innovative multiphysics coupling toolkit called the Illinois Rocstar MultiPhysics Application Coupling Toolkit (IMPACT). IMPACT is an open source, flexible, natively parallel infrastructure for coupling multiple uniphysics simulation codes into multiphysics computational systems. IMPACT works with codes written in several high-performance-computing (HPC) programming languages, and is designed from the beginning for HPC multiphysics code development. It is designed to be minimally invasive to the individual physics codes being integrated, and has few requirements on those physics codes for integration. The goal of IMPACT is to provide the support needed to enable coupling existing tools together in unique and innovative ways to produce powerful new multiphysics technologies without extensive modification and rewrite of the physics packages being integrated. There are three major outcomes from this project: 1) construction, testing, application, and open-source release of the IMPACT infrastructure, 2) production of example open-source multiphysics tools using IMPACT, and 3) identification and engagement of interested organizations in the tools and applications resulting from the project. This last outcome represents the incipient development of a user community and application echosystem being built using IMPACT. Multiphysics coupling standardization can only come from organizations working together to define needs and processes that span the space of necessary multiphysics outcomes, which Illinois Rocstar plans to continue driving toward. The IMPACT system, including source code, documentation, and test problems are all now available through the public gitHUB.org system to anyone interested in multiphysics code coupling. Many of the basic documents explaining use and architecture of IMPACT are also attached as appendices to this document. Online HTML documentation is available through the gitHUB site

  1. The dynamics of software development project management: An integrative systems dynamic perspective

    Science.gov (United States)

    Vandervelde, W. E.; Abdel-Hamid, T.

    1984-01-01

    Rather than continuing to focus on software development projects per se, the system dynamics modeling approach outlined is extended to investigate a broader set of issues pertaining to the software development organization. Rather than trace the life cycle(s) of one or more software projects, the focus is on the operations of a software development department as a continuous stream of software products are developed, placed into operation, and maintained. A number of research questions are ""ripe'' for investigating including: (1) the efficacy of different organizational structures in different software development environments, (2) personnel turnover, (3) impact of management approaches such as management by objectives, and (4) the organizational/environmental determinants of productivity.

  2. A Fuzzy Approach for Integrated Measure of Object-Oriented Software Testability

    OpenAIRE

    Vandana Gupta; K. K. Aggarwal; Yogesh Singh

    2005-01-01

    For large software systems, testing phase seems to have profound effect on the overall acceptability and quality of the final product. The success of this activity can be judged by measuring the testability of the software. A good measure for testability can better manage the testing effort and time. Different Object Oriented Metrics are used in measurement of object-oriented testability but none of them is alone sufficient to give an overall reflection of software testabi...

  3. Behavior Tracking Software Enhancement and Integration of a Feedback Module, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Horizon Performance designed a Behavior Tracking Software System to collect crew member behavior throughout a mission, giving NASA the capability to monitor...

  4. Virtual reality devices integration in scientific visualization software in the VtkVRPN framework; Integration de peripheriques de realite virtuelle dans des applications de visualisation scientifique au sein de la plate-forme VtkVRPN

    Energy Technology Data Exchange (ETDEWEB)

    Journe, G.; Guilbaud, C

    2005-07-01

    A high-quality scientific visualization software relies on ergonomic navigation and exploration. Those are essential to be able to perform an efficient data analysis. To help solving this issue, management of virtual reality devices has been developed inside the CEA 'VtkVRPN' framework. This framework is based on VTK, a 3D graphical library, and VRPN, a virtual reality devices management library. This document describes the developments done during a post-graduate training course. (authors)

  5. Anaphe - OO Libraries and Tools for Data Analysis

    CERN Document Server

    Couet, O; Molnar, Z; Moscicki, J T; Pfeiffer, A; Sang, M

    2001-01-01

    The Anaphe project is an ongoing effort to provide an Object Oriented software environment for data analysis in HENP experiments. A range of commercial and public domain libraries is used to cover basic functionalities; on top of these libraries a set of HENP-specific C++ class libraries for histogram management, fitting, plotting and ntuple-like data analysis has been developed. In order to comply with the user requirements for a command-line driven tool, we have chosen to use a scripting language (Python) as the front-end for a data analysis tool. The loose coupling provided by the consequent use of (AIDA compliant) Abstract Interfaces for each component in combination with the use of shared libraries for their implementation provides an easy integration of existing libraries into modern scripting languages thus allowing for rapid application development. This integration is simplified even further using a specialised toolkit (SWIG) to create "shadow classes" for the Python language, which map the definitio...

  6. Public Libraries in Norway Help Non-Western Immigrant Women to Integrate into Society. A Review of: Audunson, R., Essmat, S., & Aabø, S. (2011. Public libraries: A meeting place for immigrant women? Library & Information Science Research, 33(3, 220-227. doi: 10.1016/j.lisr.2011.01.003

    Directory of Open Access Journals (Sweden)

    Kathryn Oxborrow

    2012-03-01

    Full Text Available Objectives – To discover the ways in which the public library was used by immigrant women, with a particular focus on the library as a meeting place.Design – Semi-structured qualitative interviews conducted in the participants’ native languages.Setting – Public libraries in Norway. Participants lived in one of two cities both with a population of approximately 40,000 and a somewhat lower number of immigrants than the national average.Subjects – Nine non-western women who had immigrated to Norway between 8 months and 17 years prior to the study. Three women were from Iran, Kurdistan and Afghanistan respectively. All identified themselves as public library users.Methods – Participants were interviewed in their native languages and the qualitative results were analyzed in accordance with the theoretical framework set out by the authors. The main areas of focus were the role of the library in the generation of social capital, and the library as a high intensive versus low intensive meeting place.Main Results – Participants used public libraries in various ways. In the initial stages of life in a new country they were used to observe and learn about the majority culture and language. They were also used as a safe place to openly grieve and provide comfort among close friends without fear of being seen by other fellow countrymen. Over time, participants came to use the library space in more traditional ways such as for information, social, and professional needs. The study also revealed that using public libraries built trust in the institution of libraries and librarians as employees.Conclusions – The public library plays a key role in the generation of social capital, both in terms of integrating into the majority culture through observation and spontaneous interactions (bridging social capital and connecting with others from participants’ home cultures (bonding social capital for example through the provision of social space and

  7. Validation of an integrated software for the detection of rapid eye movement sleep behavior disorder.

    Science.gov (United States)

    Frauscher, Birgit; Gabelia, David; Biermayr, Marlene; Stefani, Ambra; Hackner, Heinz; Mitterling, Thomas; Poewe, Werner; Högl, Birgit

    2014-10-01

    Rapid eye movement sleep without atonia (RWA) is the polysomnographic hallmark of REM sleep behavior disorder (RBD). To partially overcome the disadvantages of manual RWA scoring, which is time consuming but essential for the accurate diagnosis of RBD, we aimed to validate software specifically developed and integrated with polysomnography for RWA detection against the gold standard of manual RWA quantification. Academic referral center sleep laboratory. Polysomnographic recordings of 20 patients with RBD and 60 healthy volunteers were analyzed. N/A. Motor activity during REM sleep was quantified manually and computer assisted (with and without artifact detection) according to Sleep Innsbruck Barcelona (SINBAR) criteria for the mentalis ("any," phasic, tonic electromyographic [EMG] activity) and the flexor digitorum superficialis (FDS) muscle (phasic EMG activity). Computer-derived indices (with and without artifact correction) for "any," phasic, tonic mentalis EMG activity, phasic FDS EMG activity, and the SINBAR index ("any" mentalis + phasic FDS) correlated well with the manually derived indices (all Spearman rhos 0.66-0.98). In contrast with computerized scoring alone, computerized scoring plus manual artifact correction (median duration 5.4 min) led to a significant reduction of false positives for "any" mentalis (40%), phasic mentalis (40.6%), and the SINBAR index (41.2%). Quantification of tonic mentalis and phasic FDS EMG activity was not influenced by artifact correction. The computer algorithm used here appears to be a promising tool for REM sleep behavior disorder detection in both research and clinical routine. A short check for plausibility of automatic detection should be a basic prerequisite for this and all other available computer algorithms. © 2014 Associated Professional Sleep Societies, LLC.

  8. Digital Aquifer - Integrating modeling, technical, software and policy aspects to develop a groundwater management tool

    Science.gov (United States)

    Tirupathi, S.; McKenna, S. A.; Fleming, K.; Wambua, M.; Waweru, P.; Ondula, E.

    2016-12-01

    Groundwater management has traditionally been observed as a study for long term policy measures to ensure that the water resource is sustainable. IBM Research, in association with the World Bank, extended this traditional analysis to include realtime groundwater management by building a context-aware, water rights management and permitting system. As part of this effort, one of the primary objectives was to develop a groundwater flow model that can help the policy makers with a visual overview of the current groundwater distribution. In addition, the system helps the policy makers simulate a range of scenarios and check the sustainability of the groundwater resource in a given region. The system also enables a license provider to check the effect of the introduction of a new well on the existing wells in the domain as well as the groundwater resource in general. This process simplifies how an engineer will determine if a new well should be approved. Distance to the nearest well neighbors and the maximum decreases in water levels of nearby wells are continually assessed and presented as evidence for an engineer to make the final judgment on approving the permit. The system also facilitates updated insights on the amount of groundwater left in an area and provides advice on how water fees should be structured to balance conservation and economic development goals. In this talk, we will discuss the concept of Digital Aquifer, the challenges in integrating modeling, technical and software aspects to develop a management system that helps policy makers and license providers with a robust decision making tool. We will concentrate on the groundwater model developed using the analytic element method that plays a very important role in the decision making aspects. Finally, the efficiency of this system and methodology is shown through a case study in Laguna Province, Philippines, which was done in collaboration with the National Water Resource Board, Philippines and World

  9. Integrating Free Computer Software in Chemistry and Biochemistry Instruction: An International Collaboration

    Science.gov (United States)

    Cedeno, David L.; Jones, Marjorie A.; Friesen, Jon A.; Wirtz, Mark W.; Rios, Luz Amalia; Ocampo, Gonzalo Taborda

    2010-01-01

    At the Universidad de Caldas, Manizales, Colombia, we used their new computer facilities to introduce chemistry graduate students to biochemical database mining and quantum chemistry calculations using freeware. These hands-on workshops allowed the students a strong introduction to easily accessible software and how to use this software to begin…

  10. Integrated massively parallel sequencing of 15 autosomal STRs and Amelogenin using a simplified library preparation approach.

    Science.gov (United States)

    Xue, Jian; Wu, Riga; Pan, Yajiao; Wang, Shunxia; Qu, Baowang; Qin, Ying; Shi, Yuequn; Zhang, Chuchu; Li, Ran; Zhang, Liyan; Zhou, Cheng; Sun, Hongyu

    2018-04-02

    Massively parallel sequencing (MPS) technologies, also termed as next-generation sequencing (NGS), are becoming increasingly popular in study of short tandem repeats (STR). However, current library preparation methods are usually based on ligation or two-round PCR that requires more steps, making it time-consuming (about 2 days), laborious and expensive. In this study, a 16-plex STR typing system was designed with fusion primer strategy based on the Ion Torrent S5 XL platform which could effectively resolve the above challenges for forensic DNA database-type samples (bloodstains, saliva stains, etc.). The efficiency of this system was tested in 253 Han Chinese participants. The libraries were prepared without DNA isolation and adapter ligation, and the whole process only required approximately 5 h. The proportion of thoroughly genotyped samples in which all the 16 loci were successfully genotyped was 86% (220/256). Of the samples, 99.7% showed 100% concordance between NGS-based STR typing and capillary electrophoresis (CE)-based STR typing. The inconsistency might have been caused by off-ladder alleles and mutations in primer binding sites. Overall, this panel enabled the large-scale genotyping of the DNA samples with controlled quality and quantity because it is a simple, operation-friendly process flow that saves labor, time and costs. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Methodology and software to detect viral integration site hot-spots

    Science.gov (United States)

    2011-01-01

    Background Modern gene therapy methods have limited control over where a therapeutic viral vector inserts into the host genome. Vector integration can activate local gene expression, which can cause cancer if the vector inserts near an oncogene. Viral integration hot-spots or 'common insertion sites' (CIS) are scrutinized to evaluate and predict patient safety. CIS are typically defined by a minimum density of insertions (such as 2-4 within a 30-100 kb region), which unfortunately depends on the total number of observed VIS. This is problematic for comparing hot-spot distributions across data sets and patients, where the VIS numbers may vary. Results We develop two new methods for defining hot-spots that are relatively independent of data set size. Both methods operate on distributions of VIS across consecutive 1 Mb 'bins' of the genome. The first method 'z-threshold' tallies the number of VIS per bin, converts these counts to z-scores, and applies a threshold to define high density bins. The second method 'BCP' applies a Bayesian change-point model to the z-scores to define hot-spots. The novel hot-spot methods are compared with a conventional CIS method using simulated data sets and data sets from five published human studies, including the X-linked ALD (adrenoleukodystrophy), CGD (chronic granulomatous disease) and SCID-X1 (X-linked severe combined immunodeficiency) trials. The BCP analysis of the human X-linked ALD data for two patients separately (774 and 1627 VIS) and combined (2401 VIS) resulted in 5-6 hot-spots covering 0.17-0.251% of the genome and containing 5.56-7.74% of the total VIS. In comparison, the CIS analysis resulted in 12-110 hot-spots covering 0.018-0.246% of the genome and containing 5.81-22.7% of the VIS, corresponding to a greater number of hot-spots as the data set size increased. Our hot-spot methods enable one to evaluate the extent of VIS clustering, and formally compare data sets in terms of hot-spot overlap. Finally, we show that the

  12. An Exploration of Retrieval-Enhancing Methods for Integrated Search in a Digital Library

    DEFF Research Database (Denmark)

    Sørensen, Diana Ransgaard; Bogers, Toine; Larsen, Birger

    2012-01-01

    Integrated search is defined as searching across different document types and representations simultaneously, with the goal of presenting the user with a single ranked result list containing the optimal mix of document types. In this paper, we compare various approaches to integrating three diffe...

  13. Capability Maturity Model Integration (CMMISM), Version 1.1 CMMISM for Systems Engineering, Software Engineering, Integrated Product and Process Development, and Supplier Sourcing (CMMI-SE/SW/IPPD/SS, V1.1). Staged Representation

    National Research Council Canada - National Science Library

    2002-01-01

    .... Concepts covered by this model include systems engineering, software engineering, integrated product and process development, and supplier sourcing as well as traditional CMM concepts such as process...

  14. Development of a high efficiency integration system and promoter library for rapid modification of Pseudomonas putida KT2440

    Energy Technology Data Exchange (ETDEWEB)

    Elmore, Joshua R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Furches, Anna [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Wolff, Gara N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Gorday, Kent [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division; Guss, Adam M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Biosciences Division

    2017-04-15

    Pseudomonas putida strains are highly robust bacteria known for their ability to efficiently utilize a variety of carbon sources, including aliphatic and aromatic hydrocarbons. Recently, P. putida has been engineered to valorize the lignin stream of a lignocellulosic biomass pretreatment process. Nonetheless, when compared to platform organisms such as Escherichia coli, the toolkit for engineering P. putida is underdeveloped. Heterologous gene expression in particular is problematic. Plasmid instability and copy number variance provide challenges for replicative plasmids, while use of homologous recombination for insertion of DNA into the chromosome is slow and laborious. Furthermore, heterologous expression efforts to date typically rely on overexpression of exogenous pathways using a handful of poorly characterized promoters. In order to improve the P. putida toolkit, we developed a rapid genome integration system using the site-specific recombinase from bacteriophage Bxb1 to enable rapid, high efficiency integration of DNA into the P. putida chromosome. We also developed a library of synthetic promoters with various UP elements, -35 sequences, and -10 sequences, as well as different ribosomal binding sites. We tested these promoters using a fluorescent reporter gene, mNeonGreen, to characterize the strength of each promoter, and identified UP-element-promoter-ribosomal binding sites combinations capable of driving a ~150-fold range of protein expression levels. One additional integrating vector was developed that confers more robust kanamycin resistance when integrated at single copy into the chromosome. This genome integration and reporter systems are extensible for testing other genetic parts, such as examining terminator strength, and will allow rapid integration of heterologous pathways for metabolic engineering.

  15. Alida – Advanced Library for Integrated Development of Data Analysis Applications

    Directory of Open Access Journals (Sweden)

    Stefan Posch

    2017-03-01

    Full Text Available Data analysis procedures can often be modeled as a set of manipulation operations applied to input data and resulting in transformed intermediate and result data. The Java library Alida is providing an advanced development framework to support programmers in developing data analysis applications adhering to such a scheme. The main intention of Alida is to foster re-usability by offering well-defined, unified, modular APIs and execution procedures for operators, and to ease development by releasing developers from tedious tasks. Alida features automatic generation of handy graphical and command line user interfaces, a built-in graphical editor for workflow design, and an automatic documentation of analysis pipelines. Alida is available from its project webpage http://www.informatik.uni-halle.de/alida, on Github and via our Maven server.

  16. Climate information for public health: the role of the IRI climate data library in an integrated knowledge system.

    Science.gov (United States)

    del Corral, John; Blumenthal, M Benno; Mantilla, Gilma; Ceccato, Pietro; Connor, Stephen J; Thomson, Madeleine C

    2012-09-01

    Public health professionals are increasingly concerned about the potential impact of climate variability and change on health outcomes. Protecting public health from the vagaries of climate requires new working relationships between the public health sector and the providers of climate data and information. The Climate Information for Public Health Action initiative at the International Research Institute for Climate and Society (IRI) is designed to increase the public health community's capacity to understand, use and demand appropriate climate data and climate information to mitigate the public health impacts of the climate. Significant challenges to building the capacity of health professionals to use climate information in research and decision-making include the difficulties experienced by many in accessing relevant and timely quality controlled data and information in formats that can be readily incorporated into specific analysis with other data sources. We present here the capacities of the IRI climate data library and show how we have used it to build an integrated knowledge system in the support of the use of climate and environmental information in climate-sensitive decision-making with respect to health. Initiated as an aid facilitating exploratory data analysis for climate scientists, the IRI climate data library has emerged as a powerful tool for interdisciplinary researchers focused on topics related to climate impacts on society, including health.

  17. Integrated conception of hardware/software mixed systems used in nuclear instrumentation

    International Nuclear Information System (INIS)

    Dias, Ailton F.; Sorel, Yves; Akil, Mohamed

    2002-01-01

    Hardware/software codesign carries out the design of systems composed by a hardware portion, with specific components, and a software portion, with microprocessor based architecture. This paper describes the Algorithm Architecture Adequation (AAA) design methodology - originally oriented to programmable multicomponent architectures, its extension to reconfigurable circuits and its application to design and development of nuclear instrumentation systems composed by programmable and configurable circuits. AAA methodology uses an unified model to describe algorithm, architecture and implementation, based on graph theory. The great advantage of AAA methodology is the utilization of a same model from the specification to the implementation of hardware/software systems, reducing the complexity and design time. (author)

  18. ElectroMagnetoEncephalography software: overview and integration with other EEG/MEG toolboxes.

    Science.gov (United States)

    Peyk, Peter; De Cesarei, Andrea; Junghöfer, Markus

    2011-01-01

    EMEGS (electromagnetic encephalography software) is a MATLAB toolbox designed to provide novice as well as expert users in the field of neuroscience with a variety of functions to perform analysis of EEG and MEG data. The software consists of a set of graphical interfaces devoted to preprocessing, analysis, and visualization of electromagnetic data. Moreover, it can be extended using a plug-in interface. Here, an overview of the capabilities of the toolbox is provided, together with a simple tutorial for both a standard ERP analysis and a time-frequency analysis. Latest features and future directions of the software development are presented in the final section.

  19. Toward an Integrated BAC Library Resource for Genome Sequencing and Analysis; FINAL

    International Nuclear Information System (INIS)

    Simon, M. I.; Kim, U.-J.

    2002-01-01

    We developed a great deal of expertise in building large BAC libraries from a variety of DNA sources including humans, mice, corn, microorganisms, worms, and Arabidopsis. We greatly improved the technology for screening these libraries rapidly and for selecting appropriate BACs and mapping BACs to develop large overlapping contigs. We became involved in supplying BACs and BAC contigs to a variety of sequencing and mapping projects and we began to collaborate with Drs. Adams and Venter at TIGR and with Dr. Leroy Hood and his group at University of Washington to provide BACs for end sequencing and for mapping and sequencing of large fragments of chromosome 16. Together with Dr. Ian Dunham and his co-workers at the Sanger Center we completed the mapping and they completed the sequencing of the first human chromosome, chromosome 22. This was published in Nature in 1999 and our BAC contigs made a major contribution to this sequencing effort. Drs. Shizuya and Ding invented an automated highly accurate BAC mapping technique. We also developed long-term collaborations with Dr. Uli Weier at UCSF in the design of BAC probes for characterization of human tumors and specific chromosome deletions and breakpoints. Finally the contribution of our work to the human genome project has been recognized in the publication both by the international consortium and the NIH of a draft sequence of the human genome in Nature last year. Dr. Shizuya was acknowledged in the authorship of that landmark paper. Dr. Simon was also an author on the Venter/Adams Celera project sequencing the human genome that was published in Science last year

  20. Software-defined networking control plane for seamless integration of multiple silicon photonic switches in Datacom networks.

    Science.gov (United States)

    Shen, Yiwen; Hattink, Maarten H N; Samadi, Payman; Cheng, Qixiang; Hu, Ziyiz; Gazman, Alexander; Bergman, Keren

    2018-04-16

    Silicon photonics based switches offer an effective option for the delivery of dynamic bandwidth for future large-scale Datacom systems while maintaining scalable energy efficiency. The integration of a silicon photonics-based optical switching fabric within electronic Datacom architectures requires novel network topologies and arbitration strategies to effectively manage the active elements in the network. We present a scalable software-defined networking control plane to integrate silicon photonic based switches with conventional Ethernet or InfiniBand networks. Our software-defined control plane manages both electronic packet switches and multiple silicon photonic switches for simultaneous packet and circuit switching. We built an experimental Dragonfly network testbed with 16 electronic packet switches and 2 silicon photonic switches to evaluate our control plane. Observed latencies occupied by each step of the switching procedure demonstrate a total of 344 µs control plane latency for data-center and high performance computing platforms.

  1. An Online Library Catalogue.

    Science.gov (United States)

    Alloro, Giovanna; Ugolini, Donatella

    1992-01-01

    Describes the implementation of an online catalog in the library of the National Institute for Cancer Research and the Clinical and Experimental Oncology Institute of the University of Genoa. Topics addressed include automation of various library functions, software features, database management, training, and user response. (10 references) (MES)

  2. FUNPACK-2, Subroutine Library, Bessel Function, Elliptical Integrals, Min-max Approximation

    International Nuclear Information System (INIS)

    Cody, W.J.; Garbow, Burton S.

    1975-01-01

    1 - Description of problem or function: FUNPACK is a collection of FORTRAN subroutines to evaluate certain special functions. The individual subroutines are (Identification/Description): NATSI0 F2I0 Bessel function I 0 ; NATSI1 F2I1 Bessel function I 1 ; NATSJ0 F2J0 Bessel function J 0 ; NATSJ1 F2J1 Bessel function J 1 ; NATSK0 F2K0 Bessel function K 0 ; NATSK1 F2K1 Bessel function K 1 ; NATSBESY F2BY Bessel function Y ν ; DAW F1DW Dawson's integral; DELIPK F1EK Complete elliptic integral of the first kind; DELIPE F1EE Complete elliptic integral of the second kind; DEI F1EI Exponential integrals; NATSPSI F2PS Psi (logarithmic derivative of gamma function); MONERR F1MO Error monitoring package . 2 - Method of solution: FUNPACK uses evaluation of min-max approximations

  3. ACHIEVING HIGH INTEGRITY OF PROCESS-CONTROL SOFTWARE BY GRAPHICAL DESIGN AND FORMAL VERIFICATION

    NARCIS (Netherlands)

    HALANG, WA; Kramer, B.J.

    The International Electrotechnical Commission is currently standardising four compatible languages for designing and implementing programmable logic controllers (PLCs). The language family includes a diagrammatic notation that supports the idea of software ICs to encourage graphical design

  4. Z-Plant material information tracking system (ZMITS) software development and integration project management plan

    International Nuclear Information System (INIS)

    IBSEN, T.G.

    1999-01-01

    This document plans for software and interface development governing the implementation of ZMITS and other supporting systems necessary to manage information for material stabilization needs of the Project Hanford Management Contract (PHMC)

  5. Increasing Open Source Software Integration on the Department of Defense Unclassified Desktop

    National Research Council Canada - National Science Library

    Schearer, Steven A

    2008-01-01

    .... While some of this expenditure goes to fund special-purpose military software, much of it is absorbed by license fees for computer operating systems and general-purpose office automation applications...

  6. Integrated Syntactic/Semantic XML Data Validation with a Reusable Software Component

    Science.gov (United States)

    Golikov, Steven

    2013-01-01

    Data integration is a critical component of enterprise system integration, and XML data validation is the foundation for sound data integration of XML-based information systems. Since B2B e-commerce relies on data validation as one of the critical components for enterprise integration, it is imperative for financial industries and e-commerce…

  7. A model library for dynamic transport and fate of micropollutants in integrated urban wastewater and stormwater systems

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Benedetti, Lorenzo; Gevaert, Veerle

    2014-01-01

    by using substance inherent properties, following an approach commonly used in large-scale MP multimedia fate and transport models. The chosen level of complexity ensures a low data requirement and minimizes the need for field measurements. Next to a synthesis of model applications, a didactic example......The increasing efforts in reducing the emission of micropollutants (MP) into the natural aquatic environment require the development of modelling tools to support the decision making process. This article presents a library of dynamic modelling tools for estimating MP fluxes within Integrated Urban...... Wastewater and Stormwater system (IUWS – including drainage network, stormwater treatment units, wastewater treatment plants, sludge treatment, and the receiving water body). The models are developed by considering the high temporal variability of the processes taking place in the IUWS, providing a basis...

  8. Scrum2Kanban: Integrating Kanban and Scrum in a University Software Engineering Capstone Course

    OpenAIRE

    Matthies, Christoph

    2018-01-01

    Using university capstone courses to teach agile software development methodologies has become commonplace, as agile methods have gained support in professional software development. This usually means students are introduced to and work with the currently most popular agile methodology: Scrum. However, as the agile methods employed in the industry change and are adapted to different contexts, university courses must follow suit. A prime example of this is the Kanban method, which has recentl...

  9. LibraryH3lp: A New Flexible Chat Reference System

    Directory of Open Access Journals (Sweden)

    Pam Sessoms

    2008-09-01

    Full Text Available LibraryH3lp is an integrated IM and web chat system designed specifically for Virtual Reference services in libraries. The software was designed for, and is currently used by, a night-time chat reference collaboraton between several large academic libraries. LibraryH3lp is designed for the workflow of chat reference, supporting multiple simultaneous operators and routing to queues of operators in a particular service area. It also supports web page embeddable chat 'widgets', as well as simultaneous gateways to multiple IM protocols. This article discusses the motivation for the development of the software, and provides an overview of LibraryH3lp's features and technical architecture. Parts of LibraryH3lp are available as open source. The complete application is available as a low-cost hosted service, and will eventually be available to be licensed for local hosting.

  10. Design of software platform based on linux operating system for γ-spectrometry instrument

    International Nuclear Information System (INIS)

    Hong Tianqi; Zhou Chen; Zhang Yongjin

    2008-01-01

    This paper described the design of γ-spectrometry instrument software platform based on s3c2410a processor with arm920t core, emphases are focused on analyzing the integrated application of embedded linux operating system, yaffs file system and qt/embedded GUI development library. It presented a new software platform in portable instrument for γ measurement. (authors)

  11. Integral test for Np237 and Am241 cross sections in JENDL, ENDF and JEF libraries

    International Nuclear Information System (INIS)

    Iwasaki, Tomohiko; Unesaki, Hironobu; Kitada, Takanori

    2002-01-01

    Experiments using Kyoto University critical assembly (KUCA) were performed for measuring the capture and fission reaction rates of 237 Np and 241 Am. A back-to-back fission chamber was employed for the measurement of the fission rate of 237 Np and 241 Am relative to 235 U. The capture rate of 237 Np relative to 197 Au was measured by using activation method. Eleven cores, of which the spectra were changed systematically, were mocked up for the present measurements. Five cores among the eleven were utilized for the fission reaction rate measurement. The experiment was analyzed using the Monte Carlo code MVP, the transport code TWOTRAN and the diffusion code CITATION using the libraries based on JENDL3.2, ENDF/B-VI and JEF2.2. As the results, for 237 Np, JENDL3.2 showed good agreement for both capture and fission. However, for the fission rate of 241 Am, JENDL3.2 underestimates 15-20%. On the other hand, ENDF/B-VI and JEF2.2 show different C/Es for 237 Np and 241 Am. (author)

  12. INVENIO Integrated Digital Library Conference CERN workshop on Innovations in Scholarly Communication (OAI5)

    CERN Multimedia

    2007-01-01

    CERN has long been committed to the free dissemination of scientific research results and theories. Towards this end, CERN's own institutional repository, the CERN Document Server (CDS) offers access to CERN works and to all related scholarly literature in the HEP domain. Hosting over 500 document collections containing more than 900,000 records, CDS provides access to anything from preprints and articles, to multimedia information such as photographs, movies, posters and brochures. The software that powers this service, CDS Invenio, is distributed freely under the GNU GPL and is currently used in approximately 15 institutions worldwide. In this poster session, we explain the use of CDS Invenio to manage a repository of scientific literature. We outline some of the issues faced during the lifecycle of a document from acquisition, processing and indexing to dissemination. In particular, we focus on the features and technology developed to meet the complexities of managing scientific information in the LHC era ...

  13. The phylogenetic likelihood library.

    Science.gov (United States)

    Flouri, T; Izquierdo-Carrasco, F; Darriba, D; Aberer, A J; Nguyen, L-T; Minh, B Q; Von Haeseler, A; Stamatakis, A

    2015-03-01

    We introduce the Phylogenetic Likelihood Library (PLL), a highly optimized application programming interface for developing likelihood-based phylogenetic inference and postanalysis software. The PLL implements appropriate data structures and functions that allow users to quickly implement common, error-prone, and labor-intensive tasks, such as likelihood calculations, model parameter as well as branch length optimization, and tree space exploration. The highly optimized and parallelized implementation of the phylogenetic likelihood function and a thorough documentation provide a framework for rapid development of scalable parallel phylogenetic software. By example of two likelihood-based phylogenetic codes we show that the PLL improves the sequential performance of current software by a factor of 2-10 while requiring only 1 month of programming time for integration. We show that, when numerical scaling for preventing floating point underflow is enabled, the double precision likelihood calculations in the PLL are up to 1.9 times faster than those in BEAGLE. On an empirical DNA dataset with 2000 taxa the AVX version of PLL is 4 times faster than BEAGLE (scaling enabled and required). The PLL is available at http://www.libpll.org under the GNU General Public License (GPL). © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  14. SWATH Mass Spectrometry Performance Using Extended Peptide MS/MS Assay Libraries*

    Science.gov (United States)

    Wu, Jemma X.; Song, Xiaomin; Pascovici, Dana; Zaw, Thiri; Care, Natasha; Krisp, Christoph; Molloy, Mark P.

    2016-01-01

    The use of data-independent acquisition methods such as SWATH for mass spectrometry based proteomics is usually performed with peptide MS/MS assay libraries which enable identification and quantitation of peptide peak areas. Reference assay libraries can be generated locally through information dependent acquisition, or obtained from community data repositories for commonly studied organisms. However, there have been no studies performed to systematically evaluate how locally generated or repository-based assay libraries affect SWATH performance for proteomic studies. To undertake this analysis, we developed a software workflow, SwathXtend, which generates extended peptide assay libraries by integration with a local seed library and delivers statistical analysis of SWATH-quantitative comparisons. We designed test samples using peptides from a yeast extract spiked into peptides from human K562 cell lysates at three different ratios to simulate protein abundance change comparisons. SWATH-MS performance was assessed using local and external assay libraries of varying complexities and proteome compositions. These experiments demonstrated that local seed libraries integrated with external assay libraries achieve better performance than local assay libraries alone, in terms of the number of identified peptides and proteins and the specificity to detect differentially abundant proteins. Our findings show that the performance of extended assay libraries is influenced by the MS/MS feature similarity of the seed and external libraries, while statistical analysis using multiple testing corrections increases the statistical rigor needed when searching against large extended assay libraries. PMID:27161445

  15. Hardware Interface Description for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio Ssystem (STRS) Radio

    Science.gov (United States)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx ML605 Virtex-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek eBox 620-110-FL) running the Ubuntu 12.4 operating system. Figure 1 shows the RIACS platform hardware. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications.The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  16. Waveform Developer's Guide for the Integrated Power, Avionics, and Software (iPAS) Space Telecommunications Radio System (STRS) Radio

    Science.gov (United States)

    Shalkhauser, Mary Jo W.; Roche, Rigoberto

    2017-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework for software defined radios (SDRs) to abstract the application software from the radio platform hardware. The STRS standard aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. To promote the use of the STRS architecture for future NASA advanced exploration missions, NASA Glenn Research Center (GRC) developed an STRS-compliant SDR on a radio platform used by the Advance Exploration System program at the Johnson Space Center (JSC) in their Integrated Power, Avionics, and Software (iPAS) laboratory. The iPAS STRS Radio was implemented on the Reconfigurable, Intelligently-Adaptive Communication System (RIACS) platform, currently being used for radio development at JSC. The platform consists of a Xilinx(Trademark) ML605 Virtex(Trademark)-6 FPGA board, an Analog Devices FMCOMMS1-EBZ RF transceiver board, and an Embedded PC (Axiomtek(Trademark) eBox 620-110-FL) running the Ubuntu 12.4 operating system. The result of this development is a very low cost STRS compliant platform that can be used for waveform developments for multiple applications. The purpose of this document is to describe how to develop a new waveform using the RIACS platform and the Very High Speed Integrated Circuits (VHSIC) Hardware Description Language (VHDL) FPGA wrapper code and the STRS implementation on the Axiomtek processor.

  17. Integrating semantic web and software agents : Exchanging RIF and BDI rules

    NARCIS (Netherlands)

    Gong, Y.; Overbeek, S.J.

    2011-01-01

    Software agents and rules are both used for creating flexibility. Exchanging rules between Semantic Web and agents can ensure consistency in rules and support easy updating and changing of rules. The Rule Interchange Format (RIF) is a new W3C recommendation Semantic Web standard for exchanging rules

  18. An integrated approach for requirement selection and scheduling in software release planning

    NARCIS (Netherlands)

    Li, C.; van den Akker, Marjan; Brinkkemper, Sjaak; Diepen, Guido

    2010-01-01

    It is essential for product software companies to decide which requirements should be included in the next release and to make an appropriate time plan of the development project. Compared to the extensive research done on requirement selection, very little research has been performed on time

  19. TypingSuite: Integrated Software for Presenting Stimuli, and Collecting and Analyzing Typing Data

    Science.gov (United States)

    Mazerolle, Erin L.; Marchand, Yannick

    2015-01-01

    Research into typing patterns has broad applications in both psycholinguistics and biometrics (i.e., improving security of computer access via each user's unique typing patterns). We present a new software package, TypingSuite, which can be used for presenting visual and auditory stimuli, collecting typing data, and summarizing and analyzing the…

  20. High Technology Systems with Low Technology Failures: Some Experiences with Rockets on Software Quality and Integration

    Science.gov (United States)

    Craig, Larry G.

    2010-01-01

    This slide presentation reviews three failures of software and how the failures contributed to or caused the failure of a launch or payload insertion into orbit. In order to avoid these systematic failures in the future, failure mitigation strategies are suggested for use.

  1. Architecture-Driven Integration of Modeling Languages for the Design of Software-Intensive Systems

    NARCIS (Netherlands)

    Dos Santos Soares, M.

    2010-01-01

    In the research that led to this thesis a multi-disciplinary approach, combining Traffic Engineering and Software Engineering, was used. Traffic engineers come up with new control strategies and algorithms for improving traffic. Once new solutions are defined from a Traffic Engineering point of

  2. Integrated Biological Warfare Technology Platform (IBWTP). Intelligent Software Supporting Situation Awareness, Response, and Operations

    Science.gov (United States)

    2007-01-01

    phases of the technology, QLI used a common software development and maintenance environment, called the Quantum Leap Uber Build System (QLUBS). QLI...May be used in internal tools and application. Using internally developed code on internal application ( Eating your own dogfood) provides

  3. Featured Library: Parrish Library

    OpenAIRE

    Kirkwood, Hal P, Jr

    2015-01-01

    The Roland G. Parrish Library of Management & Economics is located within the Krannert School of Management at Purdue University. Between 2005 - 2007 work was completed on a white paper that focused on a student-centered vision for the Management & Economics Library. The next step was a massive collection reduction and a re-envisioning of both the services and space of the library. Thus began a 3 phase renovation from a 2 floor standard, collection-focused library into a single floor, 18,000s...

  4. Integrated Development and Maintenance of Software Products to Support Efficient Updating of Customer Configurations: A Case Study in Mass Market ERP Software

    NARCIS (Netherlands)

    Jansen, S.R.L.; Brinkkemper, S.; Ballintijn, G.; Nieuwland, Arco van

    2006-01-01

    The maintenance of enterprise application software at a customer site is a potentially complex task for software vendors. This complexity can unfortunately result in a significant amount of work and risk. This paper presents a case study of a product software vendor that tries to reduce this

  5. A COTS RF/Optical Software Defined Radio for the Integrated Radio and Optical Communications Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Zeleznikar, Daniel J.; Wroblewski, Adam C.; Tokars, Roger P.; Schoenholz, Bryan L.; Lantz, Nicholas C.

    2017-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration (NASA) is investigating the merits of a hybrid radio frequency (RF) and optical communication system for deep space missions. In an effort to demonstrate the feasibility and advantages of a hybrid RF/Optical software defined radio (SDR), a laboratory prototype was assembled from primarily commercial-off-the-shelf (COTS) hardware components. This COTS platform has been used to demonstrate simultaneous transmission of the radio and optical communications waveforms through to the physical layer (telescope and antenna). This paper details the hardware and software used in the platform and various measures of its performance. A laboratory optical receiver platform has also been assembled in order to demonstrate hybrid free space links in combination with the transmitter.

  6. Patricia Knapp’s Landmark Project to Develop a Plan of Curriculum-Integrated Library Instruction. A review of: Knapp, P. B. (1966. The Monteith College library experiment. New York, NY: Scarecrow Press.

    Directory of Open Access Journals (Sweden)

    Carol D. Howe

    2011-03-01

    assessment.Knapp outlined three levels of assessment. Investigators would assess the appropriateness of individual assignments through interviews and questionnaires collected from faculty and students, as well as completed student assignments. Knapp outlined two ways to assess library competence. First, Monteith faculty members would assess literature reviews in their subject specialties written by second semester seniors. Next, faculty from other Wayne State colleges would review papers from both Monteith and non-Monteith students to comparatively assess the students’ use of sources. Knapp proposed that faculty judgment would be the most valuable measure of the relationship between library competence and overall academic success.Knapp was prepared to implement her plan of instruction using all of her findings, but her proposal to move into phase two of the project was rejected by both the Office of Education, whose members cited economic reasons, and the Council on Library Resources, whose members were not satisfied that faculty were invested in the idea of curriculum-integrated library instruction (Worrell, 2002.

  7. Functional modelling for integration of human-software-hardware in complex physical systems

    International Nuclear Information System (INIS)

    Modarres, M.

    1996-01-01

    A framework describing the properties of complex physical systems composed of human-software-hardware interactions in terms of their functions is described. It is argued that such a framework is domain-general, so that functional primitives present a language that is more general than most other modeling methods such as mathematical simulation. The characteristics and types of functional models are described. Examples of uses of the framework in modeling physical systems composed of human-software-hardware (hereby we refer to them as only physical systems) are presented. It is concluded that a function-centered model of a physical system provides a capability for generating a high-level simulation of the system for intelligent diagnostic, control or other similar applications

  8. An approach to integrated design based componet software and OLE-technology

    DEFF Research Database (Denmark)

    Bagger-Petersen, Susanne C; Emborg, Jørgen; Andersen, Tom

    1996-01-01

    The paper reports on a prototype developed as to demonstrate the (dis)abilities of the OLE-standard to integrate different design-applications to a CAD-syste.......The paper reports on a prototype developed as to demonstrate the (dis)abilities of the OLE-standard to integrate different design-applications to a CAD-syste....

  9. Viewport: An object-oriented approach to integrate workstation software for tile and stack mode display

    OpenAIRE

    Ghosh, Srinka; Andriole, Katherine P.; Avrin, David E.

    1997-01-01

    Diagnostic workstation design has migrated towards display presentation in one of two modes: tiled images or stacked images. It is our impression that the workstation setup or configuration in each of these two modes is rather distinct. We sought to establish a commonality to simplify software design, and to enable a single descriptor method to facilitate folder manager development of “hanging” protocols. All current workstation designs use a combination of “off-screen” and “on-screen” memory...

  10. A Reference Software Architecture to Support Unmanned Aircraft Integration in the National Airspace System

    Science.gov (United States)

    2012-07-01

    and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general architecture and a SAA testbed implementation that...that provides data and software services to enable a set of Unmanned Aircraft (UA) platforms to operate in a wide range of air domains which may...implemented by MIT Lincoln Laboratory in the form of a Sense and Avoid ( SAA ) testbed that provides some of the core services . This paper describes the general

  11. Integrated navigation and control software system for MRI-guided robotic prostate interventions.

    Science.gov (United States)

    Tokuda, Junichi; Fischer, Gregory S; DiMaio, Simon P; Gobbi, David G; Csoma, Csaba; Mewes, Philip W; Fichtinger, Gabor; Tempany, Clare M; Hata, Nobuhiko

    2010-01-01

    A software system to provide intuitive navigation for MRI-guided robotic transperineal prostate therapy is presented. In the system, the robot control unit, the MRI scanner, and the open-source navigation software are connected together via Ethernet to exchange commands, coordinates, and images using an open network communication protocol, OpenIGTLink. The system has six states called "workphases" that provide the necessary synchronization of all components during each stage of the clinical workflow, and the user interface guides the operator linearly through these workphases. On top of this framework, the software provides the following features for needle guidance: interactive target planning; 3D image visualization with current needle position; treatment monitoring through real-time MR images of needle trajectories in the prostate. These features are supported by calibration of robot and image coordinates by fiducial-based registration. Performance tests show that the registration error of the system was 2.6mm within the prostate volume. Registered real-time 2D images were displayed 1.97 s after the image location is specified. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. LipiDex: An Integrated Software Package for High-Confidence Lipid Identification.

    Science.gov (United States)

    Hutchins, Paul D; Russell, Jason D; Coon, Joshua J

    2018-04-17

    State-of-the-art proteomics software routinely quantifies thousands of peptides per experiment with minimal need for manual validation or processing of data. For the emerging field of discovery lipidomics via liquid chromatography-tandem mass spectrometry (LC-MS/MS), comparably mature informatics tools do not exist. Here, we introduce LipiDex, a freely available software suite that unifies and automates all stages of lipid identification, reducing hands-on processing time from hours to minutes for even the most expansive datasets. LipiDex utilizes flexible in silico fragmentation templates and lipid-optimized MS/MS spectral matching routines to confidently identify and track hundreds of lipid species and unknown compounds from diverse sample matrices. Unique spectral and chromatographic peak purity algorithms accurately quantify co-isolation and co-elution of isobaric lipids, generating identifications that match the structural resolution afforded by the LC-MS/MS experiment. During final data filtering, ionization artifacts are removed to significantly reduce dataset redundancy. LipiDex interfaces with several LC-MS/MS software packages, enabling robust lipid identification to be readily incorporated into pre-existing data workflows. Copyright © 2018 Elsevier Inc. All rights reserved.

  13. Integrated navigation and control software system for MRI-guided robotic prostate interventions

    Science.gov (United States)

    Tokuda, Junichi; Fischer, Gregory S.; DiMaio, Simon P.; Gobbi, David G.; Csoma, Csaba; Mewes, Philip W.; Fichtinger, Gabor; Tempany, Clare M.; Hata, Nobuhiko

    2010-01-01

    A software system to provide intuitive navigation for MRI-guided robotic transperineal prostate therapy is presented. In the system, the robot control unit, the MRI scanner, and the open-source navigation software are connected together via Ethernet to exchange commands, coordinates, and images using an open network communication protocol, OpenIGTLink. The system has six states called “workphases” that provide the necessary synchronization of all components during each stage of the clinical workflow, and the user interface guides the operator linearly through these workphases. On top of this framework, the software provides the following features for needle guidance: interactive target planning; 3D image visualization with current needle position; treatment monitoring through real-time MR images of needle trajectories in the prostate. These features are supported by calibration of robot and image coordinates by fiducial-based registration. Performance tests show that the registration error of the system was 2.6 mm within the prostate volume. Registered real-time 2D images were displayed 1.97 s after the image location is specified. PMID:19699057

  14. Haptic/graphic rehabilitation: integrating a robot into a virtual environment library and applying it to stroke therapy.

    Science.gov (United States)

    Sharp, Ian; Patton, James; Listenberger, Molly; Case, Emily

    2011-08-08

    Recent research that tests interactive devices for prolonged therapy practice has revealed new prospects for robotics combined with graphical and other forms of biofeedback. Previous human-robot interactive systems have required different software commands to be implemented for each robot leading to unnecessary developmental overhead time each time a new system becomes available. For example, when a haptic/graphic virtual reality environment has been coded for one specific robot to provide haptic feedback, that specific robot would not be able to be traded for another robot without recoding the program. However, recent efforts in the open source community have proposed a wrapper class approach that can elicit nearly identical responses regardless of the robot used. The result can lead researchers across the globe to perform similar experiments using shared code. Therefore modular "switching out"of one robot for another would not affect development time. In this paper, we outline the successful creation and implementation of a wrapper class for one robot into the open-source H3DAPI, which integrates the software commands most commonly used by all robots.

  15. An Optical Receiver Post Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administrations (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  16. An Optical Receiver Post-Processing System for the Integrated Radio and Optical Communications Software Defined Radio Test Bed

    Science.gov (United States)

    Nappier, Jennifer M.; Tokars, Roger P.; Wroblewski, Adam C.

    2016-01-01

    The Integrated Radio and Optical Communications (iROC) project at the National Aeronautics and Space Administration's (NASA) Glenn Research Center is investigating the feasibility of a hybrid radio frequency (RF) and optical communication system for future deep space missions. As a part of this investigation, a test bed for a radio frequency (RF) and optical software defined radio (SDR) has been built. Receivers and modems for the NASA deep space optical waveform are not commercially available so a custom ground optical receiver system has been built. This paper documents the ground optical receiver, which is used in order to test the RF and optical SDR in a free space optical communications link.

  17. Evaluation of System-Integrated Smart Grid Devices using Software- and Hardware-in-the-Loop

    Energy Technology Data Exchange (ETDEWEB)

    Lundstrom, Blake; Chakraborty, Sudipta; Lauss, Georg; Brundlinger, Roland; Conklin, Russell

    2016-12-12

    This paper presents a concise description of state-of-the-art real-time simulation-based testing methods and demonstrates how they can be used independently and/or in combination as an integrated development and validation approach for smart grid DERs and systems. A three-part case study demonstrating the application of this integrated approach at the different stages of development and validation of a system-integrated smart photovoltaic (PV) inverter is also presented. Laboratory testing results and perspectives from two international research laboratories are included in the case study.

  18. Requirement Volatility, Standardization and Knowledge Integration in Software Projects: An Empirical Analysis on Outsourced IS Development Projects

    Directory of Open Access Journals (Sweden)

    Rajesri Govindaraju

    2015-08-01

    Full Text Available Information systems development (ISD projects are highly complex, with different groups of people having  to collaborate and exchange their knowledge. Considering the intensity of knowledge exchange that takes place in outsourced ISD projects, in this study a conceptual model was developed, aiming to examine the influence of four antecedents, i.e. standardization, requirement volatility, internal integration, and external integration, on two dependent variables, i.e. process performance and product performance. Data  were collected from 46 software companies in four big cities in Indonesia. The collected data were examined to verify the proposed theoretical model using the partial least square structural equation modeling (PLS-SEM technique. The results show that process performance is significantly influenced by internal integration and standardization, while product performance is  significantly influenced by external integration and  requirement volatility. This study contributes  to a better understanding of how knowledge integration can be managed in outsourced ISD projects in view of increasing their success.

  19. Software for noise measurements

    International Nuclear Information System (INIS)

    Zyryanov, V.A.

    1987-01-01

    The CURS program library comprising 38 fortran-programs, designed for processing descrete experimental data in the form of random or determined periodic processes is described. The library is based on the modular construction principle which allows one to create on its base any sets of programs to solve tasks related to NPP operation, and to develop special software

  20. Public libraries in the library regions in the year 2009

    Directory of Open Access Journals (Sweden)

    Milena Bon

    2011-01-01

    Full Text Available Purpose: Regional public libraries were initiated in 2003 to connect professional activities of libraries within regional networks and to ensure coordinated library development in a region in cooperation with the Library System Development Centre at the National and University Library performing a coordinating role. The article analyses the performance of public libraries and their integration in regional library networks in order to find out the level of development of conditions of performance of public libraries.Methodology/approach: Statistical data for the year 2009 were the basis for the overview of library activities of ten library regions with regard to applicable legislation and library standards. The level of regional library activities is compared to the socio-economic situation of statistical regions thus representing a new approach to the presentation of Slovenian’s public libraries’ development.Results: Absolute values indicate better development of nine libraries in the central Slovenia region while relative values offer a totally different picture. Four libraries in the region of Nova Gorica prove the highest level of development.Research limitation: Research is limited to the year 2009 and basic statistical analysis.Originality/practical implications: Findings of the analysis are useful for public libraries to plan their development strategy within a region and for financial bodies to provide for adequate financing for library activities in a specific region. The basic condition for successful public library performance is the even and harmonized development of conditions of performance as recommended by library standards.

  1. Uso de software libre para el aprendizaje de la integral definida

    OpenAIRE

    Medina, Mabel Azucena; Rubio, Héctor Eduardo

    2013-01-01

    Se desarrolla la experiencia de una unidad didáctica en el marco de la teoría de Brousseau y de la Enseñanza para la Comprensión. El Tópico Generativo es la integral definida. Las Metas de Comprensión son la definición de la integral definida y las formas de evaluación de la integral definida. Los Desempeños de Comprensión son actividades autónomas de evaluación de integrales definidas. El propósito de esta actividad es que los alumnos comprendan que pueden calcular aproximadamente una integr...

  2. WIMS-D library update

    International Nuclear Information System (INIS)

    2007-05-01

    WIMS-D (Winfrith Improved Multigroup Scheme-D) is the name of a family of software packages for reactor lattice calculations and is one of the few reactor lattice codes in the public domain and available on noncommercial terms. WIMSD-5B has recently been released from the OECD Nuclear Energy Agency Data Bank, and features major improvements in machine portability, as well as incorporating a few minor corrections. This version supersedes WIMS-D/4, which was released by the Winfrith Technology Centre in the United Kingdom for IBM machines and has been adapted for various other computer platforms in different laboratories. The main weakness of the WIMS-D package is the multigroup constants library, which is based on very old data. The relatively good performance of WIMS-D is attributed to a series of empirical adjustments to the multigroup data. However, the adjustments are not always justified on the basis of more accurate and recent experimental measurements. Following the release of new and revised evaluated nuclear data files, it was felt that the performance of WIMS-D could be improved by updating the associated library. The WIMS-D Library Update Project (WLUP) was initiated in the early 1990s with the support of the IAEA. This project consisted of voluntary contributions from a large number of participants. Several benchmarks for testing the library were identified and analysed, the WIMSR module of the NJOY code system was upgraded and the author of NJOY accepted the proposed updates for the official code system distribution. A detailed parametric study was performed to investigate the effects of various data processing input options on the integral results. In addition, the data processing methods for the main reactor materials were optimized. Several partially updated libraries were produced for testing purposes. The final stage of the WLUP was organized as a coordinated research project (CRP) in order to speed up completion of the fully updated library

  3. Sighten Final Technical Report DEEE0006690 Deploying an integrated and comprehensive solar financing software platform

    Energy Technology Data Exchange (ETDEWEB)

    O' Leary, Conlan [Sighten, Inc., San Francisco, CA (United States)

    2017-10-15

    Over the project, Sighten built a comprehensive software-as-a-service (Saas) platform to automate and streamline the residential solar financing workflow. Before the project period, significant time and money was spent by companies on front-end tools related to system design and proposal creation, but comparatively few resources were available to support the many back-end calculations and data management processes that underpin third party financing. Without a tool like Sighten, the solar financing processes involved passing information from the homeowner prospect into separate tools for system design, financing, and then later to reporting tools including Microsoft Excel, CRM software, in-house software, outside software, and offline, manual processes. Passing data between tools and attempting to connect disparate systems results in inefficiency and inaccuracy for the industry. Sighten was built to consolidate all financial and solar-related calculations in a single software platform. It significantly improves upon the accuracy of these calculations and exposes sophisticated new analysis tools resulting in a rigorous, efficient and cost-effective toolset for scaling residential solar. Widely deploying a platform like Sighten’s significantly and immediately impacts the residential solar space in several important ways: 1) standardizing and improving the quality of all quantitative calculations involved in the residential financing process, most notably project finance, system production and reporting calculations; 2) representing a true step change in terms of reporting and analysis capabilities by maintaining more accurate data and exposing sophisticated tools around simulation, tranching, and financial reporting, among others, to all stakeholders in the space; 3) allowing a broader group of developers/installers/finance companies to access the capital markets by providing an out-of-the-box toolset that handles the execution of running investor capital through a

  4. An integrated PCR colony hybridization approach to screen cDNA libraries for full-length coding sequences.

    Science.gov (United States)

    Pollier, Jacob; González-Guzmán, Miguel; Ardiles-Diaz, Wilson; Geelen, Danny; Goossens, Alain

    2011-01-01

    cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) is a commonly used technique for genome-wide expression analysis that does not require prior sequence knowledge. Typically, quantitative expression data and sequence information are obtained for a large number of differentially expressed gene tags. However, most of the gene tags do not correspond to full-length (FL) coding sequences, which is a prerequisite for subsequent functional analysis. A medium-throughput screening strategy, based on integration of polymerase chain reaction (PCR) and colony hybridization, was developed that allows in parallel screening of a cDNA library for FL clones corresponding to incomplete cDNAs. The method was applied to screen for the FL open reading frames of a selection of 163 cDNA-AFLP tags from three different medicinal plants, leading to the identification of 109 (67%) FL clones. Furthermore, the protocol allows for the use of multiple probes in a single hybridization event, thus significantly increasing the throughput when screening for rare transcripts. The presented strategy offers an efficient method for the conversion of incomplete expressed sequence tags (ESTs), such as cDNA-AFLP tags, to FL-coding sequences.

  5. Viewport: an object-oriented approach to integrate workstation software for tile and stack mode display.

    Science.gov (United States)

    Ghosh, S; Andriole, K P; Avrin, D E

    1997-08-01

    Diagnostic workstation design has migrated towards display presentation in one of two modes: tiled images or stacked images. It is our impression that the workstation setup or configuration in each of these two modes is rather distinct. We sought to establish a commonality to simplify software design, and to enable a single descriptor method to facilitate folder manager development of "hanging" protocols. All current workstation designs use a combination of "off-screen" and "on-screen" memory whether or not they use a dedicated display subsystem, or merely a video board. Most diagnostic workstations also have two or more monitors. Our central concept is that of a "logical" viewport that can be smaller than, the same size as, or larger than a single monitor. Each port "views" an image data sequence loaded into offscreen memory. Each viewport can display one or more images in sequence in a one-on-one or traditionally tiled presentation. Viewports can be assigned to the available monitor "real estate" in any manner that fits. For example, a single sequence computed tomography (CT) study could be displayed across all monitors in a tiled appearance by assigning a single large viewport to the monitors. At the other extreme, a multisequence magnetic resonance (MR) study could be compared with a similar previous study by assigning four viewports to each monitor, single image display per viewport, and assigning four of the sequences of the current study to the left monitor viewports, and four of the earlier study to the right monitor viewports. Ergonomic controls activate scrolling through the off-screen image sequence data. Workstation folder manager hanging protocols could then specify viewports, number of images per viewport, and the automatic assignment of appropriately named sequences of current and previous studies to the viewports on a radiologist-specific basis. Furthermore, software development is simplified by common base objects and methods of the tile and stack

  6. High integrity software for nuclear power plants: Candidate guidelines, technical basis and research needs. Executive summary: Volume 1

    International Nuclear Information System (INIS)

    Seth, S.; Bail, W.; Cleaves, D.; Cohen, H.; Hybertson, D.; Schaefer, C.; Stark, G.; Ta, A.; Ulery, B.

    1995-06-01

    The work documented in this report was performed in support of the US Nuclear Regulatory Commission to examine the technical basis for candidate guidelines that could be considered in reviewing and evaluating high integrity computer software used in the safety systems of nuclear power plants. The framework for the work consisted of the following software development and assurance activities: requirements specification; design; coding; verification and validation, including static analysis and dynamic testing; safety analysis; operation and maintenance; configuration management; quality assurance; and planning and management. Each activity (framework element) was subdivided into technical areas (framework subelements). The report describes the development of approximately 200 candidate guidelines that span the entire range of software life-cycle activities; the assessment of the technical basis for those candidate guidelines; and the identification, categorization and prioritization of research needs for improving the technical basis. The report has two volumes: Volume 1, Executive Summary, includes an overview of the framework and of each framework element, the complete set of candidate guidelines, the results of the assessment of the technical basis for each candidate guideline, and a discussion of research needs that support the regulatory function; Volume 2 is the main report

  7. Hardware/software co-design and optimization for cyberphysical integration in digital microfluidic biochips

    CERN Document Server

    Luo, Yan; Ho, Tsung-Yi

    2015-01-01

    This book describes a comprehensive framework for hardware/software co-design, optimization, and use of robust, low-cost, and cyberphysical digital microfluidic systems. Readers with a background in electronic design automation will find this book to be a valuable reference for leveraging conventional VLSI CAD techniques for emerging technologies, e.g., biochips or bioMEMS. Readers from the circuit/system design community will benefit from methods presented to extend design and testing techniques from microelectronics to mixed-technology microsystems. For readers from the microfluidics domain,

  8. An Integrated Software Development Framework for PLC and FPGA based Digital I and Cs

    International Nuclear Information System (INIS)

    Yoo, Jun Beom; Kim, Eui Sub; Lee, Dong Ah; Choi, Jong Gyun

    2014-01-01

    NuDE 2.0 (Nuclear Development Environment) is a model-based software development environment for safety- critical digital systems in nuclear power plants. It makes possible to develop PLC-based systems as well as FPGA-based systems simultaneously from the same requirement or design specifications. The case study showed that the NuDE 2.0 can be adopted as an effective method of bridging the gap between the existing PLC and upcoming FPGA-based developments as well as a means of gaining diversity

  9. An Integrated Software Development Framework for PLC and FPGA based Digital I and Cs

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Jun Beom; Kim, Eui Sub; Lee, Dong Ah [Konkuk University, Seoul (Korea, Republic of); Choi, Jong Gyun [KAERI, Daejeon (Korea, Republic of)

    2014-08-15

    NuDE 2.0 (Nuclear Development Environment) is a model-based software development environment for safety- critical digital systems in nuclear power plants. It makes possible to develop PLC-based systems as well as FPGA-based systems simultaneously from the same requirement or design specifications. The case study showed that the NuDE 2.0 can be adopted as an effective method of bridging the gap between the existing PLC and upcoming FPGA-based developments as well as a means of gaining diversity.

  10. The integration of Workload Management Systems for the ProtoDUNE Software and Computing cluster

    CERN Document Server

    Oniciuc, Oriana-Maria

    2017-01-01

    The protoDUNE experimental program is designed to test and validate the technologies for DUNE. All of the many elements in the chain of data acquisition, storage, distribution and processing are critically important to derive physics results from the data. To achieve these, a software stack has been chosen to implement automatic propagation of configurations across all the nodes in the NP cluster. This report presents the architecture of the system and the operations through which the cluster features can be scaled.

  11. Hardware and software architecture for the integration of the new EC waves launcher in FTU control system

    Energy Technology Data Exchange (ETDEWEB)

    Boncagni, L. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Centioli, C., E-mail: cristina.centioli@enea.it [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Galperti, C.; Alessi, E.; Granucci, G. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Grosso, L.A. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Marchetto, C. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Napolitano, M. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Nowak, S. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Panella, M. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy); Sozzi, C. [Associazione EURATOM-ENEA-CNR sulla Fusione – IFP-CNR, Via Roberto Cozzi, 53 20125 Milano (Italy); Tilia, B.; Vitale, V. [Associazione EURATOM-ENEA sulla Fusione – ENEA, Via Enrico Fermi, 45 00045 Frascati (RM) (Italy)

    2013-10-15

    Highlights: ► The integration of a new ECRH launcher to FTU legacy control system is reported. ► Fast control has been developed with a three-node RT cluster within MARTe framework. ► Slow control was implemented with a Simatic S7 PLC and an EPICS IOC-CA application. ► The first results have assessed the feasibility of the launcher control architecture. -- Abstract: The role of high power electron cyclotron (EC) waves in controlling magnetohydrodynamic (MHD) instabilities in tokamaks has been assessed in several experiments, exploiting the physical effects induced by resonant heating and current drive. Recently a new EC launcher, whose main goal is controlling tearing modes and possibly preventing their onset, is being implemented on FTU. So far most of the components of the launcher control strategy have been realized and successfully tested on plasma experiments. Nevertheless the operations of the new launcher must be completely integrated into the existing one, and to FTU control system. This work deals with this final step, proposing a hardware and software architecture implementing up to date technologies, to achieve a modular and effective control strategy well integrated into a legacy system. The slow control system of the new EC launcher is based on a Siemens S7 Programmable Logic Controller (PLC), integrated into FTU control system supervisor through an EPICS input output controller (IOC) and an in-house developed Channel Access client application creating an abstraction layer that decouples the IOC and the PLC from the FTU Supervisor software. This architecture could enable a smooth migration to an EPICS-only supervisory control system. The real time component of the control system is based on the open source MARTe framework relying on a Linux real time cluster, devoted to the detection of MHD instabilities and the calculation of the injection angles and the time reference for the radiofrequency power enable commands for the EC launcher.

  12. Experiences from the formal specification of the integration platform and the synthesis of SDT with the software bus

    International Nuclear Information System (INIS)

    Thunem, Harald; Mohn, Peter; Sandmark, Haakon; Stoelen, Ketil

    1999-04-01

    The three year programme 1997-1999 for the OECD Halden Reactor Project (HRP) identifies the need to gain experience from applying formal techniques in real-life system developments. This motivated the initiation of the HRP research activity Integration of Formal Specification in the Development of HAMMLAB 2000 (INT-FS). The principal objective was to experiment with formal techniques in system developments at the HRP; in particular, system developments connected to HAMMLAB 2000 - the computerised laboratory for man-machine-interaction experiments currently under construction. It was hoped that this experimentation with formal techniques should result in a better understanding of how such techniques should be utilised in a more industrial setting. To obtain more knowledge with respect to the practical effects and consequences of an increased level of formalization was another objective. This report summarises experiences, results and conclusions from a pre-study addressing INT-FS related issues connected to the development of the HAMMLAB 2000 Integration Platform (IP). The report starts by giving a brief overview of the IP. Then it describes and summarises experiences from the formalization of a top-level requirements specification for the IP. Finally, it discusses various approaches for the integration of applications generated automatically through the CASE-tool SDT and the Software Bus on which the communication within HAMMLAB 2000 will be based. The report concludes that the selected formalisms and tools are well-suited to describe IP-like systems. The report also concludes that the integration of SDT applications with the Software Bus will not be a major obstacle, and finally that a monitoring component for the IP is well-suited for development within INT-FS (author) (ml)

  13. Supervision Software for the Integration of the Beam Interlock System with the CERN Accelerator Complex

    CERN Document Server

    Audrain, M; Dragu, M; Fuchsberger, K; Garnier, JC; Gorzawski, AA; Koza, M; Krol, K; Moscatelli, A; Puccio, B; Stamos, K; Zerlauth, M

    2014-01-01

    The Accelerator complex at the European Organisation for Nuclear Research (CERN) is composed of many systems which are required to function in a valid state to ensure safe beam operation. One key component of machine protection, the Beam Interlock System (BIS), was designed to interface critical systems around the accelerator chain, provide fast and reliable transmission of beam dump requests and trigger beam extraction in case of malfunctioning of equipment systems or beam losses. Numerous upgrades of accelerator and controls components during the Long Shutdown 1 (LS1) are followed by subsequent software updates that need to be thoroughly validated before the restart of beam operation in 2015. In parallel, the ongoing deployments of the BIS hardware in the PS booster (PSB) and the future LINAC4 give rise to new requirements for the related controls and monitoring software due to their fast cycle times. This paper describes the current status and ongoing work as well as the long-term vision for the integratio...

  14. Development of Soil Compaction Analysis Software (SCAN Integrating a Low Cost GPS Receiver and Compactometer

    Directory of Open Access Journals (Sweden)

    Dongha Lee

    2012-02-01

    Full Text Available A software for soil compaction analysis (SCAN has been developed for evaluating the compaction states using the data from the GPS as well as a compactometer attached on the roller. The SCAN is distinguished from other previous software for intelligent compaction (IC in that it can use the results from various types of GPS positioning methods, and it also has an optimal structure for remotely managing the large amounts of data gathered from numerous rollers. For this, several methods were developed: (1 improving the accuracy of low cost GPS receiver’s positioning results; (2 modeling the trajectory of a moving roller using a GPS receiver’s results and linking it with the data from the compactometer; and (3 extracting the information regarding the compaction states of the ground from the modeled trajectory, using spatial analysis methods. The SCAN was verified throughout various field compaction tests, and it has been confirmed that it can be a very effective tool in evaluating field compaction states.

  15. ParseCNV integrative copy number variation association software with quality tracking.

    Science.gov (United States)

    Glessner, Joseph T; Li, Jin; Hakonarson, Hakon

    2013-03-01

    A number of copy number variation (CNV) calling algorithms exist; however, comprehensive software tools for CNV association studies are lacking. We describe ParseCNV, unique software that takes CNV calls and creates probe-based statistics for CNV occurrence in both case-control design and in family based studies addressing both de novo and inheritance events, which are then summarized based on CNV regions (CNVRs). CNVRs are defined in a dynamic manner to allow for a complex CNV overlap while maintaining precise association region. Using this approach, we avoid failure to converge and non-monotonic curve fitting weaknesses of programs, such as CNVtools and CNVassoc, and although Plink is easy to use, it only provides combined CNV state probe-based statistics, not state-specific CNVRs. Existing CNV association methods do not provide any quality tracking information to filter confident associations, a key issue which is fully addressed by ParseCNV. In addition, uncertainty in CNV calls underlying CNV associations is evaluated to verify significant results, including CNV overlap profiles, genomic context, number of probes supporting the CNV and single-probe intensities. When optimal quality control parameters are followed using ParseCNV, 90% of CNVs validate by polymerase chain reaction, an often problematic stage because of inadequate significant association review. ParseCNV is freely available at http://parsecnv.sourceforge.net.

  16. L’applicazione delle tecnologie fotovoltaiche integrate sulle coperture degli edifici con software GIS

    Directory of Open Access Journals (Sweden)

    Stefano Bonesso

    2013-08-01

    Full Text Available Per pianificare l’utilizzo di tecnologie che sfruttano energia rinnovabile su un territorio, in particolare quelle che riguarda il solare, si possono utilizzare i software GIS (Sistemi Informativi Geografici che consentono di analizzare e rappresentare un dato geo-riferito.  Potential of photovoltaic technologies on buildings’ roofs using geographic information systems (GIS - In order to plan the diffusion of renewable energy technologies, geographic information systems (GIS can be useful. In this study photovoltaic technologies in urban environments were examined, considering the shadows of urban contest and of territory orography evaluated with GIS (ESRI ArcGIS. The results of potential photovoltaic technologies strongly depend on input data but not always roof data are accurate. The aim of this workis to define a tool to improve the results of a GIS simulation on urban scale. To validate the procedure, the results were compared with data monitored by PERSIL project. The analysis was based on the use of geographic information systems, data scanner laser (LiDAR, and software for 3D reconstruction of the buildings.

  17. Integrated Reliability Estimation of a Nuclear Maintenance Robot including a Software

    Energy Technology Data Exchange (ETDEWEB)

    Eom, Heung Seop; Kim, Jae Hee; Jeong, Kyung Min [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-10-15

    Conventional reliability estimation techniques such as Fault Tree Analysis (FTA), Reliability Block Diagram (RBD), Markov Model, and Event Tree Analysis (ETA) have been used widely and approved in some industries. Then there are some limitations when we use them for a complicate robot systems including software such as intelligent reactor inspection robots. Therefore an expert's judgment plays an important role in estimating the reliability of a complicate system in practice, because experts can deal with diverse evidence related to the reliability and then perform an inference based on them. The proposed method in this paper combines qualitative and quantitative evidences and performs an inference like experts. Furthermore, it does the work in a formal and in a quantitative way unlike human experts, by the benefits of Bayesian Nets (BNs)

  18. Integration of control and building performance simulation software by run-time coupling

    NARCIS (Netherlands)

    Yahiaoui, A.; Hensen, J.L.M.; Soethout, L.L.

    2003-01-01

    This paper presents the background, approach and initial results of a project, which aims to achieve better integrated building and systems control modeling in building performance simulation by runtime coupling of distributed computer programs. This paper focuses on one of the essential steps

  19. Monitoring single-cell gene regulation under dynamically controllable conditions with integrated microfluidics and software

    NARCIS (Netherlands)

    Kaiser, Matthias; Jug, Florian; Julou, Thomas; Deshpande, S.R.; Pfohl, Thomas; Silander, Olin K.; Myers, Gene; Van Nimwegen, Erik

    2018-01-01

    Much is still not understood about how gene regulatory interactions control cell fate decisions in single cells, in part due to the difficulty of directly observing gene regulatory processes in vivo. We introduce here a novel integrated setup consisting of a microfluidic chip and accompanying

  20. Integrating Multimedia ICT Software in Language Curriculum: Students' Perception, Use, and Effectiveness

    Science.gov (United States)

    Penner, Nikolai; Grodek, Elzbieta

    2014-01-01

    Information and Communication Technologies (ICT) constitute an integral part of the teaching and learning environment in present-day educational institutions and play an increasingly important role in the modern second language classroom. In this study, an online language learning tool "Tell Me More" (TMM) has been introduced as a…

  1. Validation study of SRAC2006 code system based on evaluated nuclear data libraries for TRIGA calculations by benchmarking integral parameters of TRX and BAPL lattices of thermal reactors

    International Nuclear Information System (INIS)

    Khan, M.J.H.; Sarker, M.M.; Islam, S.M.A.

    2013-01-01

    Highlights: ► To validate the SRAC2006 code system for TRIGA neutronics calculations. ► TRX and BAPL lattices are treated as standard benchmarks for this purpose. ► To compare the calculated results with experiment as well as MCNP values in this study. ► The study demonstrates a good agreement with the experiment and the MCNP results. ► Thus, this analysis reflects the validation study of the SRAC2006 code system. - Abstract: The goal of this study is to present the validation study of the SRAC2006 code system based on evaluated nuclear data libraries ENDF/B-VII.0 and JENDL-3.3 for neutronics analysis of TRIGA Mark-II Research Reactor at AERE, Bangladesh. This study is achieved through the analysis of integral parameters of TRX and BAPL benchmark lattices of thermal reactors. In integral measurements, the thermal reactor lattices TRX-1, TRX-2, BAPL-UO 2 -1, BAPL-UO 2 -2 and BAPL-UO 2 -3 are treated as standard benchmarks for validating/testing the SRAC2006 code system as well as nuclear data libraries. The integral parameters of the said lattices are calculated using the collision probability transport code PIJ of the SRAC2006 code system at room temperature 20 °C based on the above libraries. The calculated integral parameters are compared to the measured values as well as the MCNP values based on the Chinese evaluated nuclear data library CENDL-3.0. It was found that in most cases, the values of integral parameters demonstrate a good agreement with the experiment and the MCNP results. In addition, the group constants in SRAC format for TRX and BAPL lattices in fast and thermal energy range respectively are compared between the above libraries and it was found that the group constants are identical with very insignificant difference. Therefore, this analysis reflects the validation study of the SRAC2006 code system based on evaluated nuclear data libraries JENDL-3.3 and ENDF/B-VII.0 and can also be essential to implement further neutronics calculations

  2. MOIRA Software Framework - Integrated User-friendly Shell for The Environmental Decision Support Systems

    International Nuclear Information System (INIS)

    Hofman, Dmitry; Nordlinder, Sture

    2003-01-01

    MOIRA DSS is a model-based computerised system for the identification of optimal remedial strategies to restore radionuclide contaminated fresh water environment The examples of the questions which decision-maker could address to the system are 'Is lake liming effective in reducing the radiocesium uptake by fish?', C an control of catchment run-off be an effective measure against further redistribution of radionuclides by river?', 'Is sediment removal worthwhile to reduce further contamination of the aquatic environment?'. The MOIRA system could help decision-maker to avoid implementation of inappropriate and expensive countermeasures. MOIRA gives the possibility to predict effeas of implementation of different types of the countermeasures and evaluate both 'ecological' and 'social' effect of the countermeasures. Decision support process using MOIRA DSS can be subdivided to the following steps: Definition of the site-specific environmental and socio-economic parameters using GIS-based data. Unknown site-specific data could be estimated using GIS-based models, default data for the socio-economic parameters, data directly provided by user. Providing data about fallout of the radionuclides. Definition of the time interval for which prognosis will be made. Definition of the alternative strategies of the countermeasures. Evaluation of the sequences of the implementation of the user-defined strategies and 'no actions' strategy using predictive models. Ranking strategies using Multi-Attribute Analysis Module (MAA) Preparation of the recommendations in the form of report. This process requires usage of several computerised tools such as predictive models, multi-attribute analysis software, geographical information system, data base. MOIRA software framework could be used as the basis for the creation of the wide range of the user-friendly and easy-to-learn decision support systems. It can also provide the advanced graphical user interface and data checking system for the

  3. Library Automation and Networking in India: Problems and Prospects.

    Science.gov (United States)

    Vyas, S. D.

    1997-01-01

    Examines the information infrastructure and the impact of information technology in India. Highlights include attempts toward automation; library networking at the national and local level; descriptions of four major networks; library software; and constraints of networking in academic libraries. (LRW)

  4. COMSY- A Software Tool For Aging And Plant Life Management With An Integrated Documentation Tool

    International Nuclear Information System (INIS)

    Baier, Roman; Zander, Andre

    2008-01-01

    For the aging and plant life management the integrity of the mechanical components and structures is one of the key objectives. In order to ensure this integrity it is essential to implement a comprehensive aging management. This should be applied to all safety relevant mechanical systems or components, civil structures, electrical systems as well as instrumentation and control (I and C). The following aspects should be covered: - Identification and assessment of relevant degradation mechanisms; - Verification and evaluation of the quality status of all safety relevant systems, structures and components (SSC's); - Verification and modernization of I and C and electrical systems; - Reliable and up-to-date documentation. For the support of this issue AREVA NP GmbH has developed the computer program COMSY, which utilizes more than 30 years of experience resulting from research activities and operational experience. The program provides the option to perform a plant-wide screening for identifying system areas, which are sensitive to specific degradation mechanisms. Another object is the administration and evaluation of NDE measurements from different techniques. An integrated documentation tool makes the document management and maintenance fast, reliable and independent from staff service. (authors)

  5. Software integration for automated stability analysis and design optimization of a bearingless rotor blade

    Science.gov (United States)

    Gunduz, Mustafa Emre

    Many government agencies and corporations around the world have found the unique capabilities of rotorcraft indispensable. Incorporating such capabilities into rotorcraft design poses extra challenges because it is a complicated multidisciplinary process. The concept of applying several disciplines to the design and optimization processes may not be new, but it does not currently seem to be widely accepted in industry. The reason for this might be the lack of well-known tools for realizing a complete multidisciplinary design and analysis of a product. This study aims to propose a method that enables engineers in some design disciplines to perform a fairly detailed analysis and optimization of a design using commercially available software as well as codes developed at Georgia Tech. The ultimate goal is when the system is set up properly, the CAD model of the design, including all subsystems, will be automatically updated as soon as a new part or assembly is added to the design; or it will be updated when an analysis and/or an optimization is performed and the geometry needs to be modified. Designers and engineers will be involved in only checking the latest design for errors or adding/removing features. Such a design process will take dramatically less time to complete; therefore, it should reduce development time and costs. The optimization method is demonstrated on an existing helicopter rotor originally designed in the 1960's. The rotor is already an effective design with novel features. However, application of the optimization principles together with high-speed computing resulted in an even better design. The objective function to be minimized is related to the vibrations of the rotor system under gusty wind conditions. The design parameters are all continuous variables. Optimization is performed in a number of steps. First, the most crucial design variables of the objective function are identified. With these variables, Latin Hypercube Sampling method is used

  6. From institutional merger integration to institutional strategic transformation: A case study of the strategic management paradigm at Shanghai Library

    Institute of Scientific and Technical Information of China (English)

    CHEN Chao

    2010-01-01

    This article attempts to apply the strategic management theory to the subsequent shaping up of a readjusted strategic development policy for Shanghai Library after its merger with the Institute of Scientific and Technological Information of Shanghai (ISTIS) in 1995.It also tries to analyze and explicate such an empirical implementation of institutional reintegration process through strategic management at Shanghai Metropolitan Library.By doing so,it aims to present an objective case study of activities based on the strategic management paradigm at a major Chinese metropolitan public library.

  7. From institutional merger integration to institutional strategic transformation:A case study of the strategic management paradigm at Shanghai Library

    Institute of Scientific and Technical Information of China (English)

    CHEN; Chao

    2010-01-01

    This article attempts to apply the strategic management theory to the subsequent shaping up of a readjusted strategic development policy for Shanghai Library after its merger with the Institute of Scientific and Technological Information of Shanghai(ISTIS)in 1995.It also tries to analyze and explicate such an empirical implementation of institutional reintegration process through strategic management at Shanghai Metropolitan Library.By doing so,it aims to present an objective case study of activities based on the strategic management paradigm at a major Chinese metropolitan public library.

  8. Software Quality Assurance in Software Projects: A Study of Pakistan

    OpenAIRE

    Faisal Shafique Butt; Sundus Shaukat; M. Wasif Nisar; Ehsan Ullah Munir; Muhammad Waseem; Kashif Ayyub

    2013-01-01

    Software quality is specific property which tells what kind of standard software should have. In a software project, quality is the key factor of success and decline of software related organization. Many researches have been done regarding software quality. Software related organization follows standards introduced by Capability Maturity Model Integration (CMMI) to achieve good quality software. Quality is divided into three main layers which are Software Quality Assurance (SQA), Software Qu...

  9. Integration of Web Technologies in Software Applications. Is Web 2.0 a Solution?

    Directory of Open Access Journals (Sweden)

    Cezar Liviu CERVINSCHI

    2010-12-01

    Full Text Available Starting from the idea that Web 2.0 represents “the era of dynamic web”, the paper proposes to provide arguments (demonstrated by physical results regarding the question that is at the foundation if this article. Due to the findings we can definitely affirm that Web 2.0 is a solution to building powerful and robust software, since the Internet has become more than just a simple presence on the users’ desktop that develops easy access to information, services, entertainment, online transactions, e-commerce, e-learning and so on, but basically every kind of human or institutional interaction can happen online. This paper seeks to study the impact of two of these braches upon the user – e-commerce and e-testing. The statistic reports will be made on different sets of people, while the conclusions are the results of a detailed research and study of the applications’ behaviour in the actual operating environment.

  10. OPTiM: Optical projection tomography integrated microscope using open-source hardware and software.

    Science.gov (United States)

    Watson, Thomas; Andrews, Natalie; Davis, Samuel; Bugeon, Laurence; Dallman, Margaret D; McGinty, James

    2017-01-01

    We describe the implementation of an OPT plate to perform optical projection tomography (OPT) on a commercial wide-field inverted microscope, using our open-source hardware and software. The OPT plate includes a tilt adjustment for alignment and a stepper motor for sample rotation as required by standard projection tomography. Depending on magnification requirements, three methods of performing OPT are detailed using this adaptor plate: a conventional direct OPT method requiring only the addition of a limiting aperture behind the objective lens; an external optical-relay method allowing conventional OPT to be performed at magnifications >4x; a remote focal scanning and region-of-interest method for improved spatial resolution OPT (up to ~1.6 μm). All three methods use the microscope's existing incoherent light source (i.e. arc-lamp) and all of its inherent functionality is maintained for day-to-day use. OPT acquisitions are performed on in vivo zebrafish embryos to demonstrate the implementations' viability.

  11. Managing the CMS Online Software integrity through development and production cycles

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    The Data Acquisition system of the Compact Muon Solenoid experiment at CERN is a distributed system made of several different network technologies and computers to collect data from more than 600 custom detector Front-End Drivers. It assembles events at a rate of 100 kHz, transporting event data at an aggregate throughput of 100 GByte/s. The architecture takes advantage of the latest developments in the computing industry. For data concentration, 10/40 Gbit Ethernet technologies are used while a 56Gbps Infiniband FDR CLOS network has been chosen for the event builder with a throughput of ~4 Tbps. The CMS Online Software (CMSOS) infrastructure is a complex product created specifically for the development of large distributed data acquisition systems as well as all application components to achieve the CMS data acquisition task. It is designed to benefit from different networking technologies, parallelism available on a processing platform such as multi-core or multi-processor systems. It provides platform i...

  12. Remote access to information sources in National and university library: development of service

    Directory of Open Access Journals (Sweden)

    Gorazd Vodeb

    2006-01-01

    Full Text Available National and University Library established remote access to information sources in september 2004. The article describes implementation and development of the service. Library wanted to offer information sources to users wherever and whenever they would need them. First main evaluation criteria for software selection were integration with existing authentication system and second no need for intervention user side. The EZproxy software from Useful Utilities was chosen. Key step to implementation was establishing communication between software applications EZproxy and COBISS library automation system. Library needed to obtain licence agreements from publishers. Promotion campaign aimed to notify large number of users. Only users of National & University Library were able to use the service. Other users and libraries of Ljubljana University requested to authenticate by credentials of their library. Remote access service was developed further in order to enable authentication for other libraries. We needed to establish authentication and authorisation system and also upgrade and install the communication command procedure on different servers. The data about service usage are presented.

  13. Research Library

    Science.gov (United States)

    Los Alamos National Laboratory Research Library Search Site submit Contact Us | Remote Access Standards Theses/Dissertations Research Help Subject Guides Library Training Video Tutorials Alerts Research Library: delivering essential knowledge services for national security sciences since 1947 Los

  14. Development and testing of an interface between measurement logging system and automation software DIAdem of National Instruments(NI) and water/steam material property library LibIF97

    International Nuclear Information System (INIS)

    Pietruske, H.; Schaffrath, A.

    2002-08-01

    The Institute of Safety Research (IfS) of the Forschungszentrum Rossendorf (FZR) e.V. is constructing a new large-scale multipurpose test facility TOPFLOW (Transient Two Phase Flow Test Facility). This facility will be probably put into operation in the next two months. For an effective evaluation of the start up experiments and the acceptance trials against the vendors FZR starts with the preparation of automated software tools for the measurement data logging and automation software DIAdem, which is distributed by National Instruments (NI). In a first step an interface for the coupling of a water/steam material property library LibIF97 of the University of Applied Science Zittau/Goerlitz was developed. This report describes the programming of the General Control Interface (GPI) and its coupling with DIAdem. Additionally the capability of this coupling in connection with autosequences for data evaluation was investigated. Furtheron effective methods for TOPFLOW data evaluation were demonstrated and tested against a concrete example. Currently no TOPFLOW data are available. Therefore one selected NOKO experiment was evaluated and first practical experiences were collected. Even this example is easy understandable and clearly seen, it contains every step, which is necessary for the TOPFLOW data evaluation. This contains the opening of files, determination of water/steam material properties with the Dynamic-Link-Library LibIF97.dII, the linkage of different data channels and the generation of layouts for graphics and reports. The tools presented in this report are an important step for the evaluation of the experimental data of TOPFLOW. These tools will be adapted now for the assessment of the acceptance trails. Further on now the generation of the automated software sequences for the first scientific tests are developed. (orig.) [de

  15. Performance evaluation of multi-stratum resources integrated resilience for software defined inter-data center interconnect.

    Science.gov (United States)

    Yang, Hui; Zhang, Jie; Zhao, Yongli; Ji, Yuefeng; Wu, Jialin; Lin, Yi; Han, Jianrui; Lee, Young

    2015-05-18

    Inter-data center interconnect with IP over elastic optical network (EON) is a promising scenario to meet the high burstiness and high-bandwidth requirements of data center services. In our previous work, we implemented multi-stratum resources integration among IP networks, optical networks and application stratums resources that allows to accommodate data center services. In view of this, this study extends to consider the service resilience in case of edge optical node failure. We propose a novel multi-stratum resources integrated resilience (MSRIR) architecture for the services in software defined inter-data center interconnect based on IP over EON. A global resources integrated resilience (GRIR) algorithm is introduced based on the proposed architecture. The MSRIR can enable cross stratum optimization and provide resilience using the multiple stratums resources, and enhance the data center service resilience responsiveness to the dynamic end-to-end service demands. The overall feasibility and efficiency of the proposed architecture is experimentally verified on the control plane of our OpenFlow-based enhanced SDN (eSDN) testbed. The performance of GRIR algorithm under heavy traffic load scenario is also quantitatively evaluated based on MSRIR architecture in terms of path blocking probability, resilience latency and resource utilization, compared with other resilience algorithms.

  16. Army-NASA aircrew/aircraft integration program (A3I) software detailed design document, phase 3

    Science.gov (United States)

    Banda, Carolyn; Chiu, Alex; Helms, Gretchen; Hsieh, Tehming; Lui, Andrew; Murray, Jerry; Shankar, Renuka

    1990-01-01

    The capabilities and design approach of the MIDAS (Man-machine Integration Design and Analysis System) computer-aided engineering (CAE) workstation under development by the Army-NASA Aircrew/Aircraft Integration Program is detailed. This workstation uses graphic, symbolic, and numeric prototyping tools and human performance models as part of an integrated design/analysis environment for crewstation human engineering. Developed incrementally, the requirements and design for Phase 3 (Dec. 1987 to Jun. 1989) are described. Software tools/models developed or significantly modified during this phase included: an interactive 3-D graphic cockpit design editor; multiple-perspective graphic views to observe simulation scenarios; symbolic methods to model the mission decomposition, equipment functions, pilot tasking and loading, as well as control the simulation; a 3-D dynamic anthropometric model; an intermachine communications package; and a training assessment component. These components were successfully used during Phase 3 to demonstrate the complex interactions and human engineering findings involved with a proposed cockpit communications design change in a simulated AH-64A Apache helicopter/mission that maps to empirical data from a similar study and AH-1 Cobra flight test.

  17. Hardware and Software Integration in Project Development of Automated Controller System Using LABVIEW FPGA

    International Nuclear Information System (INIS)

    Mohd Khairulezwan Abd Manan; Mohd Sabri Minhat; Izhar Abu Hussin

    2014-01-01

    The Field-Programmable Gate Array (FPGA) is a semiconductor device that can be programmed after manufacturing. Instead of being restricted to any predetermined hardware function, an FPGA allows user to program product features and functions, adapt to new standards, and reconfigure hardware for specific applications even after the product has been installed in the field, hence the name field-programmable. This project developed a control system using LabVIEW FPGA. LabVIEW FPGA is easier where it is programmed by using drag and drop icon. Then it will be integrated with the hardware input and output. (author)

  18. Present status of an integrated software system for HASP (Human Acts Simulation Program)

    International Nuclear Information System (INIS)

    Otani, Takayuki; Ebihara, Ken-ichi; Kambayashi, Shaw; Kume, Etsuo; Higuchi, Kenji; Fujii, Minoru; Akimoto, Masayuki

    1994-01-01

    In Human Acts Simulation Program (HASP), human acts to be realized by a human-shaped intelligent robot in a nuclear power plant are simulated by computers. The major purpose of HASP is to develop basic and underlying design technologies for intelligent and automatic power plant. The objectives of this paper is to show the present status of the HASP, with particular emphasis on activities targetted at the integration of developed subsystems to simulate the important capabilities of the intelligent robot such as planning, robot dynamics, and so on. (author)

  19. Pennsylvania Academic Libraries and Student Retention and Graduation: A Preliminary Investigation with Confusing Results

    Directory of Open Access Journals (Sweden)

    Gregory A. Crawford

    2014-11-01

    Full Text Available This study examined the relationships between specific institutional financial variables and two library-related variables on graduation and retention rates for colleges and universities through correlations and multiple regression analysis. The analyses used data for Pennsylvania colleges and universities that were extracted from the Integrated Postsecondary Educational Data System (IPEDS and the Academic Libraries Survey (ALS.  All analyses were run using IBM SPSS software. The correlations showed that both library expenses per student and library use per student were significantly correlated with both graduation and retention rates. In contrast, the multiple regression results showed that neither library budgets nor library use had significant effects on either graduation rates or retention rates. As would be expected, instructional expenses per student had the highest correlation with both graduation and retention and also yielded the strongest coefficient in the resulting regression equations.

  20. Parallel Synthesis and Biocatalytic Amplification of Marine-Inspired Libraries: An Integrated Approach Toward Discovering New Chemotherapeutics

    Science.gov (United States)

    2007-09-01

    synthesis and biocatalysis. We will use a combination of highly efficient chemistry and biocatalysis to prepare a library of small organic molecules whose...potential for the synthesis of diverse alkaloids . KEY RESEARCH ACCOMPLISHMENTS OF THIS REPORTING PERIOD Cyclopentenone libraries were screened for...growth and proliferation of cancer cells. The newer approach is to use the tools of chemical synthesis to create large collections (“libraries”)

  1. Applying integrated software to optimize corporate production performance: a case study at Suncor

    International Nuclear Information System (INIS)

    Masse, L.P.; Rhynes, P.

    1997-01-01

    The feasibility and need to introduce a central database of basic well data for use in the petroleum industry in order to enhance production performance was discussed. Suncor developed a central database of well data as the foundation for a future systems architecture for its own use. The perceived, current and future benefits of such a system were described. Suncor identified the need for a corporate repository which is accessible to multiple applications, and provides the opportunity to upgrade the system to new technology that will benefit from integration. The objective was to document existing data sets, identify what additional data would be useful and document existing processes around this well data. The integrated set of data is supplied by multiple vendors and includes public land data, production budget, public well data, forecasting, economics, drilling, procurement system, fixed assets, maintenance, land administration, field data capture, production accounting and financial accounting. In addition to being able to access the current well data, significant added value is expected from the pro-active communication within the departments, and the additional time available for analysis and decisions as opposed to searching for data and comparing sources. 4 figs

  2. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  3. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In the nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model to Dynamic Safety System(DDS) shows that the estimated reliability of the system is quite reasonable and realistic

  4. An integrated model for reliability estimation of digital nuclear protection system based on fault tree and software control flow methodologies

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2000-01-01

    In nuclear industry, the difficulty of proving the reliabilities of digital systems prohibits the widespread use of digital systems in various nuclear application such as plant protection system. Even though there exist a few models which are used to estimate the reliabilities of digital systems, we develop a new integrated model which is more realistic than the existing models. We divide the process of estimating the reliability of a digital system into two phases, a high-level phase and a low-level phase, and the boundary of two phases is the reliabilities of subsystems. We apply software control flow method to the low-level phase and fault tree analysis to the high-level phase. The application of the model of dynamic safety system (DSS) shows that the estimated reliability of the system is quite reasonable and realistic. (author)

  5. Mainstreaming the New Library.

    Science.gov (United States)

    Keeler, Elizabeth

    1982-01-01

    This discussion of methods of integrating the corporate library into the mainstream of affairs highlights three major elements of the process: marketing, production, and advertising. Professionalism and the information seeking behavior of clients are noted. Five references are provided. (EJS)

  6. DIMP: an interoperable solution for software integration and product data exchange

    Science.gov (United States)

    Wang, Xi Vincent; Xu, Xun William

    2012-08-01

    Today, globalisation has become one of the main trends of manufacturing business that has led to a world-wide decentralisation of resources amongst not only individual departments within one company but also business partners. However, despite the development and improvement in the last few decades, difficulties in information exchange and sharing still exist in heterogeneous applications environments. This article is divided into two parts. In the first part, related research work and integrating solutions are reviewed and discussed. The second part introduces a collaborative environment called distributed interoperable manufacturing platform, which is based on a module-based, service-oriented architecture (SOA). In the platform, the STEP-NC data model is used to facilitate data-exchange among heterogeneous CAD/CAM/CNC systems.

  7. Free and open source software at CERN: integration of drivers in the Linux kernel

    International Nuclear Information System (INIS)

    Gonzalez Cobas, J.D.; Iglesias Gonsalvez, S.; Howard Lewis, J.; Serrano, J.; Vanga, M.; Cota, E.G.; Rubini, A.; Vaga, F.

    2012-01-01

    Most device drivers written for accelerator control systems suffer from a severe lack of portability due to the ad hoc nature of the code, often embodied with intimate knowledge of the particular machine it is deployed in. In this paper we challenge this practice by arguing for the opposite approach: development in the open, which in our case translates into the integration of our code within the Linux kernel. We make our case by describing the upstream merge effort of the tsi148 driver, a critical (and complex) component of the control system. The encouraging results from this effort have then led us to follow the same approach with two more ambitious projects, currently in the works: Linux support for the upcoming FMC boards and a new I/O subsystem. (authors)

  8. MODFLOW-OWHM v2: The next generation of fully integrated hydrologic simulation software

    Science.gov (United States)

    Boyce, S. E.; Hanson, R. T.; Ferguson, I. M.; Reimann, T.; Henson, W.; Mehl, S.; Leake, S.; Maddock, T.

    2016-12-01

    The One-Water Hydrologic Flow Model (One-Water) is a MODFLOW-based integrated hydrologic flow model designed for the analysis of a broad range of conjunctive-use and climate-related issues. One-Water fully links the movement and use of groundwater, surface water, and imported water for consumption by agriculture and natural vegetation on the landscape, and for potable and other uses within a supply-and-demand framework. One-Water includes linkages for deformation-, flow-, and head-dependent flows; additional observation and parameter options for higher-order calibrations; and redesigned code for facilitation of self-updating models and faster simulation run times. The next version of One-Water, currently under development, will include a new surface-water operations module that simulates dynamic reservoir operations, a new sustainability analysis package that facilitates the estimation and simulation of reduced storage depletion and captured discharge, a conduit-flow process for karst aquifers and leaky pipe networks, a soil zone process that adds an enhanced infiltration process, interflow, deep percolation and soil moisture, and a new subsidence and aquifer compaction package. It will also include enhancements to local grid refinement, and additional features to facilitate easier model updates, faster execution, better error messages, and more integration/cross communication between the traditional MODFLOW packages. By retaining and tracking the water within the hydrosphere, One-Water accounts for "all of the water everywhere and all of the time." This philosophy provides more confidence in the water accounting by the scientific community and provides the public a foundation needed to address wider classes of problems. Ultimately, more complex questions are being asked about water resources, so they require a more complete answer about conjunctive-use and climate-related issues.

  9. Large scale software building with CMake in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00218447; The ATLAS collaboration; Elmsheuser, Johannes; Obreshkov, Emil; Undrus, Alexander

    2017-01-01

    The offline software of the ATLAS experiment at the LHC (Large Hadron Collider) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector trigger system to select LHC collision events during data taking. ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the mentioned software packages. This also makes it possible to develop and test new and modifi...

  10. Large Scale Software Building with CMake in ATLAS

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration; Obreshkov, Emil; Undrus, Alexander

    2016-01-01

    The offline software of the ATLAS experiment at the LHC (Large Hadron Collider) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector trigger system to select LHC collision events during data taking. ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the mentioned software packages. This also makes it possible to develop and test new and modifi...

  11. Fostering successful scientific software communities

    Science.gov (United States)

    Bangerth, W.; Heister, T.; Hwang, L.; Kellogg, L. H.

    2016-12-01

    Developing sustainable open source software packages for the sciences appears at first to be primarily a technical challenge: How can one create stable and robust algorithms, appropriate software designs, sufficient documentation, quality assurance strategies such as continuous integration and test suites, or backward compatibility approaches that yield high-quality software usable not only by the authors, but also the broader community of scientists? However, our experience from almost two decades of leading the development of the deal.II software library (http://www.dealii.org, a widely-used finite element package) and the ASPECT code (http://aspect.dealii.org, used to simulate convection in the Earth's mantle) has taught us that technical aspects are not the most difficult ones in scientific open source software. Rather, it is the social challenge of building and maintaining a community of users and developers interested in answering questions on user forums, contributing code, and jointly finding solutions to common technical and non-technical challenges. These problems are posed in an environment where project leaders typically have no resources to reward the majority of contributors, where very few people are specifically paid for the work they do on the project, and with frequent turnover of contributors as project members rotate into and out of jobs. In particular, much software work is done by graduate students who may become fluent enough in a software only a year or two before they leave academia. We will discuss strategies we have found do and do not work in maintaining and growing communities around the scientific software projects we lead. Specifically, we will discuss the management style necessary to keep contributors engaged, ways to give credit where credit is due, and structuring documentation to decrease reliance on forums and thereby allow user communities to grow without straining those who answer questions.

  12. Integrated Design Software Predicts the Creep Life of Monolithic Ceramic Components

    Science.gov (United States)

    1996-01-01

    Significant improvements in propulsion and power generation for the next century will require revolutionary advances in high-temperature materials and structural design. Advanced ceramics are candidate materials for these elevated-temperature applications. As design protocols emerge for these material systems, designers must be aware of several innate features, including the degrading ability of ceramics to carry sustained load. Usually, time-dependent failure in ceramics occurs because of two different, delayedfailure mechanisms: slow crack growth and creep rupture. Slow crack growth initiates at a preexisting flaw and continues until a critical crack length is reached, causing catastrophic failure. Creep rupture, on the other hand, occurs because of bulk damage in the material: void nucleation and coalescence that eventually leads to macrocracks which then propagate to failure. Successful application of advanced ceramics depends on proper characterization of material behavior and the use of an appropriate design methodology. The life of a ceramic component can be predicted with the NASA Lewis Research Center's Ceramics Analysis and Reliability Evaluation of Structures (CARES) integrated design programs. CARES/CREEP determines the expected life of a component under creep conditions, and CARES/LIFE predicts the component life due to fast fracture and subcritical crack growth. The previously developed CARES/LIFE program has been used in numerous industrial and Government applications.

  13. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    Science.gov (United States)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  14. Social Sensors (S2ensors): A Kind of Hardware-Software-Integrated Mediators for Social Manufacturing Systems Under Mass Individualization

    Science.gov (United States)

    Ding, Kai; Jiang, Ping-Yu

    2017-09-01

    Currently, little work has been devoted to the mediators and tools for multi-role production interactions in the mass individualization environment. This paper proposes a kind of hardware-software-integrated mediators called social sensors (S2ensors) to facilitate the production interactions among customers, manufacturers, and other stakeholders in the social manufacturing systems (SMS). The concept, classification, operational logics, and formalization of S2ensors are clarified. S2ensors collect subjective data from physical sensors and objective data from sensory input in mobile Apps, merge them into meaningful information for decision-making, and finally feed the decisions back for reaction and execution. Then, an S2ensors-Cloud platform is discussed to integrate different S2ensors to work for SMSs in an autonomous way. A demonstrative case is studied by developing a prototype system and the results show that S2ensors and S2ensors-Cloud platform can assist multi-role stakeholders interact and collaborate for the production tasks. It reveals the mediator-enabled mechanisms and methods for production interactions among stakeholders in SMS.

  15. DYNA3D, INGRID, and TAURUS: an integrated, interactive software system for crashworthiness engineering

    International Nuclear Information System (INIS)

    Benson, D.J.; Hallquist, J.O.; Stillman, D.W.

    1985-04-01

    Crashworthiness engineering has always been a high priority at Lawrence Livermore National Laboratory because of its role in the safe transport of radioactive material for the nuclear power industry and military. As a result, the authors have developed an integrated, interactive set of finite element programs for crashworthiness analysis. The heart of the system is DYNA3D, an explicit, fully vectorized, large deformation structural dynamics code. DYNA3D has the following four capabilities that are critical for the efficient and accurate analysis of crashes: (1) fully nonlinear solid, shell, and beam elements for representing a structure, (2) a broad range of constitutive models for representing the materials, (3) sophisticated contact algorithms for the impact interactions, and (4) a rigid body capability to represent the bodies away from the impact zones at a greatly reduced cost without sacrificing any accuracy in the momentum calculations. To generate the large and complex data files for DYNA3D, INGRID, a general purpose mesh generator, is used. It runs on everything from IBM PCs to CRAYS, and can generate 1000 nodes/minute on a PC. With its efficient hidden line algorithms and many options for specifying geometry, INGRID also doubles as a geometric modeller. TAURUS, an interactive post processor, is used to display DYNA3D output. In addition to the standard monochrome hidden line display, time history plotting, and contouring, TAURUS generates interactive color displays on 8 color video screens by plotting color bands superimposed on the mesh which indicate the value of the state variables. For higher quality color output, graphic output files may be sent to the DICOMED film recorders. We have found that color is every bit as important as hidden line removal in aiding the analyst in understanding his results. In this paper the basic methodologies of the programs are presented along with several crashworthiness calculations

  16. "Usability of data integration and visualization software for multidisciplinary pediatric intensive care: a human factors approach to assessing technology".

    Science.gov (United States)

    Lin, Ying Ling; Guerguerian, Anne-Marie; Tomasi, Jessica; Laussen, Peter; Trbovich, Patricia

    2017-08-14

    established or derived. Usability issues, observed through contextual use, provided directions for tangible design improvements of data integration software that may lessen use errors and promote safe use. Data-driven decision making can benefit from iterative interface redesign involving clinician-users in simulated environments. This study is a first step in understanding how software can support clinicians' decision making with integrated continuous monitoring data. Importantly, testing of similar platforms by all the different disciplines who may become clinician users is a fundamental step necessary to understand the impact on clinical outcomes of decision aids.

  17. Library resources on the Internet

    Science.gov (United States)

    Buchanan, Nancy L.

    1995-07-01

    Library resources are prevalent on the Internet. Library catalogs, electronic books, electronic periodicals, periodical indexes, reference sources, and U.S. Government documents are available by telnet, Gopher, World Wide Web, and FTP. Comparatively few copyrighted library resources are available freely on the Internet. Internet implementations of library resources can add useful features, such as full-text searching. There are discussion lists, Gophers, and World Wide Web pages to help users keep up with new resources and changes to existing ones. The future will bring more library resources, more types of library resources, and more integrated implementations of such resources to the Internet.

  18. Adventure Code Camp: Library Mobile Design in the Backcountry

    Directory of Open Access Journals (Sweden)

    David Ward

    2014-09-01

    Full Text Available This article presents a case study exploring the use of a student Coding Camp as a bottom-up mobile design process to generate library mobile apps. A code camp sources student programmer talent and ideas for designing software services and features.  This case study reviews process, outcomes, and next steps in mobile web app coding camps. It concludes by offering implications for services design beyond the local camp presented in this study. By understanding how patrons expect to integrate library services and resources into their use of mobile devices, librarians can better design the user experience for this environment.

  19. Software Engineering Improvement Plan

    Science.gov (United States)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  20. Software To Go: A Catalog of Software Available for Loan.

    Science.gov (United States)

    Kurlychek, Ken, Comp.

    This catalog lists the holdings of the Software To Go software lending library and clearinghouse for programs and agencies serving students or clients who are deaf or hard of hearing. An introduction describes the clearinghouse and its collection of software, much of it commercial and copyrighted material, for Apple, Macintosh, and IBM (MS-DOS)…