WorldWideScience

Sample records for source visualization tool

  1. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  2. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  3. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    International Nuclear Information System (INIS)

    Minelli, Annalisa; Marchesini, Ivan; Taylor, Faith E.; De Rosa, Pierluigi; Casagrande, Luca; Cenci, Michele

    2014-01-01

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods

  4. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr [Insitute Universitaire Européen de la Mer, Université de la Bretagne Occidentale, Rue Dumont D' Urville, 29280 Plouzané (France); Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it [National Research Council (CNR), Research Insitute for Geo-hydrological Protection (IRPI), Strada della Madonna Alta 126, 06125 Perugia (Italy); Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk [Earth and Environmental Dynamics Research Group, Department of Geography, King' s College London, Strand, London WC2R 2LS (United Kingdom); De Rosa, Pierluigi, E-mail: Pierluigi.Derosa@unipg.it [Physics and Geology Department, University of Perugia, Via Zefferino Faina 4, 06123 Perugia (Italy); Casagrande, Luca, E-mail: Luca.Casagrande@gfosservices.it [Gfosservices S.A., Open Source GIS-WebGIS Solutions, Spatial Data Infrastructures, Planning and Counseling, Via F.lli Cairoli 24, 06127 Perugia (Italy); Cenci, Michele, E-mail: mcenci@regione.umbria.it [Servizio Energia qualità dell' ambiente, rifiuti, attività estrattive, Regione Umbia, Corso Vannucci 96, 06121 Perugia (Italy)

    2014-11-15

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.

  5. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  6. Text mining and visualization case studies using open-source tools

    CERN Document Server

    Chisholm, Andrew

    2016-01-01

    Text Mining and Visualization: Case Studies Using Open-Source Tools provides an introduction to text mining using some of the most popular and powerful open-source tools: KNIME, RapidMiner, Weka, R, and Python. The contributors-all highly experienced with text mining and open-source software-explain how text data are gathered and processed from a wide variety of sources, including books, server access logs, websites, social media sites, and message boards. Each chapter presents a case study that you can follow as part of a step-by-step, reproducible example. You can also easily apply and extend the techniques to other problems. All the examples are available on a supplementary website. The book shows you how to exploit your text data, offering successful application examples and blueprints for you to tackle your text mining tasks and benefit from open and freely available tools. It gets you up to date on the latest and most powerful tools, the data mining process, and specific text mining activities.

  7. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  9. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  10. Big Data Visualization Tools

    OpenAIRE

    Bikakis, Nikos

    2018-01-01

    Data visualization is the presentation of data in a pictorial or graphical format, and a data visualization tool is the software that generates this presentation. Data visualization provides users with intuitive means to interactively explore and analyze data, enabling them to effectively identify interesting patterns, infer correlations and causalities, and supports sense-making activities.

  11. Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.

    Science.gov (United States)

    Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T

    2016-01-01

    Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).

  12. CMS tracker visualization tools

    CERN Document Server

    Zito, G; Osborne, I; Regano, A

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  13. CMS tracker visualization tools

    Energy Technology Data Exchange (ETDEWEB)

    Mennea, M.S. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Osborne, I. [Northeastern University, 360 Huntington Avenue, Boston, MA 02115 (United States); Regano, A. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Zito, G. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy)]. E-mail: giuseppe.zito@ba.infn.it

    2005-08-21

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  14. CMS tracker visualization tools

    International Nuclear Information System (INIS)

    Mennea, M.S.; Osborne, I.; Regano, A.; Zito, G.

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking

  15. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    Science.gov (United States)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  16. ThManager: An Open Source Tool for Creating and Visualizing SKOS

    Directory of Open Access Journals (Sweden)

    Javier Lacasta

    2007-09-01

    Full Text Available Knowledge organization systems denotes formally represented knowledge that is used within the context of digital libraries to improve data sharing and information retrieval. To increase their use, and to reuse them when possible, it is vital to manage them adequately and to provide them in a standard interchange format. Simple knowledge organization systems (SKOS seem to be the most promising representation for the type of knowledge models used in digital libraries, but there is a lack of tools that are able to properly manage it. This work presents a tool that fills this gap, facilitating their use in different environments and using SKOS as an interchange format.

  17. New target prediction and visualization tools incorporating open source molecular fingerprints for TB Mobile 2.0.

    Science.gov (United States)

    Clark, Alex M; Sarker, Malabika; Ekins, Sean

    2014-01-01

    We recently developed a freely available mobile app (TB Mobile) for both iOS and Android platforms that displays Mycobacterium tuberculosis (Mtb) active molecule structures and their targets with links to associated data. The app was developed to make target information available to as large an audience as possible. We now report a major update of the iOS version of the app. This includes enhancements that use an implementation of ECFP_6 fingerprints that we have made open source. Using these fingerprints, the user can propose compounds with possible anti-TB activity, and view the compounds within a cluster landscape. Proposed compounds can also be compared to existing target data, using a näive Bayesian scoring system to rank probable targets. We have curated an additional 60 new compounds and their targets for Mtb and added these to the original set of 745 compounds. We have also curated 20 further compounds (many without targets in TB Mobile) to evaluate this version of the app with 805 compounds and associated targets. TB Mobile can now manage a small collection of compounds that can be imported from external sources, or exported by various means such as email or app-to-app inter-process communication. This means that TB Mobile can be used as a node within a growing ecosystem of mobile apps for cheminformatics. It can also cluster compounds and use internal algorithms to help identify potential targets based on molecular similarity. TB Mobile represents a valuable dataset, data-visualization aid and target prediction tool.

  18. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    Science.gov (United States)

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  19. Python tools for Visual Studio

    CERN Document Server

    Wang, Cathy

    2014-01-01

    This is a hands-on guide that provides exemplary coverage of all the features and concepts related to PTVS.The book is intended for developers who are aiming to enhance their productivity in Python projects with automation tools that Visual Studio provides for the .Net community. Some basic knowledge of Python programming is essential.

  20. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  1. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  2. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  3. Visualization of Broadband Sound Sources

    Directory of Open Access Journals (Sweden)

    Sukhanov Dmitry

    2016-01-01

    Full Text Available In this paper the method of imaging of wideband audio sources based on the 2D microphone array measurements of the sound field at the same time in all the microphones is proposed. Designed microphone array consists of 160 microphones allowing to digitize signals with a frequency of 7200 Hz. Measured signals are processed using the special algorithm that makes it possible to obtain a flat image of wideband sound sources. It is shown experimentally that the visualization is not dependent on the waveform, but determined by the bandwidth. Developed system allows to visualize sources with a resolution of up to 10 cm.

  4. Visualization of Broadband Sound Sources

    OpenAIRE

    Sukhanov Dmitry; Erzakova Nadezhda

    2016-01-01

    In this paper the method of imaging of wideband audio sources based on the 2D microphone array measurements of the sound field at the same time in all the microphones is proposed. Designed microphone array consists of 160 microphones allowing to digitize signals with a frequency of 7200 Hz. Measured signals are processed using the special algorithm that makes it possible to obtain a flat image of wideband sound sources. It is shown experimentally that the visualization is not dependent on the...

  5. Visual intelligence Microsoft tools and techniques for visualizing data

    CERN Document Server

    Stacey, Mark; Jorgensen, Adam

    2013-01-01

    Go beyond design concepts and learn to build state-of-the-art visualizations The visualization experts at Microsoft's Pragmatic Works have created a full-color, step-by-step guide to building specific types of visualizations. The book thoroughly covers the Microsoft toolset for data analysis and visualization, including Excel, and explores best practices for choosing a data visualization design, selecting tools from the Microsoft stack, and building a dynamic data visualization from start to finish. You'll examine different types of visualizations, their strengths and weaknesses, a

  6. Survey of Network Visualization Tools

    Science.gov (United States)

    2007-12-01

    programming language such as Java, C #, Delphi and Visual basic. AlgoCOMs Network also supports Visual Basic for Applications ( VBA ). Hardware: Users...AlgoCOMs Network also supports Visual Basic for Applications ( VBA ). Hardware: Users: Availability: • Commercially Available Cost $101...Application Monitoring - Constantly watch the health of your mission-critical applications: MS SQL , MS Exchange, MS IIS, Active Directory. Event

  7. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  8. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  9. Bloom: A Relationship Visualization Tool for Complex Networks

    Directory of Open Access Journals (Sweden)

    Frank Horsfall

    2010-07-01

    Full Text Available Faced with an ever-increasing capacity to collect and store data, organizations must find a way to make sense of it to their advantage. Methods are required to simplify the data so that it can inform strategic decisions and help solve problems. Visualization tools are becoming increasingly popular since they can display complex relationships in a simple, visual format. This article describes Bloom, a project at Carleton University to develop an open source visualization tool for complex networks and business ecosystems. It provides an overview of the visualization technology used in the project and demonstrates its potential impact through a case study using real-world data.

  10. Visual Arts as a Tool for Phenomenology

    Directory of Open Access Journals (Sweden)

    Anna S. CohenMiller

    2017-12-01

    Full Text Available In this article I explain the process and benefits of using visual arts as a tool within a transcendental phenomenological study. I present and discuss drawings created and described by four participants over the course of twelve interviews. Findings suggest the utility of visual arts methods within the phenomenological toolset to encourage participant voice through easing communication and facilitating understanding.

  11. chimeraviz: a tool for visualizing chimeric RNA.

    Science.gov (United States)

    Lågstad, Stian; Zhao, Sen; Hoff, Andreas M; Johannessen, Bjarne; Lingjærde, Ole Christian; Skotheim, Rolf I

    2017-09-15

    Advances in high-throughput RNA sequencing have enabled more efficient detection of fusion transcripts, but the technology and associated software used for fusion detection from sequencing data often yield a high false discovery rate. Good prioritization of the results is important, and this can be helped by a visualization framework that automatically integrates RNA data with known genomic features. Here we present chimeraviz , a Bioconductor package that automates the creation of chimeric RNA visualizations. The package supports input from nine different fusion-finder tools: deFuse, EricScript, InFusion, JAFFA, FusionCatcher, FusionMap, PRADA, SOAPfuse and STAR-FUSION. chimeraviz is an R package available via Bioconductor ( https://bioconductor.org/packages/release/bioc/html/chimeraviz.html ) under Artistic-2.0. Source code and support is available at GitHub ( https://github.com/stianlagstad/chimeraviz ). rolf.i.skotheim@rr-research.no. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  12. Standalone visualization tool for three-dimensional DRAGON geometrical models

    International Nuclear Information System (INIS)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E.

    2008-01-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  13. A Flexible Visualization Tool for Rapid Access to EFIT Results

    International Nuclear Information System (INIS)

    Zhang Ruirui; Xiao Bingjia; Luo Zhengping

    2014-01-01

    This paper introduces the design and implementation of an interactive tool, the EASTViewer, for the visualization of plasma equilibrium reconstruction results for EAST (the Experimental Advanced Superconducting Tokamak). Aimed at the operating system independently, Python, when combined with the PyGTK toolkit, is used as the programming language. Using modular design, the EASTViewer provides a unified interface with great flexibility. It is easy to access numerous data sources either from local data files or an MDSplus tree, and with the pre-defined configuration files, it can be extended to other tokamaks. The EASTViewer has been used as the major tool to visualize equilibrium data since the second EAST campaign in 2008, and it has been verified that the EASTViewer features a user-friendly interface, and has easy access to numerous data sources and cross-platforms. (fusion engineering)

  14. An interactive visualization tool for mobile objects

    Science.gov (United States)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data

  15. Visualizing Debugging Activity in Source Code Repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the bug evolution in time.

  16. Visualizing Debugging Activity in Source Code Repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the

  17. Genovar: a detection and visualization tool for genomic variants.

    Science.gov (United States)

    Jung, Kwang Su; Moon, Sanghoon; Kim, Young Jin; Kim, Bong-Jo; Park, Kiejung

    2012-05-08

    Along with single nucleotide polymorphisms (SNPs), copy number variation (CNV) is considered an important source of genetic variation associated with disease susceptibility. Despite the importance of CNV, the tools currently available for its analysis often produce false positive results due to limitations such as low resolution of array platforms, platform specificity, and the type of CNV. To resolve this problem, spurious signals must be separated from true signals by visual inspection. None of the previously reported CNV analysis tools support this function and the simultaneous visualization of comparative genomic hybridization arrays (aCGH) and sequence alignment. The purpose of the present study was to develop a useful program for the efficient detection and visualization of CNV regions that enables the manual exclusion of erroneous signals. A JAVA-based stand-alone program called Genovar was developed. To ascertain whether a detected CNV region is a novel variant, Genovar compares the detected CNV regions with previously reported CNV regions using the Database of Genomic Variants (DGV, http://projects.tcag.ca/variation) and the Single Nucleotide Polymorphism Database (dbSNP). The current version of Genovar is capable of visualizing genomic data from sources such as the aCGH data file and sequence alignment format files. Genovar is freely accessible and provides a user-friendly graphic user interface (GUI) to facilitate the detection of CNV regions. The program also provides comprehensive information to help in the elimination of spurious signals by visual inspection, making Genovar a valuable tool for reducing false positive CNV results. http://genovar.sourceforge.net/.

  18. Tool-Based Curricula and Visual Learning

    Directory of Open Access Journals (Sweden)

    Dragica Vasileska

    2013-12-01

    Full Text Available In the last twenty years nanotechnology hasrevolutionized the world of information theory, computers andother important disciplines, such as medicine, where it hascontributed significantly in the creation of more sophisticateddiagnostic tools. Therefore, it is important for people working innanotechnology to better understand basic concepts to be morecreative and productive. To further foster the progress onNanotechnology in the USA, the National Science Foundation hascreated the Network for Computational Nanotechnology (NCNand the dissemination of all the information from member andnon-member participants of the NCN is enabled by thecommunity website www.nanoHUB.org. nanoHUB’s signatureservices online simulation that enables the operation ofsophisticated research and educational simulation engines with acommon browser. No software installation or local computingpower is needed. The simulation tools as well as nano-conceptsare augmented by educational materials, assignments, and toolbasedcurricula, which are assemblies of tools that help studentsexcel in a particular area.As elaborated later in the text, it is the visual mode of learningthat we are exploiting in achieving faster and better results withstudents that go through simulation tool-based curricula. Thereare several tool based curricula already developed on thenanoHUB and undergoing further development, out of which fiveare directly related to nanoelectronics. They are: ABACUS –device simulation module; ACUTE – Computational Electronicsmodule; ANTSY – bending toolkit; and AQME – quantummechanics module. The methodology behind tool-based curriculais discussed in details. Then, the current status of each module ispresented, including user statistics and student learningindicatives. Particular simulation tool is explored further todemonstrate the ease by which students can grasp information.Representative of Abacus is PN-Junction Lab; representative ofAQME is PCPBT tool; and

  19. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  20. IViPP: A Tool for Visualization in Particle Physics

    Science.gov (United States)

    Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug

    2011-10-01

    Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.

  1. Visualization tool for human-machine interface designers

    Science.gov (United States)

    Prevost, Michael P.; Banda, Carolyn P.

    1991-06-01

    As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.

  2. Intrusion Detection using Open Source Tools

    OpenAIRE

    Jack TIMOFTE

    2008-01-01

    We have witnessed in the recent years that open source tools have gained popularity among all types of users, from individuals or small businesses to large organizations and enterprises. In this paper we will present three open source IDS tools: OSSEC, Prelude and SNORT.

  3. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  4. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  5. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Science.gov (United States)

    Carazzolle, Marcelo Falsarella; de Carvalho, Lucas Miguel; Slepicka, Hugo Henrique; Vidal, Ramon Oliveira; Pereira, Gonçalo Amarante Guimarães; Kobarg, Jörg; Meirelles, Gabriela Vaz

    2014-01-01

    High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two

  6. Visualizing Cloud Properties and Satellite Imagery: A Tool for Visualization and Information Integration

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.

    2017-12-01

    Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.

  7. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  8. Introducing Product Lines through Open Source Tools

    OpenAIRE

    Haugen, Øystein

    2008-01-01

    We present an approach to introducing product lines to companies that lower their initial risk by applying open source tools and a smooth learning curve into the use and creation of domain specific modeling combined with standardized variability modeling.

  9. SNPversity: a web-based tool for visualizing diversity

    Science.gov (United States)

    Schott, David A; Vinnakota, Abhinav G; Portwood, John L; Andorf, Carson M

    2018-01-01

    Abstract Many stand-alone desktop software suites exist to visualize single nucleotide polymorphism (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualization tool that can be implemented on a Unix-like machine and served through a web browser that can be accessible worldwide. SNPversity consists of a HDF5 database back-end for SNPs, a data exchange layer powered by TASSEL libraries that represent data in JSON format, and an interface layer using PHP to visualize SNP information. SNPversity displays data in real-time through a web browser in grids that are color-coded according to a given SNP’s allelic status and mutational state. SNPversity is currently available at MaizeGDB, the maize community’s database, and will be soon available at GrainGenes, the clade-oriented database for Triticeae and Avena species, including wheat, barley, rye, and oat. The code and documentation are uploaded onto github, and they are freely available to the public. We expect that the tool will be highly useful for other biological databases with a similar need to display SNP diversity through their web interfaces. Database URL: https://www.maizegdb.org/snpversity PMID:29688387

  10. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  11. uVis: A Formula-Based Visualization Tool

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Xu, Shangjin; Kuhail, Mohammad Amin

    Several tools use programming approaches for developing advanced visualizations. Others can with a few steps create simple visualizations with built-in patterns, and users with limited IT experience can use them. However, it is programming and time demanding to create and customize...... these visualizations. We introduce uVis, a tool that allows users with advanced spreadsheet-like IT knowledge and basic database understanding to create simple as well as advanced visualizations. These users construct visualizations by combining building blocks (i.e. controls, shapes). They specify spreadsheet...

  12. Visual illusion of tool use recalibrates tactile perception

    Science.gov (United States)

    Miller, Luke E.; Longo, Matthew R.; Saygin, Ayse P.

    2018-01-01

    Brief use of a tool recalibrates multisensory representations of the user’s body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment. PMID:28196765

  13. Visual Decision Support Tool for Supporting Asset ...

    Science.gov (United States)

    Abstract:Managing urban water infrastructures faces the challenge of jointly dealing with assets of diverse types, useful life, cost, ages and condition. Service quality and sustainability require sound long-term planning, well aligned with tactical and operational planning and management. In summary, the objective of an integrated approach to infrastructure asset management is to assist utilities answer the following questions:•Who are we at present?•What service do we deliver?•What do we own?•Where do we want to be in the long-term?•How do we get there?The AWARE-P approach (www.aware-p.org) offers a coherent methodological framework and a valuable portfolio of software tools. It is designed to assist water supply and wastewater utility decision-makers in their analyses and planning processes. It is based on a Plan-Do-Check-Act process and is in accordance with the key principles of the International Standards Organization (ISO) 55000 standards on asset management. It is compatible with, and complementary to WERF’s SIMPLE framework. The software assists in strategic, tactical, and operational planning, through a non-intrusive, web-based, collaborative environment where objectives and metrics drive IAM planning. It is aimed at industry professionals and managers, as well as at the consultants and technical experts that support them. It is easy to use and maximizes the value of information from multiple existing data sources, both in da

  14. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  15. Iterating between Tools to Create and Edit Visualizations.

    Science.gov (United States)

    Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah

    2017-01-01

    A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.

  16. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Visualization tools for insurance risk processes

    OpenAIRE

    Krzysztof Burnecki; Rafal Weron

    2006-01-01

    This chapter develops on risk processes which, perhaps, are most suitable for computer visualization of all insurance objects. At the same time, risk processes are basic instruments for any non-life actuary – they are vital for calculating the amount of loss that an insurance company may incur.

  18. Next generation tools for genomic data generation, distribution, and visualization

    Directory of Open Access Journals (Sweden)

    Nix David A

    2010-09-01

    Full Text Available Abstract Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx; an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub; and a standalone Java Swing application (GWrap that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  19. Next generation tools for genomic data generation, distribution, and visualization.

    Science.gov (United States)

    Nix, David A; Di Sera, Tonya L; Dalley, Brian K; Milash, Brett A; Cundick, Robert M; Quinn, Kevin S; Courdy, Samir J

    2010-09-09

    With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  20. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...

  1. TacTool: a tactile rapid prototyping tool for visual interfaces

    NARCIS (Netherlands)

    Keyson, D.V.; Tang, H.K.; Anzai, Y.; Ogawa, K.; Mori, H.

    1995-01-01

    This paper describes the TacTool development tool and input device for designing and evaluating visual user interfaces with tactile feedback. TacTool is currently supported by the IPO trackball with force feedback in the x and y directions. The tool is designed to enable both the designer and the

  2. Interactive visualization tools for the structural biologist.

    Science.gov (United States)

    Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M

    2013-10-01

    In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.

  3. Plasma sources for EUV lithography exposure tools

    International Nuclear Information System (INIS)

    Banine, Vadim; Moors, Roel

    2004-01-01

    The source is an integral part of an extreme ultraviolet lithography (EUVL) tool. Such a source, as well as the EUVL tool, has to fulfil extremely high demands both technical and cost oriented. The EUVL tool operates at a wavelength in the range 13-14 nm, which requires a major re-thinking of state-of-the-art lithography systems operating in the DUV range. The light production mechanism changes from conventional lamps and lasers to relatively high temperature emitting plasmas. The light transport, mainly refractive for DUV, should become reflective for EUV. The source specifications are derived from the customer requirements for the complete tool, which are: throughput, cost of ownership (CoO) and imaging quality. The EUVL system is considered as a follow up of the existing DUV based lithography technology and, while improving the feature resolution, it has to maintain high wafer throughput performance, which is driven by the overall CoO picture. This in turn puts quite high requirements on the collectable in-band power produced by an EUV source. Increased, due to improved feature resolution, critical dimension (CD) control requirements, together with reflective optics restrictions, necessitate pulse-to-pulse repeatability, spatial stability control and repetition rates, which are substantially better than those of current optical systems. All together the following aspects of the source specification will be addressed: the operating wavelength, the EUV power, the hot spot size, the collectable angle, the repetition rate, the pulse-to-pulse repeatability and the debris induced lifetime of components

  4. Tools for Visualizing HIV in Cure Research.

    Science.gov (United States)

    Niessl, Julia; Baxter, Amy E; Kaufmann, Daniel E

    2018-02-01

    The long-lived HIV reservoir remains a major obstacle for an HIV cure. Current techniques to analyze this reservoir are generally population-based. We highlight recent developments in methods visualizing HIV, which offer a different, complementary view, and provide indispensable information for cure strategy development. Recent advances in fluorescence in situ hybridization techniques enabled key developments in reservoir visualization. Flow cytometric detection of HIV mRNAs, concurrently with proteins, provides a high-throughput approach to study the reservoir on a single-cell level. On a tissue level, key spatial information can be obtained detecting viral RNA and DNA in situ by fluorescence microscopy. At total-body level, advancements in non-invasive immuno-positron emission tomography (PET) detection of HIV proteins may allow an encompassing view of HIV reservoir sites. HIV imaging approaches provide important, complementary information regarding the size, phenotype, and localization of the HIV reservoir. Visualizing the reservoir may contribute to the design, assessment, and monitoring of HIV cure strategies in vitro and in vivo.

  5. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  6. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  7. SnopViz, an interactive snow profile visualization tool

    Science.gov (United States)

    Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank

    2016-04-01

    SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international

  8. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to

  9. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  10. A Visualization-Based Tutoring Tool for Engineering Education

    Science.gov (United States)

    Nguyen, Tang-Hung; Khoo, I.-Hung

    2010-06-01

    In engineering disciplines, students usually have hard time to visualize different aspects of engineering analysis and design, which inherently are too complex or abstract to fully understand without the aid of visual explanations or visualizations. As examples, when learning materials and sequences of construction process, students need to visualize how all components of a constructed facility are assembled? Such visualization can not be achieved in a textbook and a traditional lecturing environment. In this paper, the authors present the development of a computer tutoring software, in which different visualization tools including video clips, 3 dimensional models, drawings, pictures/photos together with complementary texts are used to assist students in deeply understanding and effectively mastering materials. The paper will also discuss the implementation and the effectiveness evaluation of the proposed tutoring software, which was used to teach a construction engineering management course offered at California State University, Long Beach.

  11. Development of in-situ visualization tool for PIC simulation

    International Nuclear Information System (INIS)

    Ohno, Nobuaki; Ohtani, Hiroaki

    2014-01-01

    As the capability of a supercomputer is improved, the sizes of simulation and its output data also become larger and larger. Visualization is usually carried out on a researcher's PC with interactive visualization software after performing the computer simulation. However, the data size is becoming too large to do it currently. A promising answer is in-situ visualization. For this case a simulation code is coupled with the visualization code and visualization is performed with the simulation on the same supercomputer. We developed an in-situ visualization tool for particle-in-cell (PIC) simulation and it is provided as a Fortran's module. We coupled it with a PIC simulation code and tested the coupled code on Plasma Simulator supercomputer, and ensured that it works. (author)

  12. Visual Impairment Screening Assessment (VISA) tool: pilot validation.

    Science.gov (United States)

    Rowe, Fiona J; Hepworth, Lauren R; Hanna, Kerry L; Howard, Claire

    2018-03-06

    To report and evaluate a new Vision Impairment Screening Assessment (VISA) tool intended for use by the stroke team to improve identification of visual impairment in stroke survivors. Prospective case cohort comparative study. Stroke units at two secondary care hospitals and one tertiary centre. 116 stroke survivors were screened, 62 by naïve and 54 by non-naïve screeners. Both the VISA screening tool and the comprehensive specialist vision assessment measured case history, visual acuity, eye alignment, eye movements, visual field and visual inattention. Full completion of VISA tool and specialist vision assessment was achieved for 89 stroke survivors. Missing data for one or more sections typically related to patient's inability to complete the assessment. Sensitivity and specificity of the VISA screening tool were 90.24% and 85.29%, respectively; the positive and negative predictive values were 93.67% and 78.36%, respectively. Overall agreement was significant; k=0.736. Lowest agreement was found for screening of eye movement and visual inattention deficits. This early validation of the VISA screening tool shows promise in improving detection accuracy for clinicians involved in stroke care who are not specialists in vision problems and lack formal eye training, with potential to lead to more prompt referral with fewer false positives and negatives. Pilot validation indicates acceptability of the VISA tool for screening of visual impairment in stroke survivors. Sensitivity and specificity were high indicating the potential accuracy of the VISA tool for screening purposes. Results of this study have guided the revision of the VISA screening tool ahead of full clinical validation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  13. Participation and 3D Visualization Tools

    DEFF Research Database (Denmark)

    Mullins, Michael; Jensen, Mikkel Holm; Henriksen, Sune

    2004-01-01

    With a departure point in a workshop held at the VR Media Lab at Aalborg University , this paper deals with aspects of public participation and the use of 3D visualisation tools. The workshop grew from a desire to involve a broad collaboration between the many actors in the city through using new...... perceptions of architectural representation in urban design where 3D visualisation techniques are used. It is the authors? general finding that, while 3D visualisation media have the potential to increase understanding of virtual space for the lay public, as well as for professionals, the lay public require...

  14. Visualization and Quality Control Web Tools for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  15. Visualizing data mining results with the Brede tools

    Directory of Open Access Journals (Sweden)

    Finn A Nielsen

    2009-07-01

    Full Text Available A few neuroinformatics databases now exist that record results from neuroimaging studies in the form of brain coordinates in stereotaxic space. The Brede Toolbox was originally developed to extract, analyze and visualize data from one of them --- the BrainMap database. Since then the Brede Toolbox has expanded and now includes its own database with coordinates along with ontologies for brain regions and functions: The Brede Database. With Brede Toolbox and Database combined we setup automated workflows for extraction of data, mass meta-analytic data mining and visualizations. Most of the Web presence of the Brede Database is established by a single script executing a workflow involving these steps together with a final generation of Web pages with embedded visualizations and links to interactive three-dimensional models in the Virtual Reality Modeling Language. Apart from the Brede tools I briefly review alternate visualization tools and methods for Internet-based visualization and information visualization as well as portals for visualization tools.

  16. EUV sources for the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Damen, Marcel; Derra, Günther; Franken, Oliver; Janssen, Maurice; Jonkers, Jeroen; Klein, Jürgen; Kraus, Helmar; Krücken, Thomas; List, Andreas; Loeken, Micheal; Mader, Arnaud; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prümmer, Ralph; Rosier, Oliver; Schwabe, Stefan; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2006-03-01

    In this paper, we report on the recent progress of the Philips Extreme UV source. The Philips source concept is based on a discharge plasma ignited in a Sn vapor plume that is ablated by a laser pulse. Using rotating electrodes covered with a regenerating tin surface, the problems of electrode erosion and power scaling are fundamentally solved. Most of the work of the past year has been dedicated to develop a lamp system which is operating very reliably and stable under full scanner remote control. Topics addressed were the development of the scanner interface, a dose control system, thermo-mechanical design, positional stability of the source, tin handling, and many more. The resulting EUV source-the Philips NovaTin(R) source-can operate at more than 10kW electrical input power and delivers 200W in-band EUV into 2π continuously. The source is very small, so nearly 100% of the EUV radiation can be collected within etendue limits. The lamp system is fully automated and can operate unattended under full scanner remote control. 500 Million shots of continuous operation without interruption have been realized, electrode lifetime is at least 2 Billion shots. Three sources are currently being prepared, two of them will be integrated into the first EUV Alpha Demonstration tools of ASML. The debris problem was reduced to a level which is well acceptable for scanner operation. First, a considerable reduction of the Sn emission of the source has been realized. The debris mitigation system is based on a two-step concept using a foil trap based stage and a chemical cleaning stage. Both steps were improved considerably. A collector lifetime of 1 Billion shots is achieved, after this operating time a cleaning would be applied. The cleaning step has been verified to work with tolerable Sn residues. From the experimental results, a total collector lifetime of more than 10 Billion shots can be expected.

  17. VBioindex: A Visual Tool to Estimate Biodiversity

    Directory of Open Access Journals (Sweden)

    Dong Su Yu

    2015-09-01

    Full Text Available Biological diversity, also known as biodiversity, is an important criterion for measuring the value of an ecosystem. As biodiversity is closely related to human welfare and quality of life, many efforts to restore and maintain the biodiversity of species have been made by government agencies and non-governmental organizations, thereby drawing a substantial amount of international attention. In the fields of biological research, biodiversity is widely measured using traditional statistical indices such as the Shannon-Wiener index, species richness, evenness, and relative dominance of species. However, some biologists and ecologists have difficulty using these indices because they require advanced mathematical knowledge and computational techniques. Therefore, we developed VBioindex, a user-friendly program that is capable of measuring the Shannon-Wiener index, species richness, evenness, and relative dominance. VBioindex serves as an easy to use interface and visually represents the results in the form of a simple chart and in addition, VBioindex offers functions for long-term investigations of datasets using time-series analyses.

  18. Funding Sources for Visually Impaired Students in Higher Education.

    Science.gov (United States)

    Traber, M.

    1987-01-01

    Financial aid sources available to visually handicapped students for postsecondary educational, vocational, or technical programs are outlined. Sources include national and state blindness agencies, colleges and universities, state vocational rehabilitation agencies, and the federal government. (Author/JDD)

  19. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  20. VMEXT: A Visualization Tool for Mathematical Expression Trees

    OpenAIRE

    Schubotz, Moritz; Meuschke, Norman; Hepp, Thomas; Cohl, Howard S.; Gipp, Bela

    2017-01-01

    Mathematical expressions can be represented as a tree consisting of terminal symbols, such as identifiers or numbers (leaf nodes), and functions or operators (non-leaf nodes). Expression trees are an important mechanism for storing and processing mathematical expressions as well as the most frequently used visualization of the structure of mathematical expressions. Typically, researchers and practitioners manually visualize expression trees using general-purpose tools. This approach is labori...

  1. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  2. CTViz: A tool for the visualization of transport in nanocomposites.

    Science.gov (United States)

    Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A

    2016-05-01

    A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. ASCI visualization tool evaluation, Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, P. [ed.] [Sandia National Labs., Livermore, CA (United States). Center for Computational Engineering

    1997-04-01

    The charter of the ASCI Visualization Common Tools subgroup was to investigate and evaluate 3D scientific visualization tools. As part of that effort, a Tri-Lab evaluation effort was launched in February of 1996. The first step was to agree on a thoroughly documented list of 32 features against which all tool candidates would be evaluated. These evaluation criteria were both gleaned from a user survey and determined from informed extrapolation into the future, particularly as concerns the 3D nature and extremely large size of ASCI data sets. The second step was to winnow a field of 41 candidate tools down to 11. The selection principle was to be as inclusive as practical, retaining every tool that seemed to hold any promise of fulfilling all of ASCI`s visualization needs. These 11 tools were then closely investigated by volunteer evaluators distributed across LANL, LLNL, and SNL. This report contains the results of those evaluations, as well as a discussion of the evaluation philosophy and criteria.

  4. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools.

    Directory of Open Access Journals (Sweden)

    Meridith Blevins

    Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.

  5. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  6. Visualization tool. 3DAVS and polarization-type VR system

    International Nuclear Information System (INIS)

    Takeda, Yasuhiro; Ueshima, Yutaka

    2003-01-01

    In the visualization work of simulation data in every advanced research field, what is used most in the report or the presentation as a research result has still remained in the stages of the still picture or the 2-dimensional animation, in spite of recent abundance of various visualization software. With the recent progress of computational environment, however, more complicated phenomena can be so easily computed that the results are more needed to be comprehensible as well as intelligible. Therefore, it inevitably requires an animation rather than a still picture, or 3-dimensional display (virtual reality) rather than 2-dimensional one. In this report, two visualization tools, 3DAVS and Polarization-Type VR system are described as the data expression method after visualization processing. (author)

  7. Visualizing spikes in source-space

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Duez, Lene; Scherg, Michael

    2016-01-01

    OBJECTIVE: Reviewing magnetoencephalography (MEG) recordings is time-consuming: signals from the 306 MEG-sensors are typically reviewed divided into six arrays of 51 sensors each, thus browsing each recording six times in order to evaluate all signals. A novel method of reconstructing the MEG...... signals in source-space was developed using a source-montage of 29 brain-regions and two spatial components to remove magnetocardiographic (MKG) artefacts. Our objective was to evaluate the accuracy of reviewing MEG in source-space. METHODS: In 60 consecutive patients with epilepsy, we prospectively...... evaluated the accuracy of reviewing the MEG signals in source-space as compared to the classical method of reviewing them in sensor-space. RESULTS: All 46 spike-clusters identified in sensor-space were also identified in source-space. Two additional spike-clusters were identified in source-space. As 29...

  8. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    Science.gov (United States)

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  9. Magnetic source localization of early visual mismatch response

    NARCIS (Netherlands)

    Susac, A.; Heslenfeld, D.J.; Huonker, R.; Supek, S.

    2014-01-01

    Previous studies have reported a visual analogue of the auditory mismatch negativity (MMN) response that is based on sensory memory. The neural generators and attention dependence of the visual MMN (vMMN) still remain unclear. We used magnetoencephalography (MEG) and spatio-temporal source

  10. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  11. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen; Al-Awami, Ali K.; Beyer, Johanna; Cali, Corrado; Magistretti, Pierre J.; Pfister, Hanspeter; Hadwiger, Markus

    2017-01-01

    This paper presents Abstractocyte, a system for the visual analysis of astrocytes and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. The study of astrocytes has immense potential for understanding brain function. However, their complex and widely-branching structure requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Furthermore, the structure and function of astrocytes is very different from neurons, and therefore requires the development of new visualization and analysis tools. With Abstractocyte, biologists can explore the morphology of astrocytes using various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a specific joint visualization as a point in this space. Interactively moving this point allows them to smoothly transition between different abstraction levels in an intuitive manner. In contrast to simply switching between different visualizations, this preserves the visual context and correlations throughout the transition. Users can smoothly navigate from concrete, highly-detailed 3D views to simplified and abstracted 2D views. In addition to investigating astrocytes, neurons, and their relationships, we enable the interactive analysis of the distribution of glycogen, which is of high importance to neuroscientists. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  12. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen

    2017-08-28

    This paper presents Abstractocyte, a system for the visual analysis of astrocytes and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. The study of astrocytes has immense potential for understanding brain function. However, their complex and widely-branching structure requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Furthermore, the structure and function of astrocytes is very different from neurons, and therefore requires the development of new visualization and analysis tools. With Abstractocyte, biologists can explore the morphology of astrocytes using various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a specific joint visualization as a point in this space. Interactively moving this point allows them to smoothly transition between different abstraction levels in an intuitive manner. In contrast to simply switching between different visualizations, this preserves the visual context and correlations throughout the transition. Users can smoothly navigate from concrete, highly-detailed 3D views to simplified and abstracted 2D views. In addition to investigating astrocytes, neurons, and their relationships, we enable the interactive analysis of the distribution of glycogen, which is of high importance to neuroscientists. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  13. Scientific Visualization Tools for Enhancement of Undergraduate Research

    Science.gov (United States)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable

  14. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    Directory of Open Access Journals (Sweden)

    Brinkley James F

    2007-10-01

    Full Text Available Abstract Background Three-dimensional (3-D visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  15. Living Color Frame System: PC graphics tool for data visualization

    Science.gov (United States)

    Truong, Long V.

    1993-01-01

    Living Color Frame System (LCFS) is a personal computer software tool for generating real-time graphics applications. It is highly applicable for a wide range of data visualization in virtual environment applications. Engineers often use computer graphics to enhance the interpretation of data under observation. These graphics become more complicated when 'run time' animations are required, such as found in many typical modern artificial intelligence and expert systems. Living Color Frame System solves many of these real-time graphics problems.

  16. VRML and Collaborative Environments: New Tools for Networked Visualization

    Science.gov (United States)

    Crutcher, R. M.; Plante, R. L.; Rajlich, P.

    We present two new applications that engage the network as a tool for astronomical research and/or education. The first is a VRML server which allows users over the Web to interactively create three-dimensional visualizations of FITS images contained in the NCSA Astronomy Digital Image Library (ADIL). The server's Web interface allows users to select images from the ADIL, fill in processing parameters, and create renderings featuring isosurfaces, slices, contours, and annotations; the often extensive computations are carried out on an NCSA SGI supercomputer server without the user having an individual account on the system. The user can then download the 3D visualizations as VRML files, which may be rotated and manipulated locally on virtually any class of computer. The second application is the ADILBrowser, a part of the NCSA Horizon Image Data Browser Java package. ADILBrowser allows a group of participants to browse images from the ADIL within a collaborative session. The collaborative environment is provided by the NCSA Habanero package which includes text and audio chat tools and a white board. The ADILBrowser is just an example of a collaborative tool that can be built with the Horizon and Habanero packages. The classes provided by these packages can be assembled to create custom collaborative applications that visualize data either from local disk or from anywhere on the network.

  17. Tools and procedures for visualization of proteins and other biomolecules.

    Science.gov (United States)

    Pan, Lurong; Aller, Stephen G

    2015-04-01

    Protein, peptides, and nucleic acids are biomolecules that drive biological processes in living organisms. An enormous amount of structural data for a large number of these biomolecules has been described with atomic precision in the form of structural "snapshots" that are freely available in public repositories. These snapshots can help explain how the biomolecules function, the nature of interactions between multi-molecular complexes, and even how small-molecule drugs can modulate the biomolecules for clinical benefits. Furthermore, these structural snapshots serve as inputs for sophisticated computer simulations to turn the biomolecules into moving, "breathing" molecular machines for understanding their dynamic properties in real-time computer simulations. In order for the researcher to take advantage of such a wealth of structural data, it is necessary to gain competency in the use of computer molecular visualization tools for exploring the structures and visualizing three-dimensional spatial representations. Here, we present protocols for using two common visualization tools--the Web-based Jmol and the stand-alone PyMOL package--as well as a few examples of other popular tools. Copyright © 2015 John Wiley & Sons, Inc.

  18. STRING 3: An Advanced Groundwater Flow Visualization Tool

    Science.gov (United States)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  19. Visual color matching system based on RGB LED light source

    Science.gov (United States)

    Sun, Lei; Huang, Qingmei; Feng, Chen; Li, Wei; Wang, Chaofeng

    2018-01-01

    In order to study the property and performance of LED as RGB primary color light sources on color mixture in visual psychophysical experiments, and to find out the difference between LED light source and traditional light source, a visual color matching experiment system based on LED light sources as RGB primary colors has been built. By simulating traditional experiment of metameric color matching in CIE 1931 RGB color system, it can be used for visual color matching experiments to obtain a set of the spectral tristimulus values which we often call color-matching functions (CMFs). This system consists of three parts: a monochromatic light part using blazed grating, a light mixing part where the summation of 3 LED illuminations are to be visually matched with a monochromatic illumination, and a visual observation part. The three narrow band LEDs used have dominant wavelengths of 640 nm (red), 522 nm (green) and 458 nm (blue) respectively and their intensities can be controlled independently. After the calibration of wavelength and luminance of LED sources with a spectrophotometer, a series of visual color matching experiments have been carried out by 5 observers. The results are compared with those from CIE 1931 RGB color system, and have been used to compute an average locus for the spectral colors in the color triangle, with white at the center. It has been shown that the use of LED is feasible and has the advantages of easy control, good stability and low cost.

  20. Statistical and Visualization Data Mining Tools for Foundry Production

    Directory of Open Access Journals (Sweden)

    M. Perzyk

    2007-07-01

    Full Text Available In recent years a rapid development of a new, interdisciplinary knowledge area, called data mining, is observed. Its main task is extracting useful information from previously collected large amount of data. The main possibilities and potential applications of data mining in manufacturing industry are characterized. The main types of data mining techniques are briefly discussed, including statistical, artificial intelligence, data base and visualization tools. The statistical methods and visualization methods are presented in more detail, showing their general possibilities, advantages as well as characteristic examples of applications in foundry production. Results of the author’s research are presented, aimed at validation of selected statistical tools which can be easily and effectively used in manufacturing industry. A performance analysis of ANOVA and contingency tables based methods, dedicated for determination of the most significant process parameters as well as for detection of possible interactions among them, has been made. Several numerical tests have been performed using simulated data sets, with assumed hidden relationships as well some real data, related to the strength of ductile cast iron, collected in a foundry. It is concluded that the statistical methods offer relatively easy and fairly reliable tools for extraction of that type of knowledge about foundry manufacturing processes. However, further research is needed, aimed at explanation of some imperfections of the investigated tools as well assessment of their validity for more complex tasks.

  1. The Exercise: An Exercise Generator Tool for the SOURCe Project

    Science.gov (United States)

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  2. 3D Immersive Visualization: An Educational Tool in Geosciences

    Science.gov (United States)

    Pérez-Campos, N.; Cárdenas-Soto, M.; Juárez-Casas, M.; Castrejón-Pineda, R.

    2007-05-01

    3D immersive visualization is an innovative tool currently used in various disciplines, such as medicine, architecture, engineering, video games, etc. Recently, the Universidad Nacional Autónoma de México (UNAM) mounted a visualization theater (Ixtli) with leading edge technology, for academic and research purposes that require immersive 3D tools for a better understanding of the concepts involved. The Division of Engineering in Earth Sciences of the School of Engineering, UNAM, is running a project focused on visualization of geoscience data. Its objective is to incoporate educational material in geoscience courses in order to support and to improve the teaching-learning process, especially in well-known difficult topics for students. As part of the project, proffessors and students are trained in visualization techniques, then their data are adapted and visualized in Ixtli as part of a class or a seminar, where all the attendants can interact, not only among each other but also with the object under study. As part of our results, we present specific examples used in basic geophysics courses, such as interpreted seismic cubes, seismic-wave propagation models, and structural models from bathymetric, gravimetric and seismological data; as well as examples from ongoing applied projects, such as a modeled SH upward wave, the occurrence of an earthquake cluster in 1999 in the Popocatepetl volcano, and a risk atlas from Delegación Alvaro Obregón in Mexico City. All these examples, plus those to come, constitute a library for students and professors willing to explore another dimension of the teaching-learning process. Furthermore, this experience can be enhaced by rich discussions and interactions by videoconferences with other universities and researchers.

  3. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  4. VisIt: An End-User Tool for Visualizing and Analyzing Very Large Data

    Energy Technology Data Exchange (ETDEWEB)

    Childs, Hank; Brugger, Eric; Whitlock, Brad; Meredith, Jeremy; Ahern, Sean; Pugmire, David; Biagas, Kathleen; Miller, Mark; Weber, Gunther H.; Krishnan, Hari; Fogal, Thomas; Sanderson, Allen; Garth, Christoph; Bethel, E. Wes; Camp, David; Ruebel, Oliver; Durant, Marc; Favre, Jean; Navratil, Paul

    2012-11-01

    VisIt is a popular open source tool for visualizing and analyzing big data. It owes its success to its foci of increasing data understanding, large data support, and providing a robust and usable product, as well as its underlying design that fits today's supercomputing landscape. This report, which draws heavily from an earlier publication at the SciDAC Conference in 2011 describes the VisIt project and its accomplishments.

  5. Visual Data Comm: A Tool for Visualizing Data Communication in the Multi Sector Planner Study

    Science.gov (United States)

    Lee, Hwasoo Eric

    2010-01-01

    Data comm is a new technology proposed in future air transport system as a potential tool to provide comprehensive data connectivity. It is a key enabler to manage 4D trajectory digitally, potentially resulting in improved flight times and increased throughput. Future concepts with data comm integration have been tested in a number of human-in-the-loop studies but analyzing the results has proven to be particularly challenging because future traffic environment in which data comm is fully enabled has assumed high traffic density, resulting in data set with large amount of information. This paper describes the motivation, design, current and potential future application of Visual Data Comm (VDC), a tool for visualizing data developed in Java using Processing library which is a tool package designed for interactive visualization programming. This paper includes an example of an application of VDC on data pertaining to the most recent Multi Sector Planner study, conducted at NASA s Airspace Operations Laboratory in 2009, in which VDC was used to visualize and interpret data comm activities

  6. Three-dimensional visualization of ensemble weather forecasts – Part 1: The visualization tool Met.3D (version 1.0

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2015-07-01

    Full Text Available We present "Met.3D", a new open-source tool for the interactive three-dimensional (3-D visualization of numerical ensemble weather predictions. The tool has been developed to support weather forecasting during aircraft-based atmospheric field campaigns; however, it is applicable to further forecasting, research and teaching activities. Our work approaches challenging topics related to the visual analysis of numerical atmospheric model output – 3-D visualization, ensemble visualization and how both can be used in a meaningful way suited to weather forecasting. Met.3D builds a bridge from proven 2-D visualization methods commonly used in meteorology to 3-D visualization by combining both visualization types in a 3-D context. We address the issue of spatial perception in the 3-D view and present approaches to using the ensemble to allow the user to assess forecast uncertainty. Interactivity is key to our approach. Met.3D uses modern graphics technology to achieve interactive visualization on standard consumer hardware. The tool supports forecast data from the European Centre for Medium Range Weather Forecasts (ECMWF and can operate directly on ECMWF hybrid sigma-pressure level grids. We describe the employed visualization algorithms, and analyse the impact of the ECMWF grid topology on computing 3-D ensemble statistical quantities. Our techniques are demonstrated with examples from the T-NAWDEX-Falcon 2012 (THORPEX – North Atlantic Waveguide and Downstream Impact Experiment campaign.

  7. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  8. Open Source Next Generation Visualization Software for Interplanetary Missions

    Science.gov (United States)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  9. Allen Brain Atlas-Driven Visualizations: a web-based gene expression energy visualization tool.

    Science.gov (United States)

    Zaldivar, Andrew; Krichmar, Jeffrey L

    2014-01-01

    The Allen Brain Atlas-Driven Visualizations (ABADV) is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA) across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  10. Allen Brain Atlas-Driven Visualizations: A Web-Based Gene Expression Energy Visualization Tool

    Directory of Open Access Journals (Sweden)

    Andrew eZaldivar

    2014-05-01

    Full Text Available The Allen Brain Atlas-Driven Visualizations (ABADV is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  11. IVisTMSA: Interactive Visual Tools for Multiple Sequence Alignments.

    Science.gov (United States)

    Pervez, Muhammad Tariq; Babar, Masroor Ellahi; Nadeem, Asif; Aslam, Naeem; Naveed, Nasir; Ahmad, Sarfraz; Muhammad, Shah; Qadri, Salman; Shahid, Muhammad; Hussain, Tanveer; Javed, Maryam

    2015-01-01

    IVisTMSA is a software package of seven graphical tools for multiple sequence alignments. MSApad is an editing and analysis tool. It can load 409% more data than Jalview, STRAP, CINEMA, and Base-by-Base. MSA comparator allows the user to visualize consistent and inconsistent regions of reference and test alignments of more than 21-MB size in less than 12 seconds. MSA comparator is 5,200% efficient and more than 40% efficient as compared to BALiBASE c program and FastSP, respectively. MSA reconstruction tool provides graphical user interfaces for four popular aligners and allows the user to load several sequence files at a time. FASTA generator converts seven formats of alignments of unlimited size into FASTA format in a few seconds. MSA ID calculator calculates identity matrix of more than 11,000 sequences with a sequence length of 2,696 base pairs in less than 100 seconds. Tree and Distance Matrix calculation tools generate phylogenetic tree and distance matrix, respectively, using neighbor joining% identity and BLOSUM 62 matrix.

  12. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we co...

  13. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    Science.gov (United States)

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  14. Helioviewer: A Web 2.0 Tool for Visualizing Heterogeneous Heliophysics Data

    Science.gov (United States)

    Hughitt, V. K.; Ireland, J.; Lynch, M. J.; Schmeidel, P.; Dimitoglou, G.; Müeller, D.; Fleck, B.

    2008-12-01

    Solar physics datasets are becoming larger, richer, more numerous and more distributed. Feature/event catalogs (describing objects of interest in the original data) are becoming important tools in navigating these data. In the wake of this increasing influx of data and catalogs there has been a growing need for highly sophisticated tools for accessing and visualizing this wealth of information. Helioviewer is a novel tool for integrating and visualizing disparate sources of solar and Heliophysics data. Taking advantage of the newly available power of modern web application frameworks, Helioviewer merges image and feature catalog data, and provides for Heliophysics data a familiar interface not unlike Google Maps or MapQuest. In addition to streamlining the process of combining heterogeneous Heliophysics datatypes such as full-disk images and coronagraphs, the inclusion of visual representations of automated and human-annotated features provides the user with an integrated and intuitive view of how different factors may be interacting on the Sun. Currently, Helioviewer offers images from The Extreme ultraviolet Imaging Telescope (EIT), The Large Angle and Spectrometric COronagraph experiment (LASCO) and the Michelson Doppler Imager (MDI) instruments onboard The Solar and Heliospheric Observatory (SOHO), as well as The Transition Region and Coronal Explorer (TRACE). Helioviewer also incorporates feature/event information from the LASCO CME List, NOAA Active Regions, CACTus CME and Type II Radio Bursts feature/event catalogs. The project is undergoing continuous development with many more data sources and additional functionality planned for the near future.

  15. VisBOL: Web-Based Tools for Synthetic Biology Design Visualization.

    Science.gov (United States)

    McLaughlin, James Alastair; Pocock, Matthew; Mısırlı, Göksel; Madsen, Curtis; Wipat, Anil

    2016-08-19

    VisBOL is a Web-based application that allows the rendering of genetic circuit designs, enabling synthetic biologists to visually convey designs in SBOL visual format. VisBOL designs can be exported to formats including PNG and SVG images to be embedded in Web pages, presentations and publications. The VisBOL tool enables the automated generation of visualizations from designs specified using the Synthetic Biology Open Language (SBOL) version 2.0, as well as a range of well-known bioinformatics formats including GenBank and Pigeoncad notation. VisBOL is provided both as a user accessible Web site and as an open-source (BSD) JavaScript library that can be used to embed diagrams within other content and software.

  16. Open Source Approach to Project Management Tools

    Directory of Open Access Journals (Sweden)

    Romeo MARGEA

    2011-01-01

    Full Text Available Managing large projects involving different groups of people and complex tasks can be challenging. The solution is to use Project management software, which allows a more efficient management of projects. However, famous project management systems can be costly and may require expensive custom servers. Even if free software is not as complex as Microsoft Project, is noteworthy to think that not all projects need all the features, amenities and power of such systems. There are free and open source software alternatives that meet the needs of most projects, and that allow Web access based on different platforms and locations. A starting stage in adopting an OSS in-house is finding and identifying existing open source solution. In this paper we present an overview of Open Source Project Management Software (OSPMS based on articles, reviews, books and developers’ web sites, about those that seem to be the most popular software in this category.

  17. Tools and Methods for Visualization of Mesoscale Ocean Eddies

    Science.gov (United States)

    Bemis, K. G.; Liu, L.; Silver, D.; Kang, D.; Curchitser, E.

    2017-12-01

    Mesoscale ocean eddies form in the Gulf Stream and transport heat and nutrients across the ocean basin. The internal structure of these three-dimensional eddies and the kinematics with which they move are critical to a full understanding of their transport capacity. A series of visualization tools have been developed to extract, characterize, and track ocean eddies from 3D modeling results, to visually show the ocean eddy story by applying various illustrative visualization techniques, and to interactively view results stored on a server from a conventional browser. In this work, we apply a feature-based method to track instances of ocean eddies through the time steps of a high-resolution multidecadal regional ocean model and generate a series of eddy paths which reflect the life cycle of individual eddy instances. The basic method uses the Okubu-Weiss parameter to define eddy cores but could be adapted to alternative specifications of an eddy. Stored results include pixel-lists for each eddy instance, tracking metadata for eddy paths, and physical and geometric properties. In the simplest view, isosurfaces are used to display eddies along an eddy path. Individual eddies can then be selected and viewed independently or an eddy path can be viewed in the context of all eddy paths (longer than a specified duration) and the ocean basin. To tell the story of mesoscale ocean eddies, we combined illustrative visualization techniques, including visual effectiveness enhancement, focus+context, and smart visibility, with the extracted volume features to explore eddy characteristics at multiple scales from ocean basin to individual eddy. An evaluation by domain experts indicates that combining our feature-based techniques with illustrative visualization techniques provides an insight into the role eddies play in ocean circulation. A web-based GUI is under development to facilitate easy viewing of stored results. The GUI provides the user control to choose amongst available

  18. Tracking PACS usage with open source tools.

    Science.gov (United States)

    French, Todd L; Langer, Steve G

    2011-08-01

    A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.

  19. Tools for visualization of phosphoinositides in the cell nucleus.

    Science.gov (United States)

    Kalasova, Ilona; Fáberová, Veronika; Kalendová, Alžběta; Yildirim, Sukriye; Uličná, Lívia; Venit, Tomáš; Hozák, Pavel

    2016-04-01

    Phosphoinositides (PIs) are glycerol-based phospholipids containing hydrophilic inositol ring. The inositol ring is mono-, bis-, or tris-phosphorylated yielding seven PIs members. Ample evidence shows that PIs localize both to the cytoplasm and to the nucleus. However, tools for direct visualization of nuclear PIs are limited and many studies thus employ indirect approaches, such as staining of their metabolic enzymes. Since localization and mobility of PIs differ from their metabolic enzymes, these approaches may result in incomplete data. In this paper, we tested commercially available PIs antibodies by light microscopy on fixed cells, tested their specificity using protein-lipid overlay assay and blocking assay, and compared their staining patterns. Additionally, we prepared recombinant PIs-binding domains and tested them on both fixed and live cells by light microscopy. The results provide a useful overview of usability of the tools tested and stress that the selection of adequate tools is critical. Knowing the localization of individual PIs in various functional compartments should enable us to better understand the roles of PIs in the cell nucleus.

  20. Visual attention: Linking prefrontal sources to neuronal and behavioral correlates.

    Science.gov (United States)

    Clark, Kelsey; Squire, Ryan Fox; Merrikhi, Yaser; Noudoost, Behrad

    2015-09-01

    Attention is a means of flexibly selecting and enhancing a subset of sensory input based on the current behavioral goals. Numerous signatures of attention have been identified throughout the brain, and now experimenters are seeking to determine which of these signatures are causally related to the behavioral benefits of attention, and the source of these modulations within the brain. Here, we review the neural signatures of attention throughout the brain, their theoretical benefits for visual processing, and their experimental correlations with behavioral performance. We discuss the importance of measuring cue benefits as a way to distinguish between impairments on an attention task, which may instead be visual or motor impairments, and true attentional deficits. We examine evidence for various areas proposed as sources of attentional modulation within the brain, with a focus on the prefrontal cortex. Lastly, we look at studies that aim to link sources of attention to its neuronal signatures elsewhere in the brain. Copyright © 2015. Published by Elsevier Ltd.

  1. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.

    2015-12-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  2. Building Eclectic Personal Learning Landscapes with Open Source Tools

    NARCIS (Netherlands)

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005,

  3. Building Eclectic Personal Learning Landscapes with Open Source Tools

    OpenAIRE

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005, Heerlen, The Netherlands.

  4. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  5. Visualization and interaction tools for aerial photograph mosaics

    Science.gov (United States)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  6. Enhancing Nuclear Newcomer Training with 3D Visualization Learning Tools

    International Nuclear Information System (INIS)

    Gagnon, V.

    2016-01-01

    Full text: While the nuclear power industry is trying to reinforce its safety and regain public support post-Fukushima, it is also faced with a very real challenge that affects its day-to-day activities: a rapidly aging workforce. Statistics show that close to 40% of the current nuclear power industry workforce will retire within the next five years. For newcomer countries, the challenge is even greater, having to develop a completely new workforce. The workforce replacement effort introduces nuclear newcomers of a new generation with different backgrounds and affinities. Major lifestyle differences between the two generations of workers result, amongst other things, in different learning habits and needs for this new breed of learners. Interactivity, high visual content and quick access to information are now necessary to achieve a high level of retention. To enhance existing training programmes or to support the establishment of new training programmes for newcomer countries, L-3 MAPPS has devised learning tools to enhance these training programmes focused on the “Practice-by-Doing” principle. L-3 MAPPS has coupled 3D computer visualization with high-fidelity simulation to bring real-time, simulation-driven animated components and systems allowing immersive and participatory, individual or classroom learning. (author

  7. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  8. Mapping as a visual health communication tool: promises and dilemmas.

    Science.gov (United States)

    Parrott, Roxanne; Hopfer, Suellen; Ghetian, Christie; Lengerich, Eugene

    2007-01-01

    In the era of evidence-based public health promotion and planning, the use of maps as a form of evidence to communicate about the multiple determinants of cancer is on the rise. Geographic information systems and mapping technologies make future proliferation of this strategy likely. Yet disease maps as a communication form remain largely unexamined. This content analysis considers the presence of multivariate information, credibility cues, and the communication function of publicly accessible maps for cancer control activities. Thirty-six state comprehensive cancer control plans were publicly available in July 2005 and were reviewed for the presence of maps. Fourteen of the 36 state cancer plans (39%) contained map images (N = 59 static maps). A continuum of map inter activity was observed, with 10 states having interactive mapping tools available to query and map cancer information. Four states had both cancer plans with map images and interactive mapping tools available to the public on their Web sites. Of the 14 state cancer plans that depicted map images, two displayed multivariate data in a single map. Nine of the 10 states with interactive mapping capability offered the option to display multivariate health risk messages. The most frequent content category mapped was cancer incidence and mortality, with stage at diagnosis infrequently available. The most frequent communication function served by the maps reviewed was redundancy, as maps repeated information contained in textual forms. The social and ethical implications for communicating about cancer through the use of visual geographic representations are discussed.

  9. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  10. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Silva, Claudio [New York Univ. (NYU), NY (United States). Computer Science and Engineering Dept.

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integration or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.

  11. Deploying web-based visual exploration tools on the grid

    Energy Technology Data Exchange (ETDEWEB)

    Jankun-Kelly, T.J.; Kreylos, Oliver; Shalf, John; Ma, Kwan-Liu; Hamann, Bernd; Joy, Kenneth; Bethel, E. Wes

    2002-02-01

    We discuss a web-based portal for the exploration, encapsulation, and dissemination of visualization results over the Grid. This portal integrates three components: an interface client for structured visualization exploration, a visualization web application to manage the generation and capture of the visualization results, and a centralized portal application server to access and manage grid resources. We demonstrate the usefulness of the developed system using an example for Adaptive Mesh Refinement (AMR) data visualization.

  12. Visual tool for estimating the fractal dimension of images

    Science.gov (United States)

    Grossu, I. V.; Besliu, C.; Rusu, M. V.; Jipa, Al.; Bordeianu, C. C.; Felea, D.

    2009-10-01

    This work presents a new Visual Basic 6.0 application for estimating the fractal dimension of images, based on an optimized version of the box-counting algorithm. Following the attempt to separate the real information from "noise", we considered also the family of all band-pass filters with the same band-width (specified as parameter). The fractal dimension can be thus represented as a function of the pixel color code. The program was used for the study of paintings cracks, as an additional tool which can help the critic to decide if an artistic work is original or not. Program summaryProgram title: Fractal Analysis v01 Catalogue identifier: AEEG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29 690 No. of bytes in distributed program, including test data, etc.: 4 967 319 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30M Classification: 14 Nature of problem: Estimating the fractal dimension of images. Solution method: Optimized implementation of the box-counting algorithm. Use of a band-pass filter for separating the real information from "noise". User friendly graphical interface. Restrictions: Although various file-types can be used, the application was mainly conceived for the 8-bit grayscale, windows bitmap file format. Running time: In a first approximation, the algorithm is linear.

  13. Open Source and Proprietary Project Management Tools for SMEs.

    Directory of Open Access Journals (Sweden)

    Veronika Abramova

    2017-05-01

    Full Text Available The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management and ERP (Enterprise Resource Planning modules. Currently, a large number of project management software is available, to assist project team during the entire project lifecycle. We present the main differences between open source and proprietary project management tools and how those could be important for SMEs, describing the key features and how those can assist the project manager and the development team. In this paper, we analyse four open-source project management tools: OpenProject, ProjectLibre, Redmine, LibrePlan and four proprietary tools: Bitrix24, JIRA, Microsoft Project and Asana.

  14. EcoBrowser: a web-based tool for visualizing transcriptome data of Escherichia coli

    Directory of Open Access Journals (Sweden)

    Jia Peng

    2011-10-01

    Full Text Available Abstract Background Escherichia coli has been extensively studied as a prokaryotic model organism whose whole genome was determined in 1997. However, it is difficult to identify all the gene products involved in diverse functions by using whole genome sequencesalone. The high-resolution transcriptome mapping using tiling arrays has proved effective to improve the annotation of transcript units and discover new transcripts of ncRNAs. While abundant tiling array data have been generated, the lack of appropriate visualization tools to accommodate and integrate multiple sources of data has emerged. Findings EcoBrowser is a web-based tool for visualizing genome annotations and transcriptome data of E. coli. Important tiling array data of E. coli from different experimental platforms are collected and processed for query. An AJAX based genome browser is embedded for visualization. Thus, genome annotations can be compared with transcript profiling and genome occupancy profiling from independent experiments, which will be helpful in discovering new transcripts including novel mRNAs and ncRNAs, generating a detailed description of the transcription unit architecture, further providing clues for investigation of prokaryotic transcriptional regulation that has proved to be far more complex than previously thought. Conclusions With the help of EcoBrowser, users can get a systemic view both from the vertical and parallel sides, as well as inspirations for the design of new experiments which will expand our understanding of the regulation mechanism.

  15. A low complexity visualization tool that helps to perform complex systems analysis

    International Nuclear Information System (INIS)

    Beiro, M G; Alvarez-Hamelin, J I; Busch, J R

    2008-01-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n√n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  16. A low complexity visualization tool that helps to perform complex systems analysis

    Science.gov (United States)

    Beiró, M. G.; Alvarez-Hamelin, J. I.; Busch, J. R.

    2008-12-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n\\sqrt n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  17. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  18. Neural sources of visual working memory maintenance in human parietal and ventral extrastriate visual cortex.

    Science.gov (United States)

    Becke, Andreas; Müller, Notger; Vellage, Anne; Schoenfeld, Mircea Ariel; Hopf, Jens-Max

    2015-04-15

    Maintaining information in visual working memory is reliably indexed by the contralateral delay activity (CDA) - a sustained modulation of the event-related potential (ERP) with a topographical maximum over posterior scalp regions contralateral to the memorized input. Based on scalp topography, it is hypothesized that the CDA reflects neural activity in the parietal cortex, but the precise cortical origin of underlying electric activity was never determined. Here we combine ERP recordings with magnetoencephalography based source localization to characterize the cortical current sources generating the CDA. Observers performed a cued delayed match to sample task where either the color or the relative position of colored dots had to be maintained in memory. A detailed source-localization analysis of the magnetic activity in the retention interval revealed that the magnetic analog of the CDA (mCDA) is generated by current sources in the parietal cortex. Importantly, we find that the mCDA also receives contribution from current sources in the ventral extrastriate cortex that display a time-course similar to the parietal sources. On the basis of the magnetic responses, forward modeling of ERP data reveals that the ventral sources have non-optimal projections and that these sources are therefore concealed in the ERP by overlapping fields with parietal projections. The present observations indicate that visual working memory maintenance, as indexed by the CDA, involves the parietal cortical regions as well as the ventral extrastriate regions, which code the sensory representation of the memorized content. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  20. GAViT: Genome Assembly Visualization Tool for Short Read Data

    Energy Technology Data Exchange (ETDEWEB)

    Syed, Aijazuddin; Shapiro, Harris; Tu, Hank; Pangilinan, Jasmyn; Trong, Stephan

    2008-03-14

    It is a challenging job for genome analysts to accurately debug, troubleshoot, and validate genome assembly results. Genome analysts rely on visualization tools to help validate and troubleshoot assembly results, including such problems as mis-assemblies, low-quality regions, and repeats. Short read data adds further complexity and makes it extremely challenging for the visualization tools to scale and to view all needed assembly information. As a result, there is a need for a visualization tool that can scale to display assembly data from the new sequencing technologies. We present Genome Assembly Visualization Tool (GAViT), a highly scalable and interactive assembly visualization tool developed at the DOE Joint Genome Institute (JGI).

  1. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  2. Open source tools for large-scale neuroscience.

    Science.gov (United States)

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we c......The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities...... are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI. In the paper, we consider a number of Extract‐Transform‐Load (ETL) tools, database management systems (DBMSs), On‐Line Analytical Processing (OLAP) servers, and OLAP clients. We find that, unlike the situation a few years ago, there now...

  4. Virtual Reality: A Tool for Cartographic Visualization | Quaye-Ballard ...

    African Journals Online (AJOL)

    Visualization methods in the analysis of geographical datasets are based on static models, which restrict the visual analysis capabilities. The use of virtual reality, which is a three-dimensional (3D) perspective, gives the user the ability to change viewpoints and models dynamically overcomes the static limitations of ...

  5. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  6. E-Sourcing platforms as reasonable marketing tools for suppliers

    OpenAIRE

    Göbl, Martin; Greiter, Thomas

    2014-01-01

    Research questions: E-sourcing platforms offer purchasing organisations often easy access to a high number of relevant suppliers, their goods and services and the accord-ing prices. For the suppliers, e-sourcing platforms are a good and easy pos-sibility to present their products and services to the relevant buyers and to get in contact with potential customers. Subject of this research will be the question, whether e-sourcing platforms are also a reasonable marketing tool for suppliers in or...

  7. Users’ perception of visual aesthetics and usefulness of a web-based educational tool

    OpenAIRE

    Sánchez Franco, Manuel Jesús; Villarejo Ramos, Ángel Francisco; Peral Peral, Begoña; Buitrago Esquinas, Eva María; Roldán Salgueiro, José Luis

    2013-01-01

    As a result of our research we have become increasingly aware of the relevance of visual design in understanding learners’ attitudes towards the use of virtual tools. Likewise, perceived usefulness is an essential antecedent of the cumulative impressions of, and preferences for, such tools. Therefore, the aim of this study is to investigate the main effects of visual design and usefulness on learning and productivity in the domain of web-based educational tools. Structural Equation M...

  8. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, poweful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  9. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, powerful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  10. Visualization tool for three-dimensional plasma velocity distributions (ISEE_3D) as a plug-in for SPEDAS

    Science.gov (United States)

    Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron

    2017-12-01

    This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.

  11. Decision support tool for diagnosing the source of variation

    Science.gov (United States)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  12. ELATE: an open-source online application for analysis and visualization of elastic tensors

    International Nuclear Information System (INIS)

    Gaillac, Romain; Coudert, François-Xavier; Pullumbi, Pluton

    2016-01-01

    We report on the implementation of a tool for the analysis of second-order elastic stiffness tensors, provided with both an open-source Python module and a standalone online application allowing the visualization of anisotropic mechanical properties. After describing the software features, how we compute the conventional elastic constants and how we represent them graphically, we explain our technical choices for the implementation. In particular, we focus on why a Python module is used to generate the HTML web page with embedded Javascript for dynamical plots. (paper)

  13. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    Science.gov (United States)

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  14. Dynamic visualizations as tools for supporting cosmological literacy

    Science.gov (United States)

    Buck, Zoe Elizabeth

    My dissertation research is designed to improve access to STEM content through the development of cosmology visualizations that support all learners as they engage in cosmological sense-making. To better understand how to design visualizations that work toward breaking cycles of power and access in the sciences, I orient my work to following "meta-question": How might educators use visualizations to support diverse ways of knowing and learning in order to expand access to cosmology, and to science? In this dissertation, I address this meta-question from a pragmatic epistemological perspective, through a sociocultural lens, following three lines of inquiry: experimental methods (Creswell, 2003) with a focus on basic visualization design, activity analysis (Wells, 1996; Ash, 2001; Rahm, 2012) with a focus on culturally and linguistically diverse learners, and case study (Creswell, 2000) with a focus on expansive learning at a planetarium (Engestrom, 2001; Ash, 2014). My research questions are as follows, each of which corresponds to a self contained course of inquiry with its own design, data, analysis and results: 1) Can mediational cues like color affect the way learners interpret the content in a cosmology visualization? 2) How do cosmology visualizations support cosmological sense-making for diverse students? 3) What are the shared objects of dynamic networks of activity around visualization production and use in a large, urban planetarium and how do they affect learning? The result is a mixed-methods design (Sweetman, Badiee & Creswell, 2010) where both qualitative and quantitative data are used when appropriate to address my research goals. In the introduction I begin by establishing a theoretical framework for understanding visualizations within cultural historical activity theory (CHAT) and situating the chapters that follow within that framework. I also introduce the concept of cosmological literacy, which I define as the set of conceptual, semiotic and

  15. VITA-6.2: Advanced visual tool for information management

    International Nuclear Information System (INIS)

    Jacobson, Z.; Truong, Q.S.; Houston, B.; Taylor, V.; Herber, N.; El Gebaly, A.

    2007-01-01

    Visual Interface for Text Analysis (VITA), our combined user interface and meta-search engine software application, improves the quality and speed at which intelligence analysts can explore novel massive text corpora via innovations that facilitate user contextual awareness. (author)

  16. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen

    2017-01-01

    This thesis presents the design and implementation of Abstractocyte, a system for the visual analysis of astrocytes, and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells

  17. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  18. π Scope: python based scientific workbench with visualization tool for MDSplus data

    Science.gov (United States)

    Shiraiwa, S.

    2014-10-01

    π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.

  19. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  20. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  1. Development of a Carbon Sequestration Visualization Tool using Google Earth Pro

    Science.gov (United States)

    Keating, G. N.; Greene, M. K.

    2008-12-01

    The Big Sky Carbon Sequestration Partnership seeks to prepare organizations throughout the western United States for a possible carbon-constrained economy. Through the development of CO2 capture and subsurface sequestration technology, the Partnership is working to enable the region to cleanly utilize its abundant fossil energy resources. The intent of the Los Alamos National Laboratory Big Sky Visualization tool is to allow geochemists, geologists, geophysicists, project managers, and other project members to view, identify, and query the data collected from CO2 injection tests using a single data source platform, a mission to which Google Earth Pro is uniquely and ideally suited . The visualization framework enables fusion of data from disparate sources and allows investigators to fully explore spatial and temporal trends in CO2 fate and transport within a reservoir. 3-D subsurface wells are projected above ground in Google Earth as the KML anchor points for the presentation of various surface subsurface data. This solution is the most integrative and cost-effective possible for the variety of users in the Big Sky community.

  2. AR4VI: AR as an Accessibility Tool for People with Visual Impairments

    OpenAIRE

    Coughlan, James M.; Miele, Joshua

    2017-01-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness – an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a r...

  3. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen

    2017-06-12

    This thesis presents the design and implementation of Abstractocyte, a system for the visual analysis of astrocytes, and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. Even though glial cells make up around 50 percent of all cells in the mammalian brain, so far they have been far less studied than neurons. Nevertheless, the study of astrocytes has immense potential for understanding brain function. However, the complex and widely-branching structure of astrocytes requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Using Abstractocyte, biologists can explore the morphology of astrocytes at various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a joint visualization as a specific point in that 2D abstraction space. Dragging this point allows them to smoothly transition between different abstraction levels in an intuitive manner. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  4. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  5. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  6. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    Science.gov (United States)

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement.

  7. Video games as a tool to train visual skills.

    Science.gov (United States)

    Achtman, R L; Green, C S; Bavelier, D

    2008-01-01

    Adult brain plasticity, although possible, is often difficult to elicit. Training regimens in adults can produce specific improvements on the trained task without leading to general enhancements that would improve quality of life. This paper considers the case of playing action video games as a way to induce widespread enhancement in vision. We review the range of visual skills altered by action video game playing as well as the game components important in promoting visual plasticity. Further, we discuss what these results might mean in terms of rehabilitation for different patient populations.

  8. Video games as a tool to train visual skills

    Science.gov (United States)

    Achtman, R.L.; Green, C.S.; Bavelier, D.

    2010-01-01

    Purpose Adult brain plasticity, although possible, is often difficult to elicit. Training regimens in adults can produce specific improvements on the trained task without leading to general enhancements that would improve quality of life. This paper considers the case of playing action video games as a way to induce widespread enhancement in vision. Conclusions We review the range of visual skills altered by action video game playing as well as the game components important in promoting visual plasticity. Further, we discuss what these results might mean in terms of rehabilitation for different patient populations. PMID:18997318

  9. Visualizing data mining results with the Brede tools

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup

    2009-01-01

    has expanded and now includes its own database with coordinates along with ontologies for brain regions and functions: The Brede Database. With Brede Toolbox and Database combined we setup automated workflows for extraction of data, mass meta-analytic data mining and visualizations. Most of the Web......A few neuroinformatics databases now exist that record results from neuroimaging studies in the form of brain coordinates in stereotaxic space. The Brede Toolbox was originally developed to extract, analyze and visualize data from one of them --- the BrainMap database. Since then the Brede Toolbox...

  10. A common source of attention for auditory and visual tracking.

    Science.gov (United States)

    Fougnie, Daryl; Cockhren, Jurnell; Marois, René

    2018-05-01

    Tasks that require tracking visual information reveal the severe limitations of our capacity to attend to multiple objects that vary in time and space. Although these limitations have been extensively characterized in the visual domain, very little is known about tracking information in other sensory domains. Does tracking auditory information exhibit characteristics similar to those of tracking visual information, and to what extent do these two tracking tasks draw on the same attention resources? We addressed these questions by asking participants to perform either single or dual tracking tasks from the same (visual-visual) or different (visual-auditory) perceptual modalities, with the difficulty of the tracking tasks being manipulated across trials. The results revealed that performing two concurrent tracking tasks, whether they were in the same or different modalities, affected tracking performance as compared to performing each task alone (concurrence costs). Moreover, increasing task difficulty also led to increased costs in both the single-task and dual-task conditions (load-dependent costs). The comparison of concurrence costs between visual-visual and visual-auditory dual-task performance revealed slightly greater interference when two visual tracking tasks were paired. Interestingly, however, increasing task difficulty led to equivalent costs for visual-visual and visual-auditory pairings. We concluded that visual and auditory tracking draw largely, though not exclusively, on common central attentional resources.

  11. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    Science.gov (United States)

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  12. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    2000-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  13. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1999-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  14. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1998-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  15. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  16. iVCJ: A tool for Interactive Visualization of high explosives CJ states

    Energy Technology Data Exchange (ETDEWEB)

    Wooten, Hasani Omar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Aslam, Tariq Dennis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Whitley, Von Howard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-12

    A graphical user interface (GUI) tool has been developed that facilitates the visualization and analysis of the Chapman-Jouguet state for high explosives gaseous products using the Jones- Wilkins-Lee equation of state.

  17. VCS: Tool for Visualizing Copy Number Variation and Single Nucleotide Polymorphism

    Directory of Open Access Journals (Sweden)

    HyoYoung Kim

    2014-12-01

    Full Text Available Copy number variation (CNV or single nucleotide phlyorphism (SNP is useful genetic resource to aid in understanding complex phenotypes or deseases susceptibility. Although thousands of CNVs and SNPs are currently avaliable in the public databases, they are somewhat difficult to use for analyses without visualization tools. We developed a web-based tool called the VCS (visualization of CNV or SNP to visualize the CNV or SNP detected. The VCS tool can assist to easily interpret a biological meaning from the numerical value of CNV and SNP. The VCS provides six visualization tools: i the enrichment of genome contents in CNV; ii the physical distribution of CNV or SNP on chromosomes; iii the distribution of log2 ratio of CNVs with criteria of interested; iv the number of CNV or SNP per binning unit; v the distribution of homozygosity of SNP genotype; and vi cytomap of genes within CNV or SNP region.

  18. System Sketch: A Visualization Tool to Improve Community Decision Making

    Science.gov (United States)

    Making decisions in coastal and estuarine management requires a comprehensive understanding of the linkages between environmental, social, and economic systems. SystemSketch is a web-based scoping tool designed to assist resource managers in characterizing their systems, explorin...

  19. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  20. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  1. Users’ perception of visual design and usefulness of a web-based educational tool

    OpenAIRE

    Sánchez Franco, Manuel Jesús; Villarejo Ramos, Ángel Francisco; Peral Peral, Begoña; Buitrago Esquinas, Eva María; Roldán Salgueiro, José Luis

    2012-01-01

    Our research has become increasingly aware of the relevance of visual design in understanding learners’ attitudes towards the use of virtual tools. Likewise, perceived usefulness is an essential antecedent of the cumulative impressions of and preferences for them. Therefore, the aim of this study is to investigate the main effects of visual design and usefulness on learning and productivity in the domain of web-based educational tools. A Structural Equation Modelling, specifically Partial Lea...

  2. From a Gloss to a Learning Tool: Does Visual Aids Enhance Better Sentence Comprehension?

    Science.gov (United States)

    Sato, Takeshi; Suzuki, Akio

    2012-01-01

    The aim of this study is to optimize CALL environments as a learning tool rather than a gloss, focusing on the learning of polysemous words which refer to spatial relationship between objects. A lot of research has already been conducted to examine the efficacy of visual glosses while reading L2 texts and has reported that visual glosses can be…

  3. Procedures and Tools Used by Teachers When Completing Functional Vision Assessments with Children with Visual Impairments

    Science.gov (United States)

    Kaiser, Justin T.; Herzberg, Tina S.

    2017-01-01

    Introduction: This study analyzed survey responses from 314 teachers of students with visual impairments regarding the tools and procedures used in completing functional vision assessments (FVAs). Methods: Teachers of students with visual impairments in the United States and Canada completed an online survey during spring 2016. Results: The…

  4. Plasma diagnostic tools for optimizing negative hydrogen ion sources

    International Nuclear Information System (INIS)

    Fantz, U.; Falter, H.D.; Franzen, P.; Speth, E.; Hemsworth, R.; Boilson, D.; Krylov, A.

    2006-01-01

    The powerful diagnostic tool of optical emission spectroscopy is used to measure the plasma parameters in negative hydrogen ion sources based on the surface mechanism. Results for electron temperature, electron density, atomic-to-molecular hydrogen density ratio, and gas temperature are presented for two types of sources, a rf source and an arc source, which are currently under development for a neutral beam heating system of ITER. The amount of cesium in the plasma volume is obtained from cesium radiation: the Cs neutral density is five to ten orders of magnitude lower than the hydrogen density and the Cs ion density is two to three orders of magnitude lower than the electron density in front of the grid. It is shown that monitoring of cesium lines is very useful for monitoring the cesium balance in the source. From a line-ratio method negative ion densities are determined. In a well-conditioned source the negative ion density is of the same order of magnitude as the electron density and correlates with extracted current densities

  5. Consensus Coding as a Tool in Visual Appearance Research

    Directory of Open Access Journals (Sweden)

    D R Simmons

    2011-04-01

    Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.

  6. Visual tools and languages: Directions for the '90s

    Energy Technology Data Exchange (ETDEWEB)

    Glinert, E.P. (Rensselaer Polytechnic Inst., Troy, NY (United States). Dept. of Computer Science); Blattner, M.M. (Lawrence Livermore National Lab., CA (United States)); Frerking, C.J. (California Univ., Davis, CA (United States))

    1991-01-01

    We identify and discuss three domains where we believe that innovative application of visual programming languages is likely to make a significant impact in the near term: concurrent computing, computer-based assistance for people with disabilities, and the multimedia/multimodal environments of tomorrow in which it will be possible to hear and physically interact with information as well as see it. 33 refs., 3 figs.

  7. Visual tools and languages: Directions for the '90s

    International Nuclear Information System (INIS)

    Glinert, E.P.; Frerking, C.J.

    1991-01-01

    We identify and discuss three domains where we believe that innovative application of visual programming languages is likely to make a significant impact in the near term: concurrent computing, computer-based assistance for people with disabilities, and the multimedia/multimodal environments of tomorrow in which it will be possible to hear and physically interact with information as well as see it. 33 refs., 3 figs

  8. The Tools, Approaches and Applications of Visual Literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria

    Science.gov (United States)

    Ecoma, Victor

    2016-01-01

    The paper reflects upon the tools, approaches and applications of visual literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria. The objective of the discourse is to examine how the visual arts training and practice equip students with skills in visual literacy through methods of production, materials and…

  9. Writing in the air: A visualization tool for written languages.

    Directory of Open Access Journals (Sweden)

    Yoshihiro Itaguchi

    Full Text Available The present study investigated interactions between cognitive processes and finger actions called "kusho," meaning "air-writing" in Japanese. Kanji-culture individuals often employ kusho behavior in which they move their fingers as a substitute for a pen to write mostly done when they are trying to recall the shape of a Kanji character or the spelling of an English word. To further examine the visualization role of kusho behavior on cognitive processing, we conducted a Kanji construction task in which a stimulus (i.e., sub-parts to be constructed was simultaneously presented. In addition, we conducted a Kanji vocabulary test to reveal the relation between the kusho benefit and vocabulary size. The experiment provided two sets of novel findings. First, executing kusho behavior improved task performance (correct responses as long as the participants watched their finger movements while solving the task. This result supports the idea that visual feedback of kusho behavior helps cognitive processing for the task. Second, task performance was positively correlated with the vocabulary score when stimuli were presented for a relatively long time, whereas the kusho benefits and vocabulary score were not correlated regardless of stimulus-presentation time. These results imply that a longer stimulus-presentation could allow participants to utilize their lexical resources for solving the task. The current findings together support the visualization role of kusho behavior, adding experimental evidence supporting the view that there are interactions between cognition and motor behavior.

  10. Rapid development of medical imaging tools with open-source libraries.

    Science.gov (United States)

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  11. Three-D Google Earth bases geospatial visualization tool for the smart grid distribution

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, K. [Enterprise Horizons, Fremont, CA (United States)

    2009-07-01

    Smart grids can be used to liberalize markets, ensure reliability and reduce the environmental footprint of electric utilities. This presentation discussed a geo-spatial visualization tool for smart grid distribution. The visualization tool can be used to visualize transmission lines, substations, and is capable of viewing millions of topographical components. The tool was designed to track and monitor the health of assets and to increase awareness of vulnerabilities, vegetation, and regional demographics. The tool is also capable of identifying potential issues before a rolling blackout situation as well as anticipating islanding spike events. The visualization tool can be segmented by population and industrial belts, and is able to provide diagnostics on power factor turbulence for congestion bottlenecks. When used for transmission line and substation siting, the tool can provide terrain feasibility analyses and environmental impact analyses. Weather-based demand forecasting can be used to determine critical customers impacted by potential outages. CAD drawings can be used to visualize assets in virtual reality and can be linked to consumer indexing and smart metering initiatives. It was concluded that the web-based tool can also be used for workforce and dispatch management. tabs., figs.

  12. Visual Temporal Logic as a Rapid Prototying Tool

    DEFF Research Database (Denmark)

    Fränzle, Martin; Lüth, Karsten

    2001-01-01

    Within this survey article, we explain real-time symbolic timing diagrams and the ICOS tool-box supporting timing-diagram-based requirements capture and rapid prototyping. Real-time symbolic timing diagrams are a full-fledged metric-time temporal logic, but with a graphical syntax reminiscent...... of the informal timing diagrams widely used in electrical engineering. ICOS integrates a variety of tools, ranging from graphical specification editors over tautology checking and counterexample generation to code generators emitting C or VHDL, thus bridging the gap from formal specification to rapid prototype...

  13. MATH: A Scientific Tool for Numerical Methods Calculation and Visualization

    Directory of Open Access Journals (Sweden)

    Henrich Glaser-Opitz

    2016-02-01

    Full Text Available MATH is an easy to use application for various numerical methods calculations with graphical user interface and integrated plotting tool written in Qt with extensive use of Qwt library for plotting options and use of Gsl and MuParser libraries as a numerical and parser helping libraries. It can be found at http://sourceforge.net/projects/nummath. MATH is a convenient tool for use in education process because of its capability of showing every important step in solution process to better understand how it is done. MATH also enables fast comparison of similar method speed and precision.

  14. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  15. Sources of global climate data and visualization portals

    Science.gov (United States)

    Douglas, David C.

    2014-01-01

    Climate is integral to the geophysical foundation upon which ecosystems are structured. Knowledge about mechanistic linkages between the geophysical and biological environments is essential for understanding how global warming may reshape contemporary ecosystems and ecosystem services. Numerous global data sources spanning several decades are available that document key geophysical metrics such as temperature and precipitation, and metrics of primary biological production such as vegetation phenology and ocean phytoplankton. This paper provides an internet directory to portals for visualizing or servers for downloading many of the more commonly used global datasets, as well as a description of how to write simple computer code to efficiently retrieve these data. The data are broadly useful for quantifying relationships between climate, habitat availability, and lower-trophic-level habitat quality - especially in Arctic regions where strong seasonality is accompanied by intrinsically high year-to-year variability. If defensible linkages between the geophysical (climate) and the biological environment can be established, general circulation model (GCM) projections of future climate conditions can be used to infer future biological responses. Robustness of this approach is, however, complicated by the number of direct, indirect, or interacting linkages involved. For example, response of a predator species to climate change will be influenced by the responses of its prey and competitors, and so forth throughout a trophic web. The complexities of ecological systems warrant sensible and parsimonious approaches for assessing and establishing the role of natural climate variability in order to substantiate inferences about the potential effects of global warming.

  16. Footprints: A Visual Search Tool that Supports Discovery and Coverage Tracking.

    Science.gov (United States)

    Isaacs, Ellen; Domico, Kelly; Ahern, Shane; Bart, Eugene; Singhal, Mudita

    2014-12-01

    Searching a large document collection to learn about a broad subject involves the iterative process of figuring out what to ask, filtering the results, identifying useful documents, and deciding when one has covered enough material to stop searching. We are calling this activity "discoverage," discovery of relevant material and tracking coverage of that material. We built a visual analytic tool called Footprints that uses multiple coordinated visualizations to help users navigate through the discoverage process. To support discovery, Footprints displays topics extracted from documents that provide an overview of the search space and are used to construct searches visuospatially. Footprints allows users to triage their search results by assigning a status to each document (To Read, Read, Useful), and those status markings are shown on interactive histograms depicting the user's coverage through the documents across dates, sources, and topics. Coverage histograms help users notice biases in their search and fill any gaps in their analytic process. To create Footprints, we used a highly iterative, user-centered approach in which we conducted many evaluations during both the design and implementation stages and continually modified the design in response to feedback.

  17. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    Science.gov (United States)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user

  18. JVM: Java Visual Mapping tool for next generation sequencing read.

    Science.gov (United States)

    Yang, Ye; Liu, Juan

    2015-01-01

    We developed a program JVM (Java Visual Mapping) for mapping next generation sequencing read to reference sequence. The program is implemented in Java and is designed to deal with millions of short read generated by sequence alignment using the Illumina sequencing technology. It employs seed index strategy and octal encoding operations for sequence alignments. JVM is useful for DNA-Seq, RNA-Seq when dealing with single-end resequencing. JVM is a desktop application, which supports reads capacity from 1 MB to 10 GB.

  19. AR4VI: AR as an Accessibility Tool for People with Visual Impairments.

    Science.gov (United States)

    Coughlan, James M; Miele, Joshua

    2017-10-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness - an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well.

  20. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  1. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  2. The DiaCog: A Prototype Tool for Visualizing Online Dialog Games' Interactions

    Science.gov (United States)

    Yengin, Ilker; Lazarevic, Bojan

    2014-01-01

    This paper proposes and explains the design of a prototype learning tool named the DiaCog. The DiaCog visualizes dialog interactions within an online dialog game by using dynamically created cognitive maps. As a purposefully designed tool for enhancing learning effectiveness the DiaCog might be applicable to dialogs at discussion boards within a…

  3. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion.

    Science.gov (United States)

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M

    2016-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft ® Excel ® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet.

  4. The visual in sport history: approaches, methodologies and sources

    OpenAIRE

    Huggins, Mike

    2015-01-01

    Historians of sport now increasingly accept that visual inquiry offers another dimension to social and cultural research into sport and its history. It is complex and its boundaries are rapidly evolving. This overview offers a justification for placing more emphasis on visual approaches and an introduction to the study and interpretation of visual culture in relation to the history of sport. It stresses the importance of adopting a critical approach and the need to be reflective about that cr...

  5. VarB Plus: An Integrated Tool for Visualization of Genome Variation Datasets

    KAUST Repository

    Hidayah, Lailatul

    2012-07-01

    Research on genomic sequences has been improving significantly as more advanced technology for sequencing has been developed. This opens enormous opportunities for sequence analysis. Various analytical tools have been built for purposes such as sequence assembly, read alignments, genome browsing, comparative genomics, and visualization. From the visualization perspective, there is an increasing trend towards use of large-scale computation. However, more than power is required to produce an informative image. This is a challenge that we address by providing several ways of representing biological data in order to advance the inference endeavors of biologists. This thesis focuses on visualization of variations found in genomic sequences. We develop several visualization functions and embed them in an existing variation visualization tool as extensions. The tool we improved is named VarB, hence the nomenclature for our enhancement is VarB Plus. To the best of our knowledge, besides VarB, there is no tool that provides the capability of dynamic visualization of genome variation datasets as well as statistical analysis. Dynamic visualization allows users to toggle different parameters on and off and see the results on the fly. The statistical analysis includes Fixation Index, Relative Variant Density, and Tajima’s D. Hence we focused our efforts on this tool. The scope of our work includes plots of per-base genome coverage, Principal Coordinate Analysis (PCoA), integration with a read alignment viewer named LookSeq, and visualization of geo-biological data. In addition to description of embedded functionalities, significance, and limitations, future improvements are discussed. The result is four extensions embedded successfully in the original tool, which is built on the Qt framework in C++. Hence it is portable to numerous platforms. Our extensions have shown acceptable execution time in a beta testing with various high-volume published datasets, as well as positive

  6. Development and Evaluation of Secure Socket Layer Visualization Tool with Packet Capturing Function

    Directory of Open Access Journals (Sweden)

    Arai Masayuki

    2015-01-01

    Full Text Available Secure Socket Layer (SSL has become a fundamental technology that secures browser-processed personal details sent to the server. As a result, communication and computer engineers are advised to learn the protocol. However, understanding SSL is very difficult because of its intricate communication procedure. To solve this problem, we developed a visualization tool for understanding SSL. This paper describes the design, implementation methods, and evaluation of the tool. The evaluation results show that the visualization tool is effective for learning SSL.

  7. Exploration of Metagenome Assemblies with an Interactive Visualization Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cantor, Michael; Nordberg, Henrik; Smirnova, Tatyana; Andersen, Evan; Tringe, Susannah; Hess, Matthias; Dubchak, Inna

    2014-07-09

    Metagenomics, one of the fastest growing areas of modern genomic science, is the genetic profiling of the entire community of microbial organisms present in an environmental sample. Elviz is a web-based tool for the interactive exploration of metagenome assemblies. Elviz can be used with publicly available data sets from the Joint Genome Institute or with custom user-loaded assemblies. Elviz is available at genome.jgi.doe.gov/viz

  8. Property Integration: Componentless Design Techniques and Visualization Tools

    DEFF Research Database (Denmark)

    El-Halwagi, Mahmoud M; Glasgow, I.M.; Eden, Mario Richard

    2004-01-01

    integration is defined as a functionality-based, holistic approach to the allocation and manipulation of streams and processing units, which is based on tracking, adjusting, assigning, and matching functionalities throughout the process. Revised lever arm rules are devised to allow optimal allocation while...... maintaining intra- and interstream conservation of the property-based clusters. The property integration problem is mapped into the cluster domain. This dual problem is solved in terms of clusters and then mapped to the primal problem in the property domain. Several new rules are derived for graphical...... techniques. Particularly, systematic rules and visualization techniques for the identification of optimal mixing of streams and their allocation to units. Furthermore, a derivation of the correspondence between clustering arms and fractional contribution of streams is presented. This correspondence...

  9. Transcript structure and domain display: a customizable transcript visualization tool.

    Science.gov (United States)

    Watanabe, Kenneth A; Ma, Kaiwang; Homayouni, Arielle; Rushton, Paul J; Shen, Qingxi J

    2016-07-01

    Transcript Structure and Domain Display (TSDD) is a publicly available, web-based program that provides publication quality images of transcript structures and domains. TSDD is capable of producing transcript structures from GFF/GFF3 and BED files. Alternatively, the GFF files of several model organisms have been pre-loaded so that users only needs to enter the locus IDs of the transcripts to be displayed. Visualization of transcripts provides many benefits to researchers, ranging from evolutionary analysis of DNA-binding domains to predictive function modeling. TSDD is freely available for non-commercial users at http://shenlab.sols.unlv.edu/shenlab/software/TSD/transcript_display.html : jeffery.shen@unlv.nevada.edu. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  10. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  11. Do Bedside Visual Tools Improve Patient and Caregiver Satisfaction? A Systematic Review of the Literature.

    Science.gov (United States)

    Goyal, Anupama A; Tur, Komalpreet; Mann, Jason; Townsend, Whitney; Flanders, Scott A; Chopra, Vineet

    2017-11-01

    Although common, the impact of low-cost bedside visual tools, such as whiteboards, on patient care is unclear. To systematically review the literature and assess the influence of bedside visual tools on patient satisfaction. Medline, Embase, SCOPUS, Web of Science, CINAHL, and CENTRAL. Studies of adult or pediatric hospitalized patients reporting physician identification, understanding of provider roles, patient-provider communication, and satisfaction with care from the use of visual tools were included. Outcomes were categorized as positive, negative, or neutral based on survey responses for identification, communication, and satisfaction. Two reviewers screened studies, extracted data, and assessed the risk of study bias. Sixteen studies met the inclusion criteria. Visual tools included whiteboards (n = 4), physician pictures (n = 7), whiteboard and picture (n = 1), electronic medical record-based patient portals (n = 3), and formatted notepads (n = 1). Tools improved patients' identification of providers (13/13 studies). The impact on understanding the providers' roles was largely positive (8/10 studies). Visual tools improved patient-provider communication (4/5 studies) and satisfaction (6/8 studies). In adults, satisfaction varied between positive with the use of whiteboards (2/5 studies) and neutral with pictures (1/5 studies). Satisfaction related to pictures in pediatric patients was either positive (1/3 studies) or neutral (1/3 studies). Differences in tool format (individual pictures vs handouts with pictures of all providers) and study design (randomized vs cohort) may explain variable outcomes. The use of bedside visual tools appears to improve patient recognition of providers and patient-provider communication. Future studies that include better design and outcome assessment are necessary before widespread use can be recommended. © 2017 Society of Hospital Medicine

  12. Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools

    Science.gov (United States)

    Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.

    2017-12-01

    For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.

  13. Seeing the Point: Using Visual Sources to Understand the Arguments for Women's Suffrage

    Science.gov (United States)

    Card, Jane

    2011-01-01

    Visual sources, Jane Card argues, are a powerful resource for historical learning but using them in the classroom requires careful thought and planning. Card here shares how she has used visual source material in order to teach her students about the women's suffrage movement. In particular, Card shows how a chain of questions that moves from the…

  14. Web-based Data Exploration, Exploitation and Visualization Tools for Satellite Sensor VIS/IR Calibration Applications

    Science.gov (United States)

    Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.

    2016-12-01

    The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.

  15. A Virtual Reality Visualization Tool for Neuron Tracing.

    Science.gov (United States)

    Usher, Will; Klacansky, Pavol; Federer, Frederick; Bremer, Peer-Timo; Knoll, Aaron; Yarch, Jeff; Angelucci, Alessandra; Pascucci, Valerio

    2018-01-01

    Tracing neurons in large-scale microscopy data is crucial to establishing a wiring diagram of the brain, which is needed to understand how neural circuits in the brain process information and generate behavior. Automatic techniques often fail for large and complex datasets, and connectomics researchers may spend weeks or months manually tracing neurons using 2D image stacks. We present a design study of a new virtual reality (VR) system, developed in collaboration with trained neuroanatomists, to trace neurons in microscope scans of the visual cortex of primates. We hypothesize that using consumer-grade VR technology to interact with neurons directly in 3D will help neuroscientists better resolve complex cases and enable them to trace neurons faster and with less physical and mental strain. We discuss both the design process and technical challenges in developing an interactive system to navigate and manipulate terabyte-sized image volumes in VR. Using a number of different datasets, we demonstrate that, compared to widely used commercial software, consumer-grade VR presents a promising alternative for scientists.

  16. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    Science.gov (United States)

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  17. Visual Impairment/lntracranial Pressure Risk Clinical Care Data Tools

    Science.gov (United States)

    Van Baalen, Mary; Mason, Sara S.; Taiym, Wafa; Wear, Mary L.; Moynihan, Shannan; Alexander, David; Hart, Steve; Tarver, William

    2014-01-01

    Prior to 2010, several ISS crewmembers returned from spaceflight with changes to their vision, ranging from a mild hyperopic shift to frank disc edema. As a result, NASA expanded clinical vision testing to include more comprehensive medical imaging, including Optical Coherence Tomography and 3 Tesla Brain and Orbit MRIs. The Space and Clinical Operations (SCO) Division developed a clinical practice guideline that classified individuals based on their symptoms and diagnoses to facilitate clinical care. For the purposes of clinical surveillance, this classification was applied retrospectively to all crewmembers who had sufficient testing for classification. This classification is also a tool that has been leveraged for researchers to identify potential risk factors. In March 2014, driven in part by a more comprehensive understanding of the imaging data and increased imaging capability on orbit, the SCO Division revised their clinical care guidance to outline in-flight care and increase post-flight follow up. The new clinical guidance does not include a classification scheme

  18. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  19. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  20. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  1. Effect of different illumination sources on reading and visual performance

    Directory of Open Access Journals (Sweden)

    Male Shiva Ram

    2018-01-01

    Conclusion: This study demonstrates the influence of illumination on reading rate; there were no significant differences between males and females under different illuminations, however, males preferred CFL and females preferred FLUO for faster reading and visual comfort. Interestingly, neither preferred LED or TUNG. Although energy-efficient, visual performance under LED is poor; it is uncomfortable for prolonged reading and causes early symptoms of fatigue.

  2. VSEARCH: a versatile open source tool for metagenomics.

    Science.gov (United States)

    Rognes, Torbjørn; Flouri, Tomáš; Nichols, Ben; Quince, Christopher; Mahé, Frédéric

    2016-01-01

    VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010) for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment), clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order), chimera detection (reference-based or de novo ), dereplication (full length or prefix), pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling, while on a par with USEARCH for paired

  3. VSEARCH: a versatile open source tool for metagenomics

    Directory of Open Access Journals (Sweden)

    Torbjørn Rognes

    2016-10-01

    Full Text Available Background VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010 for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. Methods When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. Results VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment, clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order, chimera detection (reference-based or de novo, dereplication (full length or prefix, pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling

  4. Experiences of graduate students: Using Cabri as a visualization tool in math education

    Directory of Open Access Journals (Sweden)

    Çiğdem Gül

    2014-12-01

    Full Text Available Through the use of graphic calculators and dynamic software running on computers and mobile devices, students can learn complex algebraic concepts. The purpose of this study is to investigate the experiences of graduate students using Cabri as a visualization tool in math education. The qualitative case study was used in this study. Five students from graduate students studying at the non-thesis math program of a university located in the Blacksea region were the participant of the study. As a dynamic learning tool, Cabri provided participants an environment where participants visually discovered the geometry. It was concluded that dynamic learning tools like Cabri has a huge potential for teaching visually the challenging concepts that students struggle to image. Further research should investigate the potential plans for integrating the use of dynamic learning software into the math curriculum

  5. Visual Representation in GENESIS as a tool for Physical Modeling, Sound Synthesis and Musical Composition

    OpenAIRE

    Villeneuve, Jérôme; Cadoz, Claude; Castagné, Nicolas

    2015-01-01

    The motivation of this paper is to highlight the importance of visual representations for artists when modeling and simulating mass-interaction physical networks in the context of sound synthesis and musical composition. GENESIS is a musician-oriented software environment for sound synthesis and musical composition. However, despite this orientation, a substantial amount of effort has been put into building a rich variety of tools based on static or dynamic visual representations of models an...

  6. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  7. An interactive visualization tool for multi-channel confocal microscopy data in neurobiology research

    KAUST Repository

    Yong Wan,

    2009-11-01

    Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.

  8. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  9. Development of a visual tool to analyze interactions in forums in an e-learning environment

    Directory of Open Access Journals (Sweden)

    Cláudio Filipe Tereso

    2016-12-01

    Full Text Available This article presents VAFAE – Forum Access Visualization on a Distance Learning Environment, a web tool that visually maps Universidade Aberta’s (UAb students’ interaction with a course available on the e-learning platform. Raw data is extracted from the log files that are then transformed to obtain the necessary format. Next, different visualization techniques are applied with the aim of improving and streamlining the underlying information. In a more specific way, VAFAE aims at helping teachers to better understand the level and quality of the interaction of the students with the modules of the learning units in UAb’s distance learning environment.

  10. Gestió de factures electròniques amb .NET (Visual Studio Tools for Office)

    OpenAIRE

    Gimeno Capín, Pablo

    2008-01-01

    Creació d¿un software de gestió de factures electròniques desenvolupat en aquesta plataforma tecnològica, amb indicació expressa d¿utilització de les eines VSTO (Visual Studio Tools for Office) en la seva última versió. Creación de un software de gestión de facturas electrónicas desarrollado en esta plataforma tecnológica, con indicación expresa de utilización de las herramientas VSTO (Visual Studio Tools for Office) en su última versión. Creation of electronic invoice management softwa...

  11. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    Science.gov (United States)

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  12. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  13. A web-based data visualization tool for the MIMIC-II database.

    Science.gov (United States)

    Lee, Joon; Ribey, Evan; Wallace, James R

    2016-02-04

    Although MIMIC-II, a public intensive care database, has been recognized as an invaluable resource for many medical researchers worldwide, becoming a proficient MIMIC-II researcher requires knowledge of SQL programming and an understanding of the MIMIC-II database schema. These are challenging requirements especially for health researchers and clinicians who may have limited computer proficiency. In order to overcome this challenge, our objective was to create an interactive, web-based MIMIC-II data visualization tool that first-time MIMIC-II users can easily use to explore the database. The tool offers two main features: Explore and Compare. The Explore feature enables the user to select a patient cohort within MIMIC-II and visualize the distributions of various administrative, demographic, and clinical variables within the selected cohort. The Compare feature enables the user to select two patient cohorts and visually compare them with respect to a variety of variables. The tool is also helpful to experienced MIMIC-II researchers who can use it to substantially accelerate the cumbersome and time-consuming steps of writing SQL queries and manually visualizing extracted data. Any interested researcher can use the MIMIC-II data visualization tool for free to quickly and conveniently conduct a preliminary investigation on MIMIC-II with a few mouse clicks. Researchers can also use the tool to learn the characteristics of the MIMIC-II patients. Since it is still impossible to conduct multivariable regression inside the tool, future work includes adding analytics capabilities. Also, the next version of the tool will aim to utilize MIMIC-III which contains more data.

  14. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  15. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    Science.gov (United States)

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis

    Science.gov (United States)

    2014-04-01

    includes Complex System SCILAB Toolbox, GraphViz, Igraph, NetDraw, Network Workbench, OpenDX, Prefuse, Sci² Tool, and Visualization Toolkit (VTK...Kits’ Capabilities Name Web Sites (all accessed 01/29/2014 Strengths Weaknesses Complex Systems SCILAB Tool http://www.randomfactory.com/openastro...osx/ scilab -info.html Measures graph parameters Academic Free License (AFL); works on UNIX and Windows; programming language is MATLAB; no

  17. A Visualization Tool for Integrating Research Results at an Underground Mine

    Science.gov (United States)

    Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.

    2016-12-01

    Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.

  18. The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources

    Science.gov (United States)

    Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.

    2004-12-01

    The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the

  19. Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data.

    Science.gov (United States)

    Combrisson, Etienne; Vallat, Raphael; Eichenlaub, Jean-Baptiste; O'Reilly, Christian; Lajnef, Tarek; Guillot, Aymeric; Ruby, Perrine M; Jerbi, Karim

    2017-01-01

    We introduce Sleep, a new Python open-source graphical user interface (GUI) dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1) Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2) Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM), (3) Implementation of practical signal processing tools such as re-referencing or filtering, and (4) Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep) and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module.

  20. Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data

    Directory of Open Access Journals (Sweden)

    Etienne Combrisson

    2017-09-01

    Full Text Available We introduce Sleep, a new Python open-source graphical user interface (GUI dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1 Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2 Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM, (3 Implementation of practical signal processing tools such as re-referencing or filtering, and (4 Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module.

  1. Developing an Interactive Data Visualization Tool to Assess the Impact of Decision Support on Clinical Operations.

    Science.gov (United States)

    Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M

    2018-05-18

    Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.

  2. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  3. The Film as Visual Aided Learning Tool in Classroom Management Course

    Science.gov (United States)

    Altinay Gazi, Zehra; Altinay Aksal, Fahriye

    2011-01-01

    This research aims to investigate the impact of the visual aided learning on pre-service teachers' co-construction of subject matter knowledge in teaching practice. The study revealed the examination of film as an active cognizing and learning tool in classroom management course within teacher education programme. Within the framework of action…

  4. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    Science.gov (United States)

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  5. Visualization: A Tool for Enhancing Students' Concept Images of Basic Object-Oriented Concepts

    Science.gov (United States)

    Cetin, Ibrahim

    2013-01-01

    The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey…

  6. Comparative Study of Load Testing Tools: Apache JMeter, HP LoadRunner, Microsoft Visual Studio (TFS, Siege

    Directory of Open Access Journals (Sweden)

    Rabiya Abbas

    2017-12-01

    Full Text Available Software testing is the process of verifying and validating the user’s requirements. Testing is ongoing process during whole software development. Software testing is characterized into three main types. That is, in Black box testing, user doesn’t know domestic knowledge, internal logics and design of system. In white box testing, Tester knows the domestic logic of code. In Grey box testing, Tester has little bit knowledge about the internal structure and working of the system. It is commonly used in case of Integration testing.Load testing helps us to analyze the performance of the system under heavy load or under Zero load. This is achieved with the help of a Load Testing Tool. The intention for writing this research is to carry out a comparison of four load testing tools i.e. Apache JMeter, LoadRunner, Microsoft Visual Studio (TFS, Siege based on certain criteria  i.e. test scripts generation , result reports, application support, plug-in supports, and cost . The main focus is to study these load testing tools and identify which tool is better and more efficient . We assume this comparison can help in selecting the most appropriate tool and motivates the use of open source load testing tools.

  7. Independent sources of anisotropy in visual orientation representation: a visual and a cognitive oblique effect.

    Science.gov (United States)

    Balikou, Panagiota; Gourtzelidis, Pavlos; Mantas, Asimakis; Moutoussis, Konstantinos; Evdokimidis, Ioannis; Smyrnis, Nikolaos

    2015-11-01

    The representation of visual orientation is more accurate for cardinal orientations compared to oblique, and this anisotropy has been hypothesized to reflect a low-level visual process (visual, "class 1" oblique effect). The reproduction of directional and orientation information also leads to a mean error away from cardinal orientations or directions. This anisotropy has been hypothesized to reflect a high-level cognitive process of space categorization (cognitive, "class 2," oblique effect). This space categorization process would be more prominent when the visual representation of orientation degrades such as in the case of working memory with increasing cognitive load, leading to increasing magnitude of the "class 2" oblique effect, while the "class 1" oblique effect would remain unchanged. Two experiments were performed in which an array of orientation stimuli (1-4 items) was presented and then subjects had to realign a probe stimulus within the previously presented array. In the first experiment, the delay between stimulus presentation and probe varied, while in the second experiment, the stimulus presentation time varied. The variable error was larger for oblique compared to cardinal orientations in both experiments reproducing the visual "class 1" oblique effect. The mean error also reproduced the tendency away from cardinal and toward the oblique orientations in both experiments (cognitive "class 2" oblique effect). The accuracy or the reproduced orientation degraded (increasing variable error) and the cognitive "class 2" oblique effect increased with increasing memory load (number of items) in both experiments and presentation time in the second experiment. In contrast, the visual "class 1" oblique effect was not significantly modulated by any one of these experimental factors. These results confirmed the theoretical predictions for the two anisotropies in visual orientation reproduction and provided support for models proposing the categorization of

  8. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  9. Open source tools for standardized privacy protection of medical images

    Science.gov (United States)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  10. An integrated audio-visual impact tool for wind turbine installations

    International Nuclear Information System (INIS)

    Lymberopoulos, N.; Belessis, M.; Wood, M.; Voutsinas, S.

    1996-01-01

    An integrated software tool was developed for the design of wind parks that takes into account their visual and audio impact. The application is built on a powerful hardware platform and is fully operated through a graphic user interface. The topography, the wind turbines and the daylight conditions are realised digitally. The wind park can be animated in real time and the user can take virtual walks in it while the set-up of the park can be altered interactively. In parallel, the wind speed levels on the terrain, the emitted noise intensity, the annual energy output and the cash flow can be estimated at any stage of the session and prompt the user for rearrangements. The tool has been used to visually simulate existing wind parks in St. Breok, UK and Andros Island, Greece. The results lead to the conclusion that such a tool can assist to the public acceptance and licensing procedures of wind parks. (author)

  11. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    Science.gov (United States)

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  12. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    Science.gov (United States)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  13. Using data visualization tools to support degradation assessment in nuclear piping

    International Nuclear Information System (INIS)

    Jyrkama, M.I.; Pandey, M.D.

    2012-01-01

    Nuclear utilities collect a vast amount of in-service inspection data as part of periodic inspection plans and the detailed assessment and monitoring of various degradation mechanisms, such as fretting, corrosion, and creep. In many cases, the focus is primarily on ensuring that the observed minimum or maximum values are within the acceptable regulatory limits, while the rest of the (often costly) surveillance data remains unused and unanalyzed. The objective of this study is to illustrate how data visualization tools can be used effectively to analyze and consider all of the in-service inspection data, and hence provide valuable support for the degradation assessment in nuclear piping. The 2D and 3D visualization tools discussed in this paper were developed mainly in the context of flow accelerated corrosion (FAC) assessment in feeder piping, where the complex pipe geometries and flow conditions have a significant impact on the ultrasonic (UT) wall thickness measurements. The visualization of eddy current inspection results from the assessment of pitting corrosion of steam generator tubing will also be discussed briefly. The visualization tools provide a more comprehensive view of the degree and extent of degradation, and hence directly support the planning of future inspection of critical components by identifying key locations and areas for detailed monitoring. The results furthermore increase the confidence and reliability of fitness-for-service (FFS) assessments and life cycle management (LCM) planning decisions with respect to component repair or replacement. (author)

  14. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  15. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  16. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  17. iRaster: a novel information visualization tool to explore spatiotemporal patterns in multiple spike trains.

    Science.gov (United States)

    Somerville, J; Stuart, L; Sernagor, E; Borisyuk, R

    2010-12-15

    Over the last few years, simultaneous recordings of multiple spike trains have become widely used by neuroscientists. Therefore, it is important to develop new tools for analysing multiple spike trains in order to gain new insight into the function of neural systems. This paper describes how techniques from the field of visual analytics can be used to reveal specific patterns of neural activity. An interactive raster plot called iRaster has been developed. This software incorporates a selection of statistical procedures for visualization and flexible manipulations with multiple spike trains. For example, there are several procedures for the re-ordering of spike trains which can be used to unmask activity propagation, spiking synchronization, and many other important features of multiple spike train activity. Additionally, iRaster includes a rate representation of neural activity, a combined representation of rate and spikes, spike train removal and time interval removal. Furthermore, it provides multiple coordinated views, time and spike train zooming windows, a fisheye lens distortion, and dissemination facilities. iRaster is a user friendly, interactive, flexible tool which supports a broad range of visual representations. This tool has been successfully used to analyse both synthetic and experimentally recorded datasets. In this paper, the main features of iRaster are described and its performance and effectiveness are demonstrated using various types of data including experimental multi-electrode array recordings from the ganglion cell layer in mouse retina. iRaster is part of an ongoing research project called VISA (Visualization of Inter-Spike Associations) at the Visualization Lab in the University of Plymouth. The overall aim of the VISA project is to provide neuroscientists with the ability to freely explore and analyse their data. The software is freely available from the Visualization Lab website (see www.plymouth.ac.uk/infovis). Copyright © 2010

  18. Vizic: A Jupyter-based interactive visualization tool for astronomical catalogs

    Science.gov (United States)

    Yu, W.; Carrasco Kind, M.; Brunner, R. J.

    2017-07-01

    The ever-growing datasets in observational astronomy have challenged scientists in many aspects, including an efficient and interactive data exploration and visualization. Many tools have been developed to confront this challenge. However, they usually focus on displaying the actual images or focus on visualizing patterns within catalogs in a predefined way. In this paper we introduce Vizic, a Python visualization library that builds the connection between images and catalogs through an interactive map of the sky region. Vizic visualizes catalog data over a custom background canvas using the shape, size and orientation of each object in the catalog. The displayed objects in the map are highly interactive and customizable comparing to those in the observation images. These objects can be filtered by or colored by their property values, such as redshift and magnitude. They also can be sub-selected using a lasso-like tool for further analysis using standard Python functions and everything is done from inside a Jupyter notebook. Furthermore, Vizic allows custom overlays to be appended dynamically on top of the sky map. We have initially implemented several overlays, namely, Voronoi, Delaunay, Minimum Spanning Tree and HEALPix grid layer, which are helpful for visualizing large-scale structure. All these overlays can be generated, added or removed interactively with just one line of code. The catalog data is stored in a non-relational database, and the interfaces have been developed in JavaScript and Python to work within Jupyter Notebook, which allows to create customizable widgets, user generated scripts to analyze and plot the data selected/displayed in the interactive map. This unique design makes Vizic a very powerful and flexible interactive analysis tool. Vizic can be adopted in variety of exercises, for example, data inspection, clustering analysis, galaxy alignment studies, outlier identification or just large scale visualizations.

  19. Collaboratively Conceived, Designed and Implemented: Matching Visualization Tools with Geoscience Data Collections and Geoscience Data Collections with Visualization Tools via the ToolMatch Service.

    Science.gov (United States)

    Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.

    2014-12-01

    Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.

  20. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  1. Visualization of PRADS Output Data Using Open-source Visualization Tools For Improved Log Analysis

    OpenAIRE

    Desta, Dawit Hailu

    2014-01-01

    The ever growing network traffic complexity has brought new threats and vulnerabilities that can affect our day to day activities. This lead to high demand for network monitoring and detection system to tackle the emerging threats . Consequently the inspection and assessment of security incidents has become a daily activity for network and system administrators. Network analysts need to have the awareness about every network activity, the status of the network system and the network assets in...

  2. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  3. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    Science.gov (United States)

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways

    Science.gov (United States)

    Kersey, Alyssa J.; Clark, Tyia S.; Lussier, Courtney A.; Mahon, Bradford Z.; Cantlon, Jessica F.

    2016-01-01

    Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4–8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. PMID:26108614

  5. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  6. Open source engineering and sustainability tools for the built environment

    NARCIS (Netherlands)

    Coenders, J.L.

    2013-01-01

    This paper presents two novel open source software developments for design and engineering in the built environment. The first development, called “sustainability-open” [1], aims on providing open source design, analysis and assessment software source code for (environmental) performance of

  7. A new web-based tool for data visualization in MDSplus

    Energy Technology Data Exchange (ETDEWEB)

    Manduchi, G., E-mail: gabriele.manduchi@igi.cnr.it [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy); Fredian, T.; Stillerman, J. [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States)

    2014-05-15

    Highlights: • The paper describes a new web-based data visualization tool for MDSplus. • It describes the experience gained with the previous data visualization tools. • It describes the used technologies for web data access and visualization. • It describes the current architecture of the tool and the new foreseen features. - Abstract: The Java tool jScope has been widely used for years to display acquired waveform in MDSplus. The choice of the Java programming language for its implementation has been successful for several reasons among which the fact that Java supports a multiplatform environment and it is well suited for graphics and the management of network communication. jScope can be used both as a local and remote application. In the latter case, data are acquired via TCP/IP communication using the mdsip protocol. Exporting data in this way however introduces several security problems due to the necessity of opening firewall holes for the user ports. For this reason, and also due to the fact that JavaScript is becoming a widely used language for web applications, a new tool written in JavaScript and called WebScope has been developed for the visualization of MDSplus data in web browsers. Data communication is now achieved via http protocol using Asynchronous JavaScript and XML (AJAX) technology. At the server side, data access is carried out by a Python module that interacts with the web server via Web Server Gateway Interface (WSGI). When a data item, described by an MDSplus expression, is requested by the web browser for visualization, it is returned as a binary message and then handled by callback JavaScript functions activated by the web browser. Scalable Vector Graphics (SVG) technology is used to handle graphics within the web browser and to carry out the same interactive data visualization provided by jScope. In addition to mouse events, touch events are supported to provide interactivity also on touch screens. In this way, waveforms can be

  8. A new web-based tool for data visualization in MDSplus

    International Nuclear Information System (INIS)

    Manduchi, G.; Fredian, T.; Stillerman, J.

    2014-01-01

    Highlights: • The paper describes a new web-based data visualization tool for MDSplus. • It describes the experience gained with the previous data visualization tools. • It describes the used technologies for web data access and visualization. • It describes the current architecture of the tool and the new foreseen features. - Abstract: The Java tool jScope has been widely used for years to display acquired waveform in MDSplus. The choice of the Java programming language for its implementation has been successful for several reasons among which the fact that Java supports a multiplatform environment and it is well suited for graphics and the management of network communication. jScope can be used both as a local and remote application. In the latter case, data are acquired via TCP/IP communication using the mdsip protocol. Exporting data in this way however introduces several security problems due to the necessity of opening firewall holes for the user ports. For this reason, and also due to the fact that JavaScript is becoming a widely used language for web applications, a new tool written in JavaScript and called WebScope has been developed for the visualization of MDSplus data in web browsers. Data communication is now achieved via http protocol using Asynchronous JavaScript and XML (AJAX) technology. At the server side, data access is carried out by a Python module that interacts with the web server via Web Server Gateway Interface (WSGI). When a data item, described by an MDSplus expression, is requested by the web browser for visualization, it is returned as a binary message and then handled by callback JavaScript functions activated by the web browser. Scalable Vector Graphics (SVG) technology is used to handle graphics within the web browser and to carry out the same interactive data visualization provided by jScope. In addition to mouse events, touch events are supported to provide interactivity also on touch screens. In this way, waveforms can be

  9. EvolView, an online tool for visualizing, annotating and managing phylogenetic trees.

    Science.gov (United States)

    Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Hu, Songnian; Chen, Wei-Hua

    2012-07-01

    EvolView is a web application for visualizing, annotating and managing phylogenetic trees. First, EvolView is a phylogenetic tree viewer and customization tool; it visualizes trees in various formats, customizes them through built-in functions that can link information from external datasets, and exports the customized results to publication-ready figures. Second, EvolView is a tree and dataset management tool: users can easily organize related trees into distinct projects, add new datasets to trees and edit and manage existing trees and datasets. To make EvolView easy to use, it is equipped with an intuitive user interface. With a free account, users can save data and manipulations on the EvolView server. EvolView is freely available at: http://www.evolgenius.info/evolview.html.

  10. GenomeCAT: a versatile tool for the analysis and integrative visualization of DNA copy number variants.

    Science.gov (United States)

    Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard

    2017-01-06

    The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of

  11. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  12. GeneWiz browser: An Interactive Tool for Visualizing Sequenced Chromosomes

    DEFF Research Database (Denmark)

    Hallin, Peter Fischer; Stærfeldt, Hans Henrik; Rotenberg, Eva

    2009-01-01

    , standard atlases are pre-generated for all prokaryotic genomes available in GenBank, providing a fast overview of all available genomes, including recently deposited genome sequences. The tool is available online from http://www.cbs.dtu.dk/services/gwBrowser. [Supplemental material including interactive...... atlases is available online at http://www.cbs.dtu.dk/services/gwBrowser/suppl/]....... readability and increased functionality compared to other browsers. The tool allows the user to select the display of various genomic features, color setting and data ranges. Custom numerical data can be added to the plot, allowing for example visualization of gene expression and regulation data. Further...

  13. VISUAL TOOLS FOR CROWDSOURCING DATA VALIDATION WITHIN THE GLOBELAND30 GEOPORTAL

    Directory of Open Access Journals (Sweden)

    E. Chuprikova

    2016-06-01

    Full Text Available This research aims to investigate the role of visualization of the user generated data that can empower the geoportal of GlobeLand30 produced by NGCC (National Geomatics Center of China. The focus is set on the development of a concept of tools that can extend the Geo-tagging functionality and make use of it for different target groups. The anticipated tools should improve the continuous data validation, updating and efficient use of the remotely-sensed data distributed within GlobeLand30.

  14. Visual Tools for Crowdsourcing Data Validation Within the GLOBELAND30 Geoportal

    Science.gov (United States)

    Chuprikova, E.; Wu, H.; Murphy, C. E.; Meng, L.

    2016-06-01

    This research aims to investigate the role of visualization of the user generated data that can empower the geoportal of GlobeLand30 produced by NGCC (National Geomatics Center of China). The focus is set on the development of a concept of tools that can extend the Geo-tagging functionality and make use of it for different target groups. The anticipated tools should improve the continuous data validation, updating and efficient use of the remotely-sensed data distributed within GlobeLand30.

  15. Noise Source Visualization Using a Digital Voice Recorder and Low-Cost Sensors

    Directory of Open Access Journals (Sweden)

    Yong Thung Cho

    2018-04-01

    Full Text Available Accurate sound visualization of noise sources is required for optimal noise control. Typically, noise measurement systems require microphones, an analog-digital converter, cables, a data acquisition system, etc., which may not be affordable for potential users. Also, many such systems are not highly portable and may not be convenient for travel. Handheld personal electronic devices such as smartphones and digital voice recorders with relatively lower costs and higher performance have become widely available recently. Even though such devices are highly portable, directly implementing them for noise measurement may lead to erroneous results since such equipment was originally designed for voice recording. In this study, external microphones were connected to a digital voice recorder to conduct measurements and the input received was processed for noise visualization. In this way, a low cost, compact sound visualization system was designed and introduced to visualize two actual noise sources for verification with different characteristics: an enclosed loud speaker and a small air compressor. Reasonable accuracy of noise visualization for these two sources was shown over a relatively wide frequency range. This very affordable and compact sound visualization system can be used for many actual noise visualization applications in addition to educational purposes.

  16. Noise Source Visualization Using a Digital Voice Recorder and Low-Cost Sensors.

    Science.gov (United States)

    Cho, Yong Thung

    2018-04-03

    Accurate sound visualization of noise sources is required for optimal noise control. Typically, noise measurement systems require microphones, an analog-digital converter, cables, a data acquisition system, etc., which may not be affordable for potential users. Also, many such systems are not highly portable and may not be convenient for travel. Handheld personal electronic devices such as smartphones and digital voice recorders with relatively lower costs and higher performance have become widely available recently. Even though such devices are highly portable, directly implementing them for noise measurement may lead to erroneous results since such equipment was originally designed for voice recording. In this study, external microphones were connected to a digital voice recorder to conduct measurements and the input received was processed for noise visualization. In this way, a low cost, compact sound visualization system was designed and introduced to visualize two actual noise sources for verification with different characteristics: an enclosed loud speaker and a small air compressor. Reasonable accuracy of noise visualization for these two sources was shown over a relatively wide frequency range. This very affordable and compact sound visualization system can be used for many actual noise visualization applications in addition to educational purposes.

  17. Intuitive Visualization of Transient Flow: Towards a Full 3D Tool

    Science.gov (United States)

    Michel, Isabel; Schröder, Simon; Seidel, Torsten; König, Christoph

    2015-04-01

    Visualization of geoscientific data is a challenging task especially when targeting a non-professional audience. In particular, the graphical presentation of transient vector data can be a significant problem. With STRING Fraunhofer ITWM (Kaiserslautern, Germany) in collaboration with delta h Ingenieurgesellschaft mbH (Witten, Germany) developed a commercial software for intuitive 2D visualization of 3D flow problems. Through the intuitive character of the visualization experts can more easily transport their findings to non-professional audiences. In STRING pathlets moving with the flow provide an intuition of velocity and direction of both steady-state and transient flow fields. The visualization concept is based on the Lagrangian view of the flow which means that the pathlets' movement is along the direction given by pathlines. In order to capture every detail of the flow an advanced method for intelligent, time-dependent seeding of the pathlets is implemented based on ideas of the Finite Pointset Method (FPM) originally conceived at and continuously developed by Fraunhofer ITWM. Furthermore, by the same method pathlets are removed during the visualization to avoid visual cluttering. Additional scalar flow attributes, for example concentration or potential, can either be mapped directly to the pathlets or displayed in the background of the pathlets on the 2D visualization plane. The extensive capabilities of STRING are demonstrated with the help of different applications in groundwater modeling. We will discuss the strengths and current restrictions of STRING which have surfaced during daily use of the software, for example by delta h. Although the software focusses on the graphical presentation of flow data for non-professional audiences its intuitive visualization has also proven useful to experts when investigating details of flow fields. Due to the popular reception of STRING and its limitation to 2D, the need arises for the extension to a full 3D tool

  18. πScope: Python based scientific workbench with MDSplus data visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Shiraiwa, S., E-mail: shiraiwa@PSFC.MIT.EDU; Fredian, T.; Hillairet, J.; Stillerman, J.

    2016-11-15

    Highlights: • πScope provides great enhancement in MDSplus data visualization. • πScope provides a single platform for both data browsing and complicated analysis. • πScope is scriptable and easily expandable due to its object oriented. • πScope is written in python and available from (http://piscope.psfc.mit.edu/). - Abstract: A newly developed python based scientific data analysis and visualization tool, πScope ( (http://piscope.psfc.mit.edu)), is reported. The primary motivation is 1) to provide an updated tool to browse the MDSplus data beyond existing dwscope/jScope and 2) to realize a universal foundation to construct interface tools to perform computer modeling from experimental data. To visualize MDSplus data, πScope has many features including overplotting different signals and discharges, generating various plot types (line, contour, image, etc.), performing in-panel data analysis using python scripts, and producing publication quality graphics. The logic to generate multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for users. πScope uses multi-threading in data loading, and is easy to modify and expand due to its object-oriented design. Furthermore, A user can access the data structure both from a GUI and a script, enabling relatively complex data analysis workflow built quickly on πScope.

  19. πScope: Python based scientific workbench with MDSplus data visualization tool

    International Nuclear Information System (INIS)

    Shiraiwa, S.; Fredian, T.; Hillairet, J.; Stillerman, J.

    2016-01-01

    Highlights: • πScope provides great enhancement in MDSplus data visualization. • πScope provides a single platform for both data browsing and complicated analysis. • πScope is scriptable and easily expandable due to its object oriented. • πScope is written in python and available from (http://piscope.psfc.mit.edu/). - Abstract: A newly developed python based scientific data analysis and visualization tool, πScope ( (http://piscope.psfc.mit.edu)), is reported. The primary motivation is 1) to provide an updated tool to browse the MDSplus data beyond existing dwscope/jScope and 2) to realize a universal foundation to construct interface tools to perform computer modeling from experimental data. To visualize MDSplus data, πScope has many features including overplotting different signals and discharges, generating various plot types (line, contour, image, etc.), performing in-panel data analysis using python scripts, and producing publication quality graphics. The logic to generate multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for users. πScope uses multi-threading in data loading, and is easy to modify and expand due to its object-oriented design. Furthermore, A user can access the data structure both from a GUI and a script, enabling relatively complex data analysis workflow built quickly on πScope.

  20. A new tool for virtual scientific and autostereoscopic visualization of EAST

    International Nuclear Information System (INIS)

    Li, Dan; Xiao, B.J.; Xia, J.Y.; Wang, K.R.; Chen, S.L.; Luo, W.L.

    2016-01-01

    Highlights: • 3D effect of the virtual EAST has been improved and data visualization has been realized in the ASEAST system. • Interaction behavior is created that the users can get information from database. • The system integrates data acquisition, data visualization and model visualization. • QT libraries are adopted to realize the cross-platform and impressive graphical interface. • In order to manage the models, the web-based model manager system is constructed. - Abstract: The Experimental Advanced Superconducting Tokamak (EAST) Device began operation in 2006. EAST visualization work has been paid more and more attention for simulating its running state and inner structure. The VEAST system had been developed to display the 3D model of EAST facility and some diagnostic data based on Java3D. Compared with the VEAST system, a new system named autosterescopic scientific EAST (ASEAST) using C/S (Client/Server) structure in combination with the technology of OpenGL and an open-source software system for 3D computer graphics and visualization called VTK (Visualization Toolkit) and the Qt5 libraries for the graphical user interface (GUI) has been developed to improve the 3D effect of the virtual EAST and visualize the experimental data. The ASEAST can be used to get access to the information of EAST and physical properties. In addition, as a general system, ASEAST supports a wide variety of 3D formats. The visualization result can be output in the corresponding format of the input. In order to improve the rendering speed, we used the classic QEM algorithm to simplify the models in preprocess stage. As for the 3D effect, we made an investigation and the survey revealed that the system had good 3D effect.

  1. A new tool for virtual scientific and autostereoscopic visualization of EAST

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dan, E-mail: lidan@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Xiao, B.J.; Xia, J.Y.; Wang, K.R. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); University of Science and Technology of China, Hefei, Anhui (China); Chen, S.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Luo, W.L. [709th Research lnstitute, China Shipbuilding lndustry Corporation, Wuhan, Hubei (China)

    2016-11-15

    Highlights: • 3D effect of the virtual EAST has been improved and data visualization has been realized in the ASEAST system. • Interaction behavior is created that the users can get information from database. • The system integrates data acquisition, data visualization and model visualization. • QT libraries are adopted to realize the cross-platform and impressive graphical interface. • In order to manage the models, the web-based model manager system is constructed. - Abstract: The Experimental Advanced Superconducting Tokamak (EAST) Device began operation in 2006. EAST visualization work has been paid more and more attention for simulating its running state and inner structure. The VEAST system had been developed to display the 3D model of EAST facility and some diagnostic data based on Java3D. Compared with the VEAST system, a new system named autosterescopic scientific EAST (ASEAST) using C/S (Client/Server) structure in combination with the technology of OpenGL and an open-source software system for 3D computer graphics and visualization called VTK (Visualization Toolkit) and the Qt5 libraries for the graphical user interface (GUI) has been developed to improve the 3D effect of the virtual EAST and visualize the experimental data. The ASEAST can be used to get access to the information of EAST and physical properties. In addition, as a general system, ASEAST supports a wide variety of 3D formats. The visualization result can be output in the corresponding format of the input. In order to improve the rendering speed, we used the classic QEM algorithm to simplify the models in preprocess stage. As for the 3D effect, we made an investigation and the survey revealed that the system had good 3D effect.

  2. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    Science.gov (United States)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  3. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  4. Visualization of neutron flux and power distributions in TRIGA Mark II reactor as an educational tool

    International Nuclear Information System (INIS)

    Snoj, Luka; Ravnik, Matjaz; Lengar, Igor

    2008-01-01

    Modern Monte Carlo computer codes (e.g. MCNP) for neutron transport allow calculation of detailed neutron flux and power distribution in complex geometries with resolution of ∼1 mm. Moreover they enable the calculation of individual particle tracks, scattering and absorption events. With the use of advanced software for 3D visualization (e.g. Amira, Voxler, etc.) one can create and present neutron flux and power distribution in a 'user friendly' way convenient for educational purposes. One can view axial, radial or any other spatial distribution of the neutron flux and power distribution in a nuclear reactor from various perspectives and in various modalities of presentation. By visualizing the distribution of scattering and absorption events and individual particle tracks one can visualize neutron transport parameters (mean free path, diffusion length, macroscopic cross section, up-scattering, thermalization, etc.) from elementary point of view. Most of the people remember better, if they visualize the processes. Therefore the representation of the reactor and neutron transport parameters is a convenient modern educational tool for the (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. The visualization of neutron flux and power distributions in Jozef Stefan Institute TRIGA Mark II research reactor is treated in the paper. The distributions are calculated with MCNP computer code and presented using Amira and Voxler software. The results in the form of figures are presented in the paper together with comments qualitatively explaining the figures. (authors)

  5. Evaluation of a visual risk communication tool: effects on knowledge and perception of blood transfusion risk.

    Science.gov (United States)

    Lee, D H; Mehta, M D

    2003-06-01

    Effective risk communication in transfusion medicine is important for health-care consumers, but understanding the numerical magnitude of risks can be difficult. The objective of this study was to determine the effect of a visual risk communication tool on the knowledge and perception of transfusion risk. Laypeople were randomly assigned to receive transfusion risk information with either a written or a visual presentation format for communicating and comparing the probabilities of transfusion risks relative to other hazards. Knowledge of transfusion risk was ascertained with a multiple-choice quiz and risk perception was ascertained by psychometric scaling and principal components analysis. Two-hundred subjects were recruited and randomly assigned. Risk communication with both written and visual presentation formats increased knowledge of transfusion risk and decreased the perceived dread and severity of transfusion risk. Neither format changed the perceived knowledge and control of transfusion risk, nor the perceived benefit of transfusion. No differences in knowledge or risk perception outcomes were detected between the groups randomly assigned to written or visual presentation formats. Risk communication that incorporates risk comparisons in either written or visual presentation formats can improve knowledge and reduce the perception of transfusion risk in laypeople.

  6. Open Source and Proprietary Project Management Tools for SMEs.

    OpenAIRE

    Veronika Abramova; Francisco Pires; Jorge Bernardino

    2017-01-01

    The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management) and ERP (Enterprise Resource Planning) modules. Currently, a large number of project management software is available, to assist project team during t...

  7. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  8. On the road to a stronger public health workforce: visual tools to address complex challenges.

    Science.gov (United States)

    Drehobl, Patricia; Stover, Beth H; Koo, Denise

    2014-11-01

    The public health workforce is vital to protecting the health and safety of the public, yet for years, state and local governmental public health agencies have reported substantial workforce losses and other challenges to the workforce that threaten the public's health. These challenges are complex, often involve multiple influencing or related causal factors, and demand comprehensive solutions. However, proposed solutions often focus on selected factors and might be fragmented rather than comprehensive. This paper describes approaches to characterizing the situation more comprehensively and includes two visual tools: (1) a fishbone, or Ishikawa, diagram that depicts multiple factors affecting the public health workforce; and (2) a roadmap that displays key elements-goals and strategies-to strengthen the public health workforce, thus moving from the problems depicted in the fishbone toward solutions. The visual tools aid thinking about ways to strengthen the public health workforce through collective solutions and to help leverage resources and build on each other's work. The strategic roadmap is intended to serve as a dynamic tool for partnership, prioritization, and gap assessment. These tools reflect and support CDC's commitment to working with partners on the highest priorities for strengthening the workforce to improve the public's health. Published by Elsevier Inc.

  9. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  10. GO(vis), a gene ontology visualization tool based on multi-dimensional values.

    Science.gov (United States)

    Ning, Zi; Jiang, Zhenran

    2010-05-01

    Most of gene product similarity measurements concentrate on the information content of Gene Ontology (GO) terms or use a path-based similarity between GO terms, which may ignore other important information contained in the structure of the ontology. In our study, we integrate different GO similarity measure approaches to analyze the functional relationship of genes and gene products with a new triangle-based visualization tool called GO(Vis). The purpose of this tool is to demonstrate the effect of three important information factors when measuring the similarity between gene products. One advantage of this tool is that its important ratio can be adjusted to meet different measuring requirements according to the biological knowledge of each factor. The experimental results demonstrate that GO(Vis) can display diagrams of the functional relationship for gene products effectively.

  11. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  12. Open source tracking and analysis of adult Drosophila locomotion in Buridan's paradigm with and without visual targets.

    Directory of Open Access Journals (Sweden)

    Julien Colomb

    Full Text Available BACKGROUND: Insects have been among the most widely used model systems for studying the control of locomotion by nervous systems. In Drosophila, we implemented a simple test for locomotion: in Buridan's paradigm, flies walk back and forth between two inaccessible visual targets [1]. Until today, the lack of easily accessible tools for tracking the fly position and analyzing its trajectory has probably contributed to the slow acceptance of Buridan's paradigm. METHODOLOGY/PRINCIPAL FINDINGS: We present here a package of open source software designed to track a single animal walking in a homogenous environment (Buritrack and to analyze its trajectory. The Centroid Trajectory Analysis (CeTrAn software is coded in the open source statistics project R. It extracts eleven metrics and includes correlation analyses and a Principal Components Analysis (PCA. It was designed to be easily customized to personal requirements. In combination with inexpensive hardware, these tools can readily be used for teaching and research purposes. We demonstrate the capabilities of our package by measuring the locomotor behavior of adult Drosophila melanogaster (whose wings were clipped, either in the presence or in the absence of visual targets, and comparing the latter to different computer-generated data. The analysis of the trajectories confirms that flies are centrophobic and shows that inaccessible visual targets can alter the orientation of the flies without changing their overall patterns of activity. CONCLUSIONS/SIGNIFICANCE: Using computer generated data, the analysis software was tested, and chance values for some metrics (as well as chance value for their correlation were set. Our results prompt the hypothesis that fixation behavior is observed only if negative phototaxis can overcome the propensity of the flies to avoid the center of the platform. Together with our companion paper, we provide new tools to promote Open Science as well as the collection and

  13. Web-Based Tools for Data Visualization and Decision Support for South Asia

    Science.gov (United States)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  14. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  15. Beam simulation tools for GEANT4 (and neutrino source applications)

    International Nuclear Information System (INIS)

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris email daniel@fnal.gov

    2002-01-01

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider

  16. High Fidelity Tool for Noise Source Identification, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Thorough understanding of airframe and propulsion aerodynamic noise sources and the subsequent acoustic propagation to the farfield is necessary to the design and...

  17. Digital administrative maps – A tool for visualization of epidemiological data

    Directory of Open Access Journals (Sweden)

    Ewa Niewiadomska

    2013-08-01

    Full Text Available Background: The aim of the study is to present the methods for visualization of epidemiological data using digital contour maps that take into account administrative division of Poland. Materials and Methods: The possibility of epidemiological data visualization in a geographical order, limited to the administrative level of the country, voivodeships and poviats (counties, are presented. They are crucial for the process of identifying and undertaking adequate prophylactic activities directed towards decreasing the risk and improving the population's health. This paper presents tools and techniques available in Geographic Information System ArcGIS and statistical software package R. Results: The work includes our own data reflecting: 1 the values of specific mortality rates due to respiratory diseases, Poland, 2010, based on the Central Statistical Office data, using the R statistical software package; 2 the averaged registered incidence rates of sarcoidosis in 2006-2010 for the population aged 19+ in the Silesian voivodeship, using Geographic Information System ArcGIS; and 3 the number of children with diagnosed respiratory diseases in the city of Legnica in 2009, taking into account their place of residence, using layered maps in Geographic Information System ArcGIS. Conclusions: The tools presented and described in this paper make it possible to visualize the results of research, to increase attractiveness of courses for students, as well as to enhance the skills and competence of students and participants of courses. Med Pr 2013;64(4:533–539

  18. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  19. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    tremendously successful in certain applications such as the Mozilla Firefox web browser and the Apache web server [10]. Open source software is often...source versions (such as Internet Explorer compared to Mozilla Firefox ), which typically conclude that vulnerabilities are, in fact, much more...for radios M. Smith ACS ACS ROS autonomous functionality (none) ACS PX4 Firmware PX4 FMU driver BSD 3-clause ACS PX4 Nuttx real time OS BSD ACS

  20. Integrating Philips' extreme UV source in the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Derra, Guenther; Janssen, Maurice; Jonkers, Jeroen; Klein, Jurgen; Kruecken, Thomas; List, Andreas; Loeken, Michael; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prummer, Ralph; Rosier, Oliver; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2005-05-01

    The paper describes recent progress in the development of the Philips's EUV source. Progress has been realized at many frontiers: Integration studies of the source into a scanner have primarily been studied on the Xe source because it has a high degree of maturity. We report on integration with a collector, associated collector lifetime and optical characteristics. Collector lifetime in excess of 1 bln shots could be demonstrated. Next, an active dose control system was developed and tested on the Xe lamp. Resulting dose stability data are less than 0.2% for an exposure window of 100 pulses. The second part of the paper reports on progress in the development of the Philips' Sn source. First, the details of the concept are described. It is based on a Laser triggered vacuum arc, which is an extension with respect to previous designs. The source is furbished with rotating electrodes that are covered with a Sn film that is constantly regenerated. Hence by the very design of the source, it is scalable to very high power levels, and moreover has fundamentally solved the notorious problem of electrode erosion. Power values of 260 W in 2p sr are reported, along with a stable, long life operation of the lamp. The paper also addresses the problem of debris generation and mitigation of the Sn-source. The problem is attacked by a combined strategy of protection of the collector by traditional means (e.g. fields, foiltraps... ), and by designing the gas atmosphere according to the principles of the well known halogen cycles in incandescent lamps. These principles have been studied in the Lighting industry for decades and rely on the excessively high vapor pressures of metal halides. Transferred to the Sn source, it allows pumping away tin residues that would otherwise irreversibly deposit on the collector.

  1. Evolview v2: an online visualization and management tool for customized and annotated phylogenetic trees.

    Science.gov (United States)

    He, Zilong; Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Chen, Wei-Hua; Hu, Songnian

    2016-07-08

    Evolview is an online visualization and management tool for customized and annotated phylogenetic trees. It allows users to visualize phylogenetic trees in various formats, customize the trees through built-in functions and user-supplied datasets and export the customization results to publication-ready figures. Its 'dataset system' contains not only the data to be visualized on the tree, but also 'modifiers' that control various aspects of the graphical annotation. Evolview is a single-page application (like Gmail); its carefully designed interface allows users to upload, visualize, manipulate and manage trees and datasets all in a single webpage. Developments since the last public release include a modern dataset editor with keyword highlighting functionality, seven newly added types of annotation datasets, collaboration support that allows users to share their trees and datasets and various improvements of the web interface and performance. In addition, we included eleven new 'Demo' trees to demonstrate the basic functionalities of Evolview, and five new 'Showcase' trees inspired by publications to showcase the power of Evolview in producing publication-ready figures. Evolview is freely available at: http://www.evolgenius.info/evolview/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Methods and apparatus for safely handling radioactive sources in measuring-while-drilling tools

    International Nuclear Information System (INIS)

    Wraight, P.D.

    1989-01-01

    This patent describes a method for removing a chemical radioactive source from a MWD tool which is coupled in a drill string supported by a drilling rig while a borehole is drilled and includes logging means for measuring formation characteristics in response to irradiation of the adjacent formations by the radioactive source during the drilling operation. The steps of the method are: halting the drilling operation and then removing the drill string from the borehole for moving the MWD tool to a work station at the surface where the source is at a safe working distance from the drilling rig and will be accessible by way of one end of the MWD tool; positioning a radiation shield at a location adjacent to the one end of the MWD tool where the shield is ready for receiving the source as it is moved away from the other end of the MWD tool and then moving the source away from the other end of the MWD tool for enclosing the source within the shield; and once the source is enclosed within the shield; removing the shield together with the enclosed source from the MWD tool for transferring the enclosed source to another work station

  3. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  4. Nanobodies as Versatile Tools to Understand, Diagnose, Visualize and Treat Cancer

    Directory of Open Access Journals (Sweden)

    Isabel Van Audenhove

    2016-06-01

    Full Text Available Since their discovery, nanobodies have been used extensively in the fields of research, diagnostics and therapy. These antigen binding fragments, originating from Camelid heavy-chain antibodies, possess unusual hallmarks in terms of (small size, stability, solubility and specificity, hence allowing cost-effective production and sometimes outperforming monoclonal antibodies. In this review, we evaluate the current status of nanobodies to study, diagnose, visualize or inhibit cancer-specific proteins and processes. Nanobodies are highly adaptable tools for cancer research as they enable specific modulation of targets, enzymatic and non-enzymatic proteins alike. Molecular imaging studies benefit from the rapid, homogeneous tumor accumulation of nanobodies and their fast blood clearance, permitting previously unattainable fast tumor visualization. Moreover, they are endowed with considerable therapeutic potential as inhibitors of receptor-ligand pairs and deliverers of drugs or drug-loaded nanoparticles towards tumors. More in vivo and clinical studies are however eagerly awaited to unleash their full potential.

  5. New Abstraction Networks and a New Visualization Tool in Support of Auditing the SNOMED CT Content

    Science.gov (United States)

    Geller, James; Ochs, Christopher; Perl, Yehoshua; Xu, Junchuan

    2012-01-01

    Medical terminologies are large and complex. Frequently, errors are hidden in this complexity. Our objective is to find such errors, which can be aided by deriving abstraction networks from a large terminology. Abstraction networks preserve important features but eliminate many minor details, which are often not useful for identifying errors. Providing visualizations for such abstraction networks aids auditors by allowing them to quickly focus on elements of interest within a terminology. Previously we introduced area taxonomies and partial area taxonomies for SNOMED CT. In this paper, two advanced, novel kinds of abstraction networks, the relationship-constrained partial area subtaxonomy and the root-constrained partial area subtaxonomy are defined and their benefits are demonstrated. We also describe BLUSNO, an innovative software tool for quickly generating and visualizing these SNOMED CT abstraction networks. BLUSNO is a dynamic, interactive system that provides quick access to well organized information about SNOMED CT. PMID:23304293

  6. ANLIZE: a molecular mechanics force field visualization tool and its application to 18-crown-6.

    Science.gov (United States)

    Stolworthy, L D; Shirts, R B

    1997-03-01

    We describe a software tool that allows one to visualize and analyze the importance of each individual steric interaction in a molecular mechanics force field. ANLIZE is presently implemented for the Dreiding force field for use with the Cerius2 software package, but could be implemented in any molecular mechanics package with a graphical user interface. ANLIZE calculates individual interactions in the force field, sorts them by size, and displays them in several ways from a menu of choices. This allows the user to scan through selected interactions to visualize which interactions are the primary determinants of preferred conformations. The features of ANLIZE are illustrated using 18-crown-6 as an example, and the factors governing conformational preference in 18-crown-6 are demonstrated. Users of molecular mechanics packages are encouraged to demand this functionality from commercial software producers.

  7. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  8. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  9. The sensory timecourses associated with conscious visual item memory and source memory.

    Science.gov (United States)

    Thakral, Preston P; Slotnick, Scott D

    2015-09-01

    Previous event-related potential (ERP) findings have suggested that during visual item and source memory, nonconscious and conscious sensory (occipital-temporal) activity onsets may be restricted to early (0-800 ms) and late (800-1600 ms) temporal epochs, respectively. In an ERP experiment, we tested this hypothesis by separately assessing whether the onset of conscious sensory activity was restricted to the late epoch during source (location) memory and item (shape) memory. We found that conscious sensory activity had a late (>800 ms) onset during source memory and an early (memory. In a follow-up fMRI experiment, conscious sensory activity was localized to BA17, BA18, and BA19. Of primary importance, the distinct source memory and item memory ERP onsets contradict the hypothesis that there is a fixed temporal boundary separating nonconscious and conscious processing during all forms of visual conscious retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  11. Regulatory inspection: a powerful tool to control industrial radioactive sources

    International Nuclear Information System (INIS)

    Silva, F.C.A. da; Leocadio, J.C.; Ramalho, A.T.

    2008-01-01

    An important contribution for Brazilian development, especially for the quality control of products, is the use of radiation sources by conventional industries. There are in Brazil roughly 3,000 radioactive sources spread out among 950 industries. The main industrial practices involved are: industrial radiography, industrial irradiators, industrial accelerators, well logging petroleum and nuclear gauges. More than 1,800 Radiation Protection Officers (RPOs) were qualified to work in these practices. The present work presents a brief description of the safety control over industrial radioactive installations performed by the Brazilian Regulatory Authority, i.e. the National Commission of Nuclear Energy (CNEN). This paper also describes the national system for radiation safety inspections, the regulation infrastructure and the national inventory of industrial installations. The inspections are based on specific indicators, and their periodicity depends on the risk and type of installation. The present work discusses some relevant aspects that must be considered during the inspections, in order to make the inspections more efficient in controlling the sources. One of these aspects regards the evaluation of the storage place for the sources, a very important parameter for preventing future risky situations. (author)

  12. Educatie en open-source software: meer dan gratis tools.

    NARCIS (Netherlands)

    Bakker, de G.M.

    2008-01-01

    In het onderwijs wordt steeds meer open source software gebruikt, zo merkt ook Gijs de Bakker*. In de regel moet daar niet al te veel achtergezocht worden. 'Het is in de regel niet meer dan een slimme manier om aan gratis en doorgaans kwalitatief goede software te komen'. Wat De Bakker betreft een

  13. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    Science.gov (United States)

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  14. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  15. [The Performance Analysis for Lighting Sources in Highway Tunnel Based on Visual Function].

    Science.gov (United States)

    Yang, Yong; Han, Wen-yuan; Yan, Ming; Jiang, Hai-feng; Zhu, Li-wei

    2015-10-01

    Under the condition of mesopic vision, the spectral luminous efficiency function is shown as a series of curves. Its peak wavelength and intensity are affected by light spectrum, background brightness and other aspects. The impact of light source to lighting visibility could not be carried out via a single optical parametric characterization. The reaction time of visual cognition is regard as evaluating indexes in this experiment. Under the condition of different speed and luminous environment, testing visual cognition based on vision function method. The light sources include high pressure sodium, electrodeless fluorescent lamp and white LED with three kinds of color temperature (the range of color temperature is from 1 958 to 5 537 K). The background brightness value is used for basic section of highway tunnel illumination and general outdoor illumination, its range is between 1 and 5 cd x m(-)2. All values are in the scope of mesopic vision. Test results show that: under the same condition of speed and luminance, the reaction time of visual cognition that corresponding to high color temperature of light source is shorter than it corresponding to low color temperature; the reaction time corresponding to visual target in high speed is shorter than it in low speed. At the end moment, however, the visual angle of target in observer's visual field that corresponding to low speed was larger than it corresponding to high speed. Based on MOVE model, calculating the equivalent luminance of human mesopic vision, which is on condition of different emission spectrum and background brightness that formed by test lighting sources. Compared with photopic vision result, the standard deviation (CV) of time-reaction curve corresponding to equivalent brightness of mesopic vision is smaller. Under the condition of mesopic vision, the discrepancy between equivalent brightness of different lighting source and photopic vision, that is one of the main reasons for causing the

  16. GOrilla: a tool for discovery and visualization of enriched GO terms in ranked gene lists

    Directory of Open Access Journals (Sweden)

    Steinfeld Israel

    2009-02-01

    Full Text Available Abstract Background Since the inception of the GO annotation project, a variety of tools have been developed that support exploring and searching the GO database. In particular, a variety of tools that perform GO enrichment analysis are currently available. Most of these tools require as input a target set of genes and a background set and seek enrichment in the target set compared to the background set. A few tools also exist that support analyzing ranked lists. The latter typically rely on simulations or on union-bound correction for assigning statistical significance to the results. Results GOrilla is a web-based application that identifies enriched GO terms in ranked lists of genes, without requiring the user to provide explicit target and background sets. This is particularly useful in many typical cases where genomic data may be naturally represented as a ranked list of genes (e.g. by level of expression or of differential expression. GOrilla employs a flexible threshold statistical approach to discover GO terms that are significantly enriched at the top of a ranked gene list. Building on a complete theoretical characterization of the underlying distribution, called mHG, GOrilla computes an exact p-value for the observed enrichment, taking threshold multiple testing into account without the need for simulations. This enables rigorous statistical analysis of thousand of genes and thousands of GO terms in order of seconds. The output of the enrichment analysis is visualized as a hierarchical structure, providing a clear view of the relations between enriched GO terms. Conclusion GOrilla is an efficient GO analysis tool with unique features that make a useful addition to the existing repertoire of GO enrichment tools. GOrilla's unique features and advantages over other threshold free enrichment tools include rigorous statistics, fast running time and an effective graphical representation. GOrilla is publicly available at: http://cbl-gorilla.cs.technion.ac.il

  17. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  18. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  19. Data visualization, bar naked: A free tool for creating interactive graphics.

    Science.gov (United States)

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. [Visual cues as a therapeutic tool in Parkinson's disease. A systematic review].

    Science.gov (United States)

    Muñoz-Hellín, Elena; Cano-de-la-Cuerda, Roberto; Miangolarra-Page, Juan Carlos

    2013-01-01

    Sensory stimuli or sensory cues are being used as a therapeutic tool for improving gait disorders in Parkinson's disease patients, but most studies seem to focus on auditory stimuli. The aim of this study was to conduct a systematic review regarding the use of visual cues over gait disorders, dual tasks during gait, freezing and the incidence of falls in patients with Parkinson to obtain therapeutic implications. We conducted a systematic review in main databases such as Cochrane Database of Systematic Reviews, TripDataBase, PubMed, Ovid MEDLINE, Ovid EMBASE and Physiotherapy Evidence Database, during 2005 to 2012, according to the recommendations of the Consolidated Standards of Reporting Trials, evaluating the quality of the papers included with the Downs & Black Quality Index. 21 articles were finally included in this systematic review (with a total of 892 participants) with variable methodological quality, achieving an average of 17.27 points in the Downs and Black Quality Index (range: 11-21). Visual cues produce improvements over temporal-spatial parameters in gait, turning execution, reducing the appearance of freezing and falls in Parkinson's disease patients. Visual cues appear to benefit dual tasks during gait, reducing the interference of the second task. Further studies are needed to determine the preferred type of stimuli for each stage of the disease. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.

  1. LabKey Server NAb: A tool for analyzing, visualizing and sharing results from neutralizing antibody assays

    Directory of Open Access Journals (Sweden)

    Gao Hongmei

    2011-05-01

    Full Text Available Abstract Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the Lab

  2. Web tools for effective retrieval, visualization, and evaluation of cardiology medical images and records

    Science.gov (United States)

    Masseroli, Marco; Pinciroli, Francesco

    2000-12-01

    To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.

  3. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  4. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  5. The effectiveness of dental health education tools for visually impaired students in Bukit Mertajam

    Science.gov (United States)

    Shahabudin, Saadiah; Hashim, Hasnah; Omar, Maizurah

    2016-12-01

    Oral health is a vital component of overall health. It is important in adults and children alike, however, it is even more crucial for children with special needs as they have limited ability to perform oral health practices. Disabled children deserve the same opportunity for oral health as normal children. Unfortunately, oral health care is the most unattended health needs of the disabled children. This study aimed to assess the effectiveness of dental health education tools for visually impaired students in two schools in Bukit Mertajam, Penang. The project utilized dental health education tools consisting of an oral health module (printed in braille for the blind and in font 18px for the partially blind), an audio narration of the module were prepared and content-validated by an expert panel. Baseline plaque scores of 38 subjects aged 6-17 years were determined by a trained dental staff nurse. The module was then administered to the subjects facilitated by the teachers. Post intervention plaque scores were recorded again after one month. The pre and post intervention data were analyzed using Wilcoxon Signed Ranks Test with a significant p value set at among students with visual impairment. We recommend for further studies to be conducted on a bigger sample.

  6. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  7. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  8. An EcologicalVisual Exploration Tool to Support the Analysis of Visual Processing Pathways in Children with Autism Spectrum Disorders

    Directory of Open Access Journals (Sweden)

    Dario Cazzato

    2017-12-01

    Full Text Available Recent improvements in the field of assistive technologies have led to innovative solutions aiming at increasing the capabilities of people with disability, helping them in daily activities with applications that span from cognitive impairments to developmental disabilities. In particular, in the case of Autism Spectrum Disorder (ASD, the need to obtain active feedback in order to extract subsequently meaningful data becomes of fundamental importance. In this work, a study about the possibility of understanding the visual exploration in children with ASD is presented. In order to obtain an automatic evaluation, an algorithm for free (i.e., without constraints, nor using additional hardware, infrared (IR light sources or other intrusive methods gaze estimation is employed. Furthermore, no initial calibration is required. It allows the user to freely rotate the head in the field of view of the sensor, and it is insensitive to the presence of eyeglasses, hats or particular hairstyles. These relaxations of the constraints make this technique particularly suitable to be used in the critical context of autism, where the child is certainly not inclined to employ invasive devices, nor to collaborate during calibration procedures.The evaluation of children’s gaze trajectories through the proposed solution is presented for the purpose of an Early Start Denver Model (ESDM program built on the child’s spontaneous interests and game choice delivered in a natural setting.

  9. Validity of the growth model of the 'computerized visual perception assessment tool for Chinese characters structures'.

    Science.gov (United States)

    Wu, Huey-Min; Li, Cheng-Hsaun; Kuo, Bor-Chen; Yang, Yu-Mao; Lin, Chin-Kai; Wan, Wei-Hsiang

    2017-08-01

    Morphological awareness is the foundation for the important developmental skills involved with vocabulary, as well as understanding the meaning of words, orthographic knowledge, reading, and writing. Visual perception of space and radicals in two-dimensional positions of Chinese characters' morphology is very important in identifying Chinese characters. The important predictive variables of special and visual perception in Chinese characters identification were investigated in the growth model in this research. The assessment tool is the "Computerized Visual Perception Assessment Tool for Chinese Characters Structures" developed by this study. There are two constructs, basic stroke and character structure. In the basic stroke, there are three subtests of one, two, and more than three strokes. In the character structure, there are three subtests of single-component character, horizontal-compound character, and vertical-compound character. This study used purposive sampling. In the first year, 551 children 4-6 years old participated in the study and were monitored for one year. In the second year, 388 children remained in the study and the successful follow-up rate was 70.4%. This study used a two-wave cross-lagged panel design to validate the growth model of the basic stroke and the character structure. There was significant correlation of the basic stroke and the character structure at different time points. The abilities in the basic stroke and in the character structure steadily developed over time for preschool children. Children's knowledge of the basic stroke effectively predicted their knowledge of the basic stroke and the character structure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  11. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  12. Neural Monkey: An Open-source Tool for Sequence Learning

    Directory of Open Access Journals (Sweden)

    Helcl Jindřich

    2017-04-01

    Full Text Available In this paper, we announce the development of Neural Monkey – an open-source neural machine translation (NMT and general sequence-to-sequence learning system built over the TensorFlow machine learning library. The system provides a high-level API tailored for fast prototyping of complex architectures with multiple sequence encoders and decoders. Models’ overall architecture is specified in easy-to-read configuration files. The long-term goal of the Neural Monkey project is to create and maintain a growing collection of implementations of recently proposed components or methods, and therefore it is designed to be easily extensible. Trained models can be deployed either for batch data processing or as a web service. In the presented paper, we describe the design of the system and introduce the reader to running experiments using Neural Monkey.

  13. Specvis: Free and open-source software for visual field examination.

    Science.gov (United States)

    Dzwiniel, Piotr; Gola, Mateusz; Wójcik-Gryciuk, Anna; Waleszczyk, Wioletta J

    2017-01-01

    Visual field impairment affects more than 100 million people globally. However, due to the lack of the access to appropriate ophthalmic healthcare in undeveloped regions as a result of associated costs and expertise this number may be an underestimate. Improved access to affordable diagnostic software designed for visual field examination could slow the progression of diseases, such as glaucoma, allowing for early diagnosis and intervention. We have developed Specvis, a free and open-source application written in Java programming language that can run on any personal computer to meet this requirement (http://www.specvis.pl/). Specvis was tested on glaucomatous, retinitis pigmentosa and stroke patients and the results were compared to results using the Medmont M700 Automated Static Perimeter. The application was also tested for inter-test intrapersonal variability. The results from both validation studies indicated low inter-test intrapersonal variability, and suitable reliability for a fast and simple assessment of visual field impairment. Specvis easily identifies visual field areas of zero sensitivity and allows for evaluation of its levels throughout the visual field. Thus, Specvis is a new, reliable application that can be successfully used for visual field examination and can fill the gap between confrontation and perimetry tests. The main advantages of Specvis over existing methods are its availability (free), affordability (runs on any personal computer), and reliability (comparable to high-cost solutions).

  14. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    Science.gov (United States)

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-04-01

    tools for better producing feasibility and management plans; (ii) a set of activities devoted to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - a dedicated module for water management and planning that will help to manage and aggregate all the distributed data coming from the simulation scenarios; - a whole module for calibration, uncertainty and sensitivity analysis; - a module for solute transport in the unsaturated zone; - a module for crop growth and water requirements in agriculture; - tools for dealing with groundwater quality issues; - tools for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. The Consortium is constituted by partners from various water sectors from 10 EU countries, plus Turkey and Ukraine. Synergies with the UNESCO HOPE initiative on free and open source software in water management greatly boost the value of the project. Large stakeholders involvement is thought to guarantee results dissemination and exploitation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement n. 642224. References MARSOL (2014). Demonstrating Managed Aquifer Recharge as a Solution to Water Scarcity and Drought www.marsol.eu [accessed 4 January 2015] Rossetto, R., Borsi, I., Schifani, C., Bonari, E., Mogorovich P. & Primicerio M. (2013) - SID&GRID: integrating hydrological modeling in GIS environment hydroinformatics system for the management of the water resource. Rendiconti Online Societa

  15. PRIDE Inspector Toolsuite: Moving Toward a Universal Visualization Tool for Proteomics Data Standard Formats and Quality Assessment of ProteomeXchange Datasets.

    Science.gov (United States)

    Perez-Riverol, Yasset; Xu, Qing-Wei; Wang, Rui; Uszkoreit, Julian; Griss, Johannes; Sanchez, Aniel; Reisinger, Florian; Csordas, Attila; Ternent, Tobias; Del-Toro, Noemi; Dianes, Jose A; Eisenacher, Martin; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2016-01-01

    The original PRIDE Inspector tool was developed as an open source standalone tool to enable the visualization and validation of mass-spectrometry (MS)-based proteomics data before data submission or already publicly available in the Proteomics Identifications (PRIDE) database. The initial implementation of the tool focused on visualizing PRIDE data by supporting the PRIDE XML format and a direct access to private (password protected) and public experiments in PRIDE.The ProteomeXchange (PX) Consortium has been set up to enable a better integration of existing public proteomics repositories, maximizing its benefit to the scientific community through the implementation of standard submission and dissemination pipelines. Within the Consortium, PRIDE is focused on supporting submissions of tandem MS data. The increasing use and popularity of the new Proteomics Standards Initiative (PSI) data standards such as mzIdentML and mzTab, and the diversity of workflows supported by the PX resources, prompted us to design and implement a new suite of algorithms and libraries that would build upon the success of the original PRIDE Inspector and would enable users to visualize and validate PX "complete" submissions. The PRIDE Inspector Toolsuite supports the handling and visualization of different experimental output files, ranging from spectra (mzML, mzXML, and the most popular peak lists formats) and peptide and protein identification results (mzIdentML, PRIDE XML, mzTab) to quantification data (mzTab, PRIDE XML), using a modular and extensible set of open-source, cross-platform libraries. We believe that the PRIDE Inspector Toolsuite represents a milestone in the visualization and quality assessment of proteomics data. It is freely available at http://github.com/PRIDE-Toolsuite/. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  16. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  17. Tools for Teaching Mathematical Functions and Geometric Figures to Tactile Visualization through a Braille Printer for Visual Impairment People

    Directory of Open Access Journals (Sweden)

    Lorena León

    2016-04-01

    Full Text Available In this article, we showed the features and facilities offered by two new computer programs developed for the treatment and generation of geometric figures and math functions, through a Braille printer designed for visually impaired people. The programs have complete accessible features, in which users with full visual impairments can communicate with the systems via short-keys, and the speech synthesizer. The system sends sound messages that will accompanying the user during all the process to generate geometrical figures or to do a mathematical treatment. Finally, a tactile visualization displays as the results to the person with visual impairment, thus they will can complete their geometry and mathematical studies.

  18. Acceptance and practicability of a visual communication tool in smoking cessation counselling: a randomised controlled trial.

    Science.gov (United States)

    Neuner-Jehle, Stefan; Knecht, Marianne I; Stey-Steurer, Claudia; Senn, Oliver

    2013-12-01

    Smoking cessation advice is important for reducing the worldwide burden of disease resulting from tobacco smoking. Appropriate risk communication formats improve the success of counselling interventions in primary care. To test the feasibility and acceptance of a smoking cessation counselling tool with different cardiovascular risk communication formats including graphs, in comparison with the International Primary Care Respiratory Group (IPCRG) 'quit smoking assistance' tool. GPs were randomised into an intervention group (using our communication tool in addition to the IPCRG sheet) and a control group (using the IPCRG sheet only). We asked participants for socioeconomic data, smoking patterns, understanding of information, motivation, acceptance and feasibility, and measured the duration and frequency of counselling sessions. Twenty-five GPs performed 2.8 counselling sessions per month in the intervention group and 1.7 in the control group (p=0.3) with 114 patients. The median duration of a session was 10 mins (control group 11 mins, p=0.09 for difference). Median patients' motivation for smoking cessation was 7 on a 10-point visual analogue scale with no significant difference before and after the intervention (p=0.2) or between groups (p=0.73 before and p=0.15 after the intervention). Median patients' ratings of motivation, selfconfidence, understanding of information, and satisfaction with the counselling were 3-5 on a 5-point Likert scale, similar to GPs' ratings of acceptance and feasibility, with no significant difference between groups. Among Swiss GPs and patients, both our innovative communication tool and the IPCRG tool were well accepted and both merit further dissemination and application in research.

  19. Evaluating role of interactive visualization tool in improving students' conceptual understanding of chemical equilibrium

    Science.gov (United States)

    Sampath Kumar, Bharath

    The purpose of this study is to examine the role of partnering visualization tool such as simulation towards development of student's concrete conceptual understanding of chemical equilibrium. Students find chemistry concepts abstract, especially at the microscopic level. Chemical equilibrium is one such topic. While research studies have explored effectiveness of low tech instructional strategies such as analogies, jigsaw, cooperative learning, and using modeling blocks, fewer studies have explored the use of visualization tool such as simulations in the context of dynamic chemical equilibrium. Research studies have identified key reasons behind misconceptions such as lack of systematic understanding of foundational chemistry concepts, failure to recognize the system is dynamic, solving numerical problems on chemical equilibrium in an algorithmic fashion, erroneous application Le Chatelier's principle (LCP) etc. Kress et al. (2001) suggested that external representation in the form of visualization is more than a tool for learning, because it enables learners to make meanings or express their ideas which cannot be readily done so through a verbal representation alone. Mixed method study design was used towards data collection. The qualitative portion of the study is aimed towards understanding the change in student's mental model before and after the intervention. A quantitative instrument was developed based on common areas of misconceptions identified by research studies. A pilot study was conducted prior to the actual study to obtain feedback from students on the quantitative instrument and the simulation. Participants for the pilot study were sampled from a single general chemistry class. Following the pilot study, the research study was conducted with a total of 27 students (N=15 in experimental group and N=12 in control group). Prior to participating in the study, students have completed their midterm test on the topic of chemical equilibrium. Qualitative

  20. Three-Dimensional Online Visualization and Engagement Tools for the Geosciences

    Science.gov (United States)

    Cockett, R.; Moran, T.; Pidlisecky, A.

    2013-12-01

    Educational tools often sacrifice interactivity in favour of scalability so they can reach more users. This compromise leads to tools that may be viewed as second tier when compared to more engaging activities performed in a laboratory; however, the resources required to deliver laboratory exercises that are scalable is often impractical. Geoscience education is well situated to benefit from interactive online learning tools that allow users to work in a 3D environment. Visible Geology (http://3ptscience.com/visiblegeology) is an innovative web-based application designed to enable visualization of geologic structures and processes through the use of interactive 3D models. The platform allows users to conceptualize difficult, yet important geologic principles in a scientifically accurate manner by developing unique geologic models. The environment allows students to interactively practice their visualization and interpretation skills by creating and interacting with their own models and terrains. Visible Geology has been designed from a user centric perspective resulting in a simple and intuitive interface. The platform directs students to build there own geologic models by adding beds and creating geologic events such as tilting, folding, or faulting. The level of ownership and interactivity encourages engagement, leading learners to discover geologic relationships on their own, in the context of guided assignments. In January 2013, an interactive geologic history assignment was developed for a 700-student introductory geology class at The University of British Columbia. The assignment required students to distinguish the relative age of geologic events to construct a geologic history. Traditionally this type of exercise has been taught through the use of simple geologic cross-sections showing crosscutting relationships; from these cross-sections students infer the relative age of geologic events. In contrast, the Visible Geology assignment offers students a unique

  1. Anaerobes as Sources of Bioactive Compounds and Health Promoting Tools.

    Science.gov (United States)

    Mamo, Gashaw

    Aerobic microorganisms have been sources of medicinal agents for several decades and an impressive variety of drugs have been isolated from their cultures, studied and formulated to treat or prevent diseases. On the other hand, anaerobes, which are believed to be the oldest life forms on earth and evolved remarkably diverse physiological functions, have largely been neglected as sources of bioactive compounds. However, results obtained from the limited research done so far show that anaerobes are capable of producing a range of interesting bioactive compounds that can promote human health. In fact, some of these bioactive compounds are found to be novel in their structure and/or mode of action.Anaerobes play health-promoting roles through their bioactive products as well as application of whole cells. The bioactive compounds produced by these microorganisms include antimicrobial agents and substances such as immunomodulators and vitamins. Bacteriocins produced by anaerobes have been in use as preservatives for about 40 years. Because these substances are effective at low concentrations, encounter relatively less resistance from bacteria and are safe to use, there is a growing interest in these antimicrobial agents. Moreover, several antibiotics have been reported from the cultures of anaerobes. Closthioamide and andrimid produced by Clostridium cellulolyticum and Pantoea agglomerans, respectively, are examples of novel antibiotics of anaerobe origin. The discovery of such novel bioactive compounds is expected to encourage further studies which can potentially lead to tapping of the antibiotic production potential of this fascinating group of microorganisms.Anaerobes are widely used in preparation of fermented foods and beverages. During the fermentation processes, these organisms produce a number of bioactive compounds including anticancer, antihypertensive and antioxidant substances. The well-known health promoting effect of fermented food is mostly due to these

  2. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation.

    Science.gov (United States)

    Vanbellingen, Tim; Schumacher, Rahel; Eggenberger, Noëmi; Hopfner, Simone; Cazzoli, Dario; Preisig, Basil C; Bertschi, Manuel; Nyffeler, Thomas; Gutbrod, Klemens; Bassetti, Claudio L; Bohlhalter, Stephan; Müri, René M

    2015-05-01

    According to the direct matching hypothesis, perceived movements automatically activate existing motor components through matching of the perceived gesture and its execution. The aim of the present study was to test the direct matching hypothesis by assessing whether visual exploration behavior correlate with deficits in gestural imitation in left hemisphere damaged (LHD) patients. Eighteen LHD patients and twenty healthy control subjects took part in the study. Gesture imitation performance was measured by the test for upper limb apraxia (TULIA). Visual exploration behavior was measured by an infrared eye-tracking system. Short videos including forty gestures (20 meaningless and 20 communicative gestures) were presented. Cumulative fixation duration was measured in different regions of interest (ROIs), namely the face, the gesturing hand, the body, and the surrounding environment. Compared to healthy subjects, patients fixated significantly less the ROIs comprising the face and the gesturing hand during the exploration of emblematic and tool-related gestures. Moreover, visual exploration of tool-related gestures significantly correlated with tool-related imitation as measured by TULIA in LHD patients. Patients and controls did not differ in the visual exploration of meaningless gestures, and no significant relationships were found between visual exploration behavior and the imitation of emblematic and meaningless gestures in TULIA. The present study thus suggests that altered visual exploration may lead to disturbed imitation of tool related gestures, however not of emblematic and meaningless gestures. Consequently, our findings partially support the direct matching hypothesis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  5. Visi—A VTK- and QT-Based Open-Source Project for Scientific Data Visualization

    Science.gov (United States)

    Li, Yiming; Chen, Cheng-Kai

    2009-03-01

    In this paper, we present an open-source project, Visi for high-dimensional engineering and scientific data visualization. Visi is with state-of-the-art interactive user interface and graphics kernels based upon Qt (a cross-platform GUI toolkit) and VTK (an object-oriented visualization library). For an initialization of Visi, a preliminary window will be activated by Qt, and the kernel of VTK is simultaneously embedded into the window, where the graphics resources are allocated. Representation of visualization is through an interactive interface so that the data will be rendered according to user's preference. The developed framework possesses high flexibility and extensibility for advanced functions (e.g., object combination, etc) and further applications. Application of Visi to data visualization in various fields, such as protein structure in bioinformatics, 3D semiconductor transistor, and interconnect of very-large scale integration (VLSI) layout is also illustrated to show the performance of Visi. The developed open-source project is available in our project website on the internet [1].

  6. Applying Open Source Game Engine for Building Visual Simulation Training System of Fire Fighting

    Science.gov (United States)

    Yuan, Diping; Jin, Xuesheng; Zhang, Jin; Han, Dong

    There's a growing need for fire departments to adopt a safe and fair method of training to ensure that the firefighting commander is in a position to manage a fire incident. Visual simulation training systems, with their ability to replicate and interact with virtual fire scenarios through the use of computer graphics or VR, become an effective and efficient method for fire ground education. This paper describes the system architecture and functions of a visual simulated training system of fire fighting on oil storage, which adopting Delat3D, a open source game and simulation engine, to provide realistic 3D views. It presents that using open source technology provides not only the commercial-level 3D effects but also a great reduction of cost.

  7. Feature Usage Explorer: Usage Monitoring and Visualization Tool in HTML5 Based Applications

    Directory of Open Access Journals (Sweden)

    Sarunas Marciuska

    2013-10-01

    Full Text Available Feature Usage Explorer is a JavaScript library, which automatically detects features in HTML5 based applications and monitors their usage. The collected information can be visualized in a Feature Usage Diagram, which is automatically generated from an input json file. Currently, the users of Feature Usage Explorer have to design their own tool in order to generate the json file from collected usage information. This option remains viable when using the library in order not to constraint the user’s choice of preferred data storage. Feature Usage Explorer can be reused in any HTML5 based applications where an understanding of how users interact with the system is required (i.e. user experience and usability studies, human computer interaction field, or requirement prioritization area.

  8. Vortex filament method as a tool for computational visualization of quantum turbulence

    Science.gov (United States)

    Hänninen, Risto; Baggaley, Andrew W.

    2014-01-01

    The vortex filament model has become a standard and powerful tool to visualize the motion of quantized vortices in helium superfluids. In this article, we present an overview of the method and highlight its impact in aiding our understanding of quantum turbulence, particularly superfluid helium. We present an analysis of the structure and arrangement of quantized vortices. Our results are in agreement with previous studies showing that under certain conditions, vortices form coherent bundles, which allows for classical vortex stretching, giving quantum turbulence a classical nature. We also offer an explanation for the differences between the observed properties of counterflow and pure superflow turbulence in a pipe. Finally, we suggest a mechanism for the generation of coherent structures in the presence of normal fluid shear. PMID:24704873

  9. Data Visualization and Analysis Tools for the Global Precipitation Measurement (GPM) Validation Network

    Science.gov (United States)

    Morris, Kenneth R.; Schwaller, Mathew

    2010-01-01

    The Validation Network (VN) prototype for the Global Precipitation Measurement (GPM) Mission compares data from the Tropical Rainfall Measuring Mission (TRMM) satellite Precipitation Radar (PR) to similar measurements from U.S. and international operational weather radars. This prototype is a major component of the GPM Ground Validation System (GVS). The VN provides a means for the precipitation measurement community to identify and resolve significant discrepancies between the ground radar (GR) observations and similar satellite observations. The VN prototype is based on research results and computer code described by Anagnostou et al. (2001), Bolen and Chandrasekar (2000), and Liao et al. (2001), and has previously been described by Morris, et al. (2007). Morris and Schwaller (2009) describe the PR-GR volume-matching algorithm used to create the VN match-up data set used for the comparisons. This paper describes software tools that have been developed for visualization and statistical analysis of the original and volume matched PR and GR data.

  10. Usability Evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT).

    Science.gov (United States)

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2007-02-01

    Increasingly sophisticated technologies, such as On-Line Analytical Processing (OLAP) and Geospatial Information Systems (GIS), are being leveraged for conducting community health assessments (CHA). Little is known about the usability of OLAP and GIS interfaces with respect to CHA. We conducted an iterative usability evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT), a software application that combines OLAP and GIS. A total of nine graduate students and six community health researchers were asked to think-aloud while completing five CHA questions using SOVAT. The sessions were analyzed after every three participants and changes to the interface were made based on the findings. Measures included elapsed time, answers provided, erroneous actions, and satisfaction. Traditional OLAP interface features were poorly understood by participants and combined OLAP-GIS features needed to be better emphasized. The results suggest that the changes made to the SOVAT interface resulted in increases in both usability and user satisfaction.

  11. TROVE: A User-friendly Tool for Visualizing and Analyzing Cancer Hallmarks in Signaling Networks.

    Science.gov (United States)

    Chua, Huey Eng; Bhowmick, Sourav S; Zheng, Jie

    2017-09-22

    Cancer hallmarks, a concept that seeks to explain the complexity of cancer initiation and development, provide a new perspective of studying cancer signaling which could lead to a greater understanding of this complex disease. However, to the best of our knowledge, there is currently a lack of tools that support such hallmark-based study of the cancer signaling network, thereby impeding the gain of knowledge in this area. We present TROVE, a user-friendly software that facilitates hallmark annotation, visualization and analysis in cancer signaling networks. In particular, TROVE facilitates hallmark analysis specific to particular cancer types. Available under the Eclipse Public License from: https://sites.google.com/site/cosbyntu/softwares/trove and https://github.com/trove2017/Trove. hechua@ntu.edu.sg or assourav@ntu.edu.sg. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. GENEASE: Real time bioinformatics tool for multi-omics and disease ontology exploration, analysis and visualization.

    Science.gov (United States)

    Ghandikota, Sudhir; Hershey, Gurjit K Khurana; Mersha, Tesfaye B

    2018-03-24

    Advances in high-throughput sequencing technologies have made it possible to generate multiple omics data at an unprecedented rate and scale. The accumulation of these omics data far outpaces the rate at which biologists can mine and generate new hypothesis to test experimentally. There is an urgent need to develop a myriad of powerful tools to efficiently and effectively search and filter these resources to address specific post-GWAS functional genomics questions. However, to date, these resources are scattered across several databases and often lack a unified portal for data annotation and analytics. In addition, existing tools to analyze and visualize these databases are highly fragmented, resulting researchers to access multiple applications and manual interventions for each gene or variant in an ad hoc fashion until all the questions are answered. In this study, we present GENEASE, a web-based one-stop bioinformatics tool designed to not only query and explore multi-omics and phenotype databases (e.g., GTEx, ClinVar, dbGaP, GWAS Catalog, ENCODE, Roadmap Epigenomics, KEGG, Reactome, Gene and Phenotype Ontology) in a single web interface but also to perform seamless post genome-wide association downstream functional and overlap analysis for non-coding regulatory variants. GENEASE accesses over 50 different databases in public domain including model organism-specific databases to facilitate gene/variant and disease exploration, enrichment and overlap analysis in real time. It is a user-friendly tool with point-and-click interface containing links for support information including user manual and examples. GENEASE can be accessed freely at http://research.cchmc.org/mershalab/genease_new/login.html. Tesfaye.Mersha@cchmc.org, Sudhir.Ghandikota@cchmc.org. Supplementary data are available at Bioinformatics online.

  13. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  14. DisEpi: Compact Visualization as a Tool for Applied Epidemiological Research.

    Science.gov (United States)

    Benis, Arriel; Hoshen, Moshe

    2017-01-01

    Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.

  15. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    OpenAIRE

    Evviva Weinraub Lajoie; Trey Terrell; Susan McEvoy; Eva Kaplan; Ariel Schwartz; Esther Ajambo

    2014-01-01

    In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website) and Globalize3 to allow for multipl...

  16. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    Science.gov (United States)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  17. sbml-diff: A Tool for Visually Comparing SBML Models in Synthetic Biology.

    Science.gov (United States)

    Scott-Brown, James; Papachristodoulou, Antonis

    2017-07-21

    We present sbml-diff, a tool that is able to read a model of a biochemical reaction network in SBML format and produce a range of diagrams showing different levels of detail. Each diagram type can be used to visualize a single model or to visually compare two or more models. The default view depicts species as ellipses, reactions as rectangles, rules as parallelograms, and events as diamonds. A cartoon view replaces the symbols used for reactions on the basis of the associated Systems Biology Ontology terms. An abstract view represents species as ellipses and draws edges between them to indicate whether a species increases or decreases the production or degradation of another species. sbml-diff is freely licensed under the three-clause BSD license and can be downloaded from https://github.com/jamesscottbrown/sbml-diff and used as a python package called from other software, as a free-standing command-line application, or online using the form at http://sysos.eng.ox.ac.uk/tebio/upload.

  18. Images as tools. On visual epistemic practices in the biological sciences.

    Science.gov (United States)

    Samuel, Nina

    2013-06-01

    Contemporary visual epistemic practices in the biological sciences raise new questions of how to transform an iconic data measurements into images, and how the process of an imaging technique may change the material it is 'depicting'. This case-oriented study investigates microscopic imagery, which is used by system and synthetic biologists alike. The core argument is developed around the analysis of two recent methods, developed between 2003 and 2006: localization microscopy and photo-induced cell death. Far from functioning merely as illustrations of work done by other means, images can be determined as tools for discovery in their own right and as objects of investigation. Both methods deploy different constellations of intended and unintended interactions between visual appearance and underlying biological materiality. To characterize these new ways of interaction, the article introduces the notions of 'operational images' and 'operational agency'. Despite all their novelty, operational images are still subject to conventions of seeing and depicting: Phenomena emerging with the new method of localization microscopy have to be designed according to image traditions of older, conventional fluorescence microscopy to function properly as devices for communication between physicists and biologists. The article emerged from a laboratory study based on interviews conducted with researchers from the Kirchhoff-Institute for Physics and German Cancer Research Center (DKFZ) at Bioquant, Heidelberg, in 2011. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. A Software Tool to Visualize Verbal Protocols to Enhance Strategic and Metacognitive Abilities in Basic Programming

    Directory of Open Access Journals (Sweden)

    Carlos A. Arévalo

    2011-07-01

    Full Text Available Learning to program is difficult for many first year undergraduate students. Instructional strategies of traditional programming courses tend to focus on syntactic issues and assigning practice exercises using the presentation-examples-practice formula and by showing the verbal and visual explanation of a teacher during the “step by step” process of writing a computer program. Cognitive literature regarding the mental processes involved in programming suggests that the explicit teaching of certain aspects such as mental models, strategic knowledge and metacognitive abilities, are critical issues of how to write and assemble the pieces of a computer program. Verbal protocols are often used in software engineering as a technique to record the short term cognitive process of a user or expert in evaluation or problem solving scenarios. We argue that verbal protocols can be used as a mechanism to explicitly show the strategic and metacognitive process of an instructor when writing a program. In this paper we present an Information System Prototype developed to store and visualize worked examples derived from transcribed verbal protocols during the process of writing introductory level programs. Empirical data comparing the grades obtained by two groups of novice programming students, using ANOVA, indicates a statistically positive difference in performance in the group using the tool, even though these results still cannot be extrapolated to general population, given the reported limitations of this study.

  20. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  1. Autonomous Micro-Air-Vehicle Control Based on Visual Sensing for Odor Source Localization

    Directory of Open Access Journals (Sweden)

    Kenzo Kurotsuchi

    2017-07-01

    Full Text Available In this paper, we propose a novel control method for autonomous-odor-source localization using visual and odor sensing by micro air vehicles (MAVs. Our method is based on biomimetics, which enable highly autonomous localization. Our method does not need any instruction signals, including even global positioning system (GPS signals. An experimenter simply blows a whistle, and the MAV will then start to hover, to seek an odor source, and to keep hovering near the source. The GPS-signal-free control based on visual sense enables indoor/underground use. Moreover, the MAV is light-weight (85 grams and does not cause harm to others even if it accidentally falls. Experiments conducted in the real world were successful in enabling odor source localization using the MAV with a bio-inspired searching method. The distance error of the localization was 63 cm, more accurate than the target distance of 120 cm for individual identification. Our odor source localization is the first step to a proof of concept for a danger warning system. These localization experiments were the first step to a proof of concept for a danger warning system to enable a safer and more secure society.

  2. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  3. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  4. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    OpenAIRE

    Panta, Sandeep R.; Wang, Runtang; Fries, Jill; Kalyanam, Ravi; Speer, Nicole; Banich, Marie; Kiehl, Kent; King, Margaret; Milham, Michael; Wager, Tor D.; Turner, Jessica A.; Plis, Sergey M.; Calhoun, Vince D.

    2016-01-01

    In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI) scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS). We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed ...

  5. The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): Data Analysis and Visualization for Geoscience Data

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Doutriaux, Charles [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Patchett, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Sean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Shipman, Galen [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Miller, Ross [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steed, Chad [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Krishnan, Harinarayan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Silva, Claudio [NYU Polytechnic School of Engineering, New York, NY (United States); Chaudhary, Aashish [Kitware, Inc., Clifton Park, NY (United States); Bremer, Peer-Timo [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pugmire, David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Childs, Hank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Prabhat, Mr. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geveci, Berk [Kitware, Inc., Clifton Park, NY (United States); Bauer, Andrew [Kitware, Inc., Clifton Park, NY (United States); Pletzer, Alexander [Tech-X Corp., Boulder, CO (United States); Poco, Jorge [NYU Polytechnic School of Engineering, New York, NY (United States); Ellqvist, Tommy [NYU Polytechnic School of Engineering, New York, NY (United States); Santos, Emanuele [Federal Univ. of Ceara, Fortaleza (Brazil); Potter, Gerald [NASA Johnson Space Center, Houston, TX (United States); Smith, Brian [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Maxwell, Thomas [NASA Johnson Space Center, Houston, TX (United States); Kindig, David [Tech-X Corp., Boulder, CO (United States); Koop, David [NYU Polytechnic School of Engineering, New York, NY (United States)

    2013-05-01

    To support interactive visualization and analysis of complex, large-scale climate data sets, UV-CDAT integrates a powerful set of scientific computing libraries and applications to foster more efficient knowledge discovery. Connected through a provenance framework, the UV-CDAT components can be loosely coupled for fast integration or tightly coupled for greater functionality and communication with other components. This framework addresses many challenges in the interactive visual analysis of distributed large-scale data for the climate community.

  6. 3D Visualization Tools to Support Soil Management In Relation to Sustainable Agriculture and Ecosystem Services

    Science.gov (United States)

    Wang, Chen

    2017-04-01

    Visualization tools [1][2][6] have been used increasingly as part of information, consultation, and collaboration in relation to issues of global significance. Visualization techniques can be used in a variety of different settings, depending on their association with specific types of decision. Initially, they can be used to improve awareness of the local community and landscape, either individually or in groups [5]. They can also be used to communicate different aspects of change, such as digital soil mapping, ecosystem services and climate change [7][8]. A prototype 3D model was developed to present Tarland Catchment on the North East Scotland which includes 1:25000 soil map data and 1:50000 land capability for agriculture (LCA) data [4]. The model was used to identify issues arising between the growing interest soil monitoring and management, and the potential effects on existing soil characteristics. The online model was also created which can capture user/stakeholder comments they associate with soil features. In addition, people are located physically within the real-world bounds of the current soil management scenario, they can use Augmented Reality to see the scenario overlaid on their immediate surroundings. Models representing alternative soil use and management were used in the virtual landscape theatre (VLT) [3]with electronic voting designed to elicit public aspirations and concerns regarding future soil uses, and to develop scenarios driven by local input. Preliminary findings suggest positive audience responses to the relevance of the inclusion of soil data within a scene when considering questions regarding the impact of land-use change, such as woodland, agricultural land and open spaces. A future development is the use of the prototype virtual environment in a preference survey of scenarios of changes in land use, and in stakeholder consultations on such changes.END Rua, H. and Alvito, P. (2011) Living the past: 3D models, virtual reality and

  7. Gas discharge visualization: an imaging and modeling tool for medical biometrics.

    Science.gov (United States)

    Kostyuk, Nataliya; Cole, Phyadragren; Meghanathan, Natarajan; Isokpehi, Raphael D; Cohly, Hari H P

    2011-01-01

    The need for automated identification of a disease makes the issue of medical biometrics very current in our society. Not all biometric tools available provide real-time feedback. We introduce gas discharge visualization (GDV) technique as one of the biometric tools that have the potential to identify deviations from the normal functional state at early stages and in real time. GDV is a nonintrusive technique to capture the physiological and psychoemotional status of a person and the functional status of different organs and organ systems through the electrophotonic emissions of fingertips placed on the surface of an impulse analyzer. This paper first introduces biometrics and its different types and then specifically focuses on medical biometrics and the potential applications of GDV in medical biometrics. We also present our previous experience with GDV in the research regarding autism and the potential use of GDV in combination with computer science for the potential development of biological pattern/biomarker for different kinds of health abnormalities including cancer and mental diseases.

  8. Listening to the solar eclipse with an educational tool for the blind and visually impaired

    Science.gov (United States)

    Bieryla, Allyson; Diaz-Merced, Wanda; Davis, Daniel; Hart, Robert

    2018-01-01

    The Great American Solar Eclipse took place on August 21, 2017 and swept through 14 of the United States. This was a highly publicized event and much of the world took notice. We live in a time where everything is accessible via the internet as it is happening. Many people, even those outside of the eclipse path, wanted to experience the event in real-time. We built a device, using an Arduino compatible microcontroller, that converts sunlight to sound so that the blind and visually impaired community could experience the eclipse live with the rest of the world. The device has a high dynamic range light sensor and an audio output that connects to a webcam and a computer. The event was successfully streamed to YouTube from Jackson Hole, Wyoming and people from all around the world connected to listen as the sun was temporarily dimmed by the eclipse of the moon. This device is inexpensive to reproduce (< $40 per device) and can be used as a teaching tool in a lab or classroom setting. Students can learn to build and write code for these devices as well. This is a tool with great potential for human development.

  9. AceTree: a tool for visual analysis of Caenorhabditis elegans embryogenesis

    Directory of Open Access Journals (Sweden)

    Araya Carlos L

    2006-06-01

    Full Text Available Abstract Background The invariant lineage of the nematode Caenorhabditis elegans has potential as a powerful tool for the description of mutant phenotypes and gene expression patterns. We previously described procedures for the imaging and automatic extraction of the cell lineage from C. elegans embryos. That method uses time-lapse confocal imaging of a strain expressing histone-GFP fusions and a software package, StarryNite, processes the thousands of images and produces output files that describe the location and lineage relationship of each nucleus at each time point. Results We have developed a companion software package, AceTree, which links the images and the annotations using tree representations of the lineage. This facilitates curation and editing of the lineage. AceTree also contains powerful visualization and interpretive tools, such as space filling models and tree-based expression patterning, that can be used to extract biological significance from the data. Conclusion By pairing a fast lineaging program written in C with a user interface program written in Java we have produced a powerful software suite for exploring embryonic development.

  10. CAGO: a software tool for dynamic visual comparison and correlation measurement of genome organization.

    Directory of Open Access Journals (Sweden)

    Yi-Feng Chang

    Full Text Available CAGO (Comparative Analysis of Genome Organization is developed to address two critical shortcomings of conventional genome atlas plotters: lack of dynamic exploratory functions and absence of signal analysis for genomic properties. With dynamic exploratory functions, users can directly manipulate chromosome tracks of a genome atlas and intuitively identify distinct genomic signals by visual comparison. Signal analysis of genomic properties can further detect inconspicuous patterns from noisy genomic properties and calculate correlations between genomic properties across various genomes. To implement dynamic exploratory functions, CAGO presents each genome atlas in Scalable Vector Graphics (SVG format and allows users to interact with it using a SVG viewer through JavaScript. Signal analysis functions are implemented using R statistical software and a discrete wavelet transformation package waveslim. CAGO is not only a plotter for generating complex genome atlases, but also a platform for exploring genome atlases with dynamic exploratory functions for visual comparison and with signal analysis for comparing genomic properties across multiple organisms. The web-based application of CAGO, its source code, user guides, video demos, and live examples are publicly available and can be accessed at http://cbs.ym.edu.tw/cago.

  11. iELVis: An open source MATLAB toolbox for localizing and visualizing human intracranial electrode data.

    Science.gov (United States)

    Groppe, David M; Bickel, Stephan; Dykstra, Andrew R; Wang, Xiuyuan; Mégevand, Pierre; Mercier, Manuel R; Lado, Fred A; Mehta, Ashesh D; Honey, Christopher J

    2017-04-01

    Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELVis's functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  13. An image-guided radiotherapy decision support framework incorporating a Bayesian network and visualization tool.

    Science.gov (United States)

    Hargrave, Catriona; Deegan, Timothy; Bednarz, Tomasz; Poulsen, Michael; Harden, Fiona; Mengersen, Kerrie

    2018-05-17

    To describe a Bayesian network (BN) and complementary visualization tool that aim to support decision-making during online cone-beam computed tomography (CBCT)-based image-guided radiotherapy (IGRT) for prostate cancer patients. The BN was created to represent relationships between observed prostate, proximal seminal vesicle (PSV), bladder and rectum volume variations, an image feature alignment score (FAS TV _ OAR ), delivered dose, and treatment plan compliance (TPC). Variables influencing tumor volume (TV) targeting accuracy such as intrafraction motion, and contouring and couch shift errors were also represented. A score of overall TPC (FAS global ) and factors such as image quality were used to inform the BN output node providing advice about proceeding with treatment. The BN was quantified using conditional probabilities generated from published studies, FAS TV _ OAR /global modeling, and a survey of IGRT decision-making practices. A new IGRT visualization tool (IGRT REV ), in the form of Mollweide projection plots, was developed to provide a global summary of residual errors after online CBCT-planning CT registration. Sensitivity and scenario analyses were undertaken to evaluate the performance of the BN and the relative influence of the network variables on TPC and the decision to proceed with treatment. The IGRT REV plots were evaluated in conjunction with the BN scenario testing, using additional test data generated from retrospective CBCT-planning CT soft-tissue registrations for 13/36 patients whose data were used in the FAS TV _ OAR /global modeling. Modeling of the TV targeting errors resulted in a very low probability of corrected distances between the CBCT and planning CT prostate or PSV volumes being within their thresholds. Strength of influence evaluation with and without the BN TV targeting error nodes indicated that rectum- and bladder-related network variables had the highest relative importance. When the TV targeting error nodes were excluded

  14. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    Science.gov (United States)

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  15. Image Processing Tools for Improved Visualization and Analysis of Remotely Sensed Images for Agriculture and Forest Classifications

    OpenAIRE

    SINHA G. R.

    2017-01-01

    This paper suggests Image Processing tools for improved visualization and better analysis of remotely sensed images. There are methods already available in literature for the purpose but the most important challenge among the limitations is lack of robustness. We propose an optimal method for image enhancement of the images using fuzzy based approaches and few optimization tools. The segmentation images subsequently obtained after de-noising will be classified into distinct information and th...

  16. A novel tool to predict food intake: the Visual Meal Creator.

    Science.gov (United States)

    Holliday, Adrian; Batey, Chris; Eves, Frank F; Blannin, Andrew K

    2014-08-01

    Subjective appetite is commonly measured using an abstract visual analogue scale (VAS) technique, that provides no direct information about desired portion size or food choice. The purpose of this investigation was to develop and validate a user-friendly tool - the Visual Meal Creator (VIMEC) - that would allow for independent, repeated measures of subjective appetite and provide a prediction of food intake. Twelve participants experienced dietary control over a 5-hour period to manipulate hunger state on three occasions (small breakfast (SB) vs. large breakfast (LB) vs. large breakfast + snacks (LB+S)). Appetite measures were obtained every 60 minutes using the VIMEC and VAS. At 4.5 hours, participants were presented with an ad libitum test meal, from which energy intake (EI) was measured. The efficacy of the VIMEC was assessed by its ability to detect expected patterns of appetite and its strength as a predictor of energy intake. Day-to-day reproducibility and test-retest repeatability were assessed. Between- and within-condition differences in VAS and VIMEC scores (represented as mm and kcal of the "created" meal, respectively) were significantly correlated with one another throughout. Between- and within-condition changes in appetite scores obtained with the VIMEC exhibited a stronger correlation with EI at the test meal than those obtained with VAS. Pearson correlation coefficients for within-condition comparisons were 0.951, 0.914 and 0.875 (all p < 0.001) for SB, LB and LB+S respectively. Correlation coefficients for between-condition differences in VIMEC and EI were 0.273, 0.940 (p < 0.001) and 0.525 (p < 0.05) for SB - LB+S, SB - LB and LB - LB+S respectively. The VIMEC exhibited a similar degree of reproducibility to VAS. These findings suggest that the VIMEC appears to be a stronger predictor of energy intake than VAS. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Self and Peer Assessment and Dominance During Group Work Using Online Visual Tools

    Directory of Open Access Journals (Sweden)

    Ed Lester

    2010-11-01

    Full Text Available An experiment undertaken with engineering undergraduate students at the University of Nottingham involved 26 groups of three being filmed during a study using a virtual-reality-based problem-solving exercise. After the exercise, each individual filled in a questionnaire relating to the exercise which allowed them to score themselves and their peers for contribution and overall grade. The comparing of video evidence with perceived contributions made it possible to observe patterns of behaviour based on temperament dominance. This ‘dominance’ was based on two simple parameters extracted from an electronic version of the Myers-Briggs test: first, the time taken to complete the study, called ‘decisiveness’, and secondly, the degree of Extroversion/Introversion. The more decisive subjects received higher marks from their peers, despite the absence of any video evidence that they had actually contributed more than their peers. The most dominant extroverts appear to ‘do more’ with respect to the physical operation of the mouse/keyboard and interaction with the visual simulation during the virtual-reality exercise. However, there was no link with these simple temperament measures with the degree of enjoyment of the tasks, which appeared to be highly consistent. The authors do not argue that visual-media tools, such as the virtual-reality environment described in this article, might offer solutions to problems associated with group work in engineering, but rather that information regarding the character traits of the participants may help to create more effective teams and to help understand the inter-personal dynamics within teams undertaking such tasks.

  18. Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)

    Science.gov (United States)

    Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.

    2017-07-27

    A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting

  19. Health figures: an open source JavaScript library for health data visualization.

    Science.gov (United States)

    Ledesma, Andres; Al-Musawi, Mohammed; Nieminen, Hannu

    2016-03-22

    The way we look at data has a great impact on how we can understand it, particularly when the data is related to health and wellness. Due to the increased use of self-tracking devices and the ongoing shift towards preventive medicine, better understanding of our health data is an important part of improving the general welfare of the citizens. Electronic Health Records, self-tracking devices and mobile applications provide a rich variety of data but it often becomes difficult to understand. We implemented the hFigures library inspired on the hGraph visualization with additional improvements. The purpose of the library is to provide a visual representation of the evolution of health measurements in a complete and useful manner. We researched the usefulness and usability of the library by building an application for health data visualization in a health coaching program. We performed a user evaluation with Heuristic Evaluation, Controlled User Testing and Usability Questionnaires. In the Heuristics Evaluation the average response was 6.3 out of 7 points and the Cognitive Walkthrough done by usability experts indicated no design or mismatch errors. In the CSUQ usability test the system obtained an average score of 6.13 out of 7, and in the ASQ usability test the overall satisfaction score was 6.64 out of 7. We developed hFigures, an open source library for visualizing a complete, accurate and normalized graphical representation of health data. The idea is based on the concept of the hGraph but it provides additional key features, including a comparison of multiple health measurements over time. We conducted a usability evaluation of the library as a key component of an application for health and wellness monitoring. The results indicate that the data visualization library was helpful in assisting users in understanding health data and its evolution over time.

  20. ggCyto: Next Generation Open-Source Visualization Software for Cytometry.

    Science.gov (United States)

    Van, Phu; Jiang, Wenxin; Gottardo, Raphael; Finak, Greg

    2018-06-01

    Open source software for computational cytometry has gained in popularity over the past few years. Efforts such as FlowCAP, the Lyoplate and Euroflow projects have highlighted the importance of efforts to standardize both experimental and computational aspects of cytometry data analysis. The R/BioConductor platform hosts the largest collection of open source cytometry software covering all aspects of data analysis and providing infrastructure to represent and analyze cytometry data with all relevant experimental, gating, and cell population annotations enabling fully reproducible data analysis. Data visualization frameworks to support this infrastructure have lagged behind. ggCyto is a new open-source BioConductor software package for cytometry data visualization built on ggplot2 that enables ggplot-like functionality with the core BioConductor flow cytometry data structures. Amongst its features are the ability to transform data and axes on-the-fly using cytometry-specific transformations, plot faceting by experimental meta-data variables, and partial matching of channel, marker and cell populations names to the contents of the BioConductor cytometry data structures. We demonstrate the salient features of the package using publicly available cytometry data with complete reproducible examples in a supplementary material vignette. https://bioconductor.org/packages/devel/bioc/html/ggcyto.html. gfinak@fredhutch.org. Supplementary data are available at Bioinformatics online and at http://rglab.org/ggcyto/.

  1. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  2. PRI-CAT: a web-tool for the analysis, storage and visualization of plant ChIP-seq experiments.

    NARCIS (Netherlands)

    Muino, J.M.; Hoogstraat, M.; Ham, van R.C.H.J.; Dijk, van A.D.J.

    2011-01-01

    Although several tools for the analysis of ChIP-seq data have been published recently, there is a growing demand, in particular in the plant research community, for computational resources with which such data can be processed, analyzed, stored, visualized and integrated within a single,

  3. The Effect of Using a Visual Representation Tool in a Teaching-Learning Sequence for Teaching Newton's Third Law

    Science.gov (United States)

    Savinainen, Antti; Mäkynen, Asko; Nieminen, Pasi; Viiri, Jouni

    2017-01-01

    This paper presents a research-based teaching-learning sequence (TLS) that focuses on the notion of interaction in teaching Newton's third law (N3 law) which is, as earlier studies have shown, a challenging topic for students to learn. The TLS made systematic use of a visual representation tool--an interaction diagram (ID)--highlighting…

  4. Oceans 2.0: Interactive tools for the Visualization of Multi-dimensional Ocean Sensor Data

    Science.gov (United States)

    Biffard, B.; Valenzuela, M.; Conley, P.; MacArthur, M.; Tredger, S.; Guillemot, E.; Pirenne, B.

    2016-12-01

    Ocean Networks Canada (ONC) operates ocean observatories on all three of Canada's coasts. The instruments produce 280 gigabytes of data per day with 1/2 petabyte archived so far. In 2015, 13 terabytes were downloaded by over 500 users from across the world. ONC's data management system is referred to as "Oceans 2.0" owing to its interactive, participative features. A key element of Oceans 2.0 is real time data acquisition and processing: custom device drivers implement the input-output protocol of each instrument. Automatic parsing and calibration takes place on the fly, followed by event detection and quality control. All raw data are stored in a file archive, while the processed data are copied to fast databases. Interactive access to processed data is provided through data download and visualization/quick look features that are adapted to diverse data types (scalar, acoustic, video, multi-dimensional, etc). Data may be post or re-processed to add features, analysis or correct errors, update calibrations, etc. A robust storage structure has been developed consisting of an extensive file system and a no-SQL database (Cassandra). Cassandra is a node-based open source distributed database management system. It is scalable and offers improved performance for big data. A key feature is data summarization. The system has also been integrated with web services and an ERDDAP OPeNDAP server, capable of serving scalar and multidimensional data from Cassandra for fixed or mobile devices.A complex data viewer has been developed making use of the big data capability to interactively display live or historic echo sounder and acoustic Doppler current profiler data, where users can scroll, apply processing filters and zoom through gigabytes of data with simple interactions. This new technology brings scientists one step closer to a comprehensive, web-based data analysis environment in which visual assessment, filtering, event detection and annotation can be integrated.

  5. Public data and open source tools for multi-assay genomic investigation of disease.

    Science.gov (United States)

    Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi

    2016-07-01

    Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.

  6. BioJS: an open source JavaScript framework for biological data visualization.

    Science.gov (United States)

    Gómez, John; García, Leyla J; Salazar, Gustavo A; Villaveces, Jose; Gore, Swanand; García, Alexander; Martín, Maria J; Launay, Guillaume; Alcántara, Rafael; Del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Velankar, Sameer; Hermjakob, Henning; Zong, Chenggong; Ping, Peipei; Corpas, Manuel; Jiménez, Rafael C

    2013-04-15

    BioJS is an open-source project whose main objective is the visualization of biological data in JavaScript. BioJS provides an easy-to-use consistent framework for bioinformatics application programmers. It follows a community-driven standard specification that includes a collection of components purposely designed to require a very simple configuration and installation. In addition to the programming framework, BioJS provides a centralized repository of components available for reutilization by the bioinformatics community. http://code.google.com/p/biojs/. Supplementary data are available at Bioinformatics online.

  7. MyView2, a new visualization software tool for analysis of LHD data

    International Nuclear Information System (INIS)

    Moon, Chanho; Yoshinuma, Mikirou; Emoto, Masahiko; Ida, Katsumi

    2016-01-01

    The Large Helical Device (LHD) at the National Institute for Fusion Science (NIFS) is the world’s largest superconducting helical fusion device, providing a scientific research center to elucidate important physics research such as plasma transport, turbulence dynamics, and other topics. Furthermore, many types of advanced diagnostic devices are used to measure the confinement plasma characteristics, and these valuable physical data are registered over the 131,000 discharges in the LHD database. However, it is difficult to investigate the experimental data even though much physical data has been registered. In order to improve the efficiency for investigating plasma physics in LHD, we have developed a new data visualization software, MyView2, which consists of Python-based modules that can be easily set up and updated. MyView2 provides immediate access to experimental results, cross-shot analysis, and a collaboration point for scientific research. In particular, the MyView2 software is a portable structure for making viewable LHD experimental data in on- and off-site web servers, which is a capability not previously available in any general use tool. We will also discuss the benefits of using the MyView2 software for in-depth analysis of LHD experimental data.

  8. E-infocenter, a visual tool for project management in educational robotics using web technologies

    Directory of Open Access Journals (Sweden)

    Kathia Pittí Patiño

    2012-07-01

    Full Text Available Normal.dotm 0 0 1 147 838 Universidad de Salamanca 6 1 1029 12.0 0 false 18 pt 18 pt 0 0 false false false /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:12.0pt; font-family:"Times New Roman"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} Internet applications and educational robotics are technologies that are characterized by their relatively novelty and motivating character. They are the ideal setting for the application of active teaching methods. The project-based learning is considered as one of the most attractive of these teaching methods. Under our approach, the projects with robots makes use of the Web 2.0 collaborative environment, so this one will be considered as student support tool. In this way they can develop many skills that are easily transferable to the work-market. In this paper, an online visual tool called E-infocenter is described. We will show the developed selection process, the design and the implementation of the mentioned project management. This tool has been used for the first time for six weeks in the workshop “Vehicles LEGO NXT” that is, an experience dedicated to children aged between 8 and 15 years. The benefits perceived by the participants have been at management level, emotional and collaboration.

  9. Enhancing interdisciplinary collaboration and decisionmaking with J-Earth: an open source data sharing, visualization and GIS analysis platform

    Science.gov (United States)

    Prashad, L. C.; Christensen, P. R.; Fink, J. H.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2010-12-01

    local to global levels. J-Earth is a Geographic Information System (GIS) that provides analytical tools for visualizing high-resolution and hyperspectral remote sensing imagery along with numeric and vector data. J-Earth is part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of tools which were first created to target NASA instruments on Mars and Lunar missions. Data can currently be incorporated in J-Earth at a scale of over 32,000 pixels per degree. Among other GIS functions, users can analyze trends along a transect line, or across vector regions, over multiple stacked numerical data layers and export their results. Open source tools, like J-Earth, are not only generally free or low-cost to users but provide the opportunity for users to contribute direction, functionality, and data standards to these projects. The flexible nature of open source projects often facilitates the incorporation of unique and emerging data sources, such as mobile phone data, sensor networks, croudsourced inputs, and social networking. The J-Earth team plans to incorporate datasources such as these with the feedback and participation of the user community.

  10. Sinistrals are rarely ‘right’: evidence from tool­-affordance processing in visual half-­field paradigms

    Directory of Open Access Journals (Sweden)

    Bartosz eMichałowski

    2015-03-01

    Full Text Available Although current neuroscience and behavioral studies provide substantial understanding of tool representations (e.g., the processing of tool-­related affordances in the human brain, most of this knowledge is limited to right-handed individuals with typical organization of cognitive and manual skills. Therefore, any insights from these lines of research may be of little value in rehabilitation of patients with atypical laterality of praxis and/or hand dominance. To fill this gap, we tested perceptual processing of man­-made objects in 18 healthy left-­handers who were likely to show greater incidence of right-sided or bilateral (atypical lateralization of functions. In the two experiments reported here, participants performed a tool vs. non-­tool categorization task. In Exp. 1, target and distracter objects were presented for 200 ms in the left (LVF or right (RVF visual field, followed by 200ms masks. In Exp. 2, the centrally presented targets were preceded by masked primes of 35ms duration, again presented in the LVF or RVF. Based on results from both studies, i.e., response times to correctly discriminated stimuli irrespective of their category, participants were divided into two groups showing privileged processing in either left (N = 9 or right (N = 9 visual field. In Exp. 1, only individuals with RVF advantage showed significantly faster categorization of tools in their dominant visual field, whereas those with LVF advantage revealed merely a trend towards such an effect. In Exp. 2, when targets were preceded by identical primes, the ‘atypical’ group showed significantly facilitated categorization of non­-tools, whereas the ‘typical’ group demonstrated a trend towards faster categorization of tools. These results indicate that in subjects with atypically organized cognitive skills, tool­-related processes are not just mirror reversed. Thus, our outcomes call for particular caution in neurorehabilitation directed at left

  11. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    Science.gov (United States)

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. CellNetVis: a web tool for visualization of biological networks using force-directed layout constrained by cellular components.

    Science.gov (United States)

    Heberle, Henry; Carazzolle, Marcelo Falsarella; Telles, Guilherme P; Meirelles, Gabriela Vaz; Minghim, Rosane

    2017-09-13

    The advent of "omics" science has brought new perspectives in contemporary biology through the high-throughput analyses of molecular interactions, providing new clues in protein/gene function and in the organization of biological pathways. Biomolecular interaction networks, or graphs, are simple abstract representations where the components of a cell (e.g. proteins, metabolites etc.) are represented by nodes and their interactions are represented by edges. An appropriate visualization of data is crucial for understanding such networks, since pathways are related to functions that occur in specific regions of the cell. The force-directed layout is an important and widely used technique to draw networks according to their topologies. Placing the networks into cellular compartments helps to quickly identify where network elements are located and, more specifically, concentrated. Currently, only a few tools provide the capability of visually organizing networks by cellular compartments. Most of them cannot handle large and dense networks. Even for small networks with hundreds of nodes the available tools are not able to reposition the network while the user is interacting, limiting the visual exploration capability. Here we propose CellNetVis, a web tool to easily display biological networks in a cell diagram employing a constrained force-directed layout algorithm. The tool is freely available and open-source. It was originally designed for networks generated by the Integrated Interactome System and can be used with networks from others databases, like InnateDB. CellNetVis has demonstrated to be applicable for dynamic investigation of complex networks over a consistent representation of a cell on the Web, with capabilities not matched elsewhere.

  13. Measuring temporal summation in visual detection with a single-photon source.

    Science.gov (United States)

    Holmes, Rebecca; Victora, Michelle; Wang, Ranxiao Frances; Kwiat, Paul G

    2017-11-01

    Temporal summation is an important feature of the visual system which combines visual signals that arrive at different times. Previous research estimated complete summation to last for 100ms for stimuli judged "just detectable." We measured the full range of temporal summation for much weaker stimuli using a new paradigm and a novel light source, developed in the field of quantum optics for generating small numbers of photons with precise timing characteristics and reduced variance in photon number. Dark-adapted participants judged whether a light was presented to the left or right of their fixation in each trial. In Experiment 1, stimuli contained a stream of photons delivered at a constant rate while the duration was systematically varied. Accuracy should increase with duration as long as the later photons can be integrated with the proceeding ones into a single signal. The temporal integration window was estimated as the point that performance no longer improved, and was found to be 650ms on average. In Experiment 2, the duration of the visual stimuli was kept short (100ms or photons was varied to explore the efficiency of summation over the integration window compared to Experiment 1. There was some indication that temporal summation remains efficient over the integration window, although there is variation between individuals. The relatively long integration window measured in this study may be relevant to studies of the absolute visual threshold, i.e., tests of single-photon vision, where "single" photons should be separated by greater than the integration window to avoid summation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Total organic carbon, an important tool in an holistic approach to hydrocarbon source fingerprinting

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, P.D.; Burns, W.A.; Page, D.S.; Bence, A.E.; Mankiewicz, P.J.; Brown, J.S.; Douglas, G.S. [Battelle Member Inst., Waltham, MA (United States)

    2002-07-01

    The identification and allocation of multiple hydrocarbon sources in marine sediments is best achieved using an holistic approach. Total organic carbon (TOC) is one important tool that can constrain the contributions of specific sources and rule out incorrect source allocations in cases where inputs are dominated by fossil organic carbon. In a study of the benthic sediments from Prince William Sound (PWS) and the Gulf of Alaska (GOA), we find excellent agreement between measured TOC and TOC calculated from hydrocarbon fingerprint matches of polycyclic aromatic hydrocarbons (PAH) and chemical biomarkers. Confirmation by two such independent source indicators (TOC and fingerprint matches) provides evidence that source allocations determined by the fingerprint matches are robust and that the major TOC sources have been correctly identified. Fingerprint matches quantify the hydrocarbon contributions of various sources to the benthic sediments and the degree of hydrocarbon winnowing by waves and currents. TOC contents are then calculated using source allocation results from fingerprint matches and the TOCs of contributing sources. Comparisons of the actual sediment TOC values and those calculated from source allocation support our earlier published findings that the natural petrogenic hydrocarbon background in sediments in this area comes from eroding Tertiary shales and associated oil seeps along the northern GOA coast and exclude thermally mature area coals from being important contributors to the PWS background due to their high TOC content.

  15. Visualization of NO2 emission sources using temporal and spatial pattern analysis in Asia

    Science.gov (United States)

    Schütt, A. M. N.; Kuhlmann, G.; Zhu, Y.; Lipkowitsch, I.; Wenig, M.

    2016-12-01

    Nitrogen dioxide (NO2) is an indicator for population density and level of development, but the contributions of the different emission sources to the overall concentrations remains mostly unknown. In order to allocate fractions of OMI NO2 to emission types, we investigate several temporal cycles and regional patterns.Our analysis is based on daily maps of tropospheric NO2 vertical column densities (VCDs) from the Ozone Monitoring Instrument (OMI). The data set is mapped to a high resolution grid by a histopolation algorithm. This algorithm is based on a continuous parabolic spline, producing more realistic smooth distributions while reproducing the measured OMI values when integrating over ground pixel areas.In the resulting sequence of zoom in maps, we analyze weekly and annual cycles for cities, countryside and highways in China, Japan and Korea Republic and look for patterns and trends and compare the derived results to emission sources in Middle Europe and North America. Due to increased heating in winter compared to summer and more traffic during the week than on Sundays, we dissociate traffic, heating and power plants and visualized maps with different sources. We will also look into the influence of emission control measures during big events like the Olympic Games 2008 and the World Expo 2010 as a possibility to confirm our classification of NO2 emission sources.

  16. An open-source optimization tool for solar home systems: A case study in Namibia

    International Nuclear Information System (INIS)

    Campana, Pietro Elia; Holmberg, Aksel; Pettersson, Oscar; Klintenberg, Patrik; Hangula, Abraham; Araoz, Fabian Benavente; Zhang, Yang; Stridh, Bengt; Yan, Jinyue

    2016-01-01

    Highlights: • An open-source optimization tool for solar home systems (SHSs) design is developed. • The optimization tool is written in MS Excel-VBA. • The optimization tool is validated with a commercial and open-source software. • The optimization tool has the potential of improving future SHS installations. - Abstract: Solar home systems (SHSs) represent a viable technical solution for providing electricity to households and improving standard of living conditions in areas not reached by the national grid or local grids. For this reason, several rural electrification programmes in developing countries, including Namibia, have been relying on SHSs to electrify rural off-grid communities. However, the limited technical know-how of service providers, often resulting in over- or under-sized SHSs, is an issue that has to be solved to avoid dissatisfaction of SHSs’ users. The solution presented here is to develop an open-source software that service providers can use to optimally design SHSs components based on the specific electricity requirements of the end-user. The aim of this study is to develop and validate an optimization model written in MS Excel-VBA which calculates the optimal SHSs components capacities guaranteeing the minimum costs and the maximum system reliability. The results obtained with the developed tool showed good agreement with a commercial software and a computational code used in research activities. When applying the developed optimization tool to existing systems, the results identified that several components were incorrectly sized. The tool has thus the potentials of improving future SHSs installations, contributing to increasing satisfaction of end-users.

  17. Development of Environmental Decision Support System: Unifying Cross-Discipline Data Access Through Open Source Tools

    Science.gov (United States)

    Freeman, S.; Darmenova, K.; Higgins, G. J.; Apling, D.

    2012-12-01

    A common theme when it comes to accessing climate and environmental datasets is that it can be difficult to answer the five basic questions: Who, What, When, Where, and Why. Sometimes even the act of locating a data set or determining how it was generated can prove difficult. It is even more challenging for non-scientific individuals such as planners and policy makers who need to access and include such information in their work. Our Environmental Decision Support System (EDSS) attempts to address this issue by integrating several open source packages to create a simple yet robust web application for conglomerating, searching, viewing, and downloading environmental information for both scientists and decision makers alike. The system is comprised of several open source components, each playing an important role in the EDSS. The Geoportal web application provides an intuitive interface for searching and managing metadata ingested from data sets/data sources. The GeoServer and ncWMS web applications provide overlays and information for visual presentations of the data through web mapping services (WMS) by ingesting ESRI shapefiles, NetCDF, and HDF files. Users of the EDSS can browse the catalog of available products, enter a simple search string, or even constrain searches by temporal and spatial extents. Combined with a custom visualization web application, the EDSS provides a simple yet efficient means for users to not only access and manipulate climate and environmental data, but also trace the data source and the analytical methods used in the final decision aids products.

  18. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  19. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party

  20. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  1. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  2. High-volume image quality assessment systems: tuning performance with an interactive data visualization tool

    Science.gov (United States)

    Bresnahan, Patricia A.; Pukinskis, Madeleine; Wiggins, Michael

    1999-03-01

    Image quality assessment systems differ greatly with respect to the number and types of mags they need to evaluate, and their overall architectures. Managers of these systems, however, all need to be able to tune and evaluate system performance, requirements often overlooked or under-designed during project planning. Performance tuning tools allow users to define acceptable quality standards for image features and attributes by adjusting parameter settings. Performance analysis tools allow users to evaluate and/or predict how well a system performs in a given parameter state. While image assessment algorithms are becoming quite sophisticated, duplicating or surpassing the human decision making process in their speed and reliability, they often require a greater investment in 'training' or fine tuning of parameters in order to achieve optimum performance. This process may involve the analysis of hundreds or thousands of images, generating a large database of files and statistics that can be difficult to sort through and interpret. Compounding the difficulty is the fact that personnel charged with tuning and maintaining the production system may not have the statistical or analytical background required for the task. Meanwhile, hardware innovations have greatly increased the volume of images that can be handled in a given time frame, magnifying the consequences of running a production site with an inadequately tuned system. In this paper, some general requirements for a performance evaluation and tuning data visualization system are discussed. A custom engineered solution to the tuning and evaluation problem is then presented, developed within the context of a high volume image quality assessment, data entry, OCR, and image archival system. A key factor influencing the design of the system was the context-dependent definition of image quality, as perceived by a human interpreter. This led to the development of a five-level, hierarchical approach to image quality

  3. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    Science.gov (United States)

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  4. Methods and tools to evaluate the availability of renewable energy sources

    International Nuclear Information System (INIS)

    Angelis-Dimakis, Athanasios; Kartalidis, Avraam; Biberacher, Markus; Gadocha, Sabine; Dominguez, Javier; Pinedo, Irene; Fiorese, Giulia; Gnansounou, Edgard; Panichelli, Luis; Guariso, Giorgio; Robba, Michela

    2011-01-01

    The recent statements of both the European Union and the US Presidency pushed in the direction of using renewable forms of energy, in order to act against climate changes induced by the growing concentration of carbon dioxide in the atmosphere. In this paper, a survey regarding methods and tools presently available to determine potential and exploitable energy in the most important renewable sectors (i.e., solar, wind, wave, biomass and geothermal energy) is presented. Moreover, challenges for each renewable resource are highlighted as well as the available tools that can help in evaluating the use of a mix of different sources. (author)

  5. Visual Analysis as a design and decision-making tool in the development of a quarry

    Science.gov (United States)

    Randall Boyd Fitzgerald

    1979-01-01

    In order to obtain local and state government approvals, an environmental impact analysis of the mining and reclamation of a proposed hard rock quarry was required. High visibility of the proposed mining area from the adjacent community required a visual impact analysis in the planning and design of the project. The Visual Analysis defined design criteria for the...

  6. Improving Readability of an Evaluation Tool for Low-Income Clients Using Visual Information Processing Theories

    Science.gov (United States)

    Townsend, Marilyn S.; Sylva, Kathryn; Martin, Anna; Metz, Diane; Wooten-Swanson, Patti

    2008-01-01

    Literacy is an issue for many low-income audiences. Using visual information processing theories, the goal was improving readability of a food behavior checklist and ultimately improving its ability to accurately capture existing changes in dietary behaviors. Using group interviews, low-income clients (n = 18) evaluated 4 visual styles. The text…

  7. A nursing home staff tool for the indoor visual environment : the content validity

    NARCIS (Netherlands)

    Sinoo, M.M.; Kort, H.S.M.; Loomans, M.G.L.C.; Schols, J.M.G.A.

    2016-01-01

    In the Netherlands, over 40% of nursing home residents are estimated to have visual impairments. This results in the loss of basic visual abilities. The nursing home environment fits more or less to residents’ activities and social participation. This is referred to as environmental fit. To raise

  8. A nursing home staff tool for the indoor visual environment: The content validity

    NARCIS (Netherlands)

    Marcel G.L.C. Loomans; Dr. H.S.M. Kort; Marianne M. Sinoo; Jos M.G.A Schols

    2016-01-01

    In the Netherlands, over 40% of nursing home residents are estimated to have visual impairments. This results in the loss of basic visual abilities. The nursing home environment fits more or less to residents’ activities and social participation. This is referred to as environmental fit. To raise

  9. Book4All: A Tool to Make an e-Book More Accessible to Students with Vision/Visual-Impairments

    Science.gov (United States)

    Calabrò, Antonello; Contini, Elia; Leporini, Barbara

    Empowering people who are blind or otherwise visually impaired includes ensuring that products and electronic materials incorporate a broad range of accessibility features and work well with screen readers and other assistive technology devices. This is particularly important for students with vision impairments. Unfortunately, authors and publishers often do not include specific criteria when preparing the contents. Consequently, e-books can be inadequate for blind and low vision users, especially for students. In this paper we describe a semi-automatic tool developed to support operators who adapt e-documents for visually impaired students. The proposed tool can be used to convert a PDF e-book into a more suitable accessible and usable format readable on desktop computer or on mobile devices.

  10. Visual dynamic e-module as a tool to fulfill informational needs and care continuum for diabetic patients

    Directory of Open Access Journals (Sweden)

    Mohan Shinde

    2015-01-01

    Full Text Available Introduction: Diabetes can be envisaged as a lifelong phenomenon having the ominous odds for multisystemic involvement in the duration of disease. The probabilities of the occurrence of these events are influenced by the adopted lifestyle. Hence, information about the disease and lifestyle modification are vital from the perspective of prognostics. This study attempts to explore the potential of a "visual dynamic tool" for imparting knowledge and consequently  received acumen by diabetic patients. Objectives: To appraise the effectiveness of a constructed visual dynamic module (encompassing the various dimensions related to and affected by diabetes by capturing the opinions, perceptions, and experiences of the diabetic patients who underwent intervention through the module. Materials and Methods: A visual e-module with dynamically imposed and animated images in the vernacular (Hindi was prepared. This module was instituted among the diabetic patients in a logical sequence for consecutive 3 days. All the diabetic patients who underwent this intervention were interviewed in depth in order to ascertain the effectiveness of the module. These interviews were analyzed by thematic and framework analyses. Result: The visual module was perceived by the diabetic patients as an optically engaging tool for receiving, connecting, and synthesizing information about diabetes. They sensed and expressed the ease to connect with the images and labeled the received information as inclusive. Conclusion: Initial evidences suggest that visual e-module is an effective and efficient tool for knowledge management in diabetes. This issue may be further explored at diverse academic and clinical settings for gathering more information for efficacy.

  11. Data Visualization with Flash Builder Designing RIA and AIR Applications with Remote Data Sources

    CERN Document Server

    Rocchi, Cesare

    2011-01-01

    Design and create functional applications that interact with remote data sources. You get a thorough introduction to the latest Flash Builder tools learning how you can use the built-in wizards, MXML or pure ActionScript 3 to build information-rich applications for the browser or AIR applications. Hand's on tutorials guide you through each iteration including building user interaction, charting, incorporating audio and video, customizing the UI; and a code repository provides re-usable code that you can modify and deploy in your own applications. *Hand's o

  12. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java*

    Science.gov (United States)

    Hogan, P.; Coughlan, J.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  13. NASA World Wind, Open Source 4D Geospatial Visualization Platform: *.NET & Java* for EDUCATION

    Science.gov (United States)

    Hogan, P.; Kuehnel, F.

    2006-12-01

    NASA World Wind has only one goal, to provide the maximum opportunity for geospatial information to be experienced, be it education, science, research, business, or government. The benefits to understanding for information delivered in the context of its 4D virtual reality are extraordinary. The NASA World Wind visualization platform is open source and therefore lends itself well to being extended to service *any* requirements, be they proprietary and commercial or simply available. Data accessibility is highly optimized using standard formats including internationally certified open standards (W*S). Although proprietary applications can be built based on World Wind, and proprietary data delivered that leverage World Wind, there is nothing proprietary about the visualization platform itself or the multiple planetary data sets readily available, including global animations of live weather. NASA World Wind is being used by NASA research teams as well as being a formal part of high school and university curriculum. The National Guard uses World Wind for emergency response activities and State governments have incorporated high resolution imagery for GIS management as well as for their cross-agency emergency response activities. The U.S. federal government uses NASA World Wind for a myriad of GIS and security-related issues (NSA, NGA, DOE, FAA, etc.).

  14. Vision Egg: an open-source library for realtime visual stimulus generation

    Directory of Open Access Journals (Sweden)

    Andrew D Straw

    2008-11-01

    Full Text Available Modern computer hardware makes it possible to produce visual stimuli in ways not previously possible. Arbitrary scenes, from traditional sinusoidal gratings to naturalistic 3D scenes can now be specified on a frame-by-frame basis in realtime. I have developed a programming library called the Vision Egg that aims to make it easy to take advantage of these innovations. The Vision Egg is a free, open-source library making use of OpenGL and written in the high-level language Python with extensions in C. Careful attention has been paid to the issues of luminance and temporal calibration, and several interfacing techniques to input devices such as mice, movement tracking systems, and digital triggers are discussed. Together, these make the Vision Egg suitable for many psychophysical, electrophysiological, and behavioral experiments. This software is available for free download at http://www.visionegg.org/.

  15. An Open-Source Data Storage and Visualization Back End for Experimental Data

    DEFF Research Database (Denmark)

    Nielsen, Kenneth; Andersen, Thomas; Jensen, Robert

    2014-01-01

    and to interfere with the experiment if needed. The data stored consist both of specific measurements and of continuously logged system parameters. The latter is crucial to a variety of automation and surveillance features, and three cases of such features are described: monitoring system health, getting status......In this article, a flexible free and open-source software system for data logging and presentation will be described. The system is highly modular and adaptable and can be used in any laboratory in which continuous and/or ad hoc measurements require centralized storage. A presentation component...... for the data back end has furthermore been written that enables live visualization of data on any device capable of displaying Web pages. The system consists of three parts: data-logging clients, a data server, and a data presentation Web site. The logging of data from independent clients leads to high...

  16. WiseView: Visualizing motion and variability of faint WISE sources

    Science.gov (United States)

    Caselden, Dan; Westin, Paul, III; Meisner, Aaron; Kuchner, Marc; Colin, Guillaume

    2018-06-01

    WiseView renders image blinks of Wide-field Infrared Survey Explorer (WISE) coadds spanning a multi-year time baseline in a browser. The software allows for easy visual identification of motion and variability for sources far beyond the single-frame detection limit, a key threshold not surmounted by many studies. WiseView transparently gathers small image cutouts drawn from many terabytes of unWISE coadds, facilitating access to this large and unique dataset. Users need only input the coordinates of interest and can interactively tune parameters including the image stretch, colormap and blink rate. WiseView was developed in the context of the Backyard Worlds: Planet 9 citizen science project, and has enabled hundreds of brown dwarf candidate discoveries by citizen scientists and professional astronomers.

  17. Phylo-mLogo: an interactive and hierarchical multiple-logo visualization tool for alignment of many sequences

    Directory of Open Access Journals (Sweden)

    Lee DT

    2007-02-01

    Full Text Available Abstract Background When aligning several hundreds or thousands of sequences, such as epidemic virus sequences or homologous/orthologous sequences of some big gene families, to reconstruct the epidemiological history or their phylogenies, how to analyze and visualize the alignment results of many sequences has become a new challenge for computational biologists. Although there are several tools available for visualization of very long sequence alignments, few of them are applicable to the alignments of many sequences. Results A multiple-logo alignment visualization tool, called Phylo-mLogo, is presented in this paper. Phylo-mLogo calculates the variabilities and homogeneities of alignment sequences by base frequencies or entropies. Different from the traditional representations of sequence logos, Phylo-mLogo not only displays the global logo patterns of the whole alignment of multiple sequences, but also demonstrates their local homologous logos for each clade hierarchically. In addition, Phylo-mLogo also allows the user to focus only on the analysis of some important, structurally or functionally constrained sites in the alignment selected by the user or by built-in automatic calculation. Conclusion With Phylo-mLogo, the user can symbolically and hierarchically visualize hundreds of aligned sequences simultaneously and easily check the changes of their amino acid sites when analyzing many homologous/orthologous or influenza virus sequences. More information of Phylo-mLogo can be found at URL http://biocomp.iis.sinica.edu.tw/phylomlogo.

  18. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    Science.gov (United States)

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in

  19. Identifying Sources of Clinical Conflict: A Tool for Practice and Training in Bioethics Mediation.

    Science.gov (United States)

    Bergman, Edward J

    2015-01-01

    Bioethics mediators manage a wide range of clinical conflict emanating from diverse sources. Parties to clinical conflict are often not fully aware of, nor willing to express, the true nature and scope of their conflict. As such, a significant task of the bioethics mediator is to help define that conflict. The ability to assess and apply the tools necessary for an effective mediation process can be facilitated by each mediator's creation of a personal compendium of sources that generate clinical conflict, to provide an orientation for the successful management of complex dilemmatic cases. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.

  20. Which energy mix for the UK (United Kingdom)? An evolutive descriptive mapping with the integrated GAIA (graphical analysis for interactive aid)–AHP (analytic hierarchy process) visualization tool

    International Nuclear Information System (INIS)

    Ishizaka, Alessio; Siraj, Sajid; Nemery, Philippe

    2016-01-01

    Although Multi-Criteria Decision Making methods have been extensively used in energy planning, their descriptive use has been rarely considered. In this paper, we add an evolutionary description phase as an extension to the AHP (analytic hierarchy process) method that helps policy makers to gain insights into their decision problems. The proposed extension has been implemented in an open-source software that allows the users to visualize the difference of opinions within a decision process, and also the evolution of preferences over time. The method was tested in a two-phase experiment to understand the evolution of opinions on energy sources. Participants were asked to provide their preferences for different energy sources for the next twenty years for the United Kingdom. They were first asked to compare the options intuitively without using any structured approach, and then were given three months to compare the same set of options after collecting detailed information on the technical, economic, environmental and social impacts created by each of the selected energy sources. The proposed visualization method allow us to quickly discover the preference directions, and also the changes in their preferences from first to second phase. The proposed tool can help policy makers in better understanding of the energy planning problems that will lead us towards better planning and decisions in the energy sector. - Highlights: • We introduce a descriptive visual analysis tool for the analytic hierarchy process. • The method has been implemented as an open-source preference elicitation tool. • We analyse user preferences in the energy sector using this method. • The tool also provides a way to visualize temporal preferences changes. • The main negative temporal shift in the ranking was found for the nuclear energy.

  1. Visual exploration and analysis of ionospheric scintillation monitoring data: The ISMR Query Tool

    Science.gov (United States)

    Vani, Bruno César; Shimabukuro, Milton Hirokazu; Galera Monico, João Francisco

    2017-07-01

    Ionospheric Scintillations are rapid variations on the phase and/or amplitude of a radio signal as it passes through ionospheric plasma irregularities. The ionosphere is a specific layer of the Earth's atmosphere located approximately between 50 km and 1000 km above the Earth's surface. As Global Navigation Satellite Systems (GNSS) - such as GPS, Galileo, BDS and GLONASS - use radio signals, these variations degrade their positioning service quality. Due to its location, Brazil is one of the places most affected by scintillation in the world. For that reason, ionosphere monitoring stations have been deployed over Brazilian territory since 2011 through cooperative projects between several institutions in Europe and Brazil. Such monitoring stations compose a network that generates a large amount of monitoring data everyday. GNSS receivers deployed at these stations - named Ionospheric Scintillation Monitor Receivers (ISMR) - provide scintillation indices and related signal metrics for available satellites dedicated to satellite-based navigation and positioning services. With this monitoring infrastructure, more than ten million observation values are generated and stored every day. Extracting the relevant information from this huge amount of data was a hard process and required the expertise of computer and geoscience scientists. This paper describes the concepts, design and aspects related to the implementation of the software that has been supporting research on ISMR data - the so-called ISMR Query Tool. Usability and other aspects are also presented via examples of application. This web based software has been designed and developed aiming to ensure insights over the huge amount of ISMR data that is fetched every day on an integrated platform. The software applies and adapts time series mining and information visualization techniques to extend the possibilities of exploring and analyzing ISMR data. The software is available to the scientific community through the

  2. Analysis and Visualization Tool for Targeted Amplicon Bisulfite Sequencing on Ion Torrent Sequencers.

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    Full Text Available Targeted sequencing of PCR amplicons generated from bisulfite deaminated DNA is a flexible, cost-effective way to study methylation of a sample at single CpG resolution and perform subsequent multi-target, multi-sample comparisons. Currently, no platform specific protocol, support, or analysis solution is provided to perform targeted bisulfite sequencing on a Personal Genome Machine (PGM. Here, we present a novel tool, called TABSAT, for analyzing targeted bisulfite sequencing data generated on Ion Torrent sequencers. The workflow starts with raw sequencing data, performs quality assessment, and uses a tailored version of Bismark to map the reads to a reference genome. The pipeline visualizes results as lollipop plots and is able to deduce specific methylation-patterns present in a sample. The obtained profiles are then summarized and compared between samples. In order to assess the performance of the targeted bisulfite sequencing workflow, 48 samples were used to generate 53 different Bisulfite-Sequencing PCR amplicons from each sample, resulting in 2,544 amplicon targets. We obtained a mean coverage of 282X using 1,196,822 aligned reads. Next, we compared the sequencing results of these targets to the methylation level of the corresponding sites on an Illumina 450k methylation chip. The calculated average Pearson correlation coefficient of 0.91 confirms the sequencing results with one of the industry-leading CpG methylation platforms and shows that targeted amplicon bisulfite sequencing provides an accurate and cost-efficient method for DNA methylation studies, e.g., to provide platform-independent confirmation of Illumina Infinium 450k methylation data. TABSAT offers a novel way to analyze data generated by Ion Torrent instruments and can also be used with data from the Illumina MiSeq platform. It can be easily accessed via the Platomics platform, which offers a web-based graphical user interface along with sample and parameter storage

  3. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  4. VisPortal: Deploying grid-enabled visualization tools through a web-portal interface

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, Wes; Siegerist, Cristina; Shalf, John; Shetty, Praveenkumar; Jankun-Kelly, T.J.; Kreylos, Oliver; Ma, Kwan-Liu

    2003-06-09

    The LBNL/NERSC Visportal effort explores ways to deliver advanced Remote/Distributed Visualization (RDV) capabilities through a Grid-enabled web-portal interface. The effort focuses on latency tolerant distributed visualization algorithms, GUI designs that are more appropriate for the capabilities of web interfaces, and refactoring parallel-distributed applications to work in a N-tiered component deployment strategy. Most importantly, our aim is to leverage commercially-supported technology as much as possible in order to create a deployable, supportable, and hence viable platform for delivering grid-based visualization services to collaboratory users.

  5. The magnetic source imaging of pattern reversal stimuli of various visual fields

    International Nuclear Information System (INIS)

    Zhang Shuqian; Ye Yufang; Sun Jilin; Wu Jie; Jia Xiuchuan; Li Sumin; Wu Jing; Zhao Huadong; Liu Lianxiang; Wu Yujin

    2006-01-01

    Objective: To have acknowledgement of characteristics of normal volunteers visual evoked fields about full field, vertical half field and quadrant field and their dipole location by magnetoencephalography. Methods: The visual evoked fields of full field, vertical half field and quadrant field were detected with 13 subjects. The latency, dipole strength and dipoles' location on x, y and z axis were analyzed. The exact locations of the dipoles were detected by overlapping on MR images. Results: The isocontour map of M100 of full field stimulation demonstrated two separate sources. The two M100 dipoles had same peak latency and different strength. And for vertical half field and quadrant field stimulation, evoked magnetic fields of M100 distributed contralateral to the stimulated side. The M100 dipoles on the z-axis to the lower quadrant field stimulation were located significantly higher than those to the upper quadrant field stimulation. The Z value median of left upper quadrant was 49.6 (35.1-72.8) mm. The Z value median of left lower quadrant was 53.5 (44.8-76.3) mm. The different of two left quadrant medians, 3.9 mm, was significant (P<0.05). The Z value median of right upper quadrant was 40.0 (34.8-44.6) mm. The Z value median of right lower quadrant was 53.8 (40.6-61.3) mm. The different of two right quadrant medians, 13.8 mm, was also significant (P<0.05). Although each of the visual evoked fields waveforms and dipole locations demonstrated large intra- and inter-individual variations, the dipole of M100 was mainly located at area Brodmann 17, which includes superior lingual gyrus, posterior cuneus-lingual gyrus and inferior cuneus gyms. Conclusion: The M100 of visual evoked fields of pattern reversal stimulation is mainly generated by the neurons of striate cortex of contralateral to the stimulated side, which is at the lateral bottom of the calcarine fissure. (authors)

  6. WorldWide Telescope: A Newly Open Source Astronomy Visualization System

    Science.gov (United States)

    Fay, Jonathan; Roberts, Douglas A.

    2016-01-01

    After eight years of development by Microsoft Research, WorldWide Telescope (WWT) was made an open source project at the end of June 2015. WWT was motivated by the desire to put new surveys of objects, such as the Sloan Digital Sky Survey in the context of the night sky. The development of WWT under Microsoft started with the creation of a Windows desktop client that is widely used in various education, outreach and research projects. Using this, users can explore the data built into WWT as well as data that is loaded in. Beyond exploration, WWT can be used to create tours that present various datasets a narrative format.In the past two years, the team developed a collection of web controls, including an HTML5 web client, which contains much of the functionality of the Windows desktop client. The project under Microsoft has deep connections with several user communities such as education through the WWT Ambassadors program, http://wwtambassadors.org/ and with planetariums and museums such as the Adler Planetarium. WWT can also support research, including using WWT to visualize the Bones of the Milky Way and rich connections between WWT and the Astrophysical Data Systems (ADS, http://labs.adsabs.harvard.edu/adsabs/). One important new research connection is the use of WWT to create dynamic and potentially interactive supplements to journal articles, which have been created in 2015.Now WWT is an open source community lead project. The source code is available in GitHub (https://github.com/WorldWideTelescope). There is significant developer documentation on the website (http://worldwidetelescope.org/Developers/) and an extensive developer workshops (http://wwtworkshops.org/?tribe_events=wwt-developer-workshop) has taken place in the fall of 2015.Now that WWT is open source anyone who has the interest in the project can be a contributor. As important as helping out with coding, the project needs people interested in documentation, testing, training and other roles.

  7. iPhone Open Application Development Write Native Applications Using the Open Source Tool Chain

    CERN Document Server

    Zdziarski, Jonathan

    2008-01-01

    Developers everywhere are eager to create applications for the iPhone, and many of them prefer the open source, community-developed tool chain to Apple's own toolkit. This new edition of iPhone Open Application Development covers the latest version of the open toolkit -- now updated for Apple's iPhone 2.x software and iPhone 3G -- and explains in clear language how to create applications using Objective-C and the iPhone API.

  8. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  9. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  10. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  11. Windmill Noise Annoyance, Visual Aesthetics, and Attitudes towards Renewable Energy Sources

    Science.gov (United States)

    Klæboe, Ronny; Sundfør, Hanne Beate

    2016-01-01

    A small focused socio-acoustic after-study of annoyance from a windmill park was undertaken after local health officials demanded a health impact study to look into neighborhood complaints. The windmill park consists of 31 turbines and is located in the South of Norway where it affects 179 dwellings. Simple exposure-effect relationships indicate stronger reactions to windmills and wind turbine noise than shown internationally, with the caveat that the sample size is small (n = 90) and responses are colored by the existing local conflict. Pulsating swishing sounds and turbine engine hum are the main causes of noise annoyance. About 60 per cent of those who participated in the survey were of the opinion that windmills degrade the landscape aesthetically, and were far from convinced that land-based windmills are desirable as a renewable energy source (hydropower is an important alternative source of renewables in Norway). Attitudes play an important role in addition to visual aesthetics in determining the acceptance of windmills and the resulting noise annoyance. To compare results from different wind turbine noise studies it seems necessary to assess the impact of important modifying factors. PMID:27455301

  12. Windmill Noise Annoyance, Visual Aesthetics, and Attitudes towards Renewable Energy Sources

    Directory of Open Access Journals (Sweden)

    Ronny Klæboe

    2016-07-01

    Full Text Available A small focused socio-acoustic after-study of annoyance from a windmill park was undertaken after local health officials demanded a health impact study to look into neighborhood complaints. The windmill park consists of 31 turbines and is located in the South of Norway where it affects 179 dwellings. Simple exposure-effect relationships indicate stronger reactions to windmills and wind turbine noise than shown internationally, with the caveat that the sample size is small (n = 90 and responses are colored by the existing local conflict. Pulsating swishing sounds and turbine engine hum are the main causes of noise annoyance. About 60 per cent of those who participated in the survey were of the opinion that windmills degrade the landscape aesthetically, and were far from convinced that land-based windmills are desirable as a renewable energy source (hydropower is an important alternative source of renewables in Norway. Attitudes play an important role in addition to visual aesthetics in determining the acceptance of windmills and the resulting noise annoyance. To compare results from different wind turbine noise studies it seems necessary to assess the impact of important modifying factors.

  13. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  14. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  15. Interactive web visualization tools to the results interpretation of a seismic risk study aimed at the emergency levels definition

    Science.gov (United States)

    Rivas-Medina, A.; Gutierrez, V.; Gaspar-Escribano, J. M.; Benito, B.

    2009-04-01

    Results of a seismic risk assessment study are often applied and interpreted by users unspecialised on the topic or lacking a scientific background. In this context, the availability of tools that help translating essentially scientific contents to broader audiences (such as decision makers or civil defence officials) as well as representing and managing results in a user-friendly fashion, are on indubitable value. On of such tools is the visualization tool VISOR-RISNA, a web tool developed within the RISNA project (financed by the Emergency Agency of Navarre, Spain) for regional seismic risk assessment of Navarre and the subsequent development of emergency plans. The RISNA study included seismic hazard evaluation, geotechnical characterization of soils, incorporation of site effects to expected ground motions, vulnerability distribution assessment and estimation of expected damage distributions for a 10% probability of exceedance in 50 years. The main goal of RISNA was the identification of higher risk area where focusing detailed, local-scale risk studies in the future and the corresponding urban emergency plans. A geographic information system was used to combine different information layers, generate tables of results and represent maps with partial and final results. The visualization tool VISOR-RISNA is intended to facilitate the interpretation and representation of the collection of results, with the ultimate purpose of defining actuation plans. A number of criteria for defining actuation priorities are proposed in this work. They are based on combinations of risk parameters resulting from the risk study (such as expected ground motion and damage and exposed population), as determined by risk assessment specialists. Although the values that these parameters take are a result of the risk study, their distribution in several classes depends on the intervals defined by decision takers or civil defense officials. These criteria provide a ranking of

  16. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    Science.gov (United States)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk

  17. SNPexp - A web tool for calculating and visualizing correlation between HapMap genotypes and gene expression levels

    Directory of Open Access Journals (Sweden)

    Franke Andre

    2010-12-01

    Full Text Available Abstract Background Expression levels for 47294 transcripts in lymphoblastoid cell lines from all 270 HapMap phase II individuals, and genotypes (both HapMap phase II and III of 3.96 million single nucleotide polymorphisms (SNPs in the same individuals are publicly available. We aimed to generate a user-friendly web based tool for visualization of the correlation between SNP genotypes within a specified genomic region and a gene of interest, which is also well-known as an expression quantitative trait locus (eQTL analysis. Results SNPexp is implemented as a server-side script, and publicly available on this website: http://tinyurl.com/snpexp. Correlation between genotype and transcript expression levels are calculated by performing linear regression and the Wald test as implemented in PLINK and visualized using the UCSC Genome Browser. Validation of SNPexp using previously published eQTLs yielded comparable results. Conclusions SNPexp provides a convenient and platform-independent way to calculate and visualize the correlation between HapMap genotypes within a specified genetic region anywhere in the genome and gene expression levels. This allows for investigation of both cis and trans effects. The web interface and utilization of publicly available and widely used software resources makes it an attractive supplement to more advanced bioinformatic tools. For the advanced user the program can be used on a local computer on custom datasets.

  18. Monitoring CMS tracker construction and data quality using a Grid/Web service based on a visualization tool

    CERN Document Server

    Zito, Giuseppe; Regano, A

    2004-01-01

    The complexity of the CMS tracker (more than 50 million channels to monitor) now in construction in ten laboratories worldwide with hundreds of interested people, will require new tools for monitoring both the hardware and the software. In our approach we use both visualization tools and Grid services to make this monitoring possible. The use of visualization enables us to represent in a single computer screen all those million channels at once. The Grid will make it possible to get enough data and computing power in order to check every channel and also to reach the experts everywhere in the world allowing the early discovery of problems. We report here on a first prototype developed using the Grid environment already available now in CMS i.e. LCG2. This prototype consists on a Java client which implements the GUI for tracker visualization and two data servers connected to the tracker construction database and to Grid catalogs of event datasets. All the communication between client and servers is done using ...

  19. Visualizing the Cardiac Cycle: A Useful Tool to Promote Student Understanding

    Directory of Open Access Journals (Sweden)

    Ivan Shun Ho

    2011-03-01

    Full Text Available The cardiac cycle is an important concept presented in human anatomy and physiology courses. At Kingsborough Community College, all Allied Health majors taking Anatomy & Physiology must understand the cardiac cycle to grasp more advanced concepts. Contemporary textbooks illustrate the cardiac cycle’s concurrent events via linear models with overlapping line segments as physiological readouts. This presentation is appropriate for reference but, in the interactive classroom the promotion of understanding through clear, concise visual cues is essential. Muzio and Pilchman created a diagram to summarize events of the cardiac cycle. After discussions with one of the authors, I modified the diagram to aid visualization of the cycle and emphasize it as a repetitive, continuous process. A flow diagram presenting the portions of the cycle individually and progressively was also constructed. Three labeled phases are made from the diagram, based on grouped events occurring at different points. The simple, compartmentalized, cyclical diagram presented here promotes understanding of the cardiac cycle visually.

  20. Visual operations management tools in oil pipelines and terminals standardization processes

    Energy Technology Data Exchange (ETDEWEB)

    De Ludovico Almeida, Maria Fatima [Pontifical Catholic University of Rio de Janeiro (Brazil); Santiago, Adilson; Senra Ribeiro, Kassandra; Mendonca Arruda, Daniela [Petrobras Transporte (Brazil)

    2010-07-01

    Visual operations management (VOM) takes advantage of visual cues to communicate information, simplify processes and improve the quality and safety of operations. Because of heightened competition, the importance of standardization and quality management processes has become more evident for pipeline companies. Petrobras Transporte's marine terminal units has been working over the last years to be recognized as a reference in the activities it pursues. This is based on the Petrobras Transporte's strategic plan 2020, which foresees amongst others, the specialization of technical workforce, operational safety excellence, capital discipline, customer satisfaction, the search for new technologies and markets and the rendering of new services. To achieve these goals, the Marine Terminals standardization program must be adhered to. Focusing on communication and adoption of standards and procedures, this paper describes how visual guides were conceived and implemented within Petrobras Transporte to enable operators and technicians to meet operational, environmental and occupational health and safety requirements.

  1. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  2. Visual Indicators on Vaccine Boxes as Early Warning Tools to Identify Potential Freeze Damage.

    Science.gov (United States)

    Angoff, Ronald; Wood, Jillian; Chernock, Maria C; Tipping, Diane

    2015-07-01

    The aim of this study was to determine whether the use of visual freeze indicators on vaccines would assist health care providers in identifying vaccines that may have been exposed to potentially damaging temperatures. Twenty-seven sites in Connecticut involved in the Vaccine for Children Program participated. In addition to standard procedures, visual freeze indicators (FREEZEmarker ® L; Temptime Corporation, Morris Plains, NJ) were affixed to each box of vaccine that required refrigeration but must not be frozen. Temperatures were monitored twice daily. During the 24 weeks, all 27 sites experienced triggered visual freeze indicator events in 40 of the 45 refrigerators. A total of 66 triggered freeze indicator events occurred in all 4 types of refrigerators used. Only 1 of the freeze events was identified by a temperature-monitoring device. Temperatures recorded on vaccine data logs before freeze indicator events were within the 35°F to 46°F (2°C to 8°C) range in all but 1 instance. A total of 46,954 doses of freeze-sensitive vaccine were stored at the time of a visual freeze indicator event. Triggered visual freeze indicators were found on boxes containing 6566 doses (14.0% of total doses). Of all doses stored, 14,323 doses (30.5%) were of highly freeze-sensitive vaccine; 1789 of these doses (12.5%) had triggered indicators on the boxes. Visual freeze indicators are useful in the early identification of freeze events involving vaccines. Consideration should be given to including these devices as a component of the temperature-monitoring system for vaccines.

  3. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  4. Effects of Various Sketching Tools on Visual Thinking in Idea Development

    Science.gov (United States)

    Chu, Po Ying; Hung, Hsiu Yen; Wu, Chih Fu; Liu, Yen Te

    2017-01-01

    Due to the wide application of digital tools and the improvement in interactive technologies, design thinking might change in digital world comparing to that in traditional design process. This study aims to explore the difference of design thinking between three kinds of sketching tools, i.e. hand-sketch, tablet, and pen-input display, by means…

  5. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  6. A spectroscopic tool for identifying sources of origin for materials of military interest

    Science.gov (United States)

    Miziolek, Andrzej W.; De Lucia, Frank C.

    2014-05-01

    There is a need to identify the source of origin for many items of military interest, including ammunition and weapons that may be circulated and traded in illicit markets. Both fieldable systems (man-portable or handheld) as well as benchtop systems in field and home base laboratories are desired for screening and attribution purposes. Laser Induced Breakdown Spectroscopy (LIBS) continues to show significant capability as a promising new tool for materials identification, matching, and provenance. With the use of the broadband, high resolution spectrometer systems, the LIBS devices can not only determine the elemental inventory of the sample, but they are also capable of elemental fingerprinting to signify sources of origin of various materials. We present the results of an initial study to differentiate and match spent cartridges from different manufacturers and countries. We have found that using Partial Least Squares Discriminant Analysis (PLS-DA) we are able to achieve on average 93.3% True Positives and 5.3% False Positives. These results add to the large body of publications that have demonstrated that LIBS is a particularly suitable tool for source of origin determinations.

  7. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  8. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  9. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  10. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  11. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  12. SimilarityExplorer: A visual inter-comparison tool for multifaceted climate data

    Science.gov (United States)

    J. Poco; A. Dasgupta; Y. Wei; W. Hargrove; C. Schwalm; R. Cook; E. Bertini; C. Silva

    2014-01-01

    Inter-comparison and similarity analysis to gauge consensus among multiple simulation models is a critical visualization problem for understanding climate change patterns. Climate models, specifically, Terrestrial Biosphere Models (TBM) represent time and space variable ecosystem processes, for example, simulations of photosynthesis and respiration, using algorithms...

  13. Student profiling on university co-curriculum activities using data visualization tools

    Science.gov (United States)

    Jamil, Jastini Mohd.; Shaharanee, Izwan Nizal Mohd

    2017-11-01

    Co-curricular activities are playing a vital role in the development of a holistic student. Co-curriculum can be described as an extension of the formal learning experiences in a course or academic program. There are many co-curriculum activities such as students' participation in sports, volunteerism, leadership, entrepreneurship, uniform body, student council, and other social events. The number of student involves in co-curriculum activities are large, thus creating an enormous volume of data including their demographic facts, academic performance and co-curriculum types. The task for discovering and analyzing these information becomes increasingly difficult and hard to comprehend. Data visualization offer a better ways in handling with large volume of information. The need for an understanding of these various co-curriculum activities and their effect towards student performance are essential. Visualizing these information can help related stakeholders to become aware of hidden and interesting information from large amount of data drowning in their student data. The main objective of this study is to provide a clearer understanding of the different trends hidden in the student co-curriculum activities data with related to their activities and academic performances. The data visualization software was used to help visualize the data extracted from the database.

  14. Efficient analysis using custom interactive visualization tools at a Superfund site

    International Nuclear Information System (INIS)

    Williams, G.; Durham, L.

    1992-01-01

    Custom visualization analysis programs were developed and used to analyze contaminant transport calculations from a three-dimensional numerical groundwater flow model developed for a Department of Energy Superfund site. The site hydrogeology, which is highly heterogenous, includes both fractured limestone and dolomite and alluvium deposits. Three-dimensional interactive visualization techniques were used to understand and analyze the three-dimensional, double-porosity modeling results. A graphical object oriented programming environment was applied to efficiently develop custom visualization programs in a coarse-grained data structure language. Comparisons were made, using the results from the three-dimensional, finite-difference model, between traditional two-dimensional analyses (contour and vector plots) and interactive three-dimensional techniques. Subjective comparison areas include the accuracy of analysis, the ability to understand the results of three-dimensional contaminant transport simulation, and the capability to transmit the results of the analysis to the project management. In addition, a quantitative comparison was made on the time required to develop a thorough analysis of the modeling results. The conclusions from the comparative study showed that the visualization analysis provided an increased awareness of the contaminant transport mechanisms, provided new insights into contaminant migration, and resulted in a significant time savings

  15. A Visual Encapsulation of Adlerian Theory: A Tool for Teaching and Learning.

    Science.gov (United States)

    Osborn, Cynthia J.

    2001-01-01

    A visual diagram is presented in this article to illustrate 6 key concepts of Adlerian theory discussed in corresponding narrative format. It is proposed that in an age of multimedia learning, a pictorial reference can enhance the teaching and learning of Adlerian theory, representing a commitment to humanistic education. (Contains 18 references.)…

  16. Efficient analysis using custom interactive visualization tools at a Superfund site

    Energy Technology Data Exchange (ETDEWEB)

    Williams, G. [Northwestern Univ., Evanston, IL (United States); Durham, L. [Argonne National Lab., IL (United States)

    1992-12-01

    Custom visualization analysis programs were developed and used to analyze contaminant transport calculations from a three-dimensional numerical groundwater flow model developed for a Department of Energy Superfund site. The site hydrogeology, which is highly heterogenous, includes both fractured limestone and dolomite and alluvium deposits. Three-dimensional interactive visualization techniques were used to understand and analyze the three-dimensional, double-porosity modeling results. A graphical object oriented programming environment was applied to efficiently develop custom visualization programs in a coarse-grained data structure language. Comparisons were made, using the results from the three-dimensional, finite-difference model, between traditional two-dimensional analyses (contour and vector plots) and interactive three-dimensional techniques. Subjective comparison areas include the accuracy of analysis, the ability to understand the results of three-dimensional contaminant transport simulation, and the capability to transmit the results of the analysis to the project management. In addition, a quantitative comparison was made on the time required to develop a thorough analysis of the modeling results. The conclusions from the comparative study showed that the visualization analysis provided an increased awareness of the contaminant transport mechanisms, provided new insights into contaminant migration, and resulted in a significant time savings.

  17. Using a free software tool for the visualization of complicated electromagnetic fields

    International Nuclear Information System (INIS)

    Murello, A; Milotti, E

    2014-01-01

    Here, we show how a readily available and free scientific visualization program—ParaView—can be used to display electric fields in interesting situations. We give a few examples and specify the individual steps that lead to highly educational representations of the fields. (paper)

  18. SlicerAstro : A 3-D interactive visual analytics tool for HI data

    NARCIS (Netherlands)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Fillion-Robin, J. C.; Yu, L.

    SKA precursors are capable of detecting hundreds of galaxies in HI in a single 12 h pointing. In deeper surveys one will probe more easily faint HI structures, typically located in the vicinity of galaxies, such as tails, filaments, and extraplanar gas. The importance of interactive visualization in

  19. Knowledge Visualizations: A Tool to Achieve Optimized Operational Decision Making and Data Integration

    Science.gov (United States)

    2015-06-01

    based upon a pyramid of feedback loops, FFIRs, or PIRs. Reports, in response to FFIRs and PIRs, forward information up the chain of command as a...Communication, The American University, Cairo, Egypt . Keim, D. A., Mansmann, F., Schneidewind, J., & Ziegler, H. (2006). Challenges in visual data

  20. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    Science.gov (United States)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  1. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  2. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  3. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    Directory of Open Access Journals (Sweden)

    Njeh Christopher F

    2012-03-01

    Full Text Available Abstract Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID or computed radiography (CR. We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence.

  4. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    International Nuclear Information System (INIS)

    Njeh, Christopher F; Caroprese, Blas; Desai, Pushkar

    2012-01-01

    The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence

  5. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  6. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  7. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    Science.gov (United States)

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize

  8. Pika: A snow science simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the

  9. Top-Down Control of Visual Alpha Oscillations: Sources of Control Signals and Their Mechanisms of Action

    Science.gov (United States)

    Wang, Chao; Rajagovindan, Rajasimhan; Han, Sahng-Min; Ding, Mingzhou

    2016-01-01

    Alpha oscillations (8–12 Hz) are thought to inversely correlate with cortical excitability. Goal-oriented modulation of alpha has been studied extensively. In visual spatial attention, alpha over the region of visual cortex corresponding to the attended location decreases, signifying increased excitability to facilitate the processing of impending stimuli. In contrast, in retention of verbal working memory, alpha over visual cortex increases, signifying decreased excitability to gate out stimulus input to protect the information held online from sensory interference. According to the prevailing model, this goal-oriented biasing of sensory cortex is effected by top-down control signals from frontal and parietal cortices. The present study tests and substantiates this hypothesis by (a) identifying the signals that mediate the top-down biasing influence, (b) examining whether the cortical areas issuing these signals are task-specific or task-independent, and (c) establishing the possible mechanism of the biasing action. High-density human EEG data were recorded in two experimental paradigms: a trial-by-trial cued visual spatial attention task and a modified Sternberg working memory task. Applying Granger causality to both sensor-level and source-level data we report the following findings. In covert visual spatial attention, the regions exerting top-down control over visual activity are lateralized to the right hemisphere, with the dipoles located at the right frontal eye field (FEF) and the right inferior frontal gyrus (IFG) being the main sources of top-down influences. During retention of verbal working memory, the regions exerting top-down control over visual activity are lateralized to the left hemisphere, with the dipoles located at the left middle frontal gyrus (MFG) being the main source of top-down influences. In both experiments, top-down influences are mediated by alpha oscillations, and the biasing effect is likely achieved via an inhibition

  10. A new open-source Python-based Space Weather data access, visualization, and analysis toolkit

    Science.gov (United States)

    de Larquier, S.; Ribeiro, A.; Frissell, N. A.; Spaleta, J.; Kunduri, B.; Thomas, E. G.; Ruohoniemi, J.; Baker, J. B.

    2013-12-01

    Space weather research relies heavily on combining and comparing data from multiple observational platforms. Current frameworks exist to aggregate some of the data sources, most based on file downloads via web or ftp interfaces. Empirical models are mostly fortran based and lack interfaces with more useful scripting languages. In an effort to improve data and model access, the SuperDARN community has been developing a Python-based Space Science Data Visualization Toolkit (DaViTpy). At the center of this development was a redesign of how our data (from 30 years of SuperDARN radars) was made available. Several access solutions are now wrapped into one convenient Python interface which probes local directories, a new remote NoSQL database, and an FTP server to retrieve the requested data based on availability. Motivated by the efficiency of this interface and the inherent need for data from multiple instruments, we implemented similar modules for other space science datasets (POES, OMNI, Kp, AE...), and also included fundamental empirical models with Python interfaces to enhance data analysis (IRI, HWM, MSIS...). All these modules and more are gathered in a single convenient toolkit, which is collaboratively developed and distributed using Github and continues to grow. While still in its early stages, we expect this toolkit will facilitate multi-instrument space weather research and improve scientific productivity.

  11. Developmental improvements in the resolution and capacity of visual working memory share a common source

    Science.gov (United States)

    Simmering, Vanessa R.; Miller, Hilary E.

    2016-01-01

    The nature of visual working memory (VWM) representations is currently a source of debate between characterizations as slot-like versus a flexibly-divided pool of resources. Recently, a dynamic neural field model has been proposed as an alternative account that focuses more on the processes by which VWM representations are formed, maintained, and used in service of behavior. This dynamic model has explained developmental increases in VWM capacity and resolution through strengthening excitatory and inhibitory connections. Simulations of developmental improvements in VWM resolution suggest that one important change is the accuracy of comparisons between items held in memory and new inputs. Thus, the ability to detect changes is a critical component of developmental improvements in VWM performance across tasks, leading to the prediction that capacity and resolution should correlate during childhood. Comparing 5- to 8-year-old children’s performance across color discrimination and change detection tasks revealed the predicted correlation between estimates of VWM capacity and resolution, supporting the hypothesis that increasing connectivity underlies improvements in VWM during childhood. These results demonstrate the importance of formalizing the processes that support the use of VWM, rather than focusing solely on the nature of representations. We conclude by considering our results in the broader context of VWM development. PMID:27329264

  12. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  13. CellProfiler and KNIME: open source tools for high content screening.

    Science.gov (United States)

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  14. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  15. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  16. The design of a visual history tool to help users refind information within a website

    OpenAIRE

    Do, TV; Ruddle, RA

    2012-01-01

    On the WWW users frequently revisit information they have previously seen, but "keeping found things found" is difficult when the information has not been visited frequently or recently, even if a user knows which website contained the information. This paper describes the design of a tool to help users refind information within a given website. The tool encodes data about a user's interest in webpages (measured by dwell time), the frequency and recency of visits, and navigational association...

  17. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  18. The Efficacy of Social Media as a Research Tool and Information Source for Safeguards Verification

    International Nuclear Information System (INIS)

    Skoeld, T.; Feldman, Y.

    2015-01-01

    The IAEA Department of Safeguards aims to provide credible assurances to the international community that States are fulfiling their safeguards obligations in that all nuclear material remains in peaceful use. In order to draw a soundly-based safeguards conclusion for a State that has a safeguards agreement in force with the IAEA, the Department establishes a knowledge base of the State's nuclear-related infrastructure and activities against which a State's declarations are evaluated for correctness and completeness. Open source information is one stream of data that is used in the evaluation of nuclear fuel cycle activities in the State. The Department is continuously working to ensure that it has access to the most up-to-date, accurate, relevant and credible open source information available, and has begun to examine the use of social media as a new source of information. The use of social networking sites has increased exponentially in the last decade. In fact, social media has emerged as the key vehicle for delivering and acquiring information in near real-time. Therefore, it has become necessary for the open source analyst to consider social media as an essential element in the broader concept of open source information. Characteristics, such as ''immediacy'', ''recency'', ''interractiveness'', which set social networks apart from the ''traditional media'', are also the same attributes that present a challenge for using social media as an efficient information-delivery platform and a credible source of information. New tools and technologies for social media analytics have begun to emerge to help systematically monitor and mine this large body of data. The paper will survey the social media landscape in an effort to identify platforms that could be of value for safeguards verification purposes. It will explore how a number of social networking sites, such as Twitter

  19. VISUALIZATION IN THE PACKAGE AUTODESK INVENTOR SKETCH GEOMETRY WHEN USING TOOLS IN THE THEORY OF R-FUNCTIONS

    Directory of Open Access Journals (Sweden)

    E. Іvanov

    2015-12-01

    Full Text Available The paper deals with possibility of automation and control of the computational process when using the tools in the theory of R-functions possessing the properties of logic algebra, while not going beyond the elementary functions, make it possible to build the equations of geometric objects with an almost arbitrary shape. And the use of computer graphics makes it posible to represent the equation of the boundar surface and the conical region of the curved tooth coupling and the whole disk in on analytical form with possible visualization in the Autodesk Inventor package.

  20. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    Science.gov (United States)

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  1. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields.

    Science.gov (United States)

    Sapozhnikov, Oleg A; Tsysar, Sergey A; Khokhlova, Vera A; Kreider, Wayne

    2015-09-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors.

  2. Web based tools for visualizing imaging data and development of XNATView, a zero footprint image viewer.

    Science.gov (United States)

    Gutman, David A; Dunn, William D; Cobb, Jake; Stoner, Richard M; Kalpathy-Cramer, Jayashree; Erickson, Bradley

    2014-01-01

    Advances in web technologies now allow direct visualization of imaging data sets without necessitating the download of large file sets or the installation of software. This allows centralization of file storage and facilitates image review and analysis. XNATView is a light framework recently developed in our lab to visualize DICOM images stored in The Extensible Neuroimaging Archive Toolkit (XNAT). It consists of a PyXNAT-based framework to wrap around the REST application programming interface (API) and query the data in XNAT. XNATView was developed to simplify quality assurance, help organize imaging data, and facilitate data sharing for intra- and inter-laboratory collaborations. Its zero-footprint design allows the user to connect to XNAT from a web browser, navigate through projects, experiments, and subjects, and view DICOM images with accompanying metadata all within a single viewing instance.

  3. Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions

    Science.gov (United States)

    Nottrott, A.; Tan, S. M.; He, Y.

    2016-12-01

    vertical dispersion of the plume due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments.

  4. Visual development as a tool for storytelling in animated feature films

    OpenAIRE

    Moura, João Garcia de Lima de

    2014-01-01

    This dissertation aims to study and deepen the understanding of Visual Development and the way it is used towards storytelling. In animated feature and short films every element is studied and created in order to help the viewer understand the story. We will study concepts like color and light to understand how they are used in order to create an emotional connection between the animation and the viewer. The animated film ‘Beauty and the Beast’ (1991) from Walt Disney studio...

  5. Ubiquitous Computing: Using everyday object as ambient visualization tools for persuasive design

    OpenAIRE

    Cahier, Jenny; Gullberg, Eric

    2008-01-01

    In order for companies to survive and advance in today’s competitive society, a massive amount of personal information from citizens is gathered. This thesis investigates how these digital footprints can be obtained and visualized to create awareness about personal actions and encourage change in behavior . In order to decide which data would be interesting and accessible, a map of possible application fields was generated and one single field was chosen for further study. The result is a bus...

  6. Visualization of E-commerce Transaction Data : USING BUSINESS INTELLIGENCE TOOLS

    OpenAIRE

    Safari, Arash

    2015-01-01

    Customer Value(CV) is a data analytics company experiencing problems presenting the result of their analytics in a satisfiable manner. As a result, they considered the use of a data visualization and business intelligence softwares. The purpose of such softwares are to, amongst other things, virtually represent data in an interactive and perceptible manner to the viewer. There are however a large number of these types of applications on the market, making it hard for companies to find the one...

  7. AppEEARS: A Simple Tool that Eases Complex Data Integration and Visualization Challenges for Users

    Science.gov (United States)

    Maiersperger, T.

    2017-12-01

    The Application for Extracting and Exploring Analysis-Ready Samples (AppEEARS) offers a simple and efficient way to perform discovery, processing, visualization, and acquisition across large quantities and varieties of Earth science data. AppEEARS brings significant value to a very broad array of user communities by 1) significantly reducing data volumes, at-archive, based on user-defined space-time-variable subsets, 2) promoting interoperability across a wide variety of datasets via format and coordinate reference system harmonization, 3) increasing the velocity of both data analysis and insight by providing analysis-ready data packages and by allowing interactive visual exploration of those packages, and 4) ensuring veracity by making data quality measures more apparent and usable and by providing standards-based metadata and processing provenance. Development and operation of AppEEARS is led by the National Aeronautics and Space Administration (NASA) Land Processes Distributed Active Archive Center (LP DAAC). The LP DAAC also partners with several other archives to extend the capability across a larger federation of geospatial data providers. Over one hundred datasets are currently available, covering a diversity of variables including land cover, population, elevation, vegetation indices, and land surface temperature. Many hundreds of users have already used this new web-based capability to make the complex tasks of data integration and visualization much simpler and more efficient.

  8. Reverse alignment "mirror image" visualization as a laparoscopic training tool improves task performance.

    Science.gov (United States)

    Dunnican, Ward J; Singh, T Paul; Ata, Ashar; Bendana, Emma E; Conlee, Thomas D; Dolce, Charles J; Ramakrishnan, Rakesh

    2010-06-01

    Reverse alignment (mirror image) visualization is a disconcerting situation occasionally faced during laparoscopic operations. This occurs when the camera faces back at the surgeon in the opposite direction from which the surgeon's body and instruments are facing. Most surgeons will attempt to optimize trocar and camera placement to avoid this situation. The authors' objective was to determine whether the intentional use of reverse alignment visualization during laparoscopic training would improve performance. A standard box trainer was configured for reverse alignment, and 34 medical students and junior surgical residents were randomized to train with either forward alignment (DIRECT) or reverse alignment (MIRROR) visualization. Enrollees were tested on both modalities before and after a 4-week structured training program specific to their modality. Student's t test was used to determine differences in task performance between the 2 groups. Twenty-one participants completed the study (10 DIRECT, 11 MIRROR). There were no significant differences in performance time between DIRECT or MIRROR participants during forward or reverse alignment initial testing. At final testing, DIRECT participants had improved times only in forward alignment performance; they demonstrated no significant improvement in reverse alignment performance. MIRROR participants had significant time improvement in both forward and reverse alignment performance at final testing. Reverse alignment imaging for laparoscopic training improves task performance for both reverse alignment and forward alignment tasks. This may be translated into improved performance in the operating room when faced with reverse alignment situations. Minimal lab training can account for drastic adaptation to this environment.

  9. Creating User-Friendly Tools for Data Analysis and Visualization in K-12 Classrooms: A Fortran Dinosaur Meets Generation Y

    Science.gov (United States)

    Chambers, L. H.; Chaudhury, S.; Page, M. T.; Lankey, A. J.; Doughty, J.; Kern, Steven; Rogerson, Tina M.

    2008-01-01

    During the summer of 2007, as part of the second year of a NASA-funded project in partnership with Christopher Newport University called SPHERE (Students as Professionals Helping Educators Research the Earth), a group of undergraduate students spent 8 weeks in a research internship at or near NASA Langley Research Center. Three students from this group formed the Clouds group along with a NASA mentor (Chambers), and the brief addition of a local high school student fulfilling a mentorship requirement. The Clouds group was given the task of exploring and analyzing ground-based cloud observations obtained by K-12 students as part of the Students' Cloud Observations On-Line (S'COOL) Project, and the corresponding satellite data. This project began in 1997. The primary analysis tools developed for it were in FORTRAN, a computer language none of the students were familiar with. While they persevered through computer challenges and picky syntax, it eventually became obvious that this was not the most fruitful approach for a project aimed at motivating K-12 students to do their own data analysis. Thus, about halfway through the summer the group shifted its focus to more modern data analysis and visualization tools, namely spreadsheets and Google(tm) Earth. The result of their efforts, so far, is two different Excel spreadsheets and a Google(tm) Earth file. The spreadsheets are set up to allow participating classrooms to paste in a particular dataset of interest, using the standard S'COOL format, and easily perform a variety of analyses and comparisons of the ground cloud observation reports and their correspondence with the satellite data. This includes summarizing cloud occurrence and cloud cover statistics, and comparing cloud cover measurements from the two points of view. A visual classification tool is also provided to compare the cloud levels reported from the two viewpoints. This provides a statistical counterpart to the existing S'COOL data visualization tool

  10. Open-Source tools: Incidence in the wireless security of the Technical University of Babahoyo

    Directory of Open Access Journals (Sweden)

    Joffre León-Acurio

    2018-02-01

    Full Text Available Computer security is a fundamental part of an organization, especially in Higher Education institutions, where there is very sensitive information, capable of being vulnerable by diffeerent methods of intrusion, the most common being free access through wireless points. The main objective of this research is to analyze the impact of the open source tools in charge of managing the security information of the wireless network, such as OSSIM, a set of active and passive components used to manage events that generate tra c within the network. net. This research exposes the use of free software as a viable option of low cost to solve the problems that a ict student sta , such as lack of access to academic services, problems of wireless interconnectivity, with the purpose to restore confidence in students in the Use of the services offered by the institution for research-related development, guaranteeing free and free access to the internet. The level of dissatisfaction on the part of the students con rms the problem presented at the Technical University of Babahoyo, thus confirming the positive influence of the Open-Source tools for the institution’s wireless security.

  11. Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity

    Science.gov (United States)

    Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.

    2017-12-01

    Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.

  12. Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2015-01-01

    Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.

  13. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for

  14. When complex is easy on the mind: internal repetition of visual information in complex objects is a source of perceptual fluency

    NARCIS (Netherlands)

    Linda Steg; Roos Pals; Ayça Berfu Ünal; Yannick Joye

    2015-01-01

    Across 3 studies, we investigated whether visual complexity deriving from internally repeating visual information over many scale levels is a source of perceptual fluency. Such continuous repetition of visual information is formalized in fractal geometry and is a key-property of natural structures.

  15. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other

  16. TreeQ-VISTA: An Interactive Tree Visualization Tool withFunctional Annotation Query Capabilities

    Energy Technology Data Exchange (ETDEWEB)

    Gu, Shengyin; Anderson, Iain; Kunin, Victor; Cipriano, Michael; Minovitsky, Simon; Weber, Gunther; Amenta, Nina; Hamann, Bernd; Dubchak,Inna

    2007-05-07

    Summary: We describe a general multiplatform exploratorytool called TreeQ-Vista, designed for presenting functional annotationsin a phylogenetic context. Traits, such as phenotypic and genomicproperties, are interactively queried from a relational database with auser-friendly interface which provides a set of tools for users with orwithout SQL knowledge. The query results are projected onto aphylogenetic tree and can be displayed in multiple color groups. A richset of browsing, grouping and query tools are provided to facilitatetrait exploration, comparison and analysis.Availability: The program,detailed tutorial and examples are available online athttp://genome-test.lbl.gov/vista/TreeQVista.

  17. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    Science.gov (United States)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  18. Development and formative evaluation of a visual e-tool to help decision makers navigate the evidence around health financing.

    Science.gov (United States)

    Skordis-Worrall, Jolene; Pulkki-Brännström, Anni-Maria; Utley, Martin; Kembhavi, Gayatri; Bricki, Nouria; Dutoit, Xavier; Rosato, Mikey; Pagel, Christina

    2012-12-21

    There are calls for low and middle income countries to develop robust health financing policies to increase service coverage. However, existing evidence around financing options is complex and often difficult for policy makers to access. To summarize the evidence on the impact of financing health systems and develop an e-tool to help decision makers navigate the findings. After reviewing the literature, we used thematic analysis to summarize the impact of 7 common health financing mechanisms on 5 common health system goals. Information on the relevance of each study to a user's context was provided by 11 country indicators. A Web-based e-tool was then developed to assist users in navigating the literature review. This tool</