WorldWideScience

Sample records for source visualization tool

  1. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  2. Text mining and visualization case studies using open-source tools

    CERN Document Server

    Chisholm, Andrew

    2016-01-01

    Text Mining and Visualization: Case Studies Using Open-Source Tools provides an introduction to text mining using some of the most popular and powerful open-source tools: KNIME, RapidMiner, Weka, R, and Python. The contributors-all highly experienced with text mining and open-source software-explain how text data are gathered and processed from a wide variety of sources, including books, server access logs, websites, social media sites, and message boards. Each chapter presents a case study that you can follow as part of a step-by-step, reproducible example. You can also easily apply and extend the techniques to other problems. All the examples are available on a supplementary website. The book shows you how to exploit your text data, offering successful application examples and blueprints for you to tackle your text mining tasks and benefit from open and freely available tools. It gets you up to date on the latest and most powerful tools, the data mining process, and specific text mining activities.

  3. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  4. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  5. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    International Nuclear Information System (INIS)

    Minelli, Annalisa; Marchesini, Ivan; Taylor, Faith E.; De Rosa, Pierluigi; Casagrande, Luca; Cenci, Michele

    2014-01-01

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods

  7. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr [Insitute Universitaire Européen de la Mer, Université de la Bretagne Occidentale, Rue Dumont D' Urville, 29280 Plouzané (France); Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it [National Research Council (CNR), Research Insitute for Geo-hydrological Protection (IRPI), Strada della Madonna Alta 126, 06125 Perugia (Italy); Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk [Earth and Environmental Dynamics Research Group, Department of Geography, King' s College London, Strand, London WC2R 2LS (United Kingdom); De Rosa, Pierluigi, E-mail: Pierluigi.Derosa@unipg.it [Physics and Geology Department, University of Perugia, Via Zefferino Faina 4, 06123 Perugia (Italy); Casagrande, Luca, E-mail: Luca.Casagrande@gfosservices.it [Gfosservices S.A., Open Source GIS-WebGIS Solutions, Spatial Data Infrastructures, Planning and Counseling, Via F.lli Cairoli 24, 06127 Perugia (Italy); Cenci, Michele, E-mail: mcenci@regione.umbria.it [Servizio Energia qualità dell' ambiente, rifiuti, attività estrattive, Regione Umbia, Corso Vannucci 96, 06121 Perugia (Italy)

    2014-11-15

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.

  8. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  9. Bloom: A Relationship Visualization Tool for Complex Networks

    Directory of Open Access Journals (Sweden)

    Frank Horsfall

    2010-07-01

    Full Text Available Faced with an ever-increasing capacity to collect and store data, organizations must find a way to make sense of it to their advantage. Methods are required to simplify the data so that it can inform strategic decisions and help solve problems. Visualization tools are becoming increasingly popular since they can display complex relationships in a simple, visual format. This article describes Bloom, a project at Carleton University to develop an open source visualization tool for complex networks and business ecosystems. It provides an overview of the visualization technology used in the project and demonstrates its potential impact through a case study using real-world data.

  10. Visualization tool for human-machine interface designers

    Science.gov (United States)

    Prevost, Michael P.; Banda, Carolyn P.

    1991-06-01

    As modern human-machine systems continue to grow in capabilities and complexity, system operators are faced with integrating and managing increased quantities of information. Since many information components are highly related to each other, optimizing the spatial and temporal aspects of presenting information to the operator has become a formidable task for the human-machine interface (HMI) designer. The authors describe a tool in an early stage of development, the Information Source Layout Editor (ISLE). This tool is to be used for information presentation design and analysis; it uses human factors guidelines to assist the HMI designer in the spatial layout of the information required by machine operators to perform their tasks effectively. These human factors guidelines address such areas as the functional and physical relatedness of information sources. By representing these relationships with metaphors such as spring tension, attractors, and repellers, the tool can help designers visualize the complex constraint space and interacting effects of moving displays to various alternate locations. The tool contains techniques for visualizing the relative 'goodness' of a configuration, as well as mechanisms such as optimization vectors to provide guidance toward a more optimal design. Also available is a rule-based design checker to determine compliance with selected human factors guidelines.

  11. The Visual Code Navigator : An Interactive Toolset for Source Code Investigation

    NARCIS (Netherlands)

    Lommerse, Gerard; Nossin, Freek; Voinea, Lucian; Telea, Alexandru

    2005-01-01

    We present the Visual Code Navigator, a set of three interrelated visual tools that we developed for exploring large source code software projects from three different perspectives, or views: The syntactic view shows the syntactic constructs in the source code. The symbol view shows the objects a

  12. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    Science.gov (United States)

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  13. Genovar: a detection and visualization tool for genomic variants.

    Science.gov (United States)

    Jung, Kwang Su; Moon, Sanghoon; Kim, Young Jin; Kim, Bong-Jo; Park, Kiejung

    2012-05-08

    Along with single nucleotide polymorphisms (SNPs), copy number variation (CNV) is considered an important source of genetic variation associated with disease susceptibility. Despite the importance of CNV, the tools currently available for its analysis often produce false positive results due to limitations such as low resolution of array platforms, platform specificity, and the type of CNV. To resolve this problem, spurious signals must be separated from true signals by visual inspection. None of the previously reported CNV analysis tools support this function and the simultaneous visualization of comparative genomic hybridization arrays (aCGH) and sequence alignment. The purpose of the present study was to develop a useful program for the efficient detection and visualization of CNV regions that enables the manual exclusion of erroneous signals. A JAVA-based stand-alone program called Genovar was developed. To ascertain whether a detected CNV region is a novel variant, Genovar compares the detected CNV regions with previously reported CNV regions using the Database of Genomic Variants (DGV, http://projects.tcag.ca/variation) and the Single Nucleotide Polymorphism Database (dbSNP). The current version of Genovar is capable of visualizing genomic data from sources such as the aCGH data file and sequence alignment format files. Genovar is freely accessible and provides a user-friendly graphic user interface (GUI) to facilitate the detection of CNV regions. The program also provides comprehensive information to help in the elimination of spurious signals by visual inspection, making Genovar a valuable tool for reducing false positive CNV results. http://genovar.sourceforge.net/.

  14. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  15. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  16. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  17. Big Data Visualization Tools

    OpenAIRE

    Bikakis, Nikos

    2018-01-01

    Data visualization is the presentation of data in a pictorial or graphical format, and a data visualization tool is the software that generates this presentation. Data visualization provides users with intuitive means to interactively explore and analyze data, enabling them to effectively identify interesting patterns, infer correlations and causalities, and supports sense-making activities.

  18. A Flexible Visualization Tool for Rapid Access to EFIT Results

    International Nuclear Information System (INIS)

    Zhang Ruirui; Xiao Bingjia; Luo Zhengping

    2014-01-01

    This paper introduces the design and implementation of an interactive tool, the EASTViewer, for the visualization of plasma equilibrium reconstruction results for EAST (the Experimental Advanced Superconducting Tokamak). Aimed at the operating system independently, Python, when combined with the PyGTK toolkit, is used as the programming language. Using modular design, the EASTViewer provides a unified interface with great flexibility. It is easy to access numerous data sources either from local data files or an MDSplus tree, and with the pre-defined configuration files, it can be extended to other tokamaks. The EASTViewer has been used as the major tool to visualize equilibrium data since the second EAST campaign in 2008, and it has been verified that the EASTViewer features a user-friendly interface, and has easy access to numerous data sources and cross-platforms. (fusion engineering)

  19. Standalone visualization tool for three-dimensional DRAGON geometrical models

    International Nuclear Information System (INIS)

    Lukomski, A.; McIntee, B.; Moule, D.; Nichita, E.

    2008-01-01

    DRAGON is a neutron transport and depletion code able to solve one-, two- and three-dimensional problems. To date DRAGON provides two visualization modules, able to represent respectively two- and three-dimensional geometries. The two-dimensional visualization module generates a postscript file, while the three dimensional visualization module generates a MATLAB M-file with instructions for drawing the tracks in the DRAGON TRACKING data structure, which implicitly provide a representation of the geometry. The current work introduces a new, standalone, tool based on the open-source Visualization Toolkit (VTK) software package which allows the visualization of three-dimensional geometrical models by reading the DRAGON GEOMETRY data structure and generating an axonometric image which can be manipulated interactively by the user. (author)

  20. Visualizing Cloud Properties and Satellite Imagery: A Tool for Visualization and Information Integration

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.

    2017-12-01

    Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.

  1. Methods and tools for big data visualization

    OpenAIRE

    Zubova, Jelena; Kurasova, Olga

    2015-01-01

    In this paper, methods and tools for big data visualization have been investigated. Challenges faced by the big data analysis and visualization have been identified. Technologies for big data analysis have been discussed. A review of methods and tools for big data visualization has been done. Functionalities of the tools have been demonstrated by examples in order to highlight their advantages and disadvantages.

  2. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  3. Visual intelligence Microsoft tools and techniques for visualizing data

    CERN Document Server

    Stacey, Mark; Jorgensen, Adam

    2013-01-01

    Go beyond design concepts and learn to build state-of-the-art visualizations The visualization experts at Microsoft's Pragmatic Works have created a full-color, step-by-step guide to building specific types of visualizations. The book thoroughly covers the Microsoft toolset for data analysis and visualization, including Excel, and explores best practices for choosing a data visualization design, selecting tools from the Microsoft stack, and building a dynamic data visualization from start to finish. You'll examine different types of visualizations, their strengths and weaknesses, a

  4. Integrated Data Visualization and Virtual Reality Tool

    Science.gov (United States)

    Dryer, David A.

    1998-01-01

    The Integrated Data Visualization and Virtual Reality Tool (IDVVRT) Phase II effort was for the design and development of an innovative Data Visualization Environment Tool (DVET) for NASA engineers and scientists, enabling them to visualize complex multidimensional and multivariate data in a virtual environment. The objectives of the project were to: (1) demonstrate the transfer and manipulation of standard engineering data in a virtual world; (2) demonstrate the effects of design and changes using finite element analysis tools; and (3) determine the training and engineering design and analysis effectiveness of the visualization system.

  5. Iterating between Tools to Create and Edit Visualizations.

    Science.gov (United States)

    Bigelow, Alex; Drucker, Steven; Fisher, Danyel; Meyer, Miriah

    2017-01-01

    A common workflow for visualization designers begins with a generative tool, like D3 or Processing, to create the initial visualization; and proceeds to a drawing tool, like Adobe Illustrator or Inkscape, for editing and cleaning. Unfortunately, this is typically a one-way process: once a visualization is exported from the generative tool into a drawing tool, it is difficult to make further, data-driven changes. In this paper, we propose a bridge model to allow designers to bring their work back from the drawing tool to re-edit in the generative tool. Our key insight is to recast this iteration challenge as a merge problem - similar to when two people are editing a document and changes between them need to reconciled. We also present a specific instantiation of this model, a tool called Hanpuku, which bridges between D3 scripts and Illustrator. We show several examples of visualizations that are iteratively created using Hanpuku in order to illustrate the flexibility of the approach. We further describe several hypothetical tools that bridge between other visualization tools to emphasize the generality of the model.

  6. CMS tracker visualization tools

    Energy Technology Data Exchange (ETDEWEB)

    Mennea, M.S. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Osborne, I. [Northeastern University, 360 Huntington Avenue, Boston, MA 02115 (United States); Regano, A. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy); Zito, G. [Dipartimento Interateneo di Fisica ' Michelangelo Merlin' e INFN sezione di Bari, Via Amendola 173 - 70126 Bari (Italy)]. E-mail: giuseppe.zito@ba.infn.it

    2005-08-21

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  7. CMS tracker visualization tools

    CERN Document Server

    Zito, G; Osborne, I; Regano, A

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking.

  8. CMS tracker visualization tools

    International Nuclear Information System (INIS)

    Mennea, M.S.; Osborne, I.; Regano, A.; Zito, G.

    2005-01-01

    This document will review the design considerations, implementations and performance of the CMS Tracker Visualization tools. In view of the great complexity of this sub-detector (more than 50 millions channels organized in 16540 modules each one of these being a complete detector), the standard CMS visualization tools (IGUANA and IGUANACMS) that provide basic 3D capabilities and integration within CMS framework, respectively, have been complemented with additional 2D graphics objects. Based on the experience acquired using this software to debug and understand both hardware and software during the construction phase, we propose possible future improvements to cope with online monitoring and event analysis during data taking

  9. uVis: A Formula-Based Visualization Tool

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Xu, Shangjin; Kuhail, Mohammad Amin

    Several tools use programming approaches for developing advanced visualizations. Others can with a few steps create simple visualizations with built-in patterns, and users with limited IT experience can use them. However, it is programming and time demanding to create and customize...... these visualizations. We introduce uVis, a tool that allows users with advanced spreadsheet-like IT knowledge and basic database understanding to create simple as well as advanced visualizations. These users construct visualizations by combining building blocks (i.e. controls, shapes). They specify spreadsheet...

  10. Three-dimensional visualization of ensemble weather forecasts – Part 1: The visualization tool Met.3D (version 1.0

    Directory of Open Access Journals (Sweden)

    M. Rautenhaus

    2015-07-01

    Full Text Available We present "Met.3D", a new open-source tool for the interactive three-dimensional (3-D visualization of numerical ensemble weather predictions. The tool has been developed to support weather forecasting during aircraft-based atmospheric field campaigns; however, it is applicable to further forecasting, research and teaching activities. Our work approaches challenging topics related to the visual analysis of numerical atmospheric model output – 3-D visualization, ensemble visualization and how both can be used in a meaningful way suited to weather forecasting. Met.3D builds a bridge from proven 2-D visualization methods commonly used in meteorology to 3-D visualization by combining both visualization types in a 3-D context. We address the issue of spatial perception in the 3-D view and present approaches to using the ensemble to allow the user to assess forecast uncertainty. Interactivity is key to our approach. Met.3D uses modern graphics technology to achieve interactive visualization on standard consumer hardware. The tool supports forecast data from the European Centre for Medium Range Weather Forecasts (ECMWF and can operate directly on ECMWF hybrid sigma-pressure level grids. We describe the employed visualization algorithms, and analyse the impact of the ECMWF grid topology on computing 3-D ensemble statistical quantities. Our techniques are demonstrated with examples from the T-NAWDEX-Falcon 2012 (THORPEX – North Atlantic Waveguide and Downstream Impact Experiment campaign.

  11. Development of Visualization Tools for ZPPR-15 Analysis

    International Nuclear Information System (INIS)

    Lee, Min Jae; Kim, Sang Ji

    2014-01-01

    ZPPR-15 cores consist of various drawer masters that have great heterogeneity. In order to build a proper homogenization strategy, the geometry of the drawer masters should be carefully analyzed with a visualization. Additionally, a visualization of drawer masters and the core configuration is necessary for minimizing human error during the input processing. For this purpose, visualization tools for a ZPPR-15 analysis has been developed based on a Perl script. In the following section, the implementation of visualization tools will be described and various visualization samples for both drawer masters and ZPPR-15 cores will be demonstrated. Visualization tools for drawer masters and a core configuration were successfully developed for a ZPPR-15 analysis. The visualization tools are expected to be useful for understanding ZPPR-15 experiments, and finding deterministic models of ZPPR-15. It turned out that generating VTK files is handy but the application of VTK files is powerful with the aid of the VISIT program

  12. SNPversity: a web-based tool for visualizing diversity

    Science.gov (United States)

    Schott, David A; Vinnakota, Abhinav G; Portwood, John L; Andorf, Carson M

    2018-01-01

    Abstract Many stand-alone desktop software suites exist to visualize single nucleotide polymorphism (SNP) diversity, but web-based software that can be easily implemented and used for biological databases is absent. SNPversity was created to answer this need by building an open-source visualization tool that can be implemented on a Unix-like machine and served through a web browser that can be accessible worldwide. SNPversity consists of a HDF5 database back-end for SNPs, a data exchange layer powered by TASSEL libraries that represent data in JSON format, and an interface layer using PHP to visualize SNP information. SNPversity displays data in real-time through a web browser in grids that are color-coded according to a given SNP’s allelic status and mutational state. SNPversity is currently available at MaizeGDB, the maize community’s database, and will be soon available at GrainGenes, the clade-oriented database for Triticeae and Avena species, including wheat, barley, rye, and oat. The code and documentation are uploaded onto github, and they are freely available to the public. We expect that the tool will be highly useful for other biological databases with a similar need to display SNP diversity through their web interfaces. Database URL: https://www.maizegdb.org/snpversity PMID:29688387

  13. VisBOL: Web-Based Tools for Synthetic Biology Design Visualization.

    Science.gov (United States)

    McLaughlin, James Alastair; Pocock, Matthew; Mısırlı, Göksel; Madsen, Curtis; Wipat, Anil

    2016-08-19

    VisBOL is a Web-based application that allows the rendering of genetic circuit designs, enabling synthetic biologists to visually convey designs in SBOL visual format. VisBOL designs can be exported to formats including PNG and SVG images to be embedded in Web pages, presentations and publications. The VisBOL tool enables the automated generation of visualizations from designs specified using the Synthetic Biology Open Language (SBOL) version 2.0, as well as a range of well-known bioinformatics formats including GenBank and Pigeoncad notation. VisBOL is provided both as a user accessible Web site and as an open-source (BSD) JavaScript library that can be used to embed diagrams within other content and software.

  14. Helioviewer: A Web 2.0 Tool for Visualizing Heterogeneous Heliophysics Data

    Science.gov (United States)

    Hughitt, V. K.; Ireland, J.; Lynch, M. J.; Schmeidel, P.; Dimitoglou, G.; Müeller, D.; Fleck, B.

    2008-12-01

    Solar physics datasets are becoming larger, richer, more numerous and more distributed. Feature/event catalogs (describing objects of interest in the original data) are becoming important tools in navigating these data. In the wake of this increasing influx of data and catalogs there has been a growing need for highly sophisticated tools for accessing and visualizing this wealth of information. Helioviewer is a novel tool for integrating and visualizing disparate sources of solar and Heliophysics data. Taking advantage of the newly available power of modern web application frameworks, Helioviewer merges image and feature catalog data, and provides for Heliophysics data a familiar interface not unlike Google Maps or MapQuest. In addition to streamlining the process of combining heterogeneous Heliophysics datatypes such as full-disk images and coronagraphs, the inclusion of visual representations of automated and human-annotated features provides the user with an integrated and intuitive view of how different factors may be interacting on the Sun. Currently, Helioviewer offers images from The Extreme ultraviolet Imaging Telescope (EIT), The Large Angle and Spectrometric COronagraph experiment (LASCO) and the Michelson Doppler Imager (MDI) instruments onboard The Solar and Heliospheric Observatory (SOHO), as well as The Transition Region and Coronal Explorer (TRACE). Helioviewer also incorporates feature/event information from the LASCO CME List, NOAA Active Regions, CACTus CME and Type II Radio Bursts feature/event catalogs. The project is undergoing continuous development with many more data sources and additional functionality planned for the near future.

  15. Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.

    Science.gov (United States)

    Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T

    2016-01-01

    Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).

  16. ASCI visualization tool evaluation, Version 2.0

    Energy Technology Data Exchange (ETDEWEB)

    Kegelmeyer, P. [ed.] [Sandia National Labs., Livermore, CA (United States). Center for Computational Engineering

    1997-04-01

    The charter of the ASCI Visualization Common Tools subgroup was to investigate and evaluate 3D scientific visualization tools. As part of that effort, a Tri-Lab evaluation effort was launched in February of 1996. The first step was to agree on a thoroughly documented list of 32 features against which all tool candidates would be evaluated. These evaluation criteria were both gleaned from a user survey and determined from informed extrapolation into the future, particularly as concerns the 3D nature and extremely large size of ASCI data sets. The second step was to winnow a field of 41 candidate tools down to 11. The selection principle was to be as inclusive as practical, retaining every tool that seemed to hold any promise of fulfilling all of ASCI`s visualization needs. These 11 tools were then closely investigated by volunteer evaluators distributed across LANL, LLNL, and SNL. This report contains the results of those evaluations, as well as a discussion of the evaluation philosophy and criteria.

  17. TacTool: a tactile rapid prototyping tool for visual interfaces

    NARCIS (Netherlands)

    Keyson, D.V.; Tang, H.K.; Anzai, Y.; Ogawa, K.; Mori, H.

    1995-01-01

    This paper describes the TacTool development tool and input device for designing and evaluating visual user interfaces with tactile feedback. TacTool is currently supported by the IPO trackball with force feedback in the x and y directions. The tool is designed to enable both the designer and the

  18. Visual illusion of tool use recalibrates tactile perception

    Science.gov (United States)

    Miller, Luke E.; Longo, Matthew R.; Saygin, Ayse P.

    2018-01-01

    Brief use of a tool recalibrates multisensory representations of the user’s body, a phenomenon called tool embodiment. Despite two decades of research, little is known about its boundary conditions. It has been widely argued that embodiment requires active tool use, suggesting a critical role for somatosensory and motor feedback. The present study used a visual illusion to cast doubt on this view. We used a mirror-based setup to induce a visual experience of tool use with an arm that was in fact stationary. Following illusory tool use, tactile perception was recalibrated on this stationary arm, and with equal magnitude as physical use. Recalibration was not found following illusory passive tool holding, and could not be accounted for by sensory conflict or general interhemispheric plasticity. These results suggest visual tool-use signals play a critical role in driving tool embodiment. PMID:28196765

  19. Rapid development of medical imaging tools with open-source libraries.

    Science.gov (United States)

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  20. Visualizing data mining results with the Brede tools

    Directory of Open Access Journals (Sweden)

    Finn A Nielsen

    2009-07-01

    Full Text Available A few neuroinformatics databases now exist that record results from neuroimaging studies in the form of brain coordinates in stereotaxic space. The Brede Toolbox was originally developed to extract, analyze and visualize data from one of them --- the BrainMap database. Since then the Brede Toolbox has expanded and now includes its own database with coordinates along with ontologies for brain regions and functions: The Brede Database. With Brede Toolbox and Database combined we setup automated workflows for extraction of data, mass meta-analytic data mining and visualizations. Most of the Web presence of the Brede Database is established by a single script executing a workflow involving these steps together with a final generation of Web pages with embedded visualizations and links to interactive three-dimensional models in the Virtual Reality Modeling Language. Apart from the Brede tools I briefly review alternate visualization tools and methods for Internet-based visualization and information visualization as well as portals for visualization tools.

  1. IViPP: A Tool for Visualization in Particle Physics

    Science.gov (United States)

    Tran, Hieu; Skiba, Elizabeth; Baldwin, Doug

    2011-10-01

    Experiments and simulations in physics generate a lot of data; visualization is helpful to prepare that data for analysis. IViPP (Interactive Visualizations in Particle Physics) is an interactive computer program that visualizes results of particle physics simulations or experiments. IViPP can handle data from different simulators, such as SRIM or MCNP. It can display relevant geometry and measured scalar data; it can do simple selection from the visualized data. In order to be an effective visualization tool, IViPP must have a software architecture that can flexibly adapt to new data sources and display styles. It must be able to display complicated geometry and measured data with a high dynamic range. We therefore organize it in a highly modular structure, we develop libraries to describe geometry algorithmically, use rendering algorithms running on the powerful GPU to display 3-D geometry at interactive rates, and we represent scalar values in a visual form of scientific notation that shows both mantissa and exponent. This work was supported in part by the US Department of Energy through the Laboratory for Laser Energetics (LLE), with special thanks to Craig Sangster at LLE.

  2. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  3. chimeraviz: a tool for visualizing chimeric RNA.

    Science.gov (United States)

    Lågstad, Stian; Zhao, Sen; Hoff, Andreas M; Johannessen, Bjarne; Lingjærde, Ole Christian; Skotheim, Rolf I

    2017-09-15

    Advances in high-throughput RNA sequencing have enabled more efficient detection of fusion transcripts, but the technology and associated software used for fusion detection from sequencing data often yield a high false discovery rate. Good prioritization of the results is important, and this can be helped by a visualization framework that automatically integrates RNA data with known genomic features. Here we present chimeraviz , a Bioconductor package that automates the creation of chimeric RNA visualizations. The package supports input from nine different fusion-finder tools: deFuse, EricScript, InFusion, JAFFA, FusionCatcher, FusionMap, PRADA, SOAPfuse and STAR-FUSION. chimeraviz is an R package available via Bioconductor ( https://bioconductor.org/packages/release/bioc/html/chimeraviz.html ) under Artistic-2.0. Source code and support is available at GitHub ( https://github.com/stianlagstad/chimeraviz ). rolf.i.skotheim@rr-research.no. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  4. Visual Impairment Screening Assessment (VISA) tool: pilot validation.

    Science.gov (United States)

    Rowe, Fiona J; Hepworth, Lauren R; Hanna, Kerry L; Howard, Claire

    2018-03-06

    To report and evaluate a new Vision Impairment Screening Assessment (VISA) tool intended for use by the stroke team to improve identification of visual impairment in stroke survivors. Prospective case cohort comparative study. Stroke units at two secondary care hospitals and one tertiary centre. 116 stroke survivors were screened, 62 by naïve and 54 by non-naïve screeners. Both the VISA screening tool and the comprehensive specialist vision assessment measured case history, visual acuity, eye alignment, eye movements, visual field and visual inattention. Full completion of VISA tool and specialist vision assessment was achieved for 89 stroke survivors. Missing data for one or more sections typically related to patient's inability to complete the assessment. Sensitivity and specificity of the VISA screening tool were 90.24% and 85.29%, respectively; the positive and negative predictive values were 93.67% and 78.36%, respectively. Overall agreement was significant; k=0.736. Lowest agreement was found for screening of eye movement and visual inattention deficits. This early validation of the VISA screening tool shows promise in improving detection accuracy for clinicians involved in stroke care who are not specialists in vision problems and lack formal eye training, with potential to lead to more prompt referral with fewer false positives and negatives. Pilot validation indicates acceptability of the VISA tool for screening of visual impairment in stroke survivors. Sensitivity and specificity were high indicating the potential accuracy of the VISA tool for screening purposes. Results of this study have guided the revision of the VISA screening tool ahead of full clinical validation. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. GAViT: Genome Assembly Visualization Tool for Short Read Data

    Energy Technology Data Exchange (ETDEWEB)

    Syed, Aijazuddin; Shapiro, Harris; Tu, Hank; Pangilinan, Jasmyn; Trong, Stephan

    2008-03-14

    It is a challenging job for genome analysts to accurately debug, troubleshoot, and validate genome assembly results. Genome analysts rely on visualization tools to help validate and troubleshoot assembly results, including such problems as mis-assemblies, low-quality regions, and repeats. Short read data adds further complexity and makes it extremely challenging for the visualization tools to scale and to view all needed assembly information. As a result, there is a need for a visualization tool that can scale to display assembly data from the new sequencing technologies. We present Genome Assembly Visualization Tool (GAViT), a highly scalable and interactive assembly visualization tool developed at the DOE Joint Genome Institute (JGI).

  6. Haptic over visual information in the distribution of visual attention after tool-use in near and far space.

    Science.gov (United States)

    Park, George D; Reed, Catherine L

    2015-10-01

    Despite attentional prioritization for grasping space near the hands, tool-use appears to transfer attentional bias to the tool's end/functional part. The contributions of haptic and visual inputs to attentional distribution along a tool were investigated as a function of tool-use in near (Experiment 1) and far (Experiment 2) space. Visual attention was assessed with a 50/50, go/no-go, target discrimination task, while a tool was held next to targets appearing near the tool-occupied hand or tool-end. Target response times (RTs) and sensitivity (d-prime) were measured at target locations, before and after functional tool practice for three conditions: (1) open-tool: tool-end visible (visual + haptic inputs), (2) hidden-tool: tool-end visually obscured (haptic input only), and (3) short-tool: stick missing tool's length/end (control condition: hand occupied but no visual/haptic input). In near space, both open- and hidden-tool groups showed a tool-end, attentional bias (faster RTs toward tool-end) before practice; after practice, RTs near the hand improved. In far space, the open-tool group showed no bias before practice; after practice, target RTs near the tool-end improved. However, the hidden-tool group showed a consistent tool-end bias despite practice. Lack of short-tool group results suggested that hidden-tool group results were specific to haptic inputs. In conclusion, (1) allocation of visual attention along a tool due to tool practice differs in near and far space, and (2) visual attention is drawn toward the tool's end even when visually obscured, suggesting haptic input provides sufficient information for directing attention along the tool.

  7. Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT)

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Silva, Claudio [New York Univ. (NYU), NY (United States). Computer Science and Engineering Dept.

    2013-09-30

    For the past three years, a large analysis and visualization effort—funded by the Department of Energy’s Office of Biological and Environmental Research (BER), the National Aeronautics and Space Administration (NASA), and the National Oceanic and Atmospheric Administration (NOAA)—has brought together a wide variety of industry-standard scientific computing libraries and applications to create Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) to serve the global climate simulation and observational research communities. To support interactive analysis and visualization, all components connect through a provenance application–programming interface to capture meaningful history and workflow. Components can be loosely coupled into the framework for fast integration or tightly coupled for greater system functionality and communication with other components. The overarching goal of UV-CDAT is to provide a new paradigm for access to and analysis of massive, distributed scientific data collections by leveraging distributed data architectures located throughout the world. The UV-CDAT framework addresses challenges in analysis and visualization and incorporates new opportunities, including parallelism for better efficiency, higher speed, and more accurate scientific inferences. Today, it provides more than 600 users access to more analysis and visualization products than any other single source.

  8. MEG/EEG source reconstruction, statistical evaluation, and visualization with NUTMEG.

    Science.gov (United States)

    Dalal, Sarang S; Zumer, Johanna M; Guggisberg, Adrian G; Trumpis, Michael; Wong, Daniel D E; Sekihara, Kensuke; Nagarajan, Srikantan S

    2011-01-01

    NUTMEG is a source analysis toolbox geared towards cognitive neuroscience researchers using MEG and EEG, including intracranial recordings. Evoked and unaveraged data can be imported to the toolbox for source analysis in either the time or time-frequency domains. NUTMEG offers several variants of adaptive beamformers, probabilistic reconstruction algorithms, as well as minimum-norm techniques to generate functional maps of spatiotemporal neural source activity. Lead fields can be calculated from single and overlapping sphere head models or imported from other software. Group averages and statistics can be calculated as well. In addition to data analysis tools, NUTMEG provides a unique and intuitive graphical interface for visualization of results. Source analyses can be superimposed onto a structural MRI or headshape to provide a convenient visual correspondence to anatomy. These results can also be navigated interactively, with the spatial maps and source time series or spectrogram linked accordingly. Animations can be generated to view the evolution of neural activity over time. NUTMEG can also display brain renderings and perform spatial normalization of functional maps using SPM's engine. As a MATLAB package, the end user may easily link with other toolboxes or add customized functions.

  9. MindSeer: a portable and extensible tool for visualization of structural and functional neuroimaging data

    Directory of Open Access Journals (Sweden)

    Brinkley James F

    2007-10-01

    Full Text Available Abstract Background Three-dimensional (3-D visualization of multimodality neuroimaging data provides a powerful technique for viewing the relationship between structure and function. A number of applications are available that include some aspect of 3-D visualization, including both free and commercial products. These applications range from highly specific programs for a single modality, to general purpose toolkits that include many image processing functions in addition to visualization. However, few if any of these combine both stand-alone and remote multi-modality visualization in an open source, portable and extensible tool that is easy to install and use, yet can be included as a component of a larger information system. Results We have developed a new open source multimodality 3-D visualization application, called MindSeer, that has these features: integrated and interactive 3-D volume and surface visualization, Java and Java3D for true cross-platform portability, one-click installation and startup, integrated data management to help organize large studies, extensibility through plugins, transparent remote visualization, and the ability to be integrated into larger information management systems. We describe the design and implementation of the system, as well as several case studies that demonstrate its utility. These case studies are available as tutorials or demos on the associated website: http://sig.biostr.washington.edu/projects/MindSeer. Conclusion MindSeer provides a powerful visualization tool for multimodality neuroimaging data. Its architecture and unique features also allow it to be extended into other visualization domains within biomedicine.

  10. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...... programming. However, in Software Engineering, software engineers who develop user interfaces do not follow it. In many cases, it is desirable to use graphical presentations, because a graphical presentation gives a better overview than text forms, and can improve task efficiency and user satisfaction....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...

  11. Intervene: a tool for intersection and visualization of multiple gene or genomic region sets.

    Science.gov (United States)

    Khan, Aziz; Mathelier, Anthony

    2017-05-31

    A common task for scientists relies on comparing lists of genes or genomic regions derived from high-throughput sequencing experiments. While several tools exist to intersect and visualize sets of genes, similar tools dedicated to the visualization of genomic region sets are currently limited. To address this gap, we have developed the Intervene tool, which provides an easy and automated interface for the effective intersection and visualization of genomic region or list sets, thus facilitating their analysis and interpretation. Intervene contains three modules: venn to generate Venn diagrams of up to six sets, upset to generate UpSet plots of multiple sets, and pairwise to compute and visualize intersections of multiple sets as clustered heat maps. Intervene, and its interactive web ShinyApp companion, generate publication-quality figures for the interpretation of genomic region and list sets. Intervene and its web application companion provide an easy command line and an interactive web interface to compute intersections of multiple genomic and list sets. They have the capacity to plot intersections using easy-to-interpret visual approaches. Intervene is developed and designed to meet the needs of both computer scientists and biologists. The source code is freely available at https://bitbucket.org/CBGR/intervene , with the web application available at https://asntech.shinyapps.io/intervene .

  12. Visualization of Broadband Sound Sources

    Directory of Open Access Journals (Sweden)

    Sukhanov Dmitry

    2016-01-01

    Full Text Available In this paper the method of imaging of wideband audio sources based on the 2D microphone array measurements of the sound field at the same time in all the microphones is proposed. Designed microphone array consists of 160 microphones allowing to digitize signals with a frequency of 7200 Hz. Measured signals are processed using the special algorithm that makes it possible to obtain a flat image of wideband sound sources. It is shown experimentally that the visualization is not dependent on the waveform, but determined by the bandwidth. Developed system allows to visualize sources with a resolution of up to 10 cm.

  13. A low complexity visualization tool that helps to perform complex systems analysis

    International Nuclear Information System (INIS)

    Beiro, M G; Alvarez-Hamelin, J I; Busch, J R

    2008-01-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n√n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  14. A low complexity visualization tool that helps to perform complex systems analysis

    Science.gov (United States)

    Beiró, M. G.; Alvarez-Hamelin, J. I.; Busch, J. R.

    2008-12-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n\\sqrt n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  15. 5D Task Analysis Visualization Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  16. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools.

    Directory of Open Access Journals (Sweden)

    Meridith Blevins

    Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.

  17. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  18. SnopViz, an interactive snow profile visualization tool

    Science.gov (United States)

    Fierz, Charles; Egger, Thomas; gerber, Matthias; Bavay, Mathias; Techel, Frank

    2016-04-01

    SnopViz is a visualization tool for both simulation outputs of the snow-cover model SNOWPACK and observed snow profiles. It has been designed to fulfil the needs of operational services (Swiss Avalanche Warning Service, Avalanche Canada) as well as offer the flexibility required to satisfy the specific needs of researchers. This JavaScript application runs on any modern browser and does not require an active Internet connection. The open source code is available for download from models.slf.ch where examples can also be run. Both the SnopViz library and the SnopViz User Interface will become a full replacement of the current research visualization tool SN_GUI for SNOWPACK. The SnopViz library is a stand-alone application that parses the provided input files, for example, a single snow profile (CAAML file format) or multiple snow profiles as output by SNOWPACK (PRO file format). A plugin architecture allows for handling JSON objects (JavaScript Object Notation) as well and plugins for other file formats may be added easily. The outputs are provided either as vector graphics (SVG) or JSON objects. The SnopViz User Interface (UI) is a browser based stand-alone interface. It runs in every modern browser, including IE, and allows user interaction with the graphs. SVG, the XML based standard for vector graphics, was chosen because of its easy interaction with JS and a good software support (Adobe Illustrator, Inkscape) to manipulate graphs outside SnopViz for publication purposes. SnopViz provides new visualization for SNOWPACK timeline output as well as time series input and output. The actual output format for SNOWPACK timelines was retained while time series are read from SMET files, a file format used in conjunction with the open source data handling code MeteoIO. Finally, SnopViz is able to render single snow profiles, either observed or modelled, that are provided as CAAML-file. This file format (caaml.org/Schemas/V5.0/Profiles/SnowProfileIACS) is an international

  19. New target prediction and visualization tools incorporating open source molecular fingerprints for TB Mobile 2.0.

    Science.gov (United States)

    Clark, Alex M; Sarker, Malabika; Ekins, Sean

    2014-01-01

    We recently developed a freely available mobile app (TB Mobile) for both iOS and Android platforms that displays Mycobacterium tuberculosis (Mtb) active molecule structures and their targets with links to associated data. The app was developed to make target information available to as large an audience as possible. We now report a major update of the iOS version of the app. This includes enhancements that use an implementation of ECFP_6 fingerprints that we have made open source. Using these fingerprints, the user can propose compounds with possible anti-TB activity, and view the compounds within a cluster landscape. Proposed compounds can also be compared to existing target data, using a näive Bayesian scoring system to rank probable targets. We have curated an additional 60 new compounds and their targets for Mtb and added these to the original set of 745 compounds. We have also curated 20 further compounds (many without targets in TB Mobile) to evaluate this version of the app with 805 compounds and associated targets. TB Mobile can now manage a small collection of compounds that can be imported from external sources, or exported by various means such as email or app-to-app inter-process communication. This means that TB Mobile can be used as a node within a growing ecosystem of mobile apps for cheminformatics. It can also cluster compounds and use internal algorithms to help identify potential targets based on molecular similarity. TB Mobile represents a valuable dataset, data-visualization aid and target prediction tool.

  20. Python tools for Visual Studio

    CERN Document Server

    Wang, Cathy

    2014-01-01

    This is a hands-on guide that provides exemplary coverage of all the features and concepts related to PTVS.The book is intended for developers who are aiming to enhance their productivity in Python projects with automation tools that Visual Studio provides for the .Net community. Some basic knowledge of Python programming is essential.

  1. Visual Arts as a Tool for Phenomenology

    Directory of Open Access Journals (Sweden)

    Anna S. CohenMiller

    2017-12-01

    Full Text Available In this article I explain the process and benefits of using visual arts as a tool within a transcendental phenomenological study. I present and discuss drawings created and described by four participants over the course of twelve interviews. Findings suggest the utility of visual arts methods within the phenomenological toolset to encourage participant voice through easing communication and facilitating understanding.

  2. Development of a Carbon Sequestration Visualization Tool using Google Earth Pro

    Science.gov (United States)

    Keating, G. N.; Greene, M. K.

    2008-12-01

    The Big Sky Carbon Sequestration Partnership seeks to prepare organizations throughout the western United States for a possible carbon-constrained economy. Through the development of CO2 capture and subsurface sequestration technology, the Partnership is working to enable the region to cleanly utilize its abundant fossil energy resources. The intent of the Los Alamos National Laboratory Big Sky Visualization tool is to allow geochemists, geologists, geophysicists, project managers, and other project members to view, identify, and query the data collected from CO2 injection tests using a single data source platform, a mission to which Google Earth Pro is uniquely and ideally suited . The visualization framework enables fusion of data from disparate sources and allows investigators to fully explore spatial and temporal trends in CO2 fate and transport within a reservoir. 3-D subsurface wells are projected above ground in Google Earth as the KML anchor points for the presentation of various surface subsurface data. This solution is the most integrative and cost-effective possible for the variety of users in the Big Sky community.

  3. Visualization tool for three-dimensional plasma velocity distributions (ISEE_3D) as a plug-in for SPEDAS

    Science.gov (United States)

    Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron

    2017-12-01

    This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.

  4. Real-time analysis, visualization, and steering of microtomography experiments at photon sources

    International Nuclear Information System (INIS)

    Laszeski, G. von; Insley, J.A.; Foster, I.; Bresnahan, J.; Kesselman, C.; Su, M.; Thiebaux, M.; Rivers, M.L.; Wang, S.; Tieman, B.; McNulty, I.

    2000-01-01

    A new generation of specialized scientific instruments called synchrotron light sources allow the imaging of materials at very fine scales. However, in contrast to a traditional microscope, interactive use has not previously been possible because of the large amounts of data generated and the considerable computation required translating this data into a useful image. The authors describe a new software architecture that uses high-speed networks and supercomputers to enable quasi-real-time and hence interactive analysis of synchrotron light source data. This architecture uses technologies provided by the Globus computational grid toolkit to allow dynamic creation of a reconstruction pipeline that transfers data from a synchrotron source beamline to a preprocessing station, next to a parallel reconstruction system, and then to multiple visualization stations. Collaborative analysis tools allow multiple users to control data visualization. As a result, local and remote scientists can see and discuss preliminary results just minutes after data collection starts. The implications for more efficient use of this scarce resource and for more effective science appear tremendous

  5. Visual Data Comm: A Tool for Visualizing Data Communication in the Multi Sector Planner Study

    Science.gov (United States)

    Lee, Hwasoo Eric

    2010-01-01

    Data comm is a new technology proposed in future air transport system as a potential tool to provide comprehensive data connectivity. It is a key enabler to manage 4D trajectory digitally, potentially resulting in improved flight times and increased throughput. Future concepts with data comm integration have been tested in a number of human-in-the-loop studies but analyzing the results has proven to be particularly challenging because future traffic environment in which data comm is fully enabled has assumed high traffic density, resulting in data set with large amount of information. This paper describes the motivation, design, current and potential future application of Visual Data Comm (VDC), a tool for visualizing data developed in Java using Processing library which is a tool package designed for interactive visualization programming. This paper includes an example of an application of VDC on data pertaining to the most recent Multi Sector Planner study, conducted at NASA s Airspace Operations Laboratory in 2009, in which VDC was used to visualize and interpret data comm activities

  6. VisIt: An End-User Tool for Visualizing and Analyzing Very Large Data

    Energy Technology Data Exchange (ETDEWEB)

    Childs, Hank; Brugger, Eric; Whitlock, Brad; Meredith, Jeremy; Ahern, Sean; Pugmire, David; Biagas, Kathleen; Miller, Mark; Weber, Gunther H.; Krishnan, Hari; Fogal, Thomas; Sanderson, Allen; Garth, Christoph; Bethel, E. Wes; Camp, David; Ruebel, Oliver; Durant, Marc; Favre, Jean; Navratil, Paul

    2012-11-01

    VisIt is a popular open source tool for visualizing and analyzing big data. It owes its success to its foci of increasing data understanding, large data support, and providing a robust and usable product, as well as its underlying design that fits today's supercomputing landscape. This report, which draws heavily from an earlier publication at the SciDAC Conference in 2011 describes the VisIt project and its accomplishments.

  7. VCS: Tool for Visualizing Copy Number Variation and Single Nucleotide Polymorphism

    Directory of Open Access Journals (Sweden)

    HyoYoung Kim

    2014-12-01

    Full Text Available Copy number variation (CNV or single nucleotide phlyorphism (SNP is useful genetic resource to aid in understanding complex phenotypes or deseases susceptibility. Although thousands of CNVs and SNPs are currently avaliable in the public databases, they are somewhat difficult to use for analyses without visualization tools. We developed a web-based tool called the VCS (visualization of CNV or SNP to visualize the CNV or SNP detected. The VCS tool can assist to easily interpret a biological meaning from the numerical value of CNV and SNP. The VCS provides six visualization tools: i the enrichment of genome contents in CNV; ii the physical distribution of CNV or SNP on chromosomes; iii the distribution of log2 ratio of CNVs with criteria of interested; iv the number of CNV or SNP per binning unit; v the distribution of homozygosity of SNP genotype; and vi cytomap of genes within CNV or SNP region.

  8. A web-based data visualization tool for the MIMIC-II database.

    Science.gov (United States)

    Lee, Joon; Ribey, Evan; Wallace, James R

    2016-02-04

    Although MIMIC-II, a public intensive care database, has been recognized as an invaluable resource for many medical researchers worldwide, becoming a proficient MIMIC-II researcher requires knowledge of SQL programming and an understanding of the MIMIC-II database schema. These are challenging requirements especially for health researchers and clinicians who may have limited computer proficiency. In order to overcome this challenge, our objective was to create an interactive, web-based MIMIC-II data visualization tool that first-time MIMIC-II users can easily use to explore the database. The tool offers two main features: Explore and Compare. The Explore feature enables the user to select a patient cohort within MIMIC-II and visualize the distributions of various administrative, demographic, and clinical variables within the selected cohort. The Compare feature enables the user to select two patient cohorts and visually compare them with respect to a variety of variables. The tool is also helpful to experienced MIMIC-II researchers who can use it to substantially accelerate the cumbersome and time-consuming steps of writing SQL queries and manually visualizing extracted data. Any interested researcher can use the MIMIC-II data visualization tool for free to quickly and conveniently conduct a preliminary investigation on MIMIC-II with a few mouse clicks. Researchers can also use the tool to learn the characteristics of the MIMIC-II patients. Since it is still impossible to conduct multivariable regression inside the tool, future work includes adding analytics capabilities. Also, the next version of the tool will aim to utilize MIMIC-III which contains more data.

  9. 5D Task Analysis Visualization Tool Phase II, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The creation of a five-dimensional task analysis visualization (5D-TAV) software tool for Task Analysis and Workload Planning using multi-dimensional visualization...

  10. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  11. Three-D Google Earth bases geospatial visualization tool for the smart grid distribution

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, K. [Enterprise Horizons, Fremont, CA (United States)

    2009-07-01

    Smart grids can be used to liberalize markets, ensure reliability and reduce the environmental footprint of electric utilities. This presentation discussed a geo-spatial visualization tool for smart grid distribution. The visualization tool can be used to visualize transmission lines, substations, and is capable of viewing millions of topographical components. The tool was designed to track and monitor the health of assets and to increase awareness of vulnerabilities, vegetation, and regional demographics. The tool is also capable of identifying potential issues before a rolling blackout situation as well as anticipating islanding spike events. The visualization tool can be segmented by population and industrial belts, and is able to provide diagnostics on power factor turbulence for congestion bottlenecks. When used for transmission line and substation siting, the tool can provide terrain feasibility analyses and environmental impact analyses. Weather-based demand forecasting can be used to determine critical customers impacted by potential outages. CAD drawings can be used to visualize assets in virtual reality and can be linked to consumer indexing and smart metering initiatives. It was concluded that the web-based tool can also be used for workforce and dispatch management. tabs., figs.

  12. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  13. Intrusion Detection using Open Source Tools

    OpenAIRE

    Jack TIMOFTE

    2008-01-01

    We have witnessed in the recent years that open source tools have gained popularity among all types of users, from individuals or small businesses to large organizations and enterprises. In this paper we will present three open source IDS tools: OSSEC, Prelude and SNORT.

  14. Spectacle and SpecViz: New Spectral Analysis and Visualization Tools

    Science.gov (United States)

    Earl, Nicholas; Peeples, Molly; JDADF Developers

    2018-01-01

    A new era of spectroscopic exploration of our universe is being ushered in with advances in instrumentation and next-generation space telescopes. The advent of new spectroscopic instruments has highlighted a pressing need for tools scientists can use to analyze and explore these new data. We have developed Spectacle, a software package for analyzing both synthetic spectra from hydrodynamic simulations as well as real COS data with an aim of characterizing the behavior of the circumgalactic medium. It allows easy reduction of spectral data and analytic line generation capabilities. Currently, the package is focused on automatic determination of absorption regions and line identification with custom line list support, simultaneous line fitting using Voigt profiles via least-squares or MCMC methods, and multi-component modeling of blended features. Non-parametric measurements, such as equivalent widths, delta v90, and full-width half-max are available. Spectacle also provides the ability to compose compound models used to generate synthetic spectra allowing the user to define various LSF kernels, uncertainties, and to specify sampling.We also present updates to the visualization tool SpecViz, developed in conjunction with the JWST data analysis tools development team, to aid in the exploration of spectral data. SpecViz is an open source, Python-based spectral 1-D interactive visualization and analysis application built around high-performance interactive plotting. It supports handling general and instrument-specific data and includes advanced tool-sets for filtering and detrending one-dimensional data, along with the ability to isolate absorption regions using slicing and manipulate spectral features via spectral arithmetic. Multi-component modeling is also possible using a flexible model fitting tool-set that supports custom models to be used with various fitting routines. It also features robust user extensions such as custom data loaders and support for user

  15. A new web-based tool for data visualization in MDSplus

    Energy Technology Data Exchange (ETDEWEB)

    Manduchi, G., E-mail: gabriele.manduchi@igi.cnr.it [Consorzio RFX, Euratom-ENEA Association, Corso Stati Uniti 4, Padova 35127 (Italy); Fredian, T.; Stillerman, J. [Massachusetts Institute of Technology, 175 Albany Street, Cambridge, MA 02139 (United States)

    2014-05-15

    Highlights: • The paper describes a new web-based data visualization tool for MDSplus. • It describes the experience gained with the previous data visualization tools. • It describes the used technologies for web data access and visualization. • It describes the current architecture of the tool and the new foreseen features. - Abstract: The Java tool jScope has been widely used for years to display acquired waveform in MDSplus. The choice of the Java programming language for its implementation has been successful for several reasons among which the fact that Java supports a multiplatform environment and it is well suited for graphics and the management of network communication. jScope can be used both as a local and remote application. In the latter case, data are acquired via TCP/IP communication using the mdsip protocol. Exporting data in this way however introduces several security problems due to the necessity of opening firewall holes for the user ports. For this reason, and also due to the fact that JavaScript is becoming a widely used language for web applications, a new tool written in JavaScript and called WebScope has been developed for the visualization of MDSplus data in web browsers. Data communication is now achieved via http protocol using Asynchronous JavaScript and XML (AJAX) technology. At the server side, data access is carried out by a Python module that interacts with the web server via Web Server Gateway Interface (WSGI). When a data item, described by an MDSplus expression, is requested by the web browser for visualization, it is returned as a binary message and then handled by callback JavaScript functions activated by the web browser. Scalable Vector Graphics (SVG) technology is used to handle graphics within the web browser and to carry out the same interactive data visualization provided by jScope. In addition to mouse events, touch events are supported to provide interactivity also on touch screens. In this way, waveforms can be

  16. A new web-based tool for data visualization in MDSplus

    International Nuclear Information System (INIS)

    Manduchi, G.; Fredian, T.; Stillerman, J.

    2014-01-01

    Highlights: • The paper describes a new web-based data visualization tool for MDSplus. • It describes the experience gained with the previous data visualization tools. • It describes the used technologies for web data access and visualization. • It describes the current architecture of the tool and the new foreseen features. - Abstract: The Java tool jScope has been widely used for years to display acquired waveform in MDSplus. The choice of the Java programming language for its implementation has been successful for several reasons among which the fact that Java supports a multiplatform environment and it is well suited for graphics and the management of network communication. jScope can be used both as a local and remote application. In the latter case, data are acquired via TCP/IP communication using the mdsip protocol. Exporting data in this way however introduces several security problems due to the necessity of opening firewall holes for the user ports. For this reason, and also due to the fact that JavaScript is becoming a widely used language for web applications, a new tool written in JavaScript and called WebScope has been developed for the visualization of MDSplus data in web browsers. Data communication is now achieved via http protocol using Asynchronous JavaScript and XML (AJAX) technology. At the server side, data access is carried out by a Python module that interacts with the web server via Web Server Gateway Interface (WSGI). When a data item, described by an MDSplus expression, is requested by the web browser for visualization, it is returned as a binary message and then handled by callback JavaScript functions activated by the web browser. Scalable Vector Graphics (SVG) technology is used to handle graphics within the web browser and to carry out the same interactive data visualization provided by jScope. In addition to mouse events, touch events are supported to provide interactivity also on touch screens. In this way, waveforms can be

  17. A Visualization-Based Tutoring Tool for Engineering Education

    Science.gov (United States)

    Nguyen, Tang-Hung; Khoo, I.-Hung

    2010-06-01

    In engineering disciplines, students usually have hard time to visualize different aspects of engineering analysis and design, which inherently are too complex or abstract to fully understand without the aid of visual explanations or visualizations. As examples, when learning materials and sequences of construction process, students need to visualize how all components of a constructed facility are assembled? Such visualization can not be achieved in a textbook and a traditional lecturing environment. In this paper, the authors present the development of a computer tutoring software, in which different visualization tools including video clips, 3 dimensional models, drawings, pictures/photos together with complementary texts are used to assist students in deeply understanding and effectively mastering materials. The paper will also discuss the implementation and the effectiveness evaluation of the proposed tutoring software, which was used to teach a construction engineering management course offered at California State University, Long Beach.

  18. VMEXT: A Visualization Tool for Mathematical Expression Trees

    OpenAIRE

    Schubotz, Moritz; Meuschke, Norman; Hepp, Thomas; Cohl, Howard S.; Gipp, Bela

    2017-01-01

    Mathematical expressions can be represented as a tree consisting of terminal symbols, such as identifiers or numbers (leaf nodes), and functions or operators (non-leaf nodes). Expression trees are an important mechanism for storing and processing mathematical expressions as well as the most frequently used visualization of the structure of mathematical expressions. Typically, researchers and practitioners manually visualize expression trees using general-purpose tools. This approach is labori...

  19. π Scope: python based scientific workbench with visualization tool for MDSplus data

    Science.gov (United States)

    Shiraiwa, S.

    2014-10-01

    π Scope is a python based scientific data analysis and visualization tool constructed on wxPython and Matplotlib. Although it is designed to be a generic tool, the primary motivation for developing the new software is 1) to provide an updated tool to browse MDSplus data, with functionalities beyond dwscope and jScope, and 2) to provide a universal foundation to construct interface tools to perform computer simulation and modeling for Alcator C-Mod. It provides many features to visualize MDSplus data during tokamak experiments including overplotting different signals and discharges, various plot types (line, contour, image, etc.), in-panel data analysis using python scripts, and publication quality graphics generation. Additionally, the logic to produce multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for dwscope users. πScope uses multi-threading to reduce data transfer latency, and its object-oriented design makes it easy to modify and expand while the open source nature allows portability. A built-in tree data browser allows a user to approach the data structure both from a GUI and a script, enabling relatively complex data analysis workflow to be built quickly. As an example, an IDL-based interface to perform GENRAY/CQL3D simulations was ported on πScope, thus allowing LHCD simulation to be run between-shot using C-Mod experimental profiles. This workflow is being used to generate a large database to develop a LHCD actuator model for the plasma control system. Supported by USDoE Award DE-FC02-99ER54512.

  20. STRING 3: An Advanced Groundwater Flow Visualization Tool

    Science.gov (United States)

    Schröder, Simon; Michel, Isabel; Biedert, Tim; Gräfe, Marius; Seidel, Torsten; König, Christoph

    2016-04-01

    The visualization of 3D groundwater flow is a challenging task. Previous versions of our software STRING [1] solely focused on intuitive visualization of complex flow scenarios for non-professional audiences. STRING, developed by Fraunhofer ITWM (Kaiserslautern, Germany) and delta h Ingenieurgesellschaft mbH (Witten, Germany), provides the necessary means for visualization of both 2D and 3D data on planar and curved surfaces. In this contribution we discuss how to extend this approach to a full 3D tool and its challenges in continuation of Michel et al. [2]. This elevates STRING from a post-production to an exploration tool for experts. In STRING moving pathlets provide an intuition of velocity and direction of both steady-state and transient flows. The visualization concept is based on the Lagrangian view of the flow. To capture every detail of the flow an advanced method for intelligent, time-dependent seeding is used building on the Finite Pointset Method (FPM) developed by Fraunhofer ITWM. Lifting our visualization approach from 2D into 3D provides many new challenges. With the implementation of a seeding strategy for 3D one of the major problems has already been solved (see Schröder et al. [3]). As pathlets only provide an overview of the velocity field other means are required for the visualization of additional flow properties. We suggest the use of Direct Volume Rendering and isosurfaces for scalar features. In this regard we were able to develop an efficient approach for combining the rendering through raytracing of the volume and regular OpenGL geometries. This is achieved through the use of Depth Peeling or A-Buffers for the rendering of transparent geometries. Animation of pathlets requires a strict boundary of the simulation domain. Hence, STRING needs to extract the boundary, even from unstructured data, if it is not provided. In 3D we additionally need a good visualization of the boundary itself. For this the silhouette based on the angle of

  1. Visualization of Broadband Sound Sources

    OpenAIRE

    Sukhanov Dmitry; Erzakova Nadezhda

    2016-01-01

    In this paper the method of imaging of wideband audio sources based on the 2D microphone array measurements of the sound field at the same time in all the microphones is proposed. Designed microphone array consists of 160 microphones allowing to digitize signals with a frequency of 7200 Hz. Measured signals are processed using the special algorithm that makes it possible to obtain a flat image of wideband sound sources. It is shown experimentally that the visualization is not dependent on the...

  2. VarB Plus: An Integrated Tool for Visualization of Genome Variation Datasets

    KAUST Repository

    Hidayah, Lailatul

    2012-07-01

    Research on genomic sequences has been improving significantly as more advanced technology for sequencing has been developed. This opens enormous opportunities for sequence analysis. Various analytical tools have been built for purposes such as sequence assembly, read alignments, genome browsing, comparative genomics, and visualization. From the visualization perspective, there is an increasing trend towards use of large-scale computation. However, more than power is required to produce an informative image. This is a challenge that we address by providing several ways of representing biological data in order to advance the inference endeavors of biologists. This thesis focuses on visualization of variations found in genomic sequences. We develop several visualization functions and embed them in an existing variation visualization tool as extensions. The tool we improved is named VarB, hence the nomenclature for our enhancement is VarB Plus. To the best of our knowledge, besides VarB, there is no tool that provides the capability of dynamic visualization of genome variation datasets as well as statistical analysis. Dynamic visualization allows users to toggle different parameters on and off and see the results on the fly. The statistical analysis includes Fixation Index, Relative Variant Density, and Tajima’s D. Hence we focused our efforts on this tool. The scope of our work includes plots of per-base genome coverage, Principal Coordinate Analysis (PCoA), integration with a read alignment viewer named LookSeq, and visualization of geo-biological data. In addition to description of embedded functionalities, significance, and limitations, future improvements are discussed. The result is four extensions embedded successfully in the original tool, which is built on the Qt framework in C++. Hence it is portable to numerous platforms. Our extensions have shown acceptable execution time in a beta testing with various high-volume published datasets, as well as positive

  3. Web-based Data Exploration, Exploitation and Visualization Tools for Satellite Sensor VIS/IR Calibration Applications

    Science.gov (United States)

    Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.

    2016-12-01

    The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.

  4. A Review of Pathway-Based Analysis Tools That Visualize Genetic Variants

    Directory of Open Access Journals (Sweden)

    Elisa Cirillo

    2017-11-01

    Full Text Available Pathway analysis is a powerful method for data analysis in genomics, most often applied to gene expression analysis. It is also promising for single-nucleotide polymorphism (SNP data analysis, such as genome-wide association study data, because it allows the interpretation of variants with respect to the biological processes in which the affected genes and proteins are involved. Such analyses support an interactive evaluation of the possible effects of variations on function, regulation or interaction of gene products. Current pathway analysis software often does not support data visualization of variants in pathways as an alternate method to interpret genetic association results, and specific statistical methods for pathway analysis of SNP data are not combined with these visualization features. In this review, we first describe the visualization options of the tools that were identified by a literature review, in order to provide insight for improvements in this developing field. Tool evaluation was performed using a computational epistatic dataset of gene–gene interactions for obesity risk. Next, we report the necessity to include in these tools statistical methods designed for the pathway-based analysis with SNP data, expressly aiming to define features for more comprehensive pathway-based analysis tools. We conclude by recognizing that pathway analysis of genetic variations data requires a sophisticated combination of the most useful and informative visual aspects of the various tools evaluated.

  5. An interactive visualization tool for mobile objects

    Science.gov (United States)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data

  6. CTViz: A tool for the visualization of transport in nanocomposites.

    Science.gov (United States)

    Beach, Benjamin; Brown, Joshua; Tarlton, Taylor; Derosa, Pedro A

    2016-05-01

    A visualization tool (CTViz) for charge transport processes in 3-D hybrid materials (nanocomposites) was developed, inspired by the need for a graphical application to assist in code debugging and data presentation of an existing in-house code. As the simulation code grew, troubleshooting problems grew increasingly difficult without an effective way to visualize 3-D samples and charge transport in those samples. CTViz is able to produce publication and presentation quality visuals of the simulation box, as well as static and animated visuals of the paths of individual carriers through the sample. CTViz was designed to provide a high degree of flexibility in the visualization of the data. A feature that characterizes this tool is the use of shade and transparency levels to highlight important details in the morphology or in the transport paths by hiding or dimming elements of little relevance to the current view. This is fundamental for the visualization of 3-D systems with complex structures. The code presented here provides these required capabilities, but has gone beyond the original design and could be used as is or easily adapted for the visualization of other particulate transport where transport occurs on discrete paths. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Allen Brain Atlas-Driven Visualizations: a web-based gene expression energy visualization tool.

    Science.gov (United States)

    Zaldivar, Andrew; Krichmar, Jeffrey L

    2014-01-01

    The Allen Brain Atlas-Driven Visualizations (ABADV) is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA) across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  8. Allen Brain Atlas-Driven Visualizations: A Web-Based Gene Expression Energy Visualization Tool

    Directory of Open Access Journals (Sweden)

    Andrew eZaldivar

    2014-05-01

    Full Text Available The Allen Brain Atlas-Driven Visualizations (ABADV is a publicly accessible web-based tool created to retrieve and visualize expression energy data from the Allen Brain Atlas (ABA across multiple genes and brain structures. Though the ABA offers their own search engine and software for researchers to view their growing collection of online public data sets, including extensive gene expression and neuroanatomical data from human and mouse brain, many of their tools limit the amount of genes and brain structures researchers can view at once. To complement their work, ABADV generates multiple pie charts, bar charts and heat maps of expression energy values for any given set of genes and brain structures. Such a suite of free and easy-to-understand visualizations allows for easy comparison of gene expression across multiple brain areas. In addition, each visualization links back to the ABA so researchers may view a summary of the experimental detail. ABADV is currently supported on modern web browsers and is compatible with expression energy data from the Allen Mouse Brain Atlas in situ hybridization data. By creating this web application, researchers can immediately obtain and survey numerous amounts of expression energy data from the ABA, which they can then use to supplement their work or perform meta-analysis. In the future, we hope to enable ABADV across multiple data resources.

  9. Development of in-situ visualization tool for PIC simulation

    International Nuclear Information System (INIS)

    Ohno, Nobuaki; Ohtani, Hiroaki

    2014-01-01

    As the capability of a supercomputer is improved, the sizes of simulation and its output data also become larger and larger. Visualization is usually carried out on a researcher's PC with interactive visualization software after performing the computer simulation. However, the data size is becoming too large to do it currently. A promising answer is in-situ visualization. For this case a simulation code is coupled with the visualization code and visualization is performed with the simulation on the same supercomputer. We developed an in-situ visualization tool for particle-in-cell (PIC) simulation and it is provided as a Fortran's module. We coupled it with a PIC simulation code and tested the coupled code on Plasma Simulator supercomputer, and ensured that it works. (author)

  10. AR4VI: AR as an Accessibility Tool for People with Visual Impairments

    OpenAIRE

    Coughlan, James M.; Miele, Joshua

    2017-01-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness – an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a r...

  11. AR4VI: AR as an Accessibility Tool for People with Visual Impairments.

    Science.gov (United States)

    Coughlan, James M; Miele, Joshua

    2017-10-01

    Although AR technology has been largely dominated by visual media, a number of AR tools using both visual and auditory feedback have been developed specifically to assist people with low vision or blindness - an application domain that we term Augmented Reality for Visual Impairment (AR4VI). We describe two AR4VI tools developed at Smith-Kettlewell, as well as a number of pre-existing examples. We emphasize that AR4VI is a powerful tool with the potential to remove or significantly reduce a range of accessibility barriers. Rather than being restricted to use by people with visual impairments, AR4VI is a compelling universal design approach offering benefits for mainstream applications as well.

  12. A Visualization Tool for Integrating Research Results at an Underground Mine

    Science.gov (United States)

    Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.

    2016-12-01

    Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.

  13. Scientific Visualization Tools for Enhancement of Undergraduate Research

    Science.gov (United States)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable

  14. Coastal On-line Assessment and Synthesis Tool 2.0

    Science.gov (United States)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  15. Magnetic source localization of early visual mismatch response

    NARCIS (Netherlands)

    Susac, A.; Heslenfeld, D.J.; Huonker, R.; Supek, S.

    2014-01-01

    Previous studies have reported a visual analogue of the auditory mismatch negativity (MMN) response that is based on sensory memory. The neural generators and attention dependence of the visual MMN (vMMN) still remain unclear. We used magnetoencephalography (MEG) and spatio-temporal source

  16. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  17. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  18. Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-05-19

    A partnership across government, academic, and private sectors has created a novel system that enables climate researchers to solve current and emerging data analysis and visualization challenges. The Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) software project utilizes the Python application programming interface (API) combined with C/C++/Fortran implementations for performance-critical software that offers the best compromise between "scalability" and “ease-of-use.” The UV-CDAT system is highly extensible and customizable for high-performance interactive and batch visualization and analysis for climate science and other disciplines of geosciences. For complex, climate data-intensive computing, UV-CDAT’s inclusive framework supports Message Passing Interface (MPI) parallelism as well as taskfarming and other forms of parallelism. More specifically, the UV-CDAT framework supports the execution of Python scripts running in parallel using the MPI executable commands and leverages Department of Energy (DOE)-funded general-purpose, scalable parallel visualization tools such as ParaView and VisIt. This is the first system to be successfully designed in this way and with these features. The climate community leverages these tools and others, in support of a parallel client-server paradigm, allowing extreme-scale, server-side computing for maximum possible speed-up.

  19. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  20. An Excel®-based visualization tool of 2-D soil gas concentration profiles in petroleum vapor intrusion.

    Science.gov (United States)

    Verginelli, Iason; Yao, Yijun; Suuberg, Eric M

    2016-01-01

    In this study we present a petroleum vapor intrusion tool implemented in Microsoft ® Excel ® using Visual Basic for Applications (VBA) and integrated within a graphical interface. The latter helps users easily visualize two-dimensional soil gas concentration profiles and indoor concentrations as a function of site-specific conditions such as source strength and depth, biodegradation reaction rate constant, soil characteristics and building features. This tool is based on a two-dimensional explicit analytical model that combines steady-state diffusion-dominated vapor transport in a homogeneous soil with a piecewise first-order aerobic biodegradation model, in which rate is limited by oxygen availability. As recommended in the recently released United States Environmental Protection Agency's final Petroleum Vapor Intrusion guidance, a sensitivity analysis and a simplified Monte Carlo uncertainty analysis are also included in the spreadsheet.

  1. Visual color matching system based on RGB LED light source

    Science.gov (United States)

    Sun, Lei; Huang, Qingmei; Feng, Chen; Li, Wei; Wang, Chaofeng

    2018-01-01

    In order to study the property and performance of LED as RGB primary color light sources on color mixture in visual psychophysical experiments, and to find out the difference between LED light source and traditional light source, a visual color matching experiment system based on LED light sources as RGB primary colors has been built. By simulating traditional experiment of metameric color matching in CIE 1931 RGB color system, it can be used for visual color matching experiments to obtain a set of the spectral tristimulus values which we often call color-matching functions (CMFs). This system consists of three parts: a monochromatic light part using blazed grating, a light mixing part where the summation of 3 LED illuminations are to be visually matched with a monochromatic illumination, and a visual observation part. The three narrow band LEDs used have dominant wavelengths of 640 nm (red), 522 nm (green) and 458 nm (blue) respectively and their intensities can be controlled independently. After the calibration of wavelength and luminance of LED sources with a spectrophotometer, a series of visual color matching experiments have been carried out by 5 observers. The results are compared with those from CIE 1931 RGB color system, and have been used to compute an average locus for the spectral colors in the color triangle, with white at the center. It has been shown that the use of LED is feasible and has the advantages of easy control, good stability and low cost.

  2. Tools and procedures for visualization of proteins and other biomolecules.

    Science.gov (United States)

    Pan, Lurong; Aller, Stephen G

    2015-04-01

    Protein, peptides, and nucleic acids are biomolecules that drive biological processes in living organisms. An enormous amount of structural data for a large number of these biomolecules has been described with atomic precision in the form of structural "snapshots" that are freely available in public repositories. These snapshots can help explain how the biomolecules function, the nature of interactions between multi-molecular complexes, and even how small-molecule drugs can modulate the biomolecules for clinical benefits. Furthermore, these structural snapshots serve as inputs for sophisticated computer simulations to turn the biomolecules into moving, "breathing" molecular machines for understanding their dynamic properties in real-time computer simulations. In order for the researcher to take advantage of such a wealth of structural data, it is necessary to gain competency in the use of computer molecular visualization tools for exploring the structures and visualizing three-dimensional spatial representations. Here, we present protocols for using two common visualization tools--the Web-based Jmol and the stand-alone PyMOL package--as well as a few examples of other popular tools. Copyright © 2015 John Wiley & Sons, Inc.

  3. ELATE: an open-source online application for analysis and visualization of elastic tensors

    International Nuclear Information System (INIS)

    Gaillac, Romain; Coudert, François-Xavier; Pullumbi, Pluton

    2016-01-01

    We report on the implementation of a tool for the analysis of second-order elastic stiffness tensors, provided with both an open-source Python module and a standalone online application allowing the visualization of anisotropic mechanical properties. After describing the software features, how we compute the conventional elastic constants and how we represent them graphically, we explain our technical choices for the implementation. In particular, we focus on why a Python module is used to generate the HTML web page with embedded Javascript for dynamical plots. (paper)

  4. EcoBrowser: a web-based tool for visualizing transcriptome data of Escherichia coli

    Directory of Open Access Journals (Sweden)

    Jia Peng

    2011-10-01

    Full Text Available Abstract Background Escherichia coli has been extensively studied as a prokaryotic model organism whose whole genome was determined in 1997. However, it is difficult to identify all the gene products involved in diverse functions by using whole genome sequencesalone. The high-resolution transcriptome mapping using tiling arrays has proved effective to improve the annotation of transcript units and discover new transcripts of ncRNAs. While abundant tiling array data have been generated, the lack of appropriate visualization tools to accommodate and integrate multiple sources of data has emerged. Findings EcoBrowser is a web-based tool for visualizing genome annotations and transcriptome data of E. coli. Important tiling array data of E. coli from different experimental platforms are collected and processed for query. An AJAX based genome browser is embedded for visualization. Thus, genome annotations can be compared with transcript profiling and genome occupancy profiling from independent experiments, which will be helpful in discovering new transcripts including novel mRNAs and ncRNAs, generating a detailed description of the transcription unit architecture, further providing clues for investigation of prokaryotic transcriptional regulation that has proved to be far more complex than previously thought. Conclusions With the help of EcoBrowser, users can get a systemic view both from the vertical and parallel sides, as well as inspirations for the design of new experiments which will expand our understanding of the regulation mechanism.

  5. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    Science.gov (United States)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  6. Visual DMDX: A web-based authoring tool for DMDX, a Windows display program with millisecond accuracy.

    Science.gov (United States)

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2015-09-01

    DMDX is a software package for the experimental control and timing of stimulus display for Microsoft Windows systems. DMDX is reliable, flexible, millisecond accurate, and can be downloaded free of charge; therefore it has become very popular among experimental researchers. However, setting up a DMDX-based experiment is burdensome because of its command-based interface. Further, DMDX relies on RTF files in which parts of the stimuli, design, and procedure of an experiment are defined in a complicated (DMASTR-compatible) syntax. Other experiment software, such as E-Prime, Psychopy, and WEXTOR, became successful as a result of integrated visual authoring tools. Such an intuitive interface was lacking for DMDX. We therefore created and present here Visual DMDX (http://visualdmdx.com/), a HTML5-based web interface to set up experiments and export them to DMDX item files format in RTF. Visual DMDX offers most of the features available from the rich DMDX/DMASTR syntax, and it is a useful tool to support researchers who are new to DMDX. Both old and modern versions of DMDX syntax are supported. Further, with Visual DMDX, we go beyond DMDX by having added export to JSON (a versatile web format), easy backup, and a preview option for experiments. In two examples, one experiment each on lexical decision making and affective priming, we explain in a step-by-step fashion how to create experiments using Visual DMDX. We release Visual DMDX under an open-source license to foster collaboration in its continuous improvement.

  7. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    Science.gov (United States)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  8. Classifying Desirable Features of Software Visualization Tools for Corrective Maintenance

    NARCIS (Netherlands)

    Sensalire, Mariam; Ogao, Patrick; Telea, Alexandru

    2008-01-01

    We provide an evaluation of 15 software visualization tools applicable to corrective maintenance. The tasks supported as well as the techniques used are presented and graded based on the support level. By analyzing user acceptation of current tools, we aim to help developers to select what to

  9. Vizic: A Jupyter-based interactive visualization tool for astronomical catalogs

    Science.gov (United States)

    Yu, W.; Carrasco Kind, M.; Brunner, R. J.

    2017-07-01

    The ever-growing datasets in observational astronomy have challenged scientists in many aspects, including an efficient and interactive data exploration and visualization. Many tools have been developed to confront this challenge. However, they usually focus on displaying the actual images or focus on visualizing patterns within catalogs in a predefined way. In this paper we introduce Vizic, a Python visualization library that builds the connection between images and catalogs through an interactive map of the sky region. Vizic visualizes catalog data over a custom background canvas using the shape, size and orientation of each object in the catalog. The displayed objects in the map are highly interactive and customizable comparing to those in the observation images. These objects can be filtered by or colored by their property values, such as redshift and magnitude. They also can be sub-selected using a lasso-like tool for further analysis using standard Python functions and everything is done from inside a Jupyter notebook. Furthermore, Vizic allows custom overlays to be appended dynamically on top of the sky map. We have initially implemented several overlays, namely, Voronoi, Delaunay, Minimum Spanning Tree and HEALPix grid layer, which are helpful for visualizing large-scale structure. All these overlays can be generated, added or removed interactively with just one line of code. The catalog data is stored in a non-relational database, and the interfaces have been developed in JavaScript and Python to work within Jupyter Notebook, which allows to create customizable widgets, user generated scripts to analyze and plot the data selected/displayed in the interactive map. This unique design makes Vizic a very powerful and flexible interactive analysis tool. Vizic can be adopted in variety of exercises, for example, data inspection, clustering analysis, galaxy alignment studies, outlier identification or just large scale visualizations.

  10. Visualizing Debugging Activity in Source Code Repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the

  11. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  12. Visualization tool. 3DAVS and polarization-type VR system

    International Nuclear Information System (INIS)

    Takeda, Yasuhiro; Ueshima, Yutaka

    2003-01-01

    In the visualization work of simulation data in every advanced research field, what is used most in the report or the presentation as a research result has still remained in the stages of the still picture or the 2-dimensional animation, in spite of recent abundance of various visualization software. With the recent progress of computational environment, however, more complicated phenomena can be so easily computed that the results are more needed to be comprehensible as well as intelligible. Therefore, it inevitably requires an animation rather than a still picture, or 3-dimensional display (virtual reality) rather than 2-dimensional one. In this report, two visualization tools, 3DAVS and Polarization-Type VR system are described as the data expression method after visualization processing. (author)

  13. VRML and Collaborative Environments: New Tools for Networked Visualization

    Science.gov (United States)

    Crutcher, R. M.; Plante, R. L.; Rajlich, P.

    We present two new applications that engage the network as a tool for astronomical research and/or education. The first is a VRML server which allows users over the Web to interactively create three-dimensional visualizations of FITS images contained in the NCSA Astronomy Digital Image Library (ADIL). The server's Web interface allows users to select images from the ADIL, fill in processing parameters, and create renderings featuring isosurfaces, slices, contours, and annotations; the often extensive computations are carried out on an NCSA SGI supercomputer server without the user having an individual account on the system. The user can then download the 3D visualizations as VRML files, which may be rotated and manipulated locally on virtually any class of computer. The second application is the ADILBrowser, a part of the NCSA Horizon Image Data Browser Java package. ADILBrowser allows a group of participants to browse images from the ADIL within a collaborative session. The collaborative environment is provided by the NCSA Habanero package which includes text and audio chat tools and a white board. The ADILBrowser is just an example of a collaborative tool that can be built with the Horizon and Habanero packages. The classes provided by these packages can be assembled to create custom collaborative applications that visualize data either from local disk or from anywhere on the network.

  14. Which energy mix for the UK (United Kingdom)? An evolutive descriptive mapping with the integrated GAIA (graphical analysis for interactive aid)–AHP (analytic hierarchy process) visualization tool

    International Nuclear Information System (INIS)

    Ishizaka, Alessio; Siraj, Sajid; Nemery, Philippe

    2016-01-01

    Although Multi-Criteria Decision Making methods have been extensively used in energy planning, their descriptive use has been rarely considered. In this paper, we add an evolutionary description phase as an extension to the AHP (analytic hierarchy process) method that helps policy makers to gain insights into their decision problems. The proposed extension has been implemented in an open-source software that allows the users to visualize the difference of opinions within a decision process, and also the evolution of preferences over time. The method was tested in a two-phase experiment to understand the evolution of opinions on energy sources. Participants were asked to provide their preferences for different energy sources for the next twenty years for the United Kingdom. They were first asked to compare the options intuitively without using any structured approach, and then were given three months to compare the same set of options after collecting detailed information on the technical, economic, environmental and social impacts created by each of the selected energy sources. The proposed visualization method allow us to quickly discover the preference directions, and also the changes in their preferences from first to second phase. The proposed tool can help policy makers in better understanding of the energy planning problems that will lead us towards better planning and decisions in the energy sector. - Highlights: • We introduce a descriptive visual analysis tool for the analytic hierarchy process. • The method has been implemented as an open-source preference elicitation tool. • We analyse user preferences in the energy sector using this method. • The tool also provides a way to visualize temporal preferences changes. • The main negative temporal shift in the ranking was found for the nuclear energy.

  15. Users’ perception of visual aesthetics and usefulness of a web-based educational tool

    OpenAIRE

    Sánchez Franco, Manuel Jesús; Villarejo Ramos, Ángel Francisco; Peral Peral, Begoña; Buitrago Esquinas, Eva María; Roldán Salgueiro, José Luis

    2013-01-01

    As a result of our research we have become increasingly aware of the relevance of visual design in understanding learners’ attitudes towards the use of virtual tools. Likewise, perceived usefulness is an essential antecedent of the cumulative impressions of, and preferences for, such tools. Therefore, the aim of this study is to investigate the main effects of visual design and usefulness on learning and productivity in the domain of web-based educational tools. Structural Equation M...

  16. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  17. Next generation tools for genomic data generation, distribution, and visualization.

    Science.gov (United States)

    Nix, David A; Di Sera, Tonya L; Dalley, Brian K; Milash, Brett A; Cundick, Robert M; Quinn, Kevin S; Courdy, Samir J

    2010-09-09

    With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx); an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub); and a standalone Java Swing application (GWrap) that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  18. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  19. PRIDE Inspector Toolsuite: Moving Toward a Universal Visualization Tool for Proteomics Data Standard Formats and Quality Assessment of ProteomeXchange Datasets.

    Science.gov (United States)

    Perez-Riverol, Yasset; Xu, Qing-Wei; Wang, Rui; Uszkoreit, Julian; Griss, Johannes; Sanchez, Aniel; Reisinger, Florian; Csordas, Attila; Ternent, Tobias; Del-Toro, Noemi; Dianes, Jose A; Eisenacher, Martin; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2016-01-01

    The original PRIDE Inspector tool was developed as an open source standalone tool to enable the visualization and validation of mass-spectrometry (MS)-based proteomics data before data submission or already publicly available in the Proteomics Identifications (PRIDE) database. The initial implementation of the tool focused on visualizing PRIDE data by supporting the PRIDE XML format and a direct access to private (password protected) and public experiments in PRIDE.The ProteomeXchange (PX) Consortium has been set up to enable a better integration of existing public proteomics repositories, maximizing its benefit to the scientific community through the implementation of standard submission and dissemination pipelines. Within the Consortium, PRIDE is focused on supporting submissions of tandem MS data. The increasing use and popularity of the new Proteomics Standards Initiative (PSI) data standards such as mzIdentML and mzTab, and the diversity of workflows supported by the PX resources, prompted us to design and implement a new suite of algorithms and libraries that would build upon the success of the original PRIDE Inspector and would enable users to visualize and validate PX "complete" submissions. The PRIDE Inspector Toolsuite supports the handling and visualization of different experimental output files, ranging from spectra (mzML, mzXML, and the most popular peak lists formats) and peptide and protein identification results (mzIdentML, PRIDE XML, mzTab) to quantification data (mzTab, PRIDE XML), using a modular and extensible set of open-source, cross-platform libraries. We believe that the PRIDE Inspector Toolsuite represents a milestone in the visualization and quality assessment of proteomics data. It is freely available at http://github.com/PRIDE-Toolsuite/. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Visualizing Debugging Activity in Source Code Repositories

    OpenAIRE

    Voinea, Lucian; Telea, Alexandru

    2007-01-01

    We present the use of the CVSgrab visualization tool for understanding the debugging activity in the Mozilla project. We show how to display the distribution of different bug types over the project structure, locate project components which undergo heavy debugging activity, and get insight in the bug evolution in time.

  1. Development of software tools for 4-D visualization and quantitative analysis of PHITS simulation results

    International Nuclear Information System (INIS)

    Furutaka, Kazuyoshi

    2015-02-01

    A suite of software tools has been developed to facilitate the development of apparatus using a radiation transport simulation code PHITS by enabling 4D visualization (3D space and time) and quantitative analysis of so-called dieaway plots. To deliver useable tools as soon as possible, the existing software was utilized as much as possible; ParaView will be used for the 4D visualization of the results, whereas the analyses of dieaway plots will be done with ROOT toolkit with a tool named “diana”. To enable 4D visualization using ParaView, a group of tools (angel2vtk, DispDCAS1, CamPos) has been developed for the conversion of the data format to the one which can be read from ParaView and to ease the visualization. (author)

  2. Development and Evaluation of Secure Socket Layer Visualization Tool with Packet Capturing Function

    Directory of Open Access Journals (Sweden)

    Arai Masayuki

    2015-01-01

    Full Text Available Secure Socket Layer (SSL has become a fundamental technology that secures browser-processed personal details sent to the server. As a result, communication and computer engineers are advised to learn the protocol. However, understanding SSL is very difficult because of its intricate communication procedure. To solve this problem, we developed a visualization tool for understanding SSL. This paper describes the design, implementation methods, and evaluation of the tool. The evaluation results show that the visualization tool is effective for learning SSL.

  3. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  4. Footprints: A Visual Search Tool that Supports Discovery and Coverage Tracking.

    Science.gov (United States)

    Isaacs, Ellen; Domico, Kelly; Ahern, Shane; Bart, Eugene; Singhal, Mudita

    2014-12-01

    Searching a large document collection to learn about a broad subject involves the iterative process of figuring out what to ask, filtering the results, identifying useful documents, and deciding when one has covered enough material to stop searching. We are calling this activity "discoverage," discovery of relevant material and tracking coverage of that material. We built a visual analytic tool called Footprints that uses multiple coordinated visualizations to help users navigate through the discoverage process. To support discovery, Footprints displays topics extracted from documents that provide an overview of the search space and are used to construct searches visuospatially. Footprints allows users to triage their search results by assigning a status to each document (To Read, Read, Useful), and those status markings are shown on interactive histograms depicting the user's coverage through the documents across dates, sources, and topics. Coverage histograms help users notice biases in their search and fill any gaps in their analytic process. To create Footprints, we used a highly iterative, user-centered approach in which we conducted many evaluations during both the design and implementation stages and continually modified the design in response to feedback.

  5. Do Bedside Visual Tools Improve Patient and Caregiver Satisfaction? A Systematic Review of the Literature.

    Science.gov (United States)

    Goyal, Anupama A; Tur, Komalpreet; Mann, Jason; Townsend, Whitney; Flanders, Scott A; Chopra, Vineet

    2017-11-01

    Although common, the impact of low-cost bedside visual tools, such as whiteboards, on patient care is unclear. To systematically review the literature and assess the influence of bedside visual tools on patient satisfaction. Medline, Embase, SCOPUS, Web of Science, CINAHL, and CENTRAL. Studies of adult or pediatric hospitalized patients reporting physician identification, understanding of provider roles, patient-provider communication, and satisfaction with care from the use of visual tools were included. Outcomes were categorized as positive, negative, or neutral based on survey responses for identification, communication, and satisfaction. Two reviewers screened studies, extracted data, and assessed the risk of study bias. Sixteen studies met the inclusion criteria. Visual tools included whiteboards (n = 4), physician pictures (n = 7), whiteboard and picture (n = 1), electronic medical record-based patient portals (n = 3), and formatted notepads (n = 1). Tools improved patients' identification of providers (13/13 studies). The impact on understanding the providers' roles was largely positive (8/10 studies). Visual tools improved patient-provider communication (4/5 studies) and satisfaction (6/8 studies). In adults, satisfaction varied between positive with the use of whiteboards (2/5 studies) and neutral with pictures (1/5 studies). Satisfaction related to pictures in pediatric patients was either positive (1/3 studies) or neutral (1/3 studies). Differences in tool format (individual pictures vs handouts with pictures of all providers) and study design (randomized vs cohort) may explain variable outcomes. The use of bedside visual tools appears to improve patient recognition of providers and patient-provider communication. Future studies that include better design and outcome assessment are necessary before widespread use can be recommended. © 2017 Society of Hospital Medicine

  6. The Tools, Approaches and Applications of Visual Literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria

    Science.gov (United States)

    Ecoma, Victor

    2016-01-01

    The paper reflects upon the tools, approaches and applications of visual literacy in the Visual Arts Department of Cross River University of Technology, Calabar, Nigeria. The objective of the discourse is to examine how the visual arts training and practice equip students with skills in visual literacy through methods of production, materials and…

  7. KENO3D Visualization Tool for KENO V.a and KENO-VI Geometry Models

    International Nuclear Information System (INIS)

    Horwedel, J.E.; Bowman, S.M.

    2000-01-01

    Criticality safety analyses often require detailed modeling of complex geometries. Effective visualization tools can enhance checking the accuracy of these models. This report describes the KENO3D visualization tool developed at the Oak Ridge National Laboratory (ORNL) to provide visualization of KENO V.a and KENO-VI criticality safety models. The development of KENO3D is part of the current efforts to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system

  8. Porcupine: A visual pipeline tool for neuroimaging analysis.

    Directory of Open Access Journals (Sweden)

    Tim van Mourik

    2018-05-01

    Full Text Available The field of neuroimaging is rapidly adopting a more reproducible approach to data acquisition and analysis. Data structures and formats are being standardised and data analyses are getting more automated. However, as data analysis becomes more complicated, researchers often have to write longer analysis scripts, spanning different tools across multiple programming languages. This makes it more difficult to share or recreate code, reducing the reproducibility of the analysis. We present a tool, Porcupine, that constructs one's analysis visually and automatically produces analysis code. The graphical representation improves understanding of the performed analysis, while retaining the flexibility of modifying the produced code manually to custom needs. Not only does Porcupine produce the analysis code, it also creates a shareable environment for running the code in the form of a Docker image. Together, this forms a reproducible way of constructing, visualising and sharing one's analysis. Currently, Porcupine links to Nipype functionalities, which in turn accesses most standard neuroimaging analysis tools. Our goal is to release researchers from the constraints of specific implementation details, thereby freeing them to think about novel and creative ways to solve a given problem. Porcupine improves the overview researchers have of their processing pipelines, and facilitates both the development and communication of their work. This will reduce the threshold at which less expert users can generate reusable pipelines. With Porcupine, we bridge the gap between a conceptual and an implementational level of analysis and make it easier for researchers to create reproducible and shareable science. We provide a wide range of examples and documentation, as well as installer files for all platforms on our website: https://timvanmourik.github.io/Porcupine. Porcupine is free, open source, and released under the GNU General Public License v3.0.

  9. Funding Sources for Visually Impaired Students in Higher Education.

    Science.gov (United States)

    Traber, M.

    1987-01-01

    Financial aid sources available to visually handicapped students for postsecondary educational, vocational, or technical programs are outlined. Sources include national and state blindness agencies, colleges and universities, state vocational rehabilitation agencies, and the federal government. (Author/JDD)

  10. An integrated audio-visual impact tool for wind turbine installations

    International Nuclear Information System (INIS)

    Lymberopoulos, N.; Belessis, M.; Wood, M.; Voutsinas, S.

    1996-01-01

    An integrated software tool was developed for the design of wind parks that takes into account their visual and audio impact. The application is built on a powerful hardware platform and is fully operated through a graphic user interface. The topography, the wind turbines and the daylight conditions are realised digitally. The wind park can be animated in real time and the user can take virtual walks in it while the set-up of the park can be altered interactively. In parallel, the wind speed levels on the terrain, the emitted noise intensity, the annual energy output and the cash flow can be estimated at any stage of the session and prompt the user for rearrangements. The tool has been used to visually simulate existing wind parks in St. Breok, UK and Andros Island, Greece. The results lead to the conclusion that such a tool can assist to the public acceptance and licensing procedures of wind parks. (author)

  11. 3D Immersive Visualization: An Educational Tool in Geosciences

    Science.gov (United States)

    Pérez-Campos, N.; Cárdenas-Soto, M.; Juárez-Casas, M.; Castrejón-Pineda, R.

    2007-05-01

    3D immersive visualization is an innovative tool currently used in various disciplines, such as medicine, architecture, engineering, video games, etc. Recently, the Universidad Nacional Autónoma de México (UNAM) mounted a visualization theater (Ixtli) with leading edge technology, for academic and research purposes that require immersive 3D tools for a better understanding of the concepts involved. The Division of Engineering in Earth Sciences of the School of Engineering, UNAM, is running a project focused on visualization of geoscience data. Its objective is to incoporate educational material in geoscience courses in order to support and to improve the teaching-learning process, especially in well-known difficult topics for students. As part of the project, proffessors and students are trained in visualization techniques, then their data are adapted and visualized in Ixtli as part of a class or a seminar, where all the attendants can interact, not only among each other but also with the object under study. As part of our results, we present specific examples used in basic geophysics courses, such as interpreted seismic cubes, seismic-wave propagation models, and structural models from bathymetric, gravimetric and seismological data; as well as examples from ongoing applied projects, such as a modeled SH upward wave, the occurrence of an earthquake cluster in 1999 in the Popocatepetl volcano, and a risk atlas from Delegación Alvaro Obregón in Mexico City. All these examples, plus those to come, constitute a library for students and professors willing to explore another dimension of the teaching-learning process. Furthermore, this experience can be enhaced by rich discussions and interactions by videoconferences with other universities and researchers.

  12. Next generation tools for genomic data generation, distribution, and visualization

    Directory of Open Access Journals (Sweden)

    Nix David A

    2010-09-01

    Full Text Available Abstract Background With the rapidly falling cost and availability of high throughput sequencing and microarray technologies, the bottleneck for effectively using genomic analysis in the laboratory and clinic is shifting to one of effectively managing, analyzing, and sharing genomic data. Results Here we present three open-source, platform independent, software tools for generating, analyzing, distributing, and visualizing genomic data. These include a next generation sequencing/microarray LIMS and analysis project center (GNomEx; an application for annotating and programmatically distributing genomic data using the community vetted DAS/2 data exchange protocol (GenoPub; and a standalone Java Swing application (GWrap that makes cutting edge command line analysis tools available to those who prefer graphical user interfaces. Both GNomEx and GenoPub use the rich client Flex/Flash web browser interface to interact with Java classes and a relational database on a remote server. Both employ a public-private user-group security model enabling controlled distribution of patient and unpublished data alongside public resources. As such, they function as genomic data repositories that can be accessed manually or programmatically through DAS/2-enabled client applications such as the Integrated Genome Browser. Conclusions These tools have gained wide use in our core facilities, research laboratories and clinics and are freely available for non-profit use. See http://sourceforge.net/projects/gnomex/, http://sourceforge.net/projects/genoviz/, and http://sourceforge.net/projects/useq.

  13. Open source tracking and analysis of adult Drosophila locomotion in Buridan's paradigm with and without visual targets.

    Directory of Open Access Journals (Sweden)

    Julien Colomb

    Full Text Available BACKGROUND: Insects have been among the most widely used model systems for studying the control of locomotion by nervous systems. In Drosophila, we implemented a simple test for locomotion: in Buridan's paradigm, flies walk back and forth between two inaccessible visual targets [1]. Until today, the lack of easily accessible tools for tracking the fly position and analyzing its trajectory has probably contributed to the slow acceptance of Buridan's paradigm. METHODOLOGY/PRINCIPAL FINDINGS: We present here a package of open source software designed to track a single animal walking in a homogenous environment (Buritrack and to analyze its trajectory. The Centroid Trajectory Analysis (CeTrAn software is coded in the open source statistics project R. It extracts eleven metrics and includes correlation analyses and a Principal Components Analysis (PCA. It was designed to be easily customized to personal requirements. In combination with inexpensive hardware, these tools can readily be used for teaching and research purposes. We demonstrate the capabilities of our package by measuring the locomotor behavior of adult Drosophila melanogaster (whose wings were clipped, either in the presence or in the absence of visual targets, and comparing the latter to different computer-generated data. The analysis of the trajectories confirms that flies are centrophobic and shows that inaccessible visual targets can alter the orientation of the flies without changing their overall patterns of activity. CONCLUSIONS/SIGNIFICANCE: Using computer generated data, the analysis software was tested, and chance values for some metrics (as well as chance value for their correlation were set. Our results prompt the hypothesis that fixation behavior is observed only if negative phototaxis can overcome the propensity of the flies to avoid the center of the platform. Together with our companion paper, we provide new tools to promote Open Science as well as the collection and

  14. Statistical and Visualization Data Mining Tools for Foundry Production

    Directory of Open Access Journals (Sweden)

    M. Perzyk

    2007-07-01

    Full Text Available In recent years a rapid development of a new, interdisciplinary knowledge area, called data mining, is observed. Its main task is extracting useful information from previously collected large amount of data. The main possibilities and potential applications of data mining in manufacturing industry are characterized. The main types of data mining techniques are briefly discussed, including statistical, artificial intelligence, data base and visualization tools. The statistical methods and visualization methods are presented in more detail, showing their general possibilities, advantages as well as characteristic examples of applications in foundry production. Results of the author’s research are presented, aimed at validation of selected statistical tools which can be easily and effectively used in manufacturing industry. A performance analysis of ANOVA and contingency tables based methods, dedicated for determination of the most significant process parameters as well as for detection of possible interactions among them, has been made. Several numerical tests have been performed using simulated data sets, with assumed hidden relationships as well some real data, related to the strength of ductile cast iron, collected in a foundry. It is concluded that the statistical methods offer relatively easy and fairly reliable tools for extraction of that type of knowledge about foundry manufacturing processes. However, further research is needed, aimed at explanation of some imperfections of the investigated tools as well assessment of their validity for more complex tasks.

  15. Experiences of graduate students: Using Cabri as a visualization tool in math education

    Directory of Open Access Journals (Sweden)

    Çiğdem Gül

    2014-12-01

    Full Text Available Through the use of graphic calculators and dynamic software running on computers and mobile devices, students can learn complex algebraic concepts. The purpose of this study is to investigate the experiences of graduate students using Cabri as a visualization tool in math education. The qualitative case study was used in this study. Five students from graduate students studying at the non-thesis math program of a university located in the Blacksea region were the participant of the study. As a dynamic learning tool, Cabri provided participants an environment where participants visually discovered the geometry. It was concluded that dynamic learning tools like Cabri has a huge potential for teaching visually the challenging concepts that students struggle to image. Further research should investigate the potential plans for integrating the use of dynamic learning software into the math curriculum

  16. Users’ perception of visual design and usefulness of a web-based educational tool

    OpenAIRE

    Sánchez Franco, Manuel Jesús; Villarejo Ramos, Ángel Francisco; Peral Peral, Begoña; Buitrago Esquinas, Eva María; Roldán Salgueiro, José Luis

    2012-01-01

    Our research has become increasingly aware of the relevance of visual design in understanding learners’ attitudes towards the use of virtual tools. Likewise, perceived usefulness is an essential antecedent of the cumulative impressions of and preferences for them. Therefore, the aim of this study is to investigate the main effects of visual design and usefulness on learning and productivity in the domain of web-based educational tools. A Structural Equation Modelling, specifically Partial Lea...

  17. Visualization and Quality Control Web Tools for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Doelling, D. R.

    2017-12-01

    The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.

  18. Experimental Evaluation of Electric Power Grid Visualization Tools in the EIOC

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin; Dalton, Angela C.

    2009-12-01

    The present study follows an initial human factors evaluation of four electric power grid visualization tools and reports on an empirical evaluation of two of the four tools: Graphical Contingency Analysis, and Phasor State Estimator. The evaluation was conducted within specific experimental studies designed to measure the impact on decision making performance.

  19. Using data visualization tools to support degradation assessment in nuclear piping

    International Nuclear Information System (INIS)

    Jyrkama, M.I.; Pandey, M.D.

    2012-01-01

    Nuclear utilities collect a vast amount of in-service inspection data as part of periodic inspection plans and the detailed assessment and monitoring of various degradation mechanisms, such as fretting, corrosion, and creep. In many cases, the focus is primarily on ensuring that the observed minimum or maximum values are within the acceptable regulatory limits, while the rest of the (often costly) surveillance data remains unused and unanalyzed. The objective of this study is to illustrate how data visualization tools can be used effectively to analyze and consider all of the in-service inspection data, and hence provide valuable support for the degradation assessment in nuclear piping. The 2D and 3D visualization tools discussed in this paper were developed mainly in the context of flow accelerated corrosion (FAC) assessment in feeder piping, where the complex pipe geometries and flow conditions have a significant impact on the ultrasonic (UT) wall thickness measurements. The visualization of eddy current inspection results from the assessment of pitting corrosion of steam generator tubing will also be discussed briefly. The visualization tools provide a more comprehensive view of the degree and extent of degradation, and hence directly support the planning of future inspection of critical components by identifying key locations and areas for detailed monitoring. The results furthermore increase the confidence and reliability of fitness-for-service (FFS) assessments and life cycle management (LCM) planning decisions with respect to component repair or replacement. (author)

  20. Noise Source Visualization Using a Digital Voice Recorder and Low-Cost Sensors.

    Science.gov (United States)

    Cho, Yong Thung

    2018-04-03

    Accurate sound visualization of noise sources is required for optimal noise control. Typically, noise measurement systems require microphones, an analog-digital converter, cables, a data acquisition system, etc., which may not be affordable for potential users. Also, many such systems are not highly portable and may not be convenient for travel. Handheld personal electronic devices such as smartphones and digital voice recorders with relatively lower costs and higher performance have become widely available recently. Even though such devices are highly portable, directly implementing them for noise measurement may lead to erroneous results since such equipment was originally designed for voice recording. In this study, external microphones were connected to a digital voice recorder to conduct measurements and the input received was processed for noise visualization. In this way, a low cost, compact sound visualization system was designed and introduced to visualize two actual noise sources for verification with different characteristics: an enclosed loud speaker and a small air compressor. Reasonable accuracy of noise visualization for these two sources was shown over a relatively wide frequency range. This very affordable and compact sound visualization system can be used for many actual noise visualization applications in addition to educational purposes.

  1. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    Science.gov (United States)

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  2. Plasma sources for EUV lithography exposure tools

    International Nuclear Information System (INIS)

    Banine, Vadim; Moors, Roel

    2004-01-01

    The source is an integral part of an extreme ultraviolet lithography (EUVL) tool. Such a source, as well as the EUVL tool, has to fulfil extremely high demands both technical and cost oriented. The EUVL tool operates at a wavelength in the range 13-14 nm, which requires a major re-thinking of state-of-the-art lithography systems operating in the DUV range. The light production mechanism changes from conventional lamps and lasers to relatively high temperature emitting plasmas. The light transport, mainly refractive for DUV, should become reflective for EUV. The source specifications are derived from the customer requirements for the complete tool, which are: throughput, cost of ownership (CoO) and imaging quality. The EUVL system is considered as a follow up of the existing DUV based lithography technology and, while improving the feature resolution, it has to maintain high wafer throughput performance, which is driven by the overall CoO picture. This in turn puts quite high requirements on the collectable in-band power produced by an EUV source. Increased, due to improved feature resolution, critical dimension (CD) control requirements, together with reflective optics restrictions, necessitate pulse-to-pulse repeatability, spatial stability control and repetition rates, which are substantially better than those of current optical systems. All together the following aspects of the source specification will be addressed: the operating wavelength, the EUV power, the hot spot size, the collectable angle, the repetition rate, the pulse-to-pulse repeatability and the debris induced lifetime of components

  3. A Hyperbolic Ontology Visualization Tool for Model Application Programming Interface Documentation

    Science.gov (United States)

    Hyman, Cody

    2011-01-01

    Spacecraft modeling, a critically important portion in validating planned spacecraft activities, is currently carried out using a time consuming method of mission to mission model implementations and integration. A current project in early development, Integrated Spacecraft Analysis (ISCA), aims to remedy this hindrance by providing reusable architectures and reducing time spent integrating models with planning and sequencing tools. The principle objective of this internship was to develop a user interface for an experimental ontology-based structure visualization of navigation and attitude control system modeling software. To satisfy this, a number of tree and graph visualization tools were researched and a Java based hyperbolic graph viewer was selected for experimental adaptation. Early results show promise in the ability to organize and display large amounts of spacecraft model documentation efficiently and effectively through a web browser. This viewer serves as a conceptual implementation for future development but trials with both ISCA developers and end users should be performed to truly evaluate the effectiveness of continued development of such visualizations.

  4. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    Science.gov (United States)

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. An interactive visualization tool for multi-channel confocal microscopy data in neurobiology research

    KAUST Repository

    Yong Wan,

    2009-11-01

    Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.

  6. Open Source Next Generation Visualization Software for Interplanetary Missions

    Science.gov (United States)

    Trimble, Jay; Rinker, George

    2016-01-01

    Mission control is evolving quickly, driven by the requirements of new missions, and enabled by modern computing capabilities. Distributed operations, access to data anywhere, data visualization for spacecraft analysis that spans multiple data sources, flexible reconfiguration to support multiple missions, and operator use cases, are driving the need for new capabilities. NASA's Advanced Multi-Mission Operations System (AMMOS), Ames Research Center (ARC) and the Jet Propulsion Laboratory (JPL) are collaborating to build a new generation of mission operations software for visualization, to enable mission control anywhere, on the desktop, tablet and phone. The software is built on an open source platform that is open for contributions (http://nasa.github.io/openmct).

  7. πScope: Python based scientific workbench with MDSplus data visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Shiraiwa, S., E-mail: shiraiwa@PSFC.MIT.EDU; Fredian, T.; Hillairet, J.; Stillerman, J.

    2016-11-15

    Highlights: • πScope provides great enhancement in MDSplus data visualization. • πScope provides a single platform for both data browsing and complicated analysis. • πScope is scriptable and easily expandable due to its object oriented. • πScope is written in python and available from (http://piscope.psfc.mit.edu/). - Abstract: A newly developed python based scientific data analysis and visualization tool, πScope ( (http://piscope.psfc.mit.edu)), is reported. The primary motivation is 1) to provide an updated tool to browse the MDSplus data beyond existing dwscope/jScope and 2) to realize a universal foundation to construct interface tools to perform computer modeling from experimental data. To visualize MDSplus data, πScope has many features including overplotting different signals and discharges, generating various plot types (line, contour, image, etc.), performing in-panel data analysis using python scripts, and producing publication quality graphics. The logic to generate multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for users. πScope uses multi-threading in data loading, and is easy to modify and expand due to its object-oriented design. Furthermore, A user can access the data structure both from a GUI and a script, enabling relatively complex data analysis workflow built quickly on πScope.

  8. πScope: Python based scientific workbench with MDSplus data visualization tool

    International Nuclear Information System (INIS)

    Shiraiwa, S.; Fredian, T.; Hillairet, J.; Stillerman, J.

    2016-01-01

    Highlights: • πScope provides great enhancement in MDSplus data visualization. • πScope provides a single platform for both data browsing and complicated analysis. • πScope is scriptable and easily expandable due to its object oriented. • πScope is written in python and available from (http://piscope.psfc.mit.edu/). - Abstract: A newly developed python based scientific data analysis and visualization tool, πScope ( (http://piscope.psfc.mit.edu)), is reported. The primary motivation is 1) to provide an updated tool to browse the MDSplus data beyond existing dwscope/jScope and 2) to realize a universal foundation to construct interface tools to perform computer modeling from experimental data. To visualize MDSplus data, πScope has many features including overplotting different signals and discharges, generating various plot types (line, contour, image, etc.), performing in-panel data analysis using python scripts, and producing publication quality graphics. The logic to generate multi-panel plots is designed to be backward compatible with dwscope, enabling smooth migration for users. πScope uses multi-threading in data loading, and is easy to modify and expand due to its object-oriented design. Furthermore, A user can access the data structure both from a GUI and a script, enabling relatively complex data analysis workflow built quickly on πScope.

  9. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  10. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  11. Noise Source Visualization Using a Digital Voice Recorder and Low-Cost Sensors

    Directory of Open Access Journals (Sweden)

    Yong Thung Cho

    2018-04-01

    Full Text Available Accurate sound visualization of noise sources is required for optimal noise control. Typically, noise measurement systems require microphones, an analog-digital converter, cables, a data acquisition system, etc., which may not be affordable for potential users. Also, many such systems are not highly portable and may not be convenient for travel. Handheld personal electronic devices such as smartphones and digital voice recorders with relatively lower costs and higher performance have become widely available recently. Even though such devices are highly portable, directly implementing them for noise measurement may lead to erroneous results since such equipment was originally designed for voice recording. In this study, external microphones were connected to a digital voice recorder to conduct measurements and the input received was processed for noise visualization. In this way, a low cost, compact sound visualization system was designed and introduced to visualize two actual noise sources for verification with different characteristics: an enclosed loud speaker and a small air compressor. Reasonable accuracy of noise visualization for these two sources was shown over a relatively wide frequency range. This very affordable and compact sound visualization system can be used for many actual noise visualization applications in addition to educational purposes.

  12. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  13. 2014 Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Conference Report

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-01-27

    The climate and weather data science community met December 9–11, 2014, in Livermore, California, for the fourth annual Earth System Grid Federation (ESGF) and Ultrascale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Conference, hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UVCDATremain global collaborations committed to developing a new generation of open-source software infrastructure that provides distributed access and analysis to simulated and observed data from the climate and weather communities. The tools and infrastructure created under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change. In addition, the F2F conference fosters a stronger climate and weather data science community and facilitates a stronger federated software infrastructure. The 2014 F2F conference detailed the progress of ESGF, UV-CDAT, and other community efforts over the year and sets new priorities and requirements for existing and impending national and international community projects, such as the Coupled Model Intercomparison Project Phase Six. Specifically discussed at the conference were project capabilities and enhancements needs for data distribution, analysis, visualization, hardware and network infrastructure, standards, and resources.

  14. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  15. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen; Al-Awami, Ali K.; Beyer, Johanna; Cali, Corrado; Magistretti, Pierre J.; Pfister, Hanspeter; Hadwiger, Markus

    2017-01-01

    This paper presents Abstractocyte, a system for the visual analysis of astrocytes and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. The study of astrocytes has immense potential for understanding brain function. However, their complex and widely-branching structure requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Furthermore, the structure and function of astrocytes is very different from neurons, and therefore requires the development of new visualization and analysis tools. With Abstractocyte, biologists can explore the morphology of astrocytes using various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a specific joint visualization as a point in this space. Interactively moving this point allows them to smoothly transition between different abstraction levels in an intuitive manner. In contrast to simply switching between different visualizations, this preserves the visual context and correlations throughout the transition. Users can smoothly navigate from concrete, highly-detailed 3D views to simplified and abstracted 2D views. In addition to investigating astrocytes, neurons, and their relationships, we enable the interactive analysis of the distribution of glycogen, which is of high importance to neuroscientists. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  16. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen

    2017-08-28

    This paper presents Abstractocyte, a system for the visual analysis of astrocytes and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. The study of astrocytes has immense potential for understanding brain function. However, their complex and widely-branching structure requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Furthermore, the structure and function of astrocytes is very different from neurons, and therefore requires the development of new visualization and analysis tools. With Abstractocyte, biologists can explore the morphology of astrocytes using various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a specific joint visualization as a point in this space. Interactively moving this point allows them to smoothly transition between different abstraction levels in an intuitive manner. In contrast to simply switching between different visualizations, this preserves the visual context and correlations throughout the transition. Users can smoothly navigate from concrete, highly-detailed 3D views to simplified and abstracted 2D views. In addition to investigating astrocytes, neurons, and their relationships, we enable the interactive analysis of the distribution of glycogen, which is of high importance to neuroscientists. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  17. Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.

    Science.gov (United States)

    Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E

    2007-01-01

    This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.

  18. Human Factors Evaluation of Advanced Electric Power Grid Visualization Tools

    Energy Technology Data Exchange (ETDEWEB)

    Greitzer, Frank L.; Dauenhauer, Peter M.; Wierks, Tamara G.; Podmore, Robin

    2009-04-01

    This report describes initial human factors evaluation of four visualization tools (Graphical Contingency Analysis, Force Directed Graphs, Phasor State Estimator and Mode Meter/ Mode Shapes) developed by PNNL, and proposed test plans that may be implemented to evaluate their utility in scenario-based experiments.

  19. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we co...

  20. Quantitative and comparative visualization applied to cosmological simulations

    International Nuclear Information System (INIS)

    Ahrens, James; Heitmann, Katrin; Habib, Salman; Ankeny, Lee; McCormick, Patrick; Inman, Jeff; Armstrong, Ryan; Ma, Kwan-Liu

    2006-01-01

    Cosmological simulations follow the formation of nonlinear structure in dark and luminous matter. The associated simulation volumes and dynamic range are very large, making visualization both a necessary and challenging aspect of the analysis of these datasets. Our goal is to understand sources of inconsistency between different simulation codes that are started from the same initial conditions. Quantitative visualization supports the definition and reasoning about analytically defined features of interest. Comparative visualization supports the ability to visually study, side by side, multiple related visualizations of these simulations. For instance, a scientist can visually distinguish that there are fewer halos (localized lumps of tracer particles) in low-density regions for one simulation code out of a collection. This qualitative result will enable the scientist to develop a hypothesis, such as loss of halos in low-density regions due to limited resolution, to explain the inconsistency between the different simulations. Quantitative support then allows one to confirm or reject the hypothesis. If the hypothesis is rejected, this step may lead to new insights and a new hypothesis, not available from the purely qualitative analysis. We will present methods to significantly improve the Scientific analysis process by incorporating quantitative analysis as the driver for visualization. Aspects of this work are included as part of two visualization tools, ParaView, an open-source large data visualization tool, and Scout, an analysis-language based, hardware-accelerated visualization tool

  1. Comparative Study of Load Testing Tools: Apache JMeter, HP LoadRunner, Microsoft Visual Studio (TFS, Siege

    Directory of Open Access Journals (Sweden)

    Rabiya Abbas

    2017-12-01

    Full Text Available Software testing is the process of verifying and validating the user’s requirements. Testing is ongoing process during whole software development. Software testing is characterized into three main types. That is, in Black box testing, user doesn’t know domestic knowledge, internal logics and design of system. In white box testing, Tester knows the domestic logic of code. In Grey box testing, Tester has little bit knowledge about the internal structure and working of the system. It is commonly used in case of Integration testing.Load testing helps us to analyze the performance of the system under heavy load or under Zero load. This is achieved with the help of a Load Testing Tool. The intention for writing this research is to carry out a comparison of four load testing tools i.e. Apache JMeter, LoadRunner, Microsoft Visual Studio (TFS, Siege based on certain criteria  i.e. test scripts generation , result reports, application support, plug-in supports, and cost . The main focus is to study these load testing tools and identify which tool is better and more efficient . We assume this comparison can help in selecting the most appropriate tool and motivates the use of open source load testing tools.

  2. The Exercise: An Exercise Generator Tool for the SOURCe Project

    Science.gov (United States)

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  3. SieveSifter: a web-based tool for visualizing the sieve analyses of HIV-1 vaccine efficacy trials.

    Science.gov (United States)

    Fiore-Gartland, Andrew; Kullman, Nicholas; deCamp, Allan C; Clenaghan, Graham; Yang, Wayne; Magaret, Craig A; Edlefsen, Paul T; Gilbert, Peter B

    2017-08-01

    Analysis of HIV-1 virions from participants infected in a randomized controlled preventive HIV-1 vaccine efficacy trial can help elucidate mechanisms of partial protection. By comparing the genetic sequence of viruses from vaccine and placebo recipients to the sequence of the vaccine itself, a technique called 'sieve analysis', one can identify functional specificities of vaccine-induced immune responses. We have created an interactive web-based visualization and data access tool for exploring the results of sieve analyses performed on four major preventive HIV-1 vaccine efficacy trials: (i) the HIV Vaccine Trial Network (HVTN) 502/Step trial, (ii) the RV144/Thai trial, (iii) the HVTN 503/Phambili trial and (iv) the HVTN 505 trial. The tool acts simultaneously as a platform for rapid reinterpretation of sieve effects and as a portal for organizing and sharing the viral sequence data. Access to these valuable datasets also enables the development of novel methodology for future sieve analyses. Visualization: http://sieve.fredhutch.org/viz . Source code: https://github.com/nkullman/SIEVE . Data API: http://sieve.fredhutch.org/data . agartlan@fredhutch.org. © The Author(s) 2017. Published by Oxford University Press.

  4. The DiaCog: A Prototype Tool for Visualizing Online Dialog Games' Interactions

    Science.gov (United States)

    Yengin, Ilker; Lazarevic, Bojan

    2014-01-01

    This paper proposes and explains the design of a prototype learning tool named the DiaCog. The DiaCog visualizes dialog interactions within an online dialog game by using dynamically created cognitive maps. As a purposefully designed tool for enhancing learning effectiveness the DiaCog might be applicable to dialogs at discussion boards within a…

  5. Visualization of protein interaction networks: problems and solutions

    Directory of Open Access Journals (Sweden)

    Agapito Giuseppe

    2013-01-01

    possibility to interact with external databases. Results Currently, many tools are available and it is not easy for the users choosing one of them. Some tools offer sophisticated 2D and 3D network visualization making available many layout algorithms, others tools are more data-oriented and support integration of interaction data coming from different sources and data annotation. Finally, some specialistic tools are dedicated to the analysis of pathways and cellular processes and are oriented toward systems biology studies, where the dynamic aspects of the processes being studied are central. Conclusion A current trend is the deployment of open, extensible visualization tools (e.g. Cytoscape, that may be incrementally enriched by the interactomics community with novel and more powerful functions for PIN analysis, through the development of plug-ins. On the other hand, another emerging trend regards the efficient and parallel implementation of the visualization engine that may provide high interactivity and near real-time response time, as in NAViGaTOR. From a technological point of view, open-source, free and extensible tools, like Cytoscape, guarantee a long term sustainability due to the largeness of the developers and users communities, and provide a great flexibility since new functions are continuously added by the developer community through new plug-ins, but the emerging parallel, often closed-source tools like NAViGaTOR, can offer near real-time response time also in the analysis of very huge PINs.

  6. AstroBlend: An astrophysical visualization package for Blender

    Science.gov (United States)

    Naiman, J. P.

    2016-04-01

    The rapid growth in scale and complexity of both computational and observational astrophysics over the past decade necessitates efficient and intuitive methods for examining and visualizing large datasets. Here, I present AstroBlend, an open-source Python library for use within the three dimensional modeling software, Blender. While Blender has been a popular open-source software among animators and visual effects artists, in recent years it has also become a tool for visualizing astrophysical datasets. AstroBlend combines the three dimensional capabilities of Blender with the analysis tools of the widely used astrophysical toolset, yt, to afford both computational and observational astrophysicists the ability to simultaneously analyze their data and create informative and appealing visualizations. The introduction of this package includes a description of features, work flow, and various example visualizations. A website - www.astroblend.com - has been developed which includes tutorials, and a gallery of example images and movies, along with links to downloadable data, three dimensional artistic models, and various other resources.

  7. Visualization of Multi-mission Astronomical Data with ESASky

    Science.gov (United States)

    Baines, Deborah; Giordano, Fabrizio; Racero, Elena; Salgado, Jesús; López Martí, Belén; Merín, Bruno; Sarmiento, María-Henar; Gutiérrez, Raúl; Ortiz de Landaluce, Iñaki; León, Ignacio; de Teodoro, Pilar; González, Juan; Nieto, Sara; Segovia, Juan Carlos; Pollock, Andy; Rosa, Michael; Arviset, Christophe; Lennon, Daniel; O'Mullane, William; de Marchi, Guido

    2017-02-01

    ESASky is a science-driven discovery portal to explore the multi-wavelength sky and visualize and access multiple astronomical archive holdings. The tool is a web application that requires no prior knowledge of any of the missions involved and gives users world-wide simplified access to the highest-level science data products from multiple astronomical space-based astronomy missions plus a number of ESA source catalogs. The first public release of ESASky features interfaces for the visualization of the sky in multiple wavelengths, the visualization of query results summaries, and the visualization of observations and catalog sources for single and multiple targets. This paper describes these features within ESASky, developed to address use cases from the scientific community. The decisions regarding the visualization of large amounts of data and the technologies used were made to maximize the responsiveness of the application and to keep the tool as useful and intuitive as possible.

  8. CellNetVis: a web tool for visualization of biological networks using force-directed layout constrained by cellular components.

    Science.gov (United States)

    Heberle, Henry; Carazzolle, Marcelo Falsarella; Telles, Guilherme P; Meirelles, Gabriela Vaz; Minghim, Rosane

    2017-09-13

    The advent of "omics" science has brought new perspectives in contemporary biology through the high-throughput analyses of molecular interactions, providing new clues in protein/gene function and in the organization of biological pathways. Biomolecular interaction networks, or graphs, are simple abstract representations where the components of a cell (e.g. proteins, metabolites etc.) are represented by nodes and their interactions are represented by edges. An appropriate visualization of data is crucial for understanding such networks, since pathways are related to functions that occur in specific regions of the cell. The force-directed layout is an important and widely used technique to draw networks according to their topologies. Placing the networks into cellular compartments helps to quickly identify where network elements are located and, more specifically, concentrated. Currently, only a few tools provide the capability of visually organizing networks by cellular compartments. Most of them cannot handle large and dense networks. Even for small networks with hundreds of nodes the available tools are not able to reposition the network while the user is interacting, limiting the visual exploration capability. Here we propose CellNetVis, a web tool to easily display biological networks in a cell diagram employing a constrained force-directed layout algorithm. The tool is freely available and open-source. It was originally designed for networks generated by the Integrated Interactome System and can be used with networks from others databases, like InnateDB. CellNetVis has demonstrated to be applicable for dynamic investigation of complex networks over a consistent representation of a cell on the Web, with capabilities not matched elsewhere.

  9. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  10. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  11. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  12. Tools for Authentication

    International Nuclear Information System (INIS)

    White, G.

    2008-01-01

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  13. Neural sources of visual working memory maintenance in human parietal and ventral extrastriate visual cortex.

    Science.gov (United States)

    Becke, Andreas; Müller, Notger; Vellage, Anne; Schoenfeld, Mircea Ariel; Hopf, Jens-Max

    2015-04-15

    Maintaining information in visual working memory is reliably indexed by the contralateral delay activity (CDA) - a sustained modulation of the event-related potential (ERP) with a topographical maximum over posterior scalp regions contralateral to the memorized input. Based on scalp topography, it is hypothesized that the CDA reflects neural activity in the parietal cortex, but the precise cortical origin of underlying electric activity was never determined. Here we combine ERP recordings with magnetoencephalography based source localization to characterize the cortical current sources generating the CDA. Observers performed a cued delayed match to sample task where either the color or the relative position of colored dots had to be maintained in memory. A detailed source-localization analysis of the magnetic activity in the retention interval revealed that the magnetic analog of the CDA (mCDA) is generated by current sources in the parietal cortex. Importantly, we find that the mCDA also receives contribution from current sources in the ventral extrastriate cortex that display a time-course similar to the parietal sources. On the basis of the magnetic responses, forward modeling of ERP data reveals that the ventral sources have non-optimal projections and that these sources are therefore concealed in the ERP by overlapping fields with parietal projections. The present observations indicate that visual working memory maintenance, as indexed by the CDA, involves the parietal cortical regions as well as the ventral extrastriate regions, which code the sensory representation of the memorized content. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Early visual analysis tool using magnetoencephalography for treatment and recovery of neuronal dysfunction.

    Science.gov (United States)

    Rasheed, Waqas; Neoh, Yee Yik; Bin Hamid, Nor Hisham; Reza, Faruque; Idris, Zamzuri; Tang, Tong Boon

    2017-10-01

    Functional neuroimaging modalities play an important role in deciding the diagnosis and course of treatment of neuronal dysfunction and degeneration. This article presents an analytical tool with visualization by exploiting the strengths of the MEG (magnetoencephalographic) neuroimaging technique. The tool automates MEG data import (in tSSS format), channel information extraction, time/frequency decomposition, and circular graph visualization (connectogram) for simple result inspection. For advanced users, the tool also provides magnitude squared coherence (MSC) values allowing personalized threshold levels, and the computation of default model from MEG data of control population. Default model obtained from healthy population data serves as a useful benchmark to diagnose and monitor neuronal recovery during treatment. The proposed tool further provides optional labels with international 10-10 system nomenclature in order to facilitate comparison studies with EEG (electroencephalography) sensor space. Potential applications in epilepsy and traumatic brain injury studies are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.

    Science.gov (United States)

    Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk

    2016-03-18

    Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .

  16. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    Science.gov (United States)

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  17. Seeing the Point: Using Visual Sources to Understand the Arguments for Women's Suffrage

    Science.gov (United States)

    Card, Jane

    2011-01-01

    Visual sources, Jane Card argues, are a powerful resource for historical learning but using them in the classroom requires careful thought and planning. Card here shares how she has used visual source material in order to teach her students about the women's suffrage movement. In particular, Card shows how a chain of questions that moves from the…

  18. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    Science.gov (United States)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  19. ThManager: An Open Source Tool for Creating and Visualizing SKOS

    Directory of Open Access Journals (Sweden)

    Javier Lacasta

    2007-09-01

    Full Text Available Knowledge organization systems denotes formally represented knowledge that is used within the context of digital libraries to improve data sharing and information retrieval. To increase their use, and to reuse them when possible, it is vital to manage them adequately and to provide them in a standard interchange format. Simple knowledge organization systems (SKOS seem to be the most promising representation for the type of knowledge models used in digital libraries, but there is a lack of tools that are able to properly manage it. This work presents a tool that fills this gap, facilitating their use in different environments and using SKOS as an interchange format.

  20. EvolView, an online tool for visualizing, annotating and managing phylogenetic trees.

    Science.gov (United States)

    Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Hu, Songnian; Chen, Wei-Hua

    2012-07-01

    EvolView is a web application for visualizing, annotating and managing phylogenetic trees. First, EvolView is a phylogenetic tree viewer and customization tool; it visualizes trees in various formats, customizes them through built-in functions that can link information from external datasets, and exports the customized results to publication-ready figures. Second, EvolView is a tree and dataset management tool: users can easily organize related trees into distinct projects, add new datasets to trees and edit and manage existing trees and datasets. To make EvolView easy to use, it is equipped with an intuitive user interface. With a free account, users can save data and manipulations on the EvolView server. EvolView is freely available at: http://www.evolgenius.info/evolview.html.

  1. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we c......The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities...... are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI. In the paper, we consider a number of Extract‐Transform‐Load (ETL) tools, database management systems (DBMSs), On‐Line Analytical Processing (OLAP) servers, and OLAP clients. We find that, unlike the situation a few years ago, there now...

  2. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1999-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  3. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1998-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  4. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    2000-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  5. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  6. A Survey of Visualization Tools Assessed for Anomaly-Based Intrusion Detection Analysis

    Science.gov (United States)

    2014-04-01

    includes Complex System SCILAB Toolbox, GraphViz, Igraph, NetDraw, Network Workbench, OpenDX, Prefuse, Sci² Tool, and Visualization Toolkit (VTK...Kits’ Capabilities Name Web Sites (all accessed 01/29/2014 Strengths Weaknesses Complex Systems SCILAB Tool http://www.randomfactory.com/openastro...osx/ scilab -info.html Measures graph parameters Academic Free License (AFL); works on UNIX and Windows; programming language is MATLAB; no

  7. iVCJ: A tool for Interactive Visualization of high explosives CJ states

    Energy Technology Data Exchange (ETDEWEB)

    Wooten, Hasani Omar [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Aslam, Tariq Dennis [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Whitley, Von Howard [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-12

    A graphical user interface (GUI) tool has been developed that facilitates the visualization and analysis of the Chapman-Jouguet state for high explosives gaseous products using the Jones- Wilkins-Lee equation of state.

  8. Open Source and Proprietary Project Management Tools for SMEs.

    Directory of Open Access Journals (Sweden)

    Veronika Abramova

    2017-05-01

    Full Text Available The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management and ERP (Enterprise Resource Planning modules. Currently, a large number of project management software is available, to assist project team during the entire project lifecycle. We present the main differences between open source and proprietary project management tools and how those could be important for SMEs, describing the key features and how those can assist the project manager and the development team. In this paper, we analyse four open-source project management tools: OpenProject, ProjectLibre, Redmine, LibrePlan and four proprietary tools: Bitrix24, JIRA, Microsoft Project and Asana.

  9. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    Science.gov (United States)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  10. IVisTMSA: Interactive Visual Tools for Multiple Sequence Alignments.

    Science.gov (United States)

    Pervez, Muhammad Tariq; Babar, Masroor Ellahi; Nadeem, Asif; Aslam, Naeem; Naveed, Nasir; Ahmad, Sarfraz; Muhammad, Shah; Qadri, Salman; Shahid, Muhammad; Hussain, Tanveer; Javed, Maryam

    2015-01-01

    IVisTMSA is a software package of seven graphical tools for multiple sequence alignments. MSApad is an editing and analysis tool. It can load 409% more data than Jalview, STRAP, CINEMA, and Base-by-Base. MSA comparator allows the user to visualize consistent and inconsistent regions of reference and test alignments of more than 21-MB size in less than 12 seconds. MSA comparator is 5,200% efficient and more than 40% efficient as compared to BALiBASE c program and FastSP, respectively. MSA reconstruction tool provides graphical user interfaces for four popular aligners and allows the user to load several sequence files at a time. FASTA generator converts seven formats of alignments of unlimited size into FASTA format in a few seconds. MSA ID calculator calculates identity matrix of more than 11,000 sequences with a sequence length of 2,696 base pairs in less than 100 seconds. Tree and Distance Matrix calculation tools generate phylogenetic tree and distance matrix, respectively, using neighbor joining% identity and BLOSUM 62 matrix.

  11. Visual Tools for Crowdsourcing Data Validation Within the GLOBELAND30 Geoportal

    Science.gov (United States)

    Chuprikova, E.; Wu, H.; Murphy, C. E.; Meng, L.

    2016-06-01

    This research aims to investigate the role of visualization of the user generated data that can empower the geoportal of GlobeLand30 produced by NGCC (National Geomatics Center of China). The focus is set on the development of a concept of tools that can extend the Geo-tagging functionality and make use of it for different target groups. The anticipated tools should improve the continuous data validation, updating and efficient use of the remotely-sensed data distributed within GlobeLand30.

  12. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    Science.gov (United States)

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  13. Intuitive Visualization of Transient Flow: Towards a Full 3D Tool

    Science.gov (United States)

    Michel, Isabel; Schröder, Simon; Seidel, Torsten; König, Christoph

    2015-04-01

    Visualization of geoscientific data is a challenging task especially when targeting a non-professional audience. In particular, the graphical presentation of transient vector data can be a significant problem. With STRING Fraunhofer ITWM (Kaiserslautern, Germany) in collaboration with delta h Ingenieurgesellschaft mbH (Witten, Germany) developed a commercial software for intuitive 2D visualization of 3D flow problems. Through the intuitive character of the visualization experts can more easily transport their findings to non-professional audiences. In STRING pathlets moving with the flow provide an intuition of velocity and direction of both steady-state and transient flow fields. The visualization concept is based on the Lagrangian view of the flow which means that the pathlets' movement is along the direction given by pathlines. In order to capture every detail of the flow an advanced method for intelligent, time-dependent seeding of the pathlets is implemented based on ideas of the Finite Pointset Method (FPM) originally conceived at and continuously developed by Fraunhofer ITWM. Furthermore, by the same method pathlets are removed during the visualization to avoid visual cluttering. Additional scalar flow attributes, for example concentration or potential, can either be mapped directly to the pathlets or displayed in the background of the pathlets on the 2D visualization plane. The extensive capabilities of STRING are demonstrated with the help of different applications in groundwater modeling. We will discuss the strengths and current restrictions of STRING which have surfaced during daily use of the software, for example by delta h. Although the software focusses on the graphical presentation of flow data for non-professional audiences its intuitive visualization has also proven useful to experts when investigating details of flow fields. Due to the popular reception of STRING and its limitation to 2D, the need arises for the extension to a full 3D tool

  14. Visual attention: Linking prefrontal sources to neuronal and behavioral correlates.

    Science.gov (United States)

    Clark, Kelsey; Squire, Ryan Fox; Merrikhi, Yaser; Noudoost, Behrad

    2015-09-01

    Attention is a means of flexibly selecting and enhancing a subset of sensory input based on the current behavioral goals. Numerous signatures of attention have been identified throughout the brain, and now experimenters are seeking to determine which of these signatures are causally related to the behavioral benefits of attention, and the source of these modulations within the brain. Here, we review the neural signatures of attention throughout the brain, their theoretical benefits for visual processing, and their experimental correlations with behavioral performance. We discuss the importance of measuring cue benefits as a way to distinguish between impairments on an attention task, which may instead be visual or motor impairments, and true attentional deficits. We examine evidence for various areas proposed as sources of attentional modulation within the brain, with a focus on the prefrontal cortex. Lastly, we look at studies that aim to link sources of attention to its neuronal signatures elsewhere in the brain. Copyright © 2015. Published by Elsevier Ltd.

  15. Gestió de factures electròniques amb .NET (Visual Studio Tools for Office)

    OpenAIRE

    Gimeno Capín, Pablo

    2008-01-01

    Creació d¿un software de gestió de factures electròniques desenvolupat en aquesta plataforma tecnològica, amb indicació expressa d¿utilització de les eines VSTO (Visual Studio Tools for Office) en la seva última versió. Creación de un software de gestión de facturas electrónicas desarrollado en esta plataforma tecnológica, con indicación expresa de utilización de las herramientas VSTO (Visual Studio Tools for Office) en su última versión. Creation of electronic invoice management softwa...

  16. Tool-Based Curricula and Visual Learning

    Directory of Open Access Journals (Sweden)

    Dragica Vasileska

    2013-12-01

    Full Text Available In the last twenty years nanotechnology hasrevolutionized the world of information theory, computers andother important disciplines, such as medicine, where it hascontributed significantly in the creation of more sophisticateddiagnostic tools. Therefore, it is important for people working innanotechnology to better understand basic concepts to be morecreative and productive. To further foster the progress onNanotechnology in the USA, the National Science Foundation hascreated the Network for Computational Nanotechnology (NCNand the dissemination of all the information from member andnon-member participants of the NCN is enabled by thecommunity website www.nanoHUB.org. nanoHUB’s signatureservices online simulation that enables the operation ofsophisticated research and educational simulation engines with acommon browser. No software installation or local computingpower is needed. The simulation tools as well as nano-conceptsare augmented by educational materials, assignments, and toolbasedcurricula, which are assemblies of tools that help studentsexcel in a particular area.As elaborated later in the text, it is the visual mode of learningthat we are exploiting in achieving faster and better results withstudents that go through simulation tool-based curricula. Thereare several tool based curricula already developed on thenanoHUB and undergoing further development, out of which fiveare directly related to nanoelectronics. They are: ABACUS –device simulation module; ACUTE – Computational Electronicsmodule; ANTSY – bending toolkit; and AQME – quantummechanics module. The methodology behind tool-based curriculais discussed in details. Then, the current status of each module ispresented, including user statistics and student learningindicatives. Particular simulation tool is explored further todemonstrate the ease by which students can grasp information.Representative of Abacus is PN-Junction Lab; representative ofAQME is PCPBT tool; and

  17. Methods and apparatus for safely handling radioactive sources in measuring-while-drilling tools

    International Nuclear Information System (INIS)

    Wraight, P.D.

    1989-01-01

    This patent describes a method for removing a chemical radioactive source from a MWD tool which is coupled in a drill string supported by a drilling rig while a borehole is drilled and includes logging means for measuring formation characteristics in response to irradiation of the adjacent formations by the radioactive source during the drilling operation. The steps of the method are: halting the drilling operation and then removing the drill string from the borehole for moving the MWD tool to a work station at the surface where the source is at a safe working distance from the drilling rig and will be accessible by way of one end of the MWD tool; positioning a radiation shield at a location adjacent to the one end of the MWD tool where the shield is ready for receiving the source as it is moved away from the other end of the MWD tool and then moving the source away from the other end of the MWD tool for enclosing the source within the shield; and once the source is enclosed within the shield; removing the shield together with the enclosed source from the MWD tool for transferring the enclosed source to another work station

  18. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  19. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  20. GenomeCAT: a versatile tool for the analysis and integrative visualization of DNA copy number variants.

    Science.gov (United States)

    Tebel, Katrin; Boldt, Vivien; Steininger, Anne; Port, Matthias; Ebert, Grit; Ullmann, Reinhard

    2017-01-06

    The analysis of DNA copy number variants (CNV) has increasing impact in the field of genetic diagnostics and research. However, the interpretation of CNV data derived from high resolution array CGH or NGS platforms is complicated by the considerable variability of the human genome. Therefore, tools for multidimensional data analysis and comparison of patient cohorts are needed to assist in the discrimination of clinically relevant CNVs from others. We developed GenomeCAT, a standalone Java application for the analysis and integrative visualization of CNVs. GenomeCAT is composed of three modules dedicated to the inspection of single cases, comparative analysis of multidimensional data and group comparisons aiming at the identification of recurrent aberrations in patients sharing the same phenotype, respectively. Its flexible import options ease the comparative analysis of own results derived from microarray or NGS platforms with data from literature or public depositories. Multidimensional data obtained from different experiment types can be merged into a common data matrix to enable common visualization and analysis. All results are stored in the integrated MySQL database, but can also be exported as tab delimited files for further statistical calculations in external programs. GenomeCAT offers a broad spectrum of visualization and analysis tools that assist in the evaluation of CNVs in the context of other experiment data and annotations. The use of GenomeCAT does not require any specialized computer skills. The various R packages implemented for data analysis are fully integrated into GenomeCATs graphical user interface and the installation process is supported by a wizard. The flexibility in terms of data import and export in combination with the ability to create a common data matrix makes the program also well suited as an interface between genomic data from heterogeneous sources and external software tools. Due to the modular architecture the functionality of

  1. [The Performance Analysis for Lighting Sources in Highway Tunnel Based on Visual Function].

    Science.gov (United States)

    Yang, Yong; Han, Wen-yuan; Yan, Ming; Jiang, Hai-feng; Zhu, Li-wei

    2015-10-01

    Under the condition of mesopic vision, the spectral luminous efficiency function is shown as a series of curves. Its peak wavelength and intensity are affected by light spectrum, background brightness and other aspects. The impact of light source to lighting visibility could not be carried out via a single optical parametric characterization. The reaction time of visual cognition is regard as evaluating indexes in this experiment. Under the condition of different speed and luminous environment, testing visual cognition based on vision function method. The light sources include high pressure sodium, electrodeless fluorescent lamp and white LED with three kinds of color temperature (the range of color temperature is from 1 958 to 5 537 K). The background brightness value is used for basic section of highway tunnel illumination and general outdoor illumination, its range is between 1 and 5 cd x m(-)2. All values are in the scope of mesopic vision. Test results show that: under the same condition of speed and luminance, the reaction time of visual cognition that corresponding to high color temperature of light source is shorter than it corresponding to low color temperature; the reaction time corresponding to visual target in high speed is shorter than it in low speed. At the end moment, however, the visual angle of target in observer's visual field that corresponding to low speed was larger than it corresponding to high speed. Based on MOVE model, calculating the equivalent luminance of human mesopic vision, which is on condition of different emission spectrum and background brightness that formed by test lighting sources. Compared with photopic vision result, the standard deviation (CV) of time-reaction curve corresponding to equivalent brightness of mesopic vision is smaller. Under the condition of mesopic vision, the discrepancy between equivalent brightness of different lighting source and photopic vision, that is one of the main reasons for causing the

  2. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  3. VISUAL TOOLS FOR CROWDSOURCING DATA VALIDATION WITHIN THE GLOBELAND30 GEOPORTAL

    Directory of Open Access Journals (Sweden)

    E. Chuprikova

    2016-06-01

    Full Text Available This research aims to investigate the role of visualization of the user generated data that can empower the geoportal of GlobeLand30 produced by NGCC (National Geomatics Center of China. The focus is set on the development of a concept of tools that can extend the Geo-tagging functionality and make use of it for different target groups. The anticipated tools should improve the continuous data validation, updating and efficient use of the remotely-sensed data distributed within GlobeLand30.

  4. Parallel analysis tools and new visualization techniques for ultra-large climate data set

    Energy Technology Data Exchange (ETDEWEB)

    Middleton, Don [National Center for Atmospheric Research, Boulder, CO (United States); Haley, Mary [National Center for Atmospheric Research, Boulder, CO (United States)

    2014-12-10

    ParVis was a project funded under LAB 10-05: “Earth System Modeling: Advanced Scientific Visualization of Ultra-Large Climate Data Sets”. Argonne was the lead lab with partners at PNNL, SNL, NCAR and UC-Davis. This report covers progress from January 1st, 2013 through Dec 1st, 2014. Two previous reports covered the period from Summer, 2010, through September 2011 and October 2011 through December 2012, respectively. While the project was originally planned to end on April 30, 2013, personnel and priority changes allowed many of the institutions to continue work through FY14 using existing funds. A primary focus of ParVis was introducing parallelism to climate model analysis to greatly reduce the time-to-visualization for ultra-large climate data sets. Work in the first two years was conducted on two tracks with different time horizons: one track to provide immediate help to climate scientists already struggling to apply their analysis to existing large data sets and another focused on building a new data-parallel library and tool for climate analysis and visualization that will give the field a platform for performing analysis and visualization on ultra-large datasets for the foreseeable future. In the final 2 years of the project, we focused mostly on the new data-parallel library and associated tools for climate analysis and visualization.

  5. PANDA-view: An easy-to-use tool for statistical analysis and visualization of quantitative proteomics data.

    Science.gov (United States)

    Chang, Cheng; Xu, Kaikun; Guo, Chaoping; Wang, Jinxia; Yan, Qi; Zhang, Jian; He, Fuchu; Zhu, Yunping

    2018-05-22

    Compared with the numerous software tools developed for identification and quantification of -omics data, there remains a lack of suitable tools for both downstream analysis and data visualization. To help researchers better understand the biological meanings in their -omics data, we present an easy-to-use tool, named PANDA-view, for both statistical analysis and visualization of quantitative proteomics data and other -omics data. PANDA-view contains various kinds of analysis methods such as normalization, missing value imputation, statistical tests, clustering and principal component analysis, as well as the most commonly-used data visualization methods including an interactive volcano plot. Additionally, it provides user-friendly interfaces for protein-peptide-spectrum representation of the quantitative proteomics data. PANDA-view is freely available at https://sourceforge.net/projects/panda-view/. 1987ccpacer@163.com and zhuyunping@gmail.com. Supplementary data are available at Bioinformatics online.

  6. Building Eclectic Personal Learning Landscapes with Open Source Tools

    NARCIS (Netherlands)

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005,

  7. GeneWiz browser: An Interactive Tool for Visualizing Sequenced Chromosomes

    DEFF Research Database (Denmark)

    Hallin, Peter Fischer; Stærfeldt, Hans Henrik; Rotenberg, Eva

    2009-01-01

    , standard atlases are pre-generated for all prokaryotic genomes available in GenBank, providing a fast overview of all available genomes, including recently deposited genome sequences. The tool is available online from http://www.cbs.dtu.dk/services/gwBrowser. [Supplemental material including interactive...... atlases is available online at http://www.cbs.dtu.dk/services/gwBrowser/suppl/]....... readability and increased functionality compared to other browsers. The tool allows the user to select the display of various genomic features, color setting and data ranges. Custom numerical data can be added to the plot, allowing for example visualization of gene expression and regulation data. Further...

  8. Building Eclectic Personal Learning Landscapes with Open Source Tools

    OpenAIRE

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005, Heerlen, The Netherlands.

  9. Visual operations management tools applied to the oil pipelines and terminals standardization process: the experience of TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Maria Fatima Ludovico de [Pontificia Universidade Catolica do Rio de Janeiro (PUC-Rio/ITUC), Rio de Janeiro, RJ (Brazil). Instituto Tecnologico; Santiago, Adilson; Ribeiro, Kassandra Senra; Arruda, Daniela Mendonca [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    This paper describes the process by which visual operations management (VOM) tools were implemented, concerning standards and operational procedures in TRANSPETRO's Oil Pipelines and Terminals Unit. It provides: a brief literature review of visual operations management tools applied to total quality management and the standardization processes; a discussion of the assumptions from the second level of VOM (visual standards) upon which TRANSPETRO's oil pipelines and terminals business processes and operational procedures are based; and a description of the VOM implementation process involving more than 100 employees and one illustrative example of 'Quick Guides' for right-of- way management activities. Finally, it discusses the potential impacts and benefits of using VOM tools in the current practices in TRANSPETRO's Oil Pipelines and Terminals Unit, reinforcing the importance of such visual guides as vital to implement regional and corporate procedures, focusing on the main operational processes. (author)

  10. Developing an Interactive Data Visualization Tool to Assess the Impact of Decision Support on Clinical Operations.

    Science.gov (United States)

    Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M

    2018-05-18

    Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.

  11. The Film as Visual Aided Learning Tool in Classroom Management Course

    Science.gov (United States)

    Altinay Gazi, Zehra; Altinay Aksal, Fahriye

    2011-01-01

    This research aims to investigate the impact of the visual aided learning on pre-service teachers' co-construction of subject matter knowledge in teaching practice. The study revealed the examination of film as an active cognizing and learning tool in classroom management course within teacher education programme. Within the framework of action…

  12. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    Science.gov (United States)

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  13. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    Science.gov (United States)

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  14. LabKey Server NAb: A tool for analyzing, visualizing and sharing results from neutralizing antibody assays

    Directory of Open Access Journals (Sweden)

    Gao Hongmei

    2011-05-01

    Full Text Available Abstract Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the Lab

  15. Introducing Product Lines through Open Source Tools

    OpenAIRE

    Haugen, Øystein

    2008-01-01

    We present an approach to introducing product lines to companies that lower their initial risk by applying open source tools and a smooth learning curve into the use and creation of domain specific modeling combined with standardized variability modeling.

  16. The pyPHaz software, an interactive tool to analyze and visualize results from probabilistic hazard assessments

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo; Costa, Antonio; Sandri, Laura

    2014-05-01

    Probabilistic Hazard Assessment (PHA) is becoming an essential tool for risk mitigation policies, since it allows to quantify the hazard due to hazardous phenomena and, differently from the deterministic approach, it accounts for both aleatory and epistemic uncertainties. On the other hand, one of the main disadvantages of PHA methods is that their results are not easy to understand and interpret by people who are not specialist in probabilistic tools. For scientists, this leads to the issue of providing tools that can be easily used and understood by decision makers (i.e., risk managers or local authorities). The work here presented fits into the problem of simplifying the transfer between scientific knowledge and land protection policies, by providing an interface between scientists, who produce PHA's results, and decision makers, who use PHA's results for risk analyses. In this framework we present pyPHaz, an open tool developed and designed to visualize and analyze PHA results due to one or more phenomena affecting a specific area of interest. The software implementation has been fully developed with the free and open-source Python programming language and some featured Python-based libraries and modules. The pyPHaz tool allows to visualize the Hazard Curves (HC) calculated in a selected target area together with different levels of uncertainty (mean and percentiles) on maps that can be interactively created and modified by the user, thanks to a dedicated Graphical User Interface (GUI). Moreover, the tool can be used to compare the results of different PHA models and to merge them, by creating ensemble models. The pyPHaz software has been designed with the features of storing and accessing all the data through a MySQL database and of being able to read as input the XML-based standard file formats defined in the frame of GEM (Global Earthquake Model). This format model is easy to extend also to any other kind of hazard, as it will be shown in the applications

  17. From a Gloss to a Learning Tool: Does Visual Aids Enhance Better Sentence Comprehension?

    Science.gov (United States)

    Sato, Takeshi; Suzuki, Akio

    2012-01-01

    The aim of this study is to optimize CALL environments as a learning tool rather than a gloss, focusing on the learning of polysemous words which refer to spatial relationship between objects. A lot of research has already been conducted to examine the efficacy of visual glosses while reading L2 texts and has reported that visual glosses can be…

  18. Open source tools for large-scale neuroscience.

    Science.gov (United States)

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  19. Improvement of visual debugging tool. Shortening the elapsed time for getting data and adding new functions to compare/combine a set of visualized data

    International Nuclear Information System (INIS)

    Matsuda, Katsuyuki; Takemiya, Hiroshi

    2001-03-01

    The visual debugging tool 'vdebug' has been improved, which was designed for the debugging of programs for scientific computing. Improved were the following two points; (1) shortening the elapsed time required for getting appropriate data to visualize; (2) adding new functions which enable to compare and/or combine a set of visualized data originated from two or more different programs. As for shortening elapsed time for getting data, with the improved version of 'vdebug', we could achieve the following results; over hundred times shortening the elapsed time with dbx, pdbx of SX-4 and over ten times with ndb of SR2201. As for the new functions to compare/combine visualized data, it was confirmed that we could easily checked the consistency between the computational results obtained in each calculational steps on two different computers: SP and ONYX. In this report, we illustrate how the tool 'vdebug' has been improved with an example. (author)

  20. Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data.

    Science.gov (United States)

    Combrisson, Etienne; Vallat, Raphael; Eichenlaub, Jean-Baptiste; O'Reilly, Christian; Lajnef, Tarek; Guillot, Aymeric; Ruby, Perrine M; Jerbi, Karim

    2017-01-01

    We introduce Sleep, a new Python open-source graphical user interface (GUI) dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1) Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2) Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM), (3) Implementation of practical signal processing tools such as re-referencing or filtering, and (4) Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep) and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module.

  1. Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data

    Directory of Open Access Journals (Sweden)

    Etienne Combrisson

    2017-09-01

    Full Text Available We introduce Sleep, a new Python open-source graphical user interface (GUI dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1 Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2 Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM, (3 Implementation of practical signal processing tools such as re-referencing or filtering, and (4 Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module.

  2. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  3. Tools for visualization of phosphoinositides in the cell nucleus.

    Science.gov (United States)

    Kalasova, Ilona; Fáberová, Veronika; Kalendová, Alžběta; Yildirim, Sukriye; Uličná, Lívia; Venit, Tomáš; Hozák, Pavel

    2016-04-01

    Phosphoinositides (PIs) are glycerol-based phospholipids containing hydrophilic inositol ring. The inositol ring is mono-, bis-, or tris-phosphorylated yielding seven PIs members. Ample evidence shows that PIs localize both to the cytoplasm and to the nucleus. However, tools for direct visualization of nuclear PIs are limited and many studies thus employ indirect approaches, such as staining of their metabolic enzymes. Since localization and mobility of PIs differ from their metabolic enzymes, these approaches may result in incomplete data. In this paper, we tested commercially available PIs antibodies by light microscopy on fixed cells, tested their specificity using protein-lipid overlay assay and blocking assay, and compared their staining patterns. Additionally, we prepared recombinant PIs-binding domains and tested them on both fixed and live cells by light microscopy. The results provide a useful overview of usability of the tools tested and stress that the selection of adequate tools is critical. Knowing the localization of individual PIs in various functional compartments should enable us to better understand the roles of PIs in the cell nucleus.

  4. Survey of Network Visualization Tools

    Science.gov (United States)

    2007-12-01

    programming language such as Java, C #, Delphi and Visual basic. AlgoCOMs Network also supports Visual Basic for Applications ( VBA ). Hardware: Users...AlgoCOMs Network also supports Visual Basic for Applications ( VBA ). Hardware: Users: Availability: • Commercially Available Cost $101...Application Monitoring - Constantly watch the health of your mission-critical applications: MS SQL , MS Exchange, MS IIS, Active Directory. Event

  5. Enhancing interdisciplinary collaboration and decisionmaking with J-Earth: an open source data sharing, visualization and GIS analysis platform

    Science.gov (United States)

    Prashad, L. C.; Christensen, P. R.; Fink, J. H.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2010-12-01

    local to global levels. J-Earth is a Geographic Information System (GIS) that provides analytical tools for visualizing high-resolution and hyperspectral remote sensing imagery along with numeric and vector data. J-Earth is part of the JMARS (Java Mission-planning and Analysis for Remote Sensing) suite of tools which were first created to target NASA instruments on Mars and Lunar missions. Data can currently be incorporated in J-Earth at a scale of over 32,000 pixels per degree. Among other GIS functions, users can analyze trends along a transect line, or across vector regions, over multiple stacked numerical data layers and export their results. Open source tools, like J-Earth, are not only generally free or low-cost to users but provide the opportunity for users to contribute direction, functionality, and data standards to these projects. The flexible nature of open source projects often facilitates the incorporation of unique and emerging data sources, such as mobile phone data, sensor networks, croudsourced inputs, and social networking. The J-Earth team plans to incorporate datasources such as these with the feedback and participation of the user community.

  6. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data.

    Science.gov (United States)

    Jaitly, Navdeep; Mayampurath, Anoop; Littlefield, Kyle; Adkins, Joshua N; Anderson, Gordon A; Smith, Richard D

    2009-03-17

    Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS)-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC) elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to identify and quantify peptides in the

  7. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    Science.gov (United States)

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  8. iRaster: a novel information visualization tool to explore spatiotemporal patterns in multiple spike trains.

    Science.gov (United States)

    Somerville, J; Stuart, L; Sernagor, E; Borisyuk, R

    2010-12-15

    Over the last few years, simultaneous recordings of multiple spike trains have become widely used by neuroscientists. Therefore, it is important to develop new tools for analysing multiple spike trains in order to gain new insight into the function of neural systems. This paper describes how techniques from the field of visual analytics can be used to reveal specific patterns of neural activity. An interactive raster plot called iRaster has been developed. This software incorporates a selection of statistical procedures for visualization and flexible manipulations with multiple spike trains. For example, there are several procedures for the re-ordering of spike trains which can be used to unmask activity propagation, spiking synchronization, and many other important features of multiple spike train activity. Additionally, iRaster includes a rate representation of neural activity, a combined representation of rate and spikes, spike train removal and time interval removal. Furthermore, it provides multiple coordinated views, time and spike train zooming windows, a fisheye lens distortion, and dissemination facilities. iRaster is a user friendly, interactive, flexible tool which supports a broad range of visual representations. This tool has been successfully used to analyse both synthetic and experimentally recorded datasets. In this paper, the main features of iRaster are described and its performance and effectiveness are demonstrated using various types of data including experimental multi-electrode array recordings from the ganglion cell layer in mouse retina. iRaster is part of an ongoing research project called VISA (Visualization of Inter-Spike Associations) at the Visualization Lab in the University of Plymouth. The overall aim of the VISA project is to provide neuroscientists with the ability to freely explore and analyse their data. The software is freely available from the Visualization Lab website (see www.plymouth.ac.uk/infovis). Copyright © 2010

  9. Digital administrative maps – A tool for visualization of epidemiological data

    Directory of Open Access Journals (Sweden)

    Ewa Niewiadomska

    2013-08-01

    Full Text Available Background: The aim of the study is to present the methods for visualization of epidemiological data using digital contour maps that take into account administrative division of Poland. Materials and Methods: The possibility of epidemiological data visualization in a geographical order, limited to the administrative level of the country, voivodeships and poviats (counties, are presented. They are crucial for the process of identifying and undertaking adequate prophylactic activities directed towards decreasing the risk and improving the population's health. This paper presents tools and techniques available in Geographic Information System ArcGIS and statistical software package R. Results: The work includes our own data reflecting: 1 the values of specific mortality rates due to respiratory diseases, Poland, 2010, based on the Central Statistical Office data, using the R statistical software package; 2 the averaged registered incidence rates of sarcoidosis in 2006-2010 for the population aged 19+ in the Silesian voivodeship, using Geographic Information System ArcGIS; and 3 the number of children with diagnosed respiratory diseases in the city of Legnica in 2009, taking into account their place of residence, using layered maps in Geographic Information System ArcGIS. Conclusions: The tools presented and described in this paper make it possible to visualize the results of research, to increase attractiveness of courses for students, as well as to enhance the skills and competence of students and participants of courses. Med Pr 2013;64(4:533–539

  10. The sensory timecourses associated with conscious visual item memory and source memory.

    Science.gov (United States)

    Thakral, Preston P; Slotnick, Scott D

    2015-09-01

    Previous event-related potential (ERP) findings have suggested that during visual item and source memory, nonconscious and conscious sensory (occipital-temporal) activity onsets may be restricted to early (0-800 ms) and late (800-1600 ms) temporal epochs, respectively. In an ERP experiment, we tested this hypothesis by separately assessing whether the onset of conscious sensory activity was restricted to the late epoch during source (location) memory and item (shape) memory. We found that conscious sensory activity had a late (>800 ms) onset during source memory and an early (memory. In a follow-up fMRI experiment, conscious sensory activity was localized to BA17, BA18, and BA19. Of primary importance, the distinct source memory and item memory ERP onsets contradict the hypothesis that there is a fixed temporal boundary separating nonconscious and conscious processing during all forms of visual conscious retrieval. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Visi—A VTK- and QT-Based Open-Source Project for Scientific Data Visualization

    Science.gov (United States)

    Li, Yiming; Chen, Cheng-Kai

    2009-03-01

    In this paper, we present an open-source project, Visi for high-dimensional engineering and scientific data visualization. Visi is with state-of-the-art interactive user interface and graphics kernels based upon Qt (a cross-platform GUI toolkit) and VTK (an object-oriented visualization library). For an initialization of Visi, a preliminary window will be activated by Qt, and the kernel of VTK is simultaneously embedded into the window, where the graphics resources are allocated. Representation of visualization is through an interactive interface so that the data will be rendered according to user's preference. The developed framework possesses high flexibility and extensibility for advanced functions (e.g., object combination, etc) and further applications. Application of Visi to data visualization in various fields, such as protein structure in bioinformatics, 3D semiconductor transistor, and interconnect of very-large scale integration (VLSI) layout is also illustrated to show the performance of Visi. The developed open-source project is available in our project website on the internet [1].

  12. Web-Based Tools for Data Visualization and Decision Support for South Asia

    Science.gov (United States)

    Jones, N.; Nelson, J.; Pulla, S. T.; Ames, D. P.; Souffront, M.; David, C. H.; Zaitchik, B. F.; Gatlin, P. N.; Matin, M. A.

    2017-12-01

    The objective of the NASA SERVIR project is to assist developing countries in using information provided by Earth observing satellites to assess and manage climate risks, land use, and water resources. We present a collection of web apps that integrate earth observations and in situ data to facilitate deployment of data and water resources models as decision-making tools in support of this effort. The interactive nature of web apps makes this an excellent medium for creating decision support tools that harness cutting edge modeling techniques. Thin client apps hosted in a cloud portal eliminates the need for the decision makers to procure and maintain the high performance hardware required by the models, deal with issues related to software installation and platform incompatibilities, or monitor and install software updates, a problem that is exacerbated for many of the regional SERVIR hubs where both financial and technical capacity may be limited. All that is needed to use the system is an Internet connection and a web browser. We take advantage of these technologies to develop tools which can be centrally maintained but openly accessible. Advanced mapping and visualization make results intuitive and information derived actionable. We also take advantage of the emerging standards for sharing water information across the web using the OGC and WMO approved WaterML standards. This makes our tools interoperable and extensible via application programming interfaces (APIs) so that tools and data from other projects can both consume and share the tools developed in our project. Our approach enables the integration of multiple types of data and models, thus facilitating collaboration between science teams in SERVIR. The apps developed thus far by our team process time-varying netCDF files from Earth observations and large-scale computer simulations and allow visualization and exploration via raster animation and extraction of time series at selected points and/or regions.

  13. Custom Visualization without Real Programming

    DEFF Research Database (Denmark)

    Pantazos, Kostas

    Information Visualization tools have simplified visualization development. Some tools help simple users construct standard visualizations; others help programmers develop custom visualizations. This thesis contributes to the field of Information Visualization and End-User Development. The first...... contribution of the thesis is a taxonomy for Information Visualization development tools. Existing taxonomies from the Information Visualization field are helpful, but none of them can properly categorize visualization tools from a user development perspective. The categorization of 20 Information...... Visualization tools proves the applicability of this taxonomy, and the result showed that there are no Dragand- Drop tools that allow end-user developers as well as programmers to create custom visualizations. The second contribution is a new visualization development approach, the Drag...

  14. Procedures and Tools Used by Teachers When Completing Functional Vision Assessments with Children with Visual Impairments

    Science.gov (United States)

    Kaiser, Justin T.; Herzberg, Tina S.

    2017-01-01

    Introduction: This study analyzed survey responses from 314 teachers of students with visual impairments regarding the tools and procedures used in completing functional vision assessments (FVAs). Methods: Teachers of students with visual impairments in the United States and Canada completed an online survey during spring 2016. Results: The…

  15. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  16. KENO3D visualization tool for KENO V.a geometry models

    International Nuclear Information System (INIS)

    Bowman, S.M.; Horwedel, J.E.

    1999-01-01

    The standardized computer analyses for licensing evaluations (SCALE) computer software system developed at Oak Ridge National Laboratory (ORNL) is widely used and accepted around the world for criticality safety analyses. SCALE includes the well-known KENO V.a three-dimensional Monte Carlo criticality computer code. Criticality safety analysis often require detailed modeling of complex geometries. Checking the accuracy of these models can be enhanced by effective visualization tools. To address this need, ORNL has recently developed a powerful state-of-the-art visualization tool called KENO3D that enables KENO V.a users to interactively display their three-dimensional geometry models. The interactive options include the following: (1) having shaded or wireframe images; (2) showing standard views, such as top view, side view, front view, and isometric three-dimensional view; (3) rotating the model; (4) zooming in on selected locations; (5) selecting parts of the model to display; (6) editing colors and displaying legends; (7) displaying properties of any unit in the model; (8) creating cutaway views; (9) removing units from the model; and (10) printing image or saving image to common graphics formats

  17. Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation.

    Science.gov (United States)

    Vanbellingen, Tim; Schumacher, Rahel; Eggenberger, Noëmi; Hopfner, Simone; Cazzoli, Dario; Preisig, Basil C; Bertschi, Manuel; Nyffeler, Thomas; Gutbrod, Klemens; Bassetti, Claudio L; Bohlhalter, Stephan; Müri, René M

    2015-05-01

    According to the direct matching hypothesis, perceived movements automatically activate existing motor components through matching of the perceived gesture and its execution. The aim of the present study was to test the direct matching hypothesis by assessing whether visual exploration behavior correlate with deficits in gestural imitation in left hemisphere damaged (LHD) patients. Eighteen LHD patients and twenty healthy control subjects took part in the study. Gesture imitation performance was measured by the test for upper limb apraxia (TULIA). Visual exploration behavior was measured by an infrared eye-tracking system. Short videos including forty gestures (20 meaningless and 20 communicative gestures) were presented. Cumulative fixation duration was measured in different regions of interest (ROIs), namely the face, the gesturing hand, the body, and the surrounding environment. Compared to healthy subjects, patients fixated significantly less the ROIs comprising the face and the gesturing hand during the exploration of emblematic and tool-related gestures. Moreover, visual exploration of tool-related gestures significantly correlated with tool-related imitation as measured by TULIA in LHD patients. Patients and controls did not differ in the visual exploration of meaningless gestures, and no significant relationships were found between visual exploration behavior and the imitation of emblematic and meaningless gestures in TULIA. The present study thus suggests that altered visual exploration may lead to disturbed imitation of tool related gestures, however not of emblematic and meaningless gestures. Consequently, our findings partially support the direct matching hypothesis. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. HydroDesktop: An Open Source GIS-Based Platform for Hydrologic Data Discovery, Visualization, and Analysis

    Science.gov (United States)

    Ames, D. P.; Kadlec, J.; Cao, Y.; Grover, D.; Horsburgh, J. S.; Whiteaker, T.; Goodall, J. L.; Valentine, D. W.

    2010-12-01

    A growing number of hydrologic information servers are being deployed by government agencies, university networks, and individual researchers using the Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) Hydrologic Information System (HIS). The CUAHSI HIS Project has developed a standard software stack, called HydroServer, for publishing hydrologic observations data. It includes the Observations Data Model (ODM) database and Water Data Service web services, which together enable publication of data on the Internet in a standard format called Water Markup Language (WaterML). Metadata describing available datasets hosted on these servers is compiled within a central metadata catalog called HIS Central at the San Diego Supercomputer Center and is searchable through a set of predefined web services based queries. Together, these servers and central catalog service comprise a federated HIS of a scale and comprehensiveness never previously available. This presentation will briefly review/introduce the CUAHSI HIS system with special focus on a new HIS software tool called "HydroDesktop" and the open source software development web portal, www.HydroDesktop.org, which supports community development and maintenance of the software. HydroDesktop is a client-side, desktop software application that acts as a search and discovery tool for exploring the distributed network of HydroServers, downloading specific data series, visualizing and summarizing data series and exporting these to formats needed for analysis by external software. HydroDesktop is based on the open source DotSpatial GIS developer toolkit which provides it with map-based data interaction and visualization, and a plug-in interface that can be used by third party developers and researchers to easily extend the software using Microsoft .NET programming languages. HydroDesktop plug-ins that are presently available or currently under development within the project and by third party

  19. A Visualization Tool to Analyse Usage of Web-Based Interventions: The Example of Positive Online Weight Reduction (POWeR)

    Science.gov (United States)

    Smith, Emily; Bradbury, Katherine; Morrison, Leanne; Dennison, Laura; Michaelides, Danius; Yardley, Lucy

    2015-01-01

    Background Attrition is a significant problem in Web-based interventions. Consequently, this research aims to identify the relation between Web usage and benefit from such interventions. A visualization tool has been developed that enables researchers to more easily examine large datasets on intervention usage that can be difficult to make sense of using traditional descriptive or statistical techniques alone. Objective This paper demonstrates how the visualization tool was used to explore patterns in participants’ use of a Web-based weight management intervention, termed "positive online weight reduction (POWeR)." We also demonstrate how the visualization tool can be used to perform subsequent statistical analyses of the association between usage patterns, participant characteristics, and intervention outcome. Methods The visualization tool was used to analyze data from 132 participants who had accessed at least one session of the POWeR intervention. Results There was a drop in usage of optional sessions after participants had accessed the initial, core POWeR sessions, but many users nevertheless continued to complete goal and weight reviews. The POWeR tools relating to the food diary and steps diary were reused most often. Differences in participant characteristics and usage of other intervention components were identified between participants who did and did not choose to access optional POWeR sessions (in addition to the initial core sessions) or reuse the food and steps diaries. Reuse of the steps diary and the getting support tools was associated with greater weight loss. Conclusions The visualization tool provided a quick and efficient method for exploring patterns of Web usage, which enabled further analyses of whether different usage patterns were associated with participant characteristics or differences in intervention outcome. Further usage of visualization techniques is recommended to (1) make sense of large datasets more quickly and efficiently; (2

  20. Math for visualization, visualizing math

    NARCIS (Netherlands)

    Wijk, van J.J.; Hart, G.; Sarhangi, R.

    2013-01-01

    I present an overview of our work in visualization, and reflect on the role of mathematics therein. First, mathematics can be used as a tool to produce visualizations, which is illustrated with examples from information visualization, flow visualization, and cartography. Second, mathematics itself

  1. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    Directory of Open Access Journals (Sweden)

    Chie Takahashi

    2011-10-01

    Full Text Available Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009. Variations in tool geometry also affect the reliability (precision of haptic size estimates, however, because they alter the change in hand opening caused by a given change in object size. Here, we examine whether the brain appropriately adjusts the weights given to visual and haptic size signals when tool geometry changes. We first estimated each cue's reliability by measuring size-discrimination thresholds in vision-alone and haptics-alone conditions. We varied haptic reliability using tools with different object-size:hand-opening ratios (1:1, 0.7:1, and 1.4:1. We then measured the weights given to vision and haptics with each tool, using a cue-conflict paradigm. The weight given to haptics varied with tool type in a manner that was well predicted by the single-cue reliabilities (MLE model; Ernst and Banks, 2002. This suggests that the process of visual-haptic integration appropriately accounts for variations in haptic reliability introduced by different tool geometries.

  2. The role of 3-D interactive visualization in blind surveys of H I in galaxies

    Science.gov (United States)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Oosterloo, T. A.; Ramatsoku, M.; Verheijen, M. A. W.

    2015-09-01

    Upcoming H I surveys will deliver large datasets, and automated processing using the full 3-D information (two positional dimensions and one spectral dimension) to find and characterize H I objects is imperative. In this context, visualization is an essential tool for enabling qualitative and quantitative human control on an automated source finding and analysis pipeline. We discuss how Visual Analytics, the combination of automated data processing and human reasoning, creativity and intuition, supported by interactive visualization, enables flexible and fast interaction with the 3-D data, helping the astronomer to deal with the analysis of complex sources. 3-D visualization, coupled to modeling, provides additional capabilities helping the discovery and analysis of subtle structures in the 3-D domain. The requirements for a fully interactive visualization tool are: coupled 1-D/2-D/3-D visualization, quantitative and comparative capabilities, combined with supervised semi-automated analysis. Moreover, the source code must have the following characteristics for enabling collaborative work: open, modular, well documented, and well maintained. We review four state of-the-art, 3-D visualization packages assessing their capabilities and feasibility for use in the case of 3-D astronomical data.

  3. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  4. Data visualization a guide to visual storytelling for libraries

    CERN Document Server

    2016-01-01

    Data Visualization: A Guide to Visual Storytelling for Libraries is a practical guide to the skills and tools needed to create beautiful and meaningful visual stories through data visualization. Learn how to sift through complex datasets to better understand a variety of metrics, such as trends in user behavior and electronic resource usage, return on investment (ROI) and impact metrics, and learning and reference analytics. Sections include: .Identifying and interpreting datasets for visualization .Tools and technologies for creating meaningful visualizations .Case studies in data visualization and dashboards Understanding and communicating trends from your organization s data is essential. Whether you are looking to make more informed decisions by visualizing organizational data, or to tell the story of your library s impact on your community, this book will give you the tools to make it happen."

  5. Savant Genome Browser 2: visualization and analysis for population-scale genomics.

    Science.gov (United States)

    Fiume, Marc; Smith, Eric J M; Brook, Andrew; Strbenac, Dario; Turner, Brian; Mezlini, Aziz M; Robinson, Mark D; Wodak, Shoshana J; Brudno, Michael

    2012-07-01

    High-throughput sequencing (HTS) technologies are providing an unprecedented capacity for data generation, and there is a corresponding need for efficient data exploration and analysis capabilities. Although most existing tools for HTS data analysis are developed for either automated (e.g. genotyping) or visualization (e.g. genome browsing) purposes, such tools are most powerful when combined. For example, integration of visualization and computation allows users to iteratively refine their analyses by updating computational parameters within the visual framework in real-time. Here we introduce the second version of the Savant Genome Browser, a standalone program for visual and computational analysis of HTS data. Savant substantially improves upon its predecessor and existing tools by introducing innovative visualization modes and navigation interfaces for several genomic datatypes, and synergizing visual and automated analyses in a way that is powerful yet easy even for non-expert users. We also present a number of plugins that were developed by the Savant Community, which demonstrate the power of integrating visual and automated analyses using Savant. The Savant Genome Browser is freely available (open source) at www.savantbrowser.com.

  6. Specvis: Free and open-source software for visual field examination.

    Science.gov (United States)

    Dzwiniel, Piotr; Gola, Mateusz; Wójcik-Gryciuk, Anna; Waleszczyk, Wioletta J

    2017-01-01

    Visual field impairment affects more than 100 million people globally. However, due to the lack of the access to appropriate ophthalmic healthcare in undeveloped regions as a result of associated costs and expertise this number may be an underestimate. Improved access to affordable diagnostic software designed for visual field examination could slow the progression of diseases, such as glaucoma, allowing for early diagnosis and intervention. We have developed Specvis, a free and open-source application written in Java programming language that can run on any personal computer to meet this requirement (http://www.specvis.pl/). Specvis was tested on glaucomatous, retinitis pigmentosa and stroke patients and the results were compared to results using the Medmont M700 Automated Static Perimeter. The application was also tested for inter-test intrapersonal variability. The results from both validation studies indicated low inter-test intrapersonal variability, and suitable reliability for a fast and simple assessment of visual field impairment. Specvis easily identifies visual field areas of zero sensitivity and allows for evaluation of its levels throughout the visual field. Thus, Specvis is a new, reliable application that can be successfully used for visual field examination and can fill the gap between confrontation and perimetry tests. The main advantages of Specvis over existing methods are its availability (free), affordability (runs on any personal computer), and reliability (comparable to high-cost solutions).

  7. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  8. Living Color Frame System: PC graphics tool for data visualization

    Science.gov (United States)

    Truong, Long V.

    1993-01-01

    Living Color Frame System (LCFS) is a personal computer software tool for generating real-time graphics applications. It is highly applicable for a wide range of data visualization in virtual environment applications. Engineers often use computer graphics to enhance the interpretation of data under observation. These graphics become more complicated when 'run time' animations are required, such as found in many typical modern artificial intelligence and expert systems. Living Color Frame System solves many of these real-time graphics problems.

  9. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.

    2015-12-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  10. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  11. Using the Visualization Software Evaluation Rubric to explore six freely available visualization applications

    Directory of Open Access Journals (Sweden)

    Thea P. Atwood

    2018-01-01

    Full Text Available Objective: As a variety of visualization tools become available to librarians and researchers, it can be challenging to select a tool that is robust and flexible enough to provide the desired visualization outcomes for work or personal use. In this article, the authors provide guidance on several freely available tools, and offer a rubric for use in evaluating visualization tools. Methods: A rubric was generated to assist the authors in assessing the selected six freely available visualization tools. Each author analyzed three tools, and discussed the differences, similarities, challenges, and successes of each. Results: Of the six visualization tools, two tools emerged with high marks. The authors found that the rubric was a successful evaluation tool, and facilitated discussion on the strengths and weaknesses of the six selected visualization pieces of software. Conclusions: Of the six different visualization tools analyzed, all had different functions and features available to best meet the needs of users. In a situation where there are many options available, and it is difficult at first glance to determine a clear winner, a rubric can be useful in providing a method to quickly assess and communicate the effectiveness of a tool.

  12. Tools and Methods for Visualization of Mesoscale Ocean Eddies

    Science.gov (United States)

    Bemis, K. G.; Liu, L.; Silver, D.; Kang, D.; Curchitser, E.

    2017-12-01

    Mesoscale ocean eddies form in the Gulf Stream and transport heat and nutrients across the ocean basin. The internal structure of these three-dimensional eddies and the kinematics with which they move are critical to a full understanding of their transport capacity. A series of visualization tools have been developed to extract, characterize, and track ocean eddies from 3D modeling results, to visually show the ocean eddy story by applying various illustrative visualization techniques, and to interactively view results stored on a server from a conventional browser. In this work, we apply a feature-based method to track instances of ocean eddies through the time steps of a high-resolution multidecadal regional ocean model and generate a series of eddy paths which reflect the life cycle of individual eddy instances. The basic method uses the Okubu-Weiss parameter to define eddy cores but could be adapted to alternative specifications of an eddy. Stored results include pixel-lists for each eddy instance, tracking metadata for eddy paths, and physical and geometric properties. In the simplest view, isosurfaces are used to display eddies along an eddy path. Individual eddies can then be selected and viewed independently or an eddy path can be viewed in the context of all eddy paths (longer than a specified duration) and the ocean basin. To tell the story of mesoscale ocean eddies, we combined illustrative visualization techniques, including visual effectiveness enhancement, focus+context, and smart visibility, with the extracted volume features to explore eddy characteristics at multiple scales from ocean basin to individual eddy. An evaluation by domain experts indicates that combining our feature-based techniques with illustrative visualization techniques provides an insight into the role eddies play in ocean circulation. A web-based GUI is under development to facilitate easy viewing of stored results. The GUI provides the user control to choose amongst available

  13. Decon2LS: An open-source software package for automated processing and visualization of high resolution mass spectrometry data

    Directory of Open Access Journals (Sweden)

    Anderson Gordon A

    2009-03-01

    Full Text Available Abstract Background Data generated from liquid chromatography coupled to high-resolution mass spectrometry (LC-MS-based studies of a biological sample can contain large amounts of biologically significant information in the form of proteins, peptides, and metabolites. Interpreting this data involves inferring the masses and abundances of biomolecules injected into the instrument. Because of the inherent complexity of mass spectral patterns produced by these biomolecules, the analysis is significantly enhanced by using visualization capabilities to inspect and confirm results. In this paper we describe Decon2LS, an open-source software package for automated processing and visualization of high-resolution MS data. Drawing extensively on algorithms developed over the last ten years for ICR2LS, Decon2LS packages the algorithms as a rich set of modular, reusable processing classes for performing diverse functions such as reading raw data, routine peak finding, theoretical isotope distribution modelling, and deisotoping. Because the source code is openly available, these functionalities can now be used to build derivative applications in relatively fast manner. In addition, Decon2LS provides an extensive set of visualization tools, such as high performance chart controls. Results With a variety of options that include peak processing, deisotoping, isotope composition, etc, Decon2LS supports processing of multiple raw data formats. Deisotoping can be performed on an individual scan, an individual dataset, or on multiple datasets using batch processing. Other processing options include creating a two dimensional view of mass and liquid chromatography (LC elution time features, generating spectrum files for tandem MS data, creating total intensity chromatograms, and visualizing theoretical peptide profiles. Application of Decon2LS to deisotope different datasets obtained across different instruments yielded a high number of features that can be used to

  14. Novel Scientific Visualization Interfaces for Interactive Information Visualization and Sharing

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.

    2012-12-01

    rainfall conditions are available in the IFIS. 2D and 3D interactive visualizations in the IFIS make the data more understandable to general public. Users are able to filter data sources for their communities and selected rivers. The data and information on IFIS is also accessible through web services and mobile applications. The IFIS is optimized for various browsers and screen sizes to provide access through multiple platforms including tablets and mobile devices. Multiple view modes in the IFIS accommodate different user types from general public to researchers and decision makers by providing different level of tools and details. River view mode allows users to visualize data from multiple IFC bridge sensors and USGS stream gauges to follow flooding condition along a river. The IFIS will help communities make better-informed decisions on the occurrence of floods, and will alert communities in advance to help minimize damage of floods.

  15. PACOM: A Versatile Tool for Integrating, Filtering, Visualizing, and Comparing Multiple Large Mass Spectrometry Proteomics Data Sets.

    Science.gov (United States)

    Martínez-Bartolomé, Salvador; Medina-Aunon, J Alberto; López-García, Miguel Ángel; González-Tejedo, Carmen; Prieto, Gorka; Navajas, Rosana; Salazar-Donate, Emilio; Fernández-Costa, Carolina; Yates, John R; Albar, Juan Pablo

    2018-04-06

    Mass-spectrometry-based proteomics has evolved into a high-throughput technology in which numerous large-scale data sets are generated from diverse analytical platforms. Furthermore, several scientific journals and funding agencies have emphasized the storage of proteomics data in public repositories to facilitate its evaluation, inspection, and reanalysis. (1) As a consequence, public proteomics data repositories are growing rapidly. However, tools are needed to integrate multiple proteomics data sets to compare different experimental features or to perform quality control analysis. Here, we present a new Java stand-alone tool, Proteomics Assay COMparator (PACOM), that is able to import, combine, and simultaneously compare numerous proteomics experiments to check the integrity of the proteomic data as well as verify data quality. With PACOM, the user can detect source of errors that may have been introduced in any step of a proteomics workflow and that influence the final results. Data sets can be easily compared and integrated, and data quality and reproducibility can be visually assessed through a rich set of graphical representations of proteomics data features as well as a wide variety of data filters. Its flexibility and easy-to-use interface make PACOM a unique tool for daily use in a proteomics laboratory. PACOM is available at https://github.com/smdb21/pacom .

  16. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. E-Sourcing platforms as reasonable marketing tools for suppliers

    OpenAIRE

    Göbl, Martin; Greiter, Thomas

    2014-01-01

    Research questions: E-sourcing platforms offer purchasing organisations often easy access to a high number of relevant suppliers, their goods and services and the accord-ing prices. For the suppliers, e-sourcing platforms are a good and easy pos-sibility to present their products and services to the relevant buyers and to get in contact with potential customers. Subject of this research will be the question, whether e-sourcing platforms are also a reasonable marketing tool for suppliers in or...

  18. GO(vis), a gene ontology visualization tool based on multi-dimensional values.

    Science.gov (United States)

    Ning, Zi; Jiang, Zhenran

    2010-05-01

    Most of gene product similarity measurements concentrate on the information content of Gene Ontology (GO) terms or use a path-based similarity between GO terms, which may ignore other important information contained in the structure of the ontology. In our study, we integrate different GO similarity measure approaches to analyze the functional relationship of genes and gene products with a new triangle-based visualization tool called GO(Vis). The purpose of this tool is to demonstrate the effect of three important information factors when measuring the similarity between gene products. One advantage of this tool is that its important ratio can be adjusted to meet different measuring requirements according to the biological knowledge of each factor. The experimental results demonstrate that GO(Vis) can display diagrams of the functional relationship for gene products effectively.

  19. IIS--Integrated Interactome System: a web-based platform for the annotation, analysis and visualization of protein-metabolite-gene-drug interactions by integrating a variety of data sources and tools.

    Science.gov (United States)

    Carazzolle, Marcelo Falsarella; de Carvalho, Lucas Miguel; Slepicka, Hugo Henrique; Vidal, Ramon Oliveira; Pereira, Gonçalo Amarante Guimarães; Kobarg, Jörg; Meirelles, Gabriela Vaz

    2014-01-01

    High-throughput screening of physical, genetic and chemical-genetic interactions brings important perspectives in the Systems Biology field, as the analysis of these interactions provides new insights into protein/gene function, cellular metabolic variations and the validation of therapeutic targets and drug design. However, such analysis depends on a pipeline connecting different tools that can automatically integrate data from diverse sources and result in a more comprehensive dataset that can be properly interpreted. We describe here the Integrated Interactome System (IIS), an integrative platform with a web-based interface for the annotation, analysis and visualization of the interaction profiles of proteins/genes, metabolites and drugs of interest. IIS works in four connected modules: (i) Submission module, which receives raw data derived from Sanger sequencing (e.g. two-hybrid system); (ii) Search module, which enables the user to search for the processed reads to be assembled into contigs/singlets, or for lists of proteins/genes, metabolites and drugs of interest, and add them to the project; (iii) Annotation module, which assigns annotations from several databases for the contigs/singlets or lists of proteins/genes, generating tables with automatic annotation that can be manually curated; and (iv) Interactome module, which maps the contigs/singlets or the uploaded lists to entries in our integrated database, building networks that gather novel identified interactions, protein and metabolite expression/concentration levels, subcellular localization and computed topological metrics, GO biological processes and KEGG pathways enrichment. This module generates a XGMML file that can be imported into Cytoscape or be visualized directly on the web. We have developed IIS by the integration of diverse databases following the need of appropriate tools for a systematic analysis of physical, genetic and chemical-genetic interactions. IIS was validated with yeast two

  20. Sinistrals are rarely ‘right’: evidence from tool­-affordance processing in visual half-­field paradigms

    Directory of Open Access Journals (Sweden)

    Bartosz eMichałowski

    2015-03-01

    Full Text Available Although current neuroscience and behavioral studies provide substantial understanding of tool representations (e.g., the processing of tool-­related affordances in the human brain, most of this knowledge is limited to right-handed individuals with typical organization of cognitive and manual skills. Therefore, any insights from these lines of research may be of little value in rehabilitation of patients with atypical laterality of praxis and/or hand dominance. To fill this gap, we tested perceptual processing of man­-made objects in 18 healthy left-­handers who were likely to show greater incidence of right-sided or bilateral (atypical lateralization of functions. In the two experiments reported here, participants performed a tool vs. non-­tool categorization task. In Exp. 1, target and distracter objects were presented for 200 ms in the left (LVF or right (RVF visual field, followed by 200ms masks. In Exp. 2, the centrally presented targets were preceded by masked primes of 35ms duration, again presented in the LVF or RVF. Based on results from both studies, i.e., response times to correctly discriminated stimuli irrespective of their category, participants were divided into two groups showing privileged processing in either left (N = 9 or right (N = 9 visual field. In Exp. 1, only individuals with RVF advantage showed significantly faster categorization of tools in their dominant visual field, whereas those with LVF advantage revealed merely a trend towards such an effect. In Exp. 2, when targets were preceded by identical primes, the ‘atypical’ group showed significantly facilitated categorization of non­-tools, whereas the ‘typical’ group demonstrated a trend towards faster categorization of tools. These results indicate that in subjects with atypically organized cognitive skills, tool­-related processes are not just mirror reversed. Thus, our outcomes call for particular caution in neurorehabilitation directed at left

  1. 'tomo_display' and 'vol_tools': IDL VM Packages for Tomography Data Reconstruction, Processing, and Visualization

    Science.gov (United States)

    Rivers, M. L.; Gualda, G. A.

    2009-05-01

    One of the challenges in tomography is the availability of suitable software for image processing and analysis in 3D. We present here 'tomo_display' and 'vol_tools', two packages created in IDL that enable reconstruction, processing, and visualization of tomographic data. They complement in many ways the capabilities offered by Blob3D (Ketcham 2005 - Geosphere, 1: 32-41, DOI: 10.1130/GES00001.1) and, in combination, allow users without programming knowledge to perform all steps necessary to obtain qualitative and quantitative information using tomographic data. The package 'tomo_display' was created and is maintained by Mark Rivers. It allows the user to: (1) preprocess and reconstruct parallel beam tomographic data, including removal of anomalous pixels, ring artifact reduction, and automated determination of the rotation center, (2) visualization of both raw and reconstructed data, either as individual frames, or as a series of sequential frames. The package 'vol_tools' consists of a series of small programs created and maintained by Guilherme Gualda to perform specific tasks not included in other packages. Existing modules include simple tools for cropping volumes, generating histograms of intensity, sample volume measurement (useful for porous samples like pumice), and computation of volume differences (for differential absorption tomography). The module 'vol_animate' can be used to generate 3D animations using rendered isosurfaces around objects. Both packages use the same NetCDF format '.volume' files created using code written by Mark Rivers. Currently, only 16-bit integer volumes are created and read by the packages, but floating point and 8-bit data can easily be stored in the NetCDF format as well. A simple GUI to convert sequences of tiffs into '.volume' files is available within 'vol_tools'. Both 'tomo_display' and 'vol_tools' include options to (1) generate onscreen output that allows for dynamic visualization in 3D, (2) save sequences of tiffs to disk

  2. The SCEC/UseIT Intern Program: Creating Open-Source Visualization Software Using Diverse Resources

    Science.gov (United States)

    Francoeur, H.; Callaghan, S.; Perry, S.; Jordan, T.

    2004-12-01

    The Southern California Earthquake Center undergraduate IT intern program (SCEC UseIT) conducts IT research to benefit collaborative earth science research. Through this program, interns have developed real-time, interactive, 3D visualization software using open-source tools. Dubbed LA3D, a distribution of this software is now in use by the seismic community. LA3D enables the user to interactively view Southern California datasets and models of importance to earthquake scientists, such as faults, earthquakes, fault blocks, digital elevation models, and seismic hazard maps. LA3D is now being extended to support visualizations anywhere on the planet. The new software, called SCEC-VIDEO (Virtual Interactive Display of Earth Objects), makes use of a modular, plugin-based software architecture which supports easy development and integration of new data sets. Currently SCEC-VIDEO is in beta testing, with a full open-source release slated for the future. Both LA3D and SCEC-VIDEO were developed using a wide variety of software technologies. These, which included relational databases, web services, software management technologies, and 3-D graphics in Java, were necessary to integrate the heterogeneous array of data sources which comprise our software. Currently the interns are working to integrate new technologies and larger data sets to increase software functionality and value. In addition, both LA3D and SCEC-VIDEO allow the user to script and create movies. Thus program interns with computer science backgrounds have been writing software while interns with other interests, such as cinema, geology, and education, have been making movies that have proved of great use in scientific talks, media interviews, and education. Thus, SCEC UseIT incorporates a wide variety of scientific and human resources to create products of value to the scientific and outreach communities. The program plans to continue with its interdisciplinary approach, increasing the relevance of the

  3. Visual Climate Knowledge Discovery within a Grid Environment

    Science.gov (United States)

    Heitzler, Magnus; Kiertscher, Simon; Lang, Ulrich; Nocke, Thomas; Wahnes, Jens; Winkelmann, Volker

    2013-04-01

    The C3Grid-INAD project aims to provide a common grid infrastructure for the climate science community to improve access to climate related data and domain workflows via the Internet. To make sense of the heterogeneous, often large-sized or even dynamically generated and modified files originating from C3Grid, a highly flexible and user-friendly analysis software is needed to run on different high-performance computing nodes within the grid environment, when requested by a user. Because visual analysis tools directly address human visual perception and therefore are being considered to be highly intuitive, two distinct visualization workflows have been integrated in C3Grid-INAD, targeting different application backgrounds. First, a GrADS-based workflow enables the ad-hoc visualization of selected datasets in respect to data source, temporal and spatial extent, as well as variables of interest. Being low in resource demands, this workflow allows for users to gain fast insights through basic spatial visualization. For more advanced visual analysis purposes, a second workflow enables the user to start a visualization session via Virtual Network Computing (VNC) and VirtualGL to access high-performance computing nodes on which a wide variety of different visual analysis tools are provided. These are made available using the easy-to-use software system SimEnvVis. Considering metadata as well as user preferences and analysis goals, SimEnvVis evaluates the attached tools and launches the selected visual analysis tool by providing a dynamically parameterized template. This approach facilitates the selection of the most suitable tools, and at the same time eases the process of familiarization with them. Because of a higher demand for computational resources, SimEnvVis-sessions are restricted to a smaller set of users at a time. This architecture enables climate scientists not only to remotely access, but also to visually analyze highly heterogeneous data originating from C3

  4. Visual Representation in GENESIS as a tool for Physical Modeling, Sound Synthesis and Musical Composition

    OpenAIRE

    Villeneuve, Jérôme; Cadoz, Claude; Castagné, Nicolas

    2015-01-01

    The motivation of this paper is to highlight the importance of visual representations for artists when modeling and simulating mass-interaction physical networks in the context of sound synthesis and musical composition. GENESIS is a musician-oriented software environment for sound synthesis and musical composition. However, despite this orientation, a substantial amount of effort has been put into building a rich variety of tools based on static or dynamic visual representations of models an...

  5. A new tool for virtual scientific and autostereoscopic visualization of EAST

    International Nuclear Information System (INIS)

    Li, Dan; Xiao, B.J.; Xia, J.Y.; Wang, K.R.; Chen, S.L.; Luo, W.L.

    2016-01-01

    Highlights: • 3D effect of the virtual EAST has been improved and data visualization has been realized in the ASEAST system. • Interaction behavior is created that the users can get information from database. • The system integrates data acquisition, data visualization and model visualization. • QT libraries are adopted to realize the cross-platform and impressive graphical interface. • In order to manage the models, the web-based model manager system is constructed. - Abstract: The Experimental Advanced Superconducting Tokamak (EAST) Device began operation in 2006. EAST visualization work has been paid more and more attention for simulating its running state and inner structure. The VEAST system had been developed to display the 3D model of EAST facility and some diagnostic data based on Java3D. Compared with the VEAST system, a new system named autosterescopic scientific EAST (ASEAST) using C/S (Client/Server) structure in combination with the technology of OpenGL and an open-source software system for 3D computer graphics and visualization called VTK (Visualization Toolkit) and the Qt5 libraries for the graphical user interface (GUI) has been developed to improve the 3D effect of the virtual EAST and visualize the experimental data. The ASEAST can be used to get access to the information of EAST and physical properties. In addition, as a general system, ASEAST supports a wide variety of 3D formats. The visualization result can be output in the corresponding format of the input. In order to improve the rendering speed, we used the classic QEM algorithm to simplify the models in preprocess stage. As for the 3D effect, we made an investigation and the survey revealed that the system had good 3D effect.

  6. A new tool for virtual scientific and autostereoscopic visualization of EAST

    Energy Technology Data Exchange (ETDEWEB)

    Li, Dan, E-mail: lidan@ipp.ac.cn [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Xiao, B.J.; Xia, J.Y.; Wang, K.R. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); University of Science and Technology of China, Hefei, Anhui (China); Chen, S.L. [Institute of Plasma Physics, Chinese Academy of Sciences, Hefei, Anhui (China); Luo, W.L. [709th Research lnstitute, China Shipbuilding lndustry Corporation, Wuhan, Hubei (China)

    2016-11-15

    Highlights: • 3D effect of the virtual EAST has been improved and data visualization has been realized in the ASEAST system. • Interaction behavior is created that the users can get information from database. • The system integrates data acquisition, data visualization and model visualization. • QT libraries are adopted to realize the cross-platform and impressive graphical interface. • In order to manage the models, the web-based model manager system is constructed. - Abstract: The Experimental Advanced Superconducting Tokamak (EAST) Device began operation in 2006. EAST visualization work has been paid more and more attention for simulating its running state and inner structure. The VEAST system had been developed to display the 3D model of EAST facility and some diagnostic data based on Java3D. Compared with the VEAST system, a new system named autosterescopic scientific EAST (ASEAST) using C/S (Client/Server) structure in combination with the technology of OpenGL and an open-source software system for 3D computer graphics and visualization called VTK (Visualization Toolkit) and the Qt5 libraries for the graphical user interface (GUI) has been developed to improve the 3D effect of the virtual EAST and visualize the experimental data. The ASEAST can be used to get access to the information of EAST and physical properties. In addition, as a general system, ASEAST supports a wide variety of 3D formats. The visualization result can be output in the corresponding format of the input. In order to improve the rendering speed, we used the classic QEM algorithm to simplify the models in preprocess stage. As for the 3D effect, we made an investigation and the survey revealed that the system had good 3D effect.

  7. Visual-haptic integration with pliers and tongs: signal ‘weights’ take account of changes in haptic sensitivity caused by different tools

    Directory of Open Access Journals (Sweden)

    Chie eTakahashi

    2014-02-01

    Full Text Available When we hold an object while looking at it, estimates from visual and haptic cues to size are combined in a statistically optimal fashion, whereby the ‘weight’ given to each signal reflects their relative reliabilities. This allows object properties to be estimated more precisely than would otherwise be possible. Tools such as pliers and tongs systematically perturb the mapping between object size and the hand opening. This could complicate visual-haptic integration because it may alter the reliability of the haptic signal, thereby disrupting the determination of appropriate signal weights. To investigate this we first measured the reliability of haptic size estimates made with virtual pliers-like tools (created using a stereoscopic display and force-feedback robots with different ‘gains’ between hand opening and object size. Haptic reliability in tool use was straightforwardly determined by a combination of sensitivity to changes in hand opening and the effects of tool geometry. The precise pattern of sensitivity to hand opening, which violated Weber’s law, meant that haptic reliability changed with tool gain. We then examined whether the visuo-motor system accounts for these reliability changes. We measured the weight given to visual and haptic stimuli when both were available, again with different tool gains, by measuring the perceived size of stimuli in which visual and haptic sizes were varied independently. The weight given to each sensory cue changed with tool gain in a manner that closely resembled the predictions of optimal sensory integration. The results are consistent with the idea that different tool geometries are modelled by the brain, allowing it to calculate not only the distal properties of objects felt with tools, but also the certainty with which those properties are known. These findings highlight the flexibility of human sensory integration and tool-use, and potentially provide an approach for optimising the

  8. ggCyto: Next Generation Open-Source Visualization Software for Cytometry.

    Science.gov (United States)

    Van, Phu; Jiang, Wenxin; Gottardo, Raphael; Finak, Greg

    2018-06-01

    Open source software for computational cytometry has gained in popularity over the past few years. Efforts such as FlowCAP, the Lyoplate and Euroflow projects have highlighted the importance of efforts to standardize both experimental and computational aspects of cytometry data analysis. The R/BioConductor platform hosts the largest collection of open source cytometry software covering all aspects of data analysis and providing infrastructure to represent and analyze cytometry data with all relevant experimental, gating, and cell population annotations enabling fully reproducible data analysis. Data visualization frameworks to support this infrastructure have lagged behind. ggCyto is a new open-source BioConductor software package for cytometry data visualization built on ggplot2 that enables ggplot-like functionality with the core BioConductor flow cytometry data structures. Amongst its features are the ability to transform data and axes on-the-fly using cytometry-specific transformations, plot faceting by experimental meta-data variables, and partial matching of channel, marker and cell populations names to the contents of the BioConductor cytometry data structures. We demonstrate the salient features of the package using publicly available cytometry data with complete reproducible examples in a supplementary material vignette. https://bioconductor.org/packages/devel/bioc/html/ggcyto.html. gfinak@fredhutch.org. Supplementary data are available at Bioinformatics online and at http://rglab.org/ggcyto/.

  9. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  10. Using Visual Simulation Tools And Learning Outcomes-Based Curriculum To Help Transportation Engineering Students And Practitioners To Better Understand And Design Traffic Signal Control Systems

    Science.gov (United States)

    2012-06-01

    The use of visual simulation tools to convey complex concepts has become a useful tool in education as well as in research. : This report describes a project that developed curriculum and visualization tools to train transportation engineering studen...

  11. OmicsNet: a web-based tool for creation and visual analysis of biological networks in 3D space.

    Science.gov (United States)

    Zhou, Guangyan; Xia, Jianguo

    2018-06-07

    Biological networks play increasingly important roles in omics data integration and systems biology. Over the past decade, many excellent tools have been developed to support creation, analysis and visualization of biological networks. However, important limitations remain: most tools are standalone programs, the majority of them focus on protein-protein interaction (PPI) or metabolic networks, and visualizations often suffer from 'hairball' effects when networks become large. To help address these limitations, we developed OmicsNet - a novel web-based tool that allows users to easily create different types of molecular interaction networks and visually explore them in a three-dimensional (3D) space. Users can upload one or multiple lists of molecules of interest (genes/proteins, microRNAs, transcription factors or metabolites) to create and merge different types of biological networks. The 3D network visualization system was implemented using the powerful Web Graphics Library (WebGL) technology that works natively in most major browsers. OmicsNet supports force-directed layout, multi-layered perspective layout, as well as spherical layout to help visualize and navigate complex networks. A rich set of functions have been implemented to allow users to perform coloring, shading, topology analysis, and enrichment analysis. OmicsNet is freely available at http://www.omicsnet.ca.

  12. Distributed visualization of gridded geophysical data: the Carbon Data Explorer, version 0.2.3

    Science.gov (United States)

    Endsley, K. A.; Billmire, M. G.

    2016-01-01

    Due to the proliferation of geophysical models, particularly climate models, the increasing resolution of their spatiotemporal estimates of Earth system processes, and the desire to easily share results with collaborators, there is a genuine need for tools to manage, aggregate, visualize, and share data sets. We present a new, web-based software tool - the Carbon Data Explorer - that provides these capabilities for gridded geophysical data sets. While originally developed for visualizing carbon flux, this tool can accommodate any time-varying, spatially explicit scientific data set, particularly NASA Earth system science level III products. In addition, the tool's open-source licensing and web presence facilitate distributed scientific visualization, comparison with other data sets and uncertainty estimates, and data publishing and distribution.

  13. Visualization: A Tool for Enhancing Students' Concept Images of Basic Object-Oriented Concepts

    Science.gov (United States)

    Cetin, Ibrahim

    2013-01-01

    The purpose of this study was twofold: to investigate students' concept images about class, object, and their relationship and to help them enhance their learning of these notions with a visualization tool. Fifty-six second-year university students participated in the study. To investigate his/her concept images, the researcher developed a survey…

  14. Visual dynamic e-module as a tool to fulfill informational needs and care continuum for diabetic patients

    Directory of Open Access Journals (Sweden)

    Mohan Shinde

    2015-01-01

    Full Text Available Introduction: Diabetes can be envisaged as a lifelong phenomenon having the ominous odds for multisystemic involvement in the duration of disease. The probabilities of the occurrence of these events are influenced by the adopted lifestyle. Hence, information about the disease and lifestyle modification are vital from the perspective of prognostics. This study attempts to explore the potential of a "visual dynamic tool" for imparting knowledge and consequently  received acumen by diabetic patients. Objectives: To appraise the effectiveness of a constructed visual dynamic module (encompassing the various dimensions related to and affected by diabetes by capturing the opinions, perceptions, and experiences of the diabetic patients who underwent intervention through the module. Materials and Methods: A visual e-module with dynamically imposed and animated images in the vernacular (Hindi was prepared. This module was instituted among the diabetic patients in a logical sequence for consecutive 3 days. All the diabetic patients who underwent this intervention were interviewed in depth in order to ascertain the effectiveness of the module. These interviews were analyzed by thematic and framework analyses. Result: The visual module was perceived by the diabetic patients as an optically engaging tool for receiving, connecting, and synthesizing information about diabetes. They sensed and expressed the ease to connect with the images and labeled the received information as inclusive. Conclusion: Initial evidences suggest that visual e-module is an effective and efficient tool for knowledge management in diabetes. This issue may be further explored at diverse academic and clinical settings for gathering more information for efficacy.

  15. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  16. Autonomous Micro-Air-Vehicle Control Based on Visual Sensing for Odor Source Localization

    Directory of Open Access Journals (Sweden)

    Kenzo Kurotsuchi

    2017-07-01

    Full Text Available In this paper, we propose a novel control method for autonomous-odor-source localization using visual and odor sensing by micro air vehicles (MAVs. Our method is based on biomimetics, which enable highly autonomous localization. Our method does not need any instruction signals, including even global positioning system (GPS signals. An experimenter simply blows a whistle, and the MAV will then start to hover, to seek an odor source, and to keep hovering near the source. The GPS-signal-free control based on visual sense enables indoor/underground use. Moreover, the MAV is light-weight (85 grams and does not cause harm to others even if it accidentally falls. Experiments conducted in the real world were successful in enabling odor source localization using the MAV with a bio-inspired searching method. The distance error of the localization was 63 cm, more accurate than the target distance of 120 cm for individual identification. Our odor source localization is the first step to a proof of concept for a danger warning system. These localization experiments were the first step to a proof of concept for a danger warning system to enable a safer and more secure society.

  17. Development of Tool Representations in the Dorsal and Ventral Visual Object Processing Pathways

    Science.gov (United States)

    Kersey, Alyssa J.; Clark, Tyia S.; Lussier, Courtney A.; Mahon, Bradford Z.; Cantlon, Jessica F.

    2016-01-01

    Tools represent a special class of objects, because they are processed across both the dorsal and ventral visual object processing pathways. Three core regions are known to be involved in tool processing: the left posterior middle temporal gyrus, the medial fusiform gyrus (bilaterally), and the left inferior parietal lobule. A critical and relatively unexplored issue concerns whether, in development, tool preferences emerge at the same time and to a similar degree across all regions of the tool-processing network. To test this issue, we used functional magnetic resonance imaging to measure the neural amplitude, peak location, and the dispersion of tool-related neural responses in the youngest sample of children tested to date in this domain (ages 4–8 years). We show that children recruit overlapping regions of the adult tool-processing network and also exhibit similar patterns of co-activation across the network to adults. The amplitude and co-activation data show that the core components of the tool-processing network are established by age 4. Our findings on the distributions of peak location and dispersion of activation indicate that the tool network undergoes refinement between ages 4 and 8 years. PMID:26108614

  18. Visualization and interaction tools for aerial photograph mosaics

    Science.gov (United States)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  19. Development of a visual tool to analyze interactions in forums in an e-learning environment

    Directory of Open Access Journals (Sweden)

    Cláudio Filipe Tereso

    2016-12-01

    Full Text Available This article presents VAFAE – Forum Access Visualization on a Distance Learning Environment, a web tool that visually maps Universidade Aberta’s (UAb students’ interaction with a course available on the e-learning platform. Raw data is extracted from the log files that are then transformed to obtain the necessary format. Next, different visualization techniques are applied with the aim of improving and streamlining the underlying information. In a more specific way, VAFAE aims at helping teachers to better understand the level and quality of the interaction of the students with the modules of the learning units in UAb’s distance learning environment.

  20. Efficient Delivery and Visualization of Long Time-Series Datasets Using Das2 Tools

    Science.gov (United States)

    Piker, C.; Granroth, L.; Faden, J.; Kurth, W. S.

    2017-12-01

    For over 14 years the University of Iowa Radio and Plasma Wave Group has utilized a network transparent data streaming and visualization system for most daily data review and collaboration activities. This system, called Das2, was originally designed in support of the Cassini Radio and Plasma Wave Science (RPWS) investigation, but is now relied on for daily review and analysis of Voyager, Polar, Cluster, Mars Express, Juno and other mission results. In light of current efforts to promote automatic data distribution in space physics it seems prudent to provide an overview of our open source Das2 programs and interface definitions to the wider community and to recount lessons learned. This submission will provide an overview of interfaces that define the system, describe the relationship between the Das2 effort and Autoplot and will examine handling Cassini RPWS Wideband waveforms and dynamic spectra as examples of dealing with long time-series data sets. In addition, the advantages and limitations of the current Das2 tool set will be discussed, as well as lessons learned that are applicable to other data sharing initiatives. Finally, plans for future developments including improved catalogs to support 'no-software' data sources and redundant multi-server fail over, as well as new adapters for CSV (Comma Separated Values) and JSON (Javascript Object Notation) output to support Cassini closeout and the HAPI (Heliophysics Application Programming Interface) initiative are outlined.

  1. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  2. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  3. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  4. On the road to a stronger public health workforce: visual tools to address complex challenges.

    Science.gov (United States)

    Drehobl, Patricia; Stover, Beth H; Koo, Denise

    2014-11-01

    The public health workforce is vital to protecting the health and safety of the public, yet for years, state and local governmental public health agencies have reported substantial workforce losses and other challenges to the workforce that threaten the public's health. These challenges are complex, often involve multiple influencing or related causal factors, and demand comprehensive solutions. However, proposed solutions often focus on selected factors and might be fragmented rather than comprehensive. This paper describes approaches to characterizing the situation more comprehensively and includes two visual tools: (1) a fishbone, or Ishikawa, diagram that depicts multiple factors affecting the public health workforce; and (2) a roadmap that displays key elements-goals and strategies-to strengthen the public health workforce, thus moving from the problems depicted in the fishbone toward solutions. The visual tools aid thinking about ways to strengthen the public health workforce through collective solutions and to help leverage resources and build on each other's work. The strategic roadmap is intended to serve as a dynamic tool for partnership, prioritization, and gap assessment. These tools reflect and support CDC's commitment to working with partners on the highest priorities for strengthening the workforce to improve the public's health. Published by Elsevier Inc.

  5. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  6. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  7. Three-Dimensional Online Visualization and Engagement Tools for the Geosciences

    Science.gov (United States)

    Cockett, R.; Moran, T.; Pidlisecky, A.

    2013-12-01

    Educational tools often sacrifice interactivity in favour of scalability so they can reach more users. This compromise leads to tools that may be viewed as second tier when compared to more engaging activities performed in a laboratory; however, the resources required to deliver laboratory exercises that are scalable is often impractical. Geoscience education is well situated to benefit from interactive online learning tools that allow users to work in a 3D environment. Visible Geology (http://3ptscience.com/visiblegeology) is an innovative web-based application designed to enable visualization of geologic structures and processes through the use of interactive 3D models. The platform allows users to conceptualize difficult, yet important geologic principles in a scientifically accurate manner by developing unique geologic models. The environment allows students to interactively practice their visualization and interpretation skills by creating and interacting with their own models and terrains. Visible Geology has been designed from a user centric perspective resulting in a simple and intuitive interface. The platform directs students to build there own geologic models by adding beds and creating geologic events such as tilting, folding, or faulting. The level of ownership and interactivity encourages engagement, leading learners to discover geologic relationships on their own, in the context of guided assignments. In January 2013, an interactive geologic history assignment was developed for a 700-student introductory geology class at The University of British Columbia. The assignment required students to distinguish the relative age of geologic events to construct a geologic history. Traditionally this type of exercise has been taught through the use of simple geologic cross-sections showing crosscutting relationships; from these cross-sections students infer the relative age of geologic events. In contrast, the Visible Geology assignment offers students a unique

  8. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  9. Interactive metagenomic visualization in a Web browser

    Directory of Open Access Journals (Sweden)

    Phillippy Adam M

    2011-09-01

    Full Text Available Abstract Background A critical output of metagenomic studies is the estimation of abundances of taxonomical or functional groups. The inherent uncertainty in assignments to these groups makes it important to consider both their hierarchical contexts and their prediction confidence. The current tools for visualizing metagenomic data, however, omit or distort quantitative hierarchical relationships and lack the facility for displaying secondary variables. Results Here we present Krona, a new visualization tool that allows intuitive exploration of relative abundances and confidences within the complex hierarchies of metagenomic classifications. Krona combines a variant of radial, space-filling displays with parametric coloring and interactive polar-coordinate zooming. The HTML5 and JavaScript implementation enables fully interactive charts that can be explored with any modern Web browser, without the need for installed software or plug-ins. This Web-based architecture also allows each chart to be an independent document, making them easy to share via e-mail or post to a standard Web server. To illustrate Krona's utility, we describe its application to various metagenomic data sets and its compatibility with popular metagenomic analysis tools. Conclusions Krona is both a powerful metagenomic visualization tool and a demonstration of the potential of HTML5 for highly accessible bioinformatic visualizations. Its rich and interactive displays facilitate more informed interpretations of metagenomic analyses, while its implementation as a browser-based application makes it extremely portable and easily adopted into existing analysis packages. Both the Krona rendering code and conversion tools are freely available under a BSD open-source license, and available from: http://krona.sourceforge.net.

  10. EUV sources for the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Damen, Marcel; Derra, Günther; Franken, Oliver; Janssen, Maurice; Jonkers, Jeroen; Klein, Jürgen; Kraus, Helmar; Krücken, Thomas; List, Andreas; Loeken, Micheal; Mader, Arnaud; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prümmer, Ralph; Rosier, Oliver; Schwabe, Stefan; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2006-03-01

    In this paper, we report on the recent progress of the Philips Extreme UV source. The Philips source concept is based on a discharge plasma ignited in a Sn vapor plume that is ablated by a laser pulse. Using rotating electrodes covered with a regenerating tin surface, the problems of electrode erosion and power scaling are fundamentally solved. Most of the work of the past year has been dedicated to develop a lamp system which is operating very reliably and stable under full scanner remote control. Topics addressed were the development of the scanner interface, a dose control system, thermo-mechanical design, positional stability of the source, tin handling, and many more. The resulting EUV source-the Philips NovaTin(R) source-can operate at more than 10kW electrical input power and delivers 200W in-band EUV into 2π continuously. The source is very small, so nearly 100% of the EUV radiation can be collected within etendue limits. The lamp system is fully automated and can operate unattended under full scanner remote control. 500 Million shots of continuous operation without interruption have been realized, electrode lifetime is at least 2 Billion shots. Three sources are currently being prepared, two of them will be integrated into the first EUV Alpha Demonstration tools of ASML. The debris problem was reduced to a level which is well acceptable for scanner operation. First, a considerable reduction of the Sn emission of the source has been realized. The debris mitigation system is based on a two-step concept using a foil trap based stage and a chemical cleaning stage. Both steps were improved considerably. A collector lifetime of 1 Billion shots is achieved, after this operating time a cleaning would be applied. The cleaning step has been verified to work with tolerable Sn residues. From the experimental results, a total collector lifetime of more than 10 Billion shots can be expected.

  11. AWE: Aviation Weather Data Visualization Environment

    Science.gov (United States)

    Spirkovska, Lilly; Lodha, Suresh K.; Norvig, Peter (Technical Monitor)

    2000-01-01

    Weather is one of the major causes of aviation accidents. General aviation (GA) flights account for 92% of all the aviation accidents, In spite of all the official and unofficial sources of weather visualization tools available to pilots, there is an urgent need for visualizing several weather related data tailored for general aviation pilots. Our system, Aviation Weather Data Visualization Environment AWE), presents graphical displays of meteorological observations, terminal area forecasts, and winds aloft forecasts onto a cartographic grid specific to the pilot's area of interest. Decisions regarding the graphical display and design are made based on careful consideration of user needs. Integral visual display of these elements of weather reports is designed for the use of GA pilots as a weather briefing and route selection tool. AWE provides linking of the weather information to the flight's path and schedule. The pilot can interact with the system to obtain aviation-specific weather for the entire area or for his specific route to explore what-if scenarios and make "go/no-go" decisions. The system, as evaluated by some pilots at NASA Ames Research Center, was found to be useful.

  12. Documentation of high impact visualizations and improvement plans for utilization of VisIt for reactor simulation

    Energy Technology Data Exchange (ETDEWEB)

    R.Childs, H; Bremer, D J

    2008-10-03

    The primary goal of this milestone was to enable the visualization and analysis needs of the campaign's simulation codes. This goal was well accomplished. We have extended the VisIt visualization and analysis tool to be suitable for the Nek, UNIC, SAS, and DIABLO code teams. This represented a significant development effort, primarily in terms of tuning the processing of the very large data sets produced by the Nek code. As a result of our development, and of the support we provided, these groups have been able to successfully accomplish their visualization and analysis activities using VisIt. Visualization is an important part of the simulation process. It allows stakeholders to explore simulations and discover phenomena, to confirm assumptions, and to convey findings to a larger audience. Further, visualization software is complex and is an active research area, especially in the area of visualization of very large data sets, such as those produced by the Reactor campaign's Nek code. To meet the campaign's visualization and analysis needs, we chose to leverage the existing software tool, VisIt. VisIt is an open source, parallel visualization and analysis tool for interactively exploring scientific data. The tool represents approximately fifty man-years worth of effort, much of which was dedicated to techniques for processing large data and also to user interfaces. VisIt originated in the DOE's Advanced Simulation and Computing Initiative (ASCI) program, but is also actively developed by the Office of Science's Scientific Discovery through Advanced Computing (SciDAC) program, as well as by the at large open source community, including university partners. Our work for this effort consisted of both customizing VisIt to meet Reactor campaign needs and of providing support for stakeholders in the Reactor campaign to ensure they were successful using the tool.

  13. Cytoscape: the network visualization tool for GenomeSpace workflows [v2; ref status: indexed, http://f1000r.es/47f

    Directory of Open Access Journals (Sweden)

    Barry Demchak

    2014-08-01

    Full Text Available Modern genomic analysis often requires workflows incorporating multiple best-of-breed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded over 850 times since the release of its first version in September, 2013.

  14. Cytoscape: the network visualization tool for GenomeSpace workflows [v1; ref status: indexed, http://f1000r.es/3ph

    Directory of Open Access Journals (Sweden)

    Barry Demchak

    2014-07-01

    Full Text Available Modern genomic analysis often requires workflows incorporating multiple best-ofbreed tools. GenomeSpace is a web-based visual workbench that combines a selection of these tools with mechanisms that create data flows between them. One such tool is Cytoscape 3, a popular application that enables analysis and visualization of graph-oriented genomic networks. As Cytoscape runs on the desktop, and not in a web browser, integrating it into GenomeSpace required special care in creating a seamless user experience and enabling appropriate data flows. In this paper, we present the design and operation of the Cytoscape GenomeSpace app, which accomplishes this integration, thereby providing critical analysis and visualization functionality for GenomeSpace users. It has been downloaded it over 850 times since the release of its first version in September, 2013.

  15. Visual-Haptic Integration: Cue Weights are Varied Appropriately, to Account for Changes in Haptic Reliability Introduced by Using a Tool

    OpenAIRE

    Chie Takahashi; Simon J Watt

    2011-01-01

    Tools such as pliers systematically change the relationship between an object's size and the hand opening required to grasp it. Previous work suggests the brain takes this into account, integrating visual and haptic size information that refers to the same object, independent of the similarity of the ‘raw’ visual and haptic signals (Takahashi et al., VSS 2009). Variations in tool geometry also affect the reliability (precision) of haptic size estimates, however, because they alter the change ...

  16. Nanobodies as Versatile Tools to Understand, Diagnose, Visualize and Treat Cancer

    Directory of Open Access Journals (Sweden)

    Isabel Van Audenhove

    2016-06-01

    Full Text Available Since their discovery, nanobodies have been used extensively in the fields of research, diagnostics and therapy. These antigen binding fragments, originating from Camelid heavy-chain antibodies, possess unusual hallmarks in terms of (small size, stability, solubility and specificity, hence allowing cost-effective production and sometimes outperforming monoclonal antibodies. In this review, we evaluate the current status of nanobodies to study, diagnose, visualize or inhibit cancer-specific proteins and processes. Nanobodies are highly adaptable tools for cancer research as they enable specific modulation of targets, enzymatic and non-enzymatic proteins alike. Molecular imaging studies benefit from the rapid, homogeneous tumor accumulation of nanobodies and their fast blood clearance, permitting previously unattainable fast tumor visualization. Moreover, they are endowed with considerable therapeutic potential as inhibitors of receptor-ligand pairs and deliverers of drugs or drug-loaded nanoparticles towards tumors. More in vivo and clinical studies are however eagerly awaited to unleash their full potential.

  17. A Workflow for Simulation and Visualization Of Seismic Wave Propagation Using SeisSol, VisIt and Avizo

    KAUST Repository

    Passone,Luca

    2011-08-01

    Ground motion estimation and subsurface exploration are main research areas in computational seismology, they are fundamental for implementing earthquake engineering requirements and for modern subsurface reservoir assessment. In this study we propose a workflow for discretizing, simulating and visualizing near source ground motion due to earthquake rupture. For data generation we use an elastic wave equation solver called SeisSol based on the Discontinuous Galerkin formulation with Arbitrary high-order DERivatives (ADER-DG). SeisSol is capable of highly accurate treatment of any earthquake source characterization, occurring on geometrically complex fault systems embedded in geologically complicated earth structures. We then visualize the results with two tools: VisIt (“a free interactive parallel visualization and graphical analysis tool for viewing scientific data”) and Avizo (“The 3D Analysis Software for Scientific and Industrial data”). We investigate each approach, include our experiences from model generation to visualization in highly immersive environments and conclude with a set of general recommendations for earthquake visualization.

  18. Data Cube Visualization with Blender

    Science.gov (United States)

    Kent, Brian R.; Gárate, Matías

    2017-06-01

    With the increasing data acquisition rates from observational and computational astrophysics, new tools are needed to study and visualize data. We present a methodology for rendering 3D data cubes using the open-source 3D software Blender. By importing processed observations and numerical simulations through the Voxel Data format, we are able use the Blender interface and Python API to create high-resolution animated visualizations. We review the methods for data import, animation, and camera movement, and present examples of this methodology. The 3D rendering of data cubes gives scientists the ability to create appealing displays that can be used for both scientific presentations as well as public outreach.

  19. A Model-Driven Visualization Tool for Use with Model-Based Systems Engineering Projects

    Science.gov (United States)

    Trase, Kathryn; Fink, Eric

    2014-01-01

    Model-Based Systems Engineering (MBSE) promotes increased consistency between a system's design and its design documentation through the use of an object-oriented system model. The creation of this system model facilitates data presentation by providing a mechanism from which information can be extracted by automated manipulation of model content. Existing MBSE tools enable model creation, but are often too complex for the unfamiliar model viewer to easily use. These tools do not yet provide many opportunities for easing into the development and use of a system model when system design documentation already exists. This study creates a Systems Modeling Language (SysML) Document Traceability Framework (SDTF) for integrating design documentation with a system model, and develops an Interactive Visualization Engine for SysML Tools (InVEST), that exports consistent, clear, and concise views of SysML model data. These exported views are each meaningful to a variety of project stakeholders with differing subjects of concern and depth of technical involvement. InVEST allows a model user to generate multiple views and reports from a MBSE model, including wiki pages and interactive visualizations of data. System data can also be filtered to present only the information relevant to the particular stakeholder, resulting in a view that is both consistent with the larger system model and other model views. Viewing the relationships between system artifacts and documentation, and filtering through data to see specialized views improves the value of the system as a whole, as data becomes information

  20. Visualizing Dataflow Graphs of Deep Learning Models in TensorFlow.

    Science.gov (United States)

    Wongsuphasawat, Kanit; Smilkov, Daniel; Wexler, James; Wilson, Jimbo; Mane, Dandelion; Fritz, Doug; Krishnan, Dilip; Viegas, Fernanda B; Wattenberg, Martin

    2018-01-01

    We present a design study of the TensorFlow Graph Visualizer, part of the TensorFlow machine intelligence platform. This tool helps users understand complex machine learning architectures by visualizing their underlying dataflow graphs. The tool works by applying a series of graph transformations that enable standard layout techniques to produce a legible interactive diagram. To declutter the graph, we decouple non-critical nodes from the layout. To provide an overview, we build a clustered graph using the hierarchical structure annotated in the source code. To support exploration of nested structure on demand, we perform edge bundling to enable stable and responsive cluster expansion. Finally, we detect and highlight repeated structures to emphasize a model's modular composition. To demonstrate the utility of the visualizer, we describe example usage scenarios and report user feedback. Overall, users find the visualizer useful for understanding, debugging, and sharing the structures of their models.

  1. The Visual Geophysical Exploration Environment: A Multi-dimensional Scientific Visualization

    Science.gov (United States)

    Pandya, R. E.; Domenico, B.; Murray, D.; Marlino, M. R.

    2003-12-01

    The Visual Geophysical Exploration Environment (VGEE) is an online learning environment designed to help undergraduate students understand fundamental Earth system science concepts. The guiding principle of the VGEE is the importance of hands-on interaction with scientific visualization and data. The VGEE consists of four elements: 1) an online, inquiry-based curriculum for guiding student exploration; 2) a suite of El Nino-related data sets adapted for student use; 3) a learner-centered interface to a scientific visualization tool; and 4) a set of concept models (interactive tools that help students understand fundamental scientific concepts). There are two key innovations featured in this interactive poster session. One is the integration of concept models and the visualization tool. Concept models are simple, interactive, Java-based illustrations of fundamental physical principles. We developed eight concept models and integrated them into the visualization tool to enable students to probe data. The ability to probe data using a concept model addresses the common problem of transfer: the difficulty students have in applying theoretical knowledge to everyday phenomenon. The other innovation is a visualization environment and data that are discoverable in digital libraries, and installed, configured, and used for investigations over the web. By collaborating with the Integrated Data Viewer developers, we were able to embed a web-launchable visualization tool and access to distributed data sets into the online curricula. The Thematic Real-time Environmental Data Distributed Services (THREDDS) project is working to provide catalogs of datasets that can be used in new VGEE curricula under development. By cataloging this curricula in the Digital Library for Earth System Education (DLESE), learners and educators can discover the data and visualization tool within a framework that guides their use.

  2. Top-Down Control of Visual Alpha Oscillations: Sources of Control Signals and Their Mechanisms of Action

    Science.gov (United States)

    Wang, Chao; Rajagovindan, Rajasimhan; Han, Sahng-Min; Ding, Mingzhou

    2016-01-01

    Alpha oscillations (8–12 Hz) are thought to inversely correlate with cortical excitability. Goal-oriented modulation of alpha has been studied extensively. In visual spatial attention, alpha over the region of visual cortex corresponding to the attended location decreases, signifying increased excitability to facilitate the processing of impending stimuli. In contrast, in retention of verbal working memory, alpha over visual cortex increases, signifying decreased excitability to gate out stimulus input to protect the information held online from sensory interference. According to the prevailing model, this goal-oriented biasing of sensory cortex is effected by top-down control signals from frontal and parietal cortices. The present study tests and substantiates this hypothesis by (a) identifying the signals that mediate the top-down biasing influence, (b) examining whether the cortical areas issuing these signals are task-specific or task-independent, and (c) establishing the possible mechanism of the biasing action. High-density human EEG data were recorded in two experimental paradigms: a trial-by-trial cued visual spatial attention task and a modified Sternberg working memory task. Applying Granger causality to both sensor-level and source-level data we report the following findings. In covert visual spatial attention, the regions exerting top-down control over visual activity are lateralized to the right hemisphere, with the dipoles located at the right frontal eye field (FEF) and the right inferior frontal gyrus (IFG) being the main sources of top-down influences. During retention of verbal working memory, the regions exerting top-down control over visual activity are lateralized to the left hemisphere, with the dipoles located at the left middle frontal gyrus (MFG) being the main source of top-down influences. In both experiments, top-down influences are mediated by alpha oscillations, and the biasing effect is likely achieved via an inhibition

  3. Decision support tool for diagnosing the source of variation

    Science.gov (United States)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  4. End-User Development of Information Visualization

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Lauesen, Søren; Vatrapu, Ravi

    2013-01-01

    such as data manipulation, but no formal training in programming. 18 visualization tools were surveyed from an enduser developer perspective. The results of this survey study show that end-user developers need better tools to create and modify custom visualizations. A closer collaboration between End......This paper investigates End-User Development of Information Visualization. More specifically, we investigated how existing visualization tools allow end-user developers to construct visualizations. End-user developers have some developing or scripting skills to perform relatively advanced tasks......-User Development and Information Visualization researchers could contribute towards the development of better tools to support custom visualizations. In addition, as empirical evaluations of these tools are lacking both research communities should focus more on this aspect. The study serves as a starting point...

  5. Visual comparison for information visualization

    KAUST Repository

    Gleicher, M.; Albers, D.; Walker, R.; Jusufi, I.; Hansen, C. D.; Roberts, J. C.

    2011-01-01

    Data analysis often involves the comparison of complex objects. With the ever increasing amounts and complexity of data, the demand for systems to help with these comparisons is also growing. Increasingly, information visualization tools support such comparisons explicitly, beyond simply allowing a viewer to examine each object individually. In this paper, we argue that the design of information visualizations of complex objects can, and should, be studied in general, that is independently of what those objects are. As a first step in developing this general understanding of comparison, we propose a general taxonomy of visual designs for comparison that groups designs into three basic categories, which can be combined. To clarify the taxonomy and validate its completeness, we provide a survey of work in information visualization related to comparison. Although we find a great diversity of systems and approaches, we see that all designs are assembled from the building blocks of juxtaposition, superposition and explicit encodings. This initial exploration shows the power of our model, and suggests future challenges in developing a general understanding of comparative visualization and facilitating the development of more comparative visualization tools. © The Author(s) 2011.

  6. Visual comparison for information visualization

    KAUST Repository

    Gleicher, M.

    2011-09-07

    Data analysis often involves the comparison of complex objects. With the ever increasing amounts and complexity of data, the demand for systems to help with these comparisons is also growing. Increasingly, information visualization tools support such comparisons explicitly, beyond simply allowing a viewer to examine each object individually. In this paper, we argue that the design of information visualizations of complex objects can, and should, be studied in general, that is independently of what those objects are. As a first step in developing this general understanding of comparison, we propose a general taxonomy of visual designs for comparison that groups designs into three basic categories, which can be combined. To clarify the taxonomy and validate its completeness, we provide a survey of work in information visualization related to comparison. Although we find a great diversity of systems and approaches, we see that all designs are assembled from the building blocks of juxtaposition, superposition and explicit encodings. This initial exploration shows the power of our model, and suggests future challenges in developing a general understanding of comparative visualization and facilitating the development of more comparative visualization tools. © The Author(s) 2011.

  7. Data visualization, bar naked: A free tool for creating interactive graphics.

    Science.gov (United States)

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Applying Open Source Game Engine for Building Visual Simulation Training System of Fire Fighting

    Science.gov (United States)

    Yuan, Diping; Jin, Xuesheng; Zhang, Jin; Han, Dong

    There's a growing need for fire departments to adopt a safe and fair method of training to ensure that the firefighting commander is in a position to manage a fire incident. Visual simulation training systems, with their ability to replicate and interact with virtual fire scenarios through the use of computer graphics or VR, become an effective and efficient method for fire ground education. This paper describes the system architecture and functions of a visual simulated training system of fire fighting on oil storage, which adopting Delat3D, a open source game and simulation engine, to provide realistic 3D views. It presents that using open source technology provides not only the commercial-level 3D effects but also a great reduction of cost.

  9. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  11. Monitoring CMS tracker construction and data quality using a Grid/Web service based on a visualization tool

    CERN Document Server

    Zito, Giuseppe; Regano, A

    2004-01-01

    The complexity of the CMS tracker (more than 50 million channels to monitor) now in construction in ten laboratories worldwide with hundreds of interested people, will require new tools for monitoring both the hardware and the software. In our approach we use both visualization tools and Grid services to make this monitoring possible. The use of visualization enables us to represent in a single computer screen all those million channels at once. The Grid will make it possible to get enough data and computing power in order to check every channel and also to reach the experts everywhere in the world allowing the early discovery of problems. We report here on a first prototype developed using the Grid environment already available now in CMS i.e. LCG2. This prototype consists on a Java client which implements the GUI for tracker visualization and two data servers connected to the tracker construction database and to Grid catalogs of event datasets. All the communication between client and servers is done using ...

  12. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  13. Gas discharge visualization: an imaging and modeling tool for medical biometrics.

    Science.gov (United States)

    Kostyuk, Nataliya; Cole, Phyadragren; Meghanathan, Natarajan; Isokpehi, Raphael D; Cohly, Hari H P

    2011-01-01

    The need for automated identification of a disease makes the issue of medical biometrics very current in our society. Not all biometric tools available provide real-time feedback. We introduce gas discharge visualization (GDV) technique as one of the biometric tools that have the potential to identify deviations from the normal functional state at early stages and in real time. GDV is a nonintrusive technique to capture the physiological and psychoemotional status of a person and the functional status of different organs and organ systems through the electrophotonic emissions of fingertips placed on the surface of an impulse analyzer. This paper first introduces biometrics and its different types and then specifically focuses on medical biometrics and the potential applications of GDV in medical biometrics. We also present our previous experience with GDV in the research regarding autism and the potential use of GDV in combination with computer science for the potential development of biological pattern/biomarker for different kinds of health abnormalities including cancer and mental diseases.

  14. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  15. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  16. Explorative visual analytics on interval-based genomic data and their metadata.

    Science.gov (United States)

    Jalili, Vahid; Matteucci, Matteo; Masseroli, Marco; Ceri, Stefano

    2017-12-04

    With the wide-spreading of public repositories of NGS processed data, the availability of user-friendly and effective tools for data exploration, analysis and visualization is becoming very relevant. These tools enable interactive analytics, an exploratory approach for the seamless "sense-making" of data through on-the-fly integration of analysis and visualization phases, suggested not only for evaluating processing results, but also for designing and adapting NGS data analysis pipelines. This paper presents abstractions for supporting the early analysis of NGS processed data and their implementation in an associated tool, named GenoMetric Space Explorer (GeMSE). This tool serves the needs of the GenoMetric Query Language, an innovative cloud-based system for computing complex queries over heterogeneous processed data. It can also be used starting from any text files in standard BED, BroadPeak, NarrowPeak, GTF, or general tab-delimited format, containing numerical features of genomic regions; metadata can be provided as text files in tab-delimited attribute-value format. GeMSE allows interactive analytics, consisting of on-the-fly cycling among steps of data exploration, analysis and visualization that help biologists and bioinformaticians in making sense of heterogeneous genomic datasets. By means of an explorative interaction support, users can trace past activities and quickly recover their results, seamlessly going backward and forward in the analysis steps and comparative visualizations of heatmaps. GeMSE effective application and practical usefulness is demonstrated through significant use cases of biological interest. GeMSE is available at http://www.bioinformatics.deib.polimi.it/GeMSE/ , and its source code is available at https://github.com/Genometric/GeMSE under GPLv3 open-source license.

  17. Plasma diagnostic tools for optimizing negative hydrogen ion sources

    International Nuclear Information System (INIS)

    Fantz, U.; Falter, H.D.; Franzen, P.; Speth, E.; Hemsworth, R.; Boilson, D.; Krylov, A.

    2006-01-01

    The powerful diagnostic tool of optical emission spectroscopy is used to measure the plasma parameters in negative hydrogen ion sources based on the surface mechanism. Results for electron temperature, electron density, atomic-to-molecular hydrogen density ratio, and gas temperature are presented for two types of sources, a rf source and an arc source, which are currently under development for a neutral beam heating system of ITER. The amount of cesium in the plasma volume is obtained from cesium radiation: the Cs neutral density is five to ten orders of magnitude lower than the hydrogen density and the Cs ion density is two to three orders of magnitude lower than the electron density in front of the grid. It is shown that monitoring of cesium lines is very useful for monitoring the cesium balance in the source. From a line-ratio method negative ion densities are determined. In a well-conditioned source the negative ion density is of the same order of magnitude as the electron density and correlates with extracted current densities

  18. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  19. Phylo-mLogo: an interactive and hierarchical multiple-logo visualization tool for alignment of many sequences

    Directory of Open Access Journals (Sweden)

    Lee DT

    2007-02-01

    Full Text Available Abstract Background When aligning several hundreds or thousands of sequences, such as epidemic virus sequences or homologous/orthologous sequences of some big gene families, to reconstruct the epidemiological history or their phylogenies, how to analyze and visualize the alignment results of many sequences has become a new challenge for computational biologists. Although there are several tools available for visualization of very long sequence alignments, few of them are applicable to the alignments of many sequences. Results A multiple-logo alignment visualization tool, called Phylo-mLogo, is presented in this paper. Phylo-mLogo calculates the variabilities and homogeneities of alignment sequences by base frequencies or entropies. Different from the traditional representations of sequence logos, Phylo-mLogo not only displays the global logo patterns of the whole alignment of multiple sequences, but also demonstrates their local homologous logos for each clade hierarchically. In addition, Phylo-mLogo also allows the user to focus only on the analysis of some important, structurally or functionally constrained sites in the alignment selected by the user or by built-in automatic calculation. Conclusion With Phylo-mLogo, the user can symbolically and hierarchically visualize hundreds of aligned sequences simultaneously and easily check the changes of their amino acid sites when analyzing many homologous/orthologous or influenza virus sequences. More information of Phylo-mLogo can be found at URL http://biocomp.iis.sinica.edu.tw/phylomlogo.

  20. APT: Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  1. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  2. Visual Control of Locomotion

    National Research Council Canada - National Science Library

    Loomis, Jack M; Beall, Andrew C

    2005-01-01

    The accomplishments were threefold. First, a software tool for rendering virtual environments was developed, a tool useful for other researchers interested in visual perception and visual control of action...

  3. Developing new serious games tools to improve radiation protection

    International Nuclear Information System (INIS)

    Majersky, T.; Rapant, T.; Bayer, M.; Majersky, D.

    2008-01-01

    In this paper, novel software technologies for simulation and training of workers in radiologically dangerous conditions are presented. Such new software tools enable the radiation protection managers and workers to better evaluate, visualize and intuitively understand the radiation situation. In the first part of the paper, virtual reality planning tool ALPLANNER is introduced. ALPLANNER enables computation of worker's doses and 3D simulation of planned activities in the environment. In the second part of the paper , a software technology SP ACEVISION for real-time interactive 3D visualization of radioactivity is presented. Radiation fields can be spatially and dynamically visualized in the environment using computer games technologies. Such real-time visualization can be used by RP staff to compute and visualize direct responses of the radiation field to the effects of shielding. Another presented application is determination and visualization of activity sources in inhomogeneous radiation fields. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. (authors)

  4. Developing new serious games tools to improve radiation protection

    International Nuclear Information System (INIS)

    Majersky, T.; Rapant, T.; Bayer, M.; Majersky, D.

    2009-01-01

    In this paper, novel software technologies for simulation and training of workers in radiologically dangerous conditions are presented. Such new software tools enable the radiation protection managers and workers to better evaluate, visualize and intuitively understand the radiation situation. In the first part of the paper, virtual reality planning tool ALPLANNER is introduced. ALPLANNER enables computation of worker's doses and 3D simulation of planned activities in the environment. In the second part of the paper , a software technology SP ACEVISION for real-time interactive 3D visualization of radioactivity is presented. Radiation fields can be spatially and dynamically visualized in the environment using computer games technologies. Such real-time visualization can be used by RP staff to compute and visualize direct responses of the radiation field to the effects of shielding. Another presented application is determination and visualization of activity sources in inhomogeneous radiation fields. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. Practical example of how the mentioned software technologies are used during the decommissioning of NPP A-1 Jaslovske Bohunice is provided. (authors)

  5. sbml-diff: A Tool for Visually Comparing SBML Models in Synthetic Biology.

    Science.gov (United States)

    Scott-Brown, James; Papachristodoulou, Antonis

    2017-07-21

    We present sbml-diff, a tool that is able to read a model of a biochemical reaction network in SBML format and produce a range of diagrams showing different levels of detail. Each diagram type can be used to visualize a single model or to visually compare two or more models. The default view depicts species as ellipses, reactions as rectangles, rules as parallelograms, and events as diamonds. A cartoon view replaces the symbols used for reactions on the basis of the associated Systems Biology Ontology terms. An abstract view represents species as ellipses and draws edges between them to indicate whether a species increases or decreases the production or degradation of another species. sbml-diff is freely licensed under the three-clause BSD license and can be downloaded from https://github.com/jamesscottbrown/sbml-diff and used as a python package called from other software, as a free-standing command-line application, or online using the form at http://sysos.eng.ox.ac.uk/tebio/upload.

  6. Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.

    2012-07-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source

  7. Particle Track Visualization using the MCNP Visual Editor

    International Nuclear Information System (INIS)

    Schwarz, Randolph A.; Carter, Lee; Brown, Wendi A.

    2001-01-01

    The Monte Carlo N-Particle (MCNP) visual editor1,2,3 is used throughout the world for displaying and creating complex MCNP geometries. The visual editor combines the Los Alamos MCNP Fortran code with a C front end to provide a visual interface. A big advantage of this approach is that the particle transport routines for MCNP are available to the visual front end. The latest release of the visual editor by Pacific Northwest National Laboratory enables the user to plot transport data points on top of a two-dimensional geometry plot. The user can plot source points, collisions points, surface crossings, and tally contributions. This capability can be used to show where particle collisions are occurring, verify the effectiveness of the particle biasing, or show which collisions contribute to a tally. For a KCODE (criticality source) calculation, the visual editor can be used to plot the source points for specific cycles

  8. Enhancing Nuclear Newcomer Training with 3D Visualization Learning Tools

    International Nuclear Information System (INIS)

    Gagnon, V.

    2016-01-01

    Full text: While the nuclear power industry is trying to reinforce its safety and regain public support post-Fukushima, it is also faced with a very real challenge that affects its day-to-day activities: a rapidly aging workforce. Statistics show that close to 40% of the current nuclear power industry workforce will retire within the next five years. For newcomer countries, the challenge is even greater, having to develop a completely new workforce. The workforce replacement effort introduces nuclear newcomers of a new generation with different backgrounds and affinities. Major lifestyle differences between the two generations of workers result, amongst other things, in different learning habits and needs for this new breed of learners. Interactivity, high visual content and quick access to information are now necessary to achieve a high level of retention. To enhance existing training programmes or to support the establishment of new training programmes for newcomer countries, L-3 MAPPS has devised learning tools to enhance these training programmes focused on the “Practice-by-Doing” principle. L-3 MAPPS has coupled 3D computer visualization with high-fidelity simulation to bring real-time, simulation-driven animated components and systems allowing immersive and participatory, individual or classroom learning. (author

  9. Book4All: A Tool to Make an e-Book More Accessible to Students with Vision/Visual-Impairments

    Science.gov (United States)

    Calabrò, Antonello; Contini, Elia; Leporini, Barbara

    Empowering people who are blind or otherwise visually impaired includes ensuring that products and electronic materials incorporate a broad range of accessibility features and work well with screen readers and other assistive technology devices. This is particularly important for students with vision impairments. Unfortunately, authors and publishers often do not include specific criteria when preparing the contents. Consequently, e-books can be inadequate for blind and low vision users, especially for students. In this paper we describe a semi-automatic tool developed to support operators who adapt e-documents for visually impaired students. The proposed tool can be used to convert a PDF e-book into a more suitable accessible and usable format readable on desktop computer or on mobile devices.

  10. Image Processing Tools for Improved Visualization and Analysis of Remotely Sensed Images for Agriculture and Forest Classifications

    OpenAIRE

    SINHA G. R.

    2017-01-01

    This paper suggests Image Processing tools for improved visualization and better analysis of remotely sensed images. There are methods already available in literature for the purpose but the most important challenge among the limitations is lack of robustness. We propose an optimal method for image enhancement of the images using fuzzy based approaches and few optimization tools. The segmentation images subsequently obtained after de-noising will be classified into distinct information and th...

  11. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    International Nuclear Information System (INIS)

    Njeh, Christopher F; Caroprese, Blas; Desai, Pushkar

    2012-01-01

    The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID) or computed radiography (CR). We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence

  12. VSEARCH: a versatile open source tool for metagenomics.

    Science.gov (United States)

    Rognes, Torbjørn; Flouri, Tomáš; Nichols, Ben; Quince, Christopher; Mahé, Frédéric

    2016-01-01

    VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010) for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment), clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order), chimera detection (reference-based or de novo ), dereplication (full length or prefix), pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling, while on a par with USEARCH for paired

  13. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    OpenAIRE

    Panta, Sandeep R.; Wang, Runtang; Fries, Jill; Kalyanam, Ravi; Speer, Nicole; Banich, Marie; Kiehl, Kent; King, Margaret; Milham, Michael; Wager, Tor D.; Turner, Jessica A.; Plis, Sergey M.; Calhoun, Vince D.

    2016-01-01

    In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI) scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS). We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed ...

  14. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    Science.gov (United States)

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  15. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  16. Dynamic visualizations as tools for supporting cosmological literacy

    Science.gov (United States)

    Buck, Zoe Elizabeth

    My dissertation research is designed to improve access to STEM content through the development of cosmology visualizations that support all learners as they engage in cosmological sense-making. To better understand how to design visualizations that work toward breaking cycles of power and access in the sciences, I orient my work to following "meta-question": How might educators use visualizations to support diverse ways of knowing and learning in order to expand access to cosmology, and to science? In this dissertation, I address this meta-question from a pragmatic epistemological perspective, through a sociocultural lens, following three lines of inquiry: experimental methods (Creswell, 2003) with a focus on basic visualization design, activity analysis (Wells, 1996; Ash, 2001; Rahm, 2012) with a focus on culturally and linguistically diverse learners, and case study (Creswell, 2000) with a focus on expansive learning at a planetarium (Engestrom, 2001; Ash, 2014). My research questions are as follows, each of which corresponds to a self contained course of inquiry with its own design, data, analysis and results: 1) Can mediational cues like color affect the way learners interpret the content in a cosmology visualization? 2) How do cosmology visualizations support cosmological sense-making for diverse students? 3) What are the shared objects of dynamic networks of activity around visualization production and use in a large, urban planetarium and how do they affect learning? The result is a mixed-methods design (Sweetman, Badiee & Creswell, 2010) where both qualitative and quantitative data are used when appropriate to address my research goals. In the introduction I begin by establishing a theoretical framework for understanding visualizations within cultural historical activity theory (CHAT) and situating the chapters that follow within that framework. I also introduce the concept of cosmological literacy, which I define as the set of conceptual, semiotic and

  17. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    Science.gov (United States)

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-04-01

    tools for better producing feasibility and management plans; (ii) a set of activities devoted to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - a dedicated module for water management and planning that will help to manage and aggregate all the distributed data coming from the simulation scenarios; - a whole module for calibration, uncertainty and sensitivity analysis; - a module for solute transport in the unsaturated zone; - a module for crop growth and water requirements in agriculture; - tools for dealing with groundwater quality issues; - tools for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. The Consortium is constituted by partners from various water sectors from 10 EU countries, plus Turkey and Ukraine. Synergies with the UNESCO HOPE initiative on free and open source software in water management greatly boost the value of the project. Large stakeholders involvement is thought to guarantee results dissemination and exploitation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement n. 642224. References MARSOL (2014). Demonstrating Managed Aquifer Recharge as a Solution to Water Scarcity and Drought www.marsol.eu [accessed 4 January 2015] Rossetto, R., Borsi, I., Schifani, C., Bonari, E., Mogorovich P. & Primicerio M. (2013) - SID&GRID: integrating hydrological modeling in GIS environment hydroinformatics system for the management of the water resource. Rendiconti Online Societa

  18. Drishti: a volume exploration and presentation tool

    Science.gov (United States)

    Limaye, Ajay

    2012-10-01

    Among several rendering techniques for volumetric data, direct volume rendering is a powerful visualization tool for a wide variety of applications. This paper describes the major features of hardware based volume exploration and presentation tool - Drishti. The word, Drishti, stands for vision or insight in Sanskrit, an ancient Indian language. Drishti is a cross-platform open-source volume rendering system that delivers high quality, state of the art renderings. The features in Drishti include, though not limited to, production quality rendering, volume sculpting, multi-resolution zooming, transfer function blending, profile generation, measurement tools, mesh generation, stereo/anaglyph/crosseye renderings. Ultimately, Drishti provides an intuitive and powerful interface for choreographing animations.

  19. Visualization of documents and concepts in neuroinformatics with the 3D-SE viewer

    Directory of Open Access Journals (Sweden)

    Antoine P Naud

    2007-11-01

    Full Text Available A new interactive visualization tool is proposed for mining text data from various fields of neuroscience. Applications to several text datasets are presented to demonstrate the capability of the proposed interactive tool to visualize complex relationships between pairs of lexical entities (with some semantic contents such as terms, keywords, posters, or papers' abstracts. Implemented as a Java applet, this tool is based on the spherical embedding (SE algorithm, which was designed for the visualization of bipartite graphs. Items such as words and documents are linked on the basis of occurrence relationships, which can be represented in a bipartite graph. These items are visualized by embedding the vertices of the bipartite graph on spheres in a three-dimensional (3-D space. The main advantage of the proposed visualization tool is that 3-D layouts can convey more information than planar or linear displays of items or graphs. Different kinds of information extracted from texts, such as keywords, indexing terms, or topics are visualized, allowing interactive browsing of various fields of research featured by keywords, topics, or research teams. A typical use of the 3D-SE viewer is quick browsing of topics displayed on a sphere, then selecting one or several item(s displays links to related terms on another sphere representing, e.g., documents or abstracts, and provides direct online access to the document source in a database, such as the Visiome Platform or the SfN Annual Meeting. Developed as a Java applet, it operates as a tool on top of existing resources.

  20. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, poweful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  1. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, powerful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  2. AceTree: a tool for visual analysis of Caenorhabditis elegans embryogenesis

    Directory of Open Access Journals (Sweden)

    Araya Carlos L

    2006-06-01

    Full Text Available Abstract Background The invariant lineage of the nematode Caenorhabditis elegans has potential as a powerful tool for the description of mutant phenotypes and gene expression patterns. We previously described procedures for the imaging and automatic extraction of the cell lineage from C. elegans embryos. That method uses time-lapse confocal imaging of a strain expressing histone-GFP fusions and a software package, StarryNite, processes the thousands of images and produces output files that describe the location and lineage relationship of each nucleus at each time point. Results We have developed a companion software package, AceTree, which links the images and the annotations using tree representations of the lineage. This facilitates curation and editing of the lineage. AceTree also contains powerful visualization and interpretive tools, such as space filling models and tree-based expression patterning, that can be used to extract biological significance from the data. Conclusion By pairing a fast lineaging program written in C with a user interface program written in Java we have produced a powerful software suite for exploring embryonic development.

  3. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement

    Directory of Open Access Journals (Sweden)

    Juan J. Garcia-Cantero

    2017-06-01

    Full Text Available Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells’ overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma’s morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been

  4. [Visual cues as a therapeutic tool in Parkinson's disease. A systematic review].

    Science.gov (United States)

    Muñoz-Hellín, Elena; Cano-de-la-Cuerda, Roberto; Miangolarra-Page, Juan Carlos

    2013-01-01

    Sensory stimuli or sensory cues are being used as a therapeutic tool for improving gait disorders in Parkinson's disease patients, but most studies seem to focus on auditory stimuli. The aim of this study was to conduct a systematic review regarding the use of visual cues over gait disorders, dual tasks during gait, freezing and the incidence of falls in patients with Parkinson to obtain therapeutic implications. We conducted a systematic review in main databases such as Cochrane Database of Systematic Reviews, TripDataBase, PubMed, Ovid MEDLINE, Ovid EMBASE and Physiotherapy Evidence Database, during 2005 to 2012, according to the recommendations of the Consolidated Standards of Reporting Trials, evaluating the quality of the papers included with the Downs & Black Quality Index. 21 articles were finally included in this systematic review (with a total of 892 participants) with variable methodological quality, achieving an average of 17.27 points in the Downs and Black Quality Index (range: 11-21). Visual cues produce improvements over temporal-spatial parameters in gait, turning execution, reducing the appearance of freezing and falls in Parkinson's disease patients. Visual cues appear to benefit dual tasks during gait, reducing the interference of the second task. Further studies are needed to determine the preferred type of stimuli for each stage of the disease. Copyright © 2012 SEGG. Published by Elsevier Espana. All rights reserved.

  5. Advancements to Visualization Control System (VCS, part of UV-CDAT), a Visualization Package Designed for Climate Scientists

    Science.gov (United States)

    Lipsa, D.; Chaudhary, A.; Williams, D. N.; Doutriaux, C.; Jhaveri, S.

    2017-12-01

    Climate Data Analysis Tools (UV-CDAT, https://uvcdat.llnl.gov) is a data analysis and visualization software package developed at Lawrence Livermore National Laboratory and designed for climate scientists. Core components of UV-CDAT include: 1) Community Data Management System (CDMS) which provides I/O support and a data model for climate data;2) CDAT Utilities (GenUtil) that processes data using spatial and temporal averaging and statistic functions; and 3) Visualization Control System (VCS) for interactive visualization of the data. VCS is a Python visualization package primarily built for climate scientists, however, because of its generality and breadth of functionality, it can be a useful tool to other scientific applications. VCS provides 1D, 2D and 3D visualization functions such as scatter plot and line graphs for 1d data, boxfill, meshfill, isofill, isoline for 2d scalar data, vector glyphs and streamlines for 2d vector data and 3d_scalar and 3d_vector for 3d data. Specifically for climate data our plotting routines include projections, Skew-T plots and Taylor diagrams. While VCS provided a user-friendly API, the previous implementation of VCS relied on slow performing vector graphics (Cairo) backend which is suitable for smaller dataset and non-interactive graphics. LLNL and Kitware team has added a new backend to VCS that uses the Visualization Toolkit (VTK) as its visualization backend. VTK is one of the most popular open source, multi-platform scientific visualization library written in C++. Its use of OpenGL and pipeline processing architecture results in a high performant VCS library. Its multitude of available data formats and visualization algorithms results in easy adoption of new visualization methods and new data formats in VCS. In this presentation, we describe recent contributions to VCS that includes new visualization plots, continuous integration testing using Conda and CircleCI, tutorials and examples using Jupyter notebooks as well as

  6. Evaluation of a visual risk communication tool: effects on knowledge and perception of blood transfusion risk.

    Science.gov (United States)

    Lee, D H; Mehta, M D

    2003-06-01

    Effective risk communication in transfusion medicine is important for health-care consumers, but understanding the numerical magnitude of risks can be difficult. The objective of this study was to determine the effect of a visual risk communication tool on the knowledge and perception of transfusion risk. Laypeople were randomly assigned to receive transfusion risk information with either a written or a visual presentation format for communicating and comparing the probabilities of transfusion risks relative to other hazards. Knowledge of transfusion risk was ascertained with a multiple-choice quiz and risk perception was ascertained by psychometric scaling and principal components analysis. Two-hundred subjects were recruited and randomly assigned. Risk communication with both written and visual presentation formats increased knowledge of transfusion risk and decreased the perceived dread and severity of transfusion risk. Neither format changed the perceived knowledge and control of transfusion risk, nor the perceived benefit of transfusion. No differences in knowledge or risk perception outcomes were detected between the groups randomly assigned to written or visual presentation formats. Risk communication that incorporates risk comparisons in either written or visual presentation formats can improve knowledge and reduce the perception of transfusion risk in laypeople.

  7. VisComposer: A Visual Programmable Composition Environment for Information Visualization

    Directory of Open Access Journals (Sweden)

    Honghui Mei

    2018-03-01

    Full Text Available As the amount of data being collected has increased, the need for tools that can enable the visual exploration of data has also grown. This has led to the development of a variety of widely used programming frameworks for information visualization. Unfortunately, such frameworks demand comprehensive visualization and coding skills and require users to develop visualization from scratch. An alternative is to create interactive visualization design environments that require little to no programming. However, these tools only supports a small portion of visual forms.We present a programmable integrated development environment (IDE, VisComposer, that supports the development of expressive visualization using a drag-and-drop visual interface. VisComposer exposes the programmability by customizing desired components within a modularized visualization composition pipeline, effectively balancing the capability gap between expert coders and visualization artists. The implemented system empowers users to compose comprehensive visualizations with real-time preview and optimization features, and supports prototyping, sharing and reuse of the effects by means of an intuitive visual composer. Visual programming and textual programming integrated in our system allow users to compose more complex visual effects while retaining the simplicity of use. We demonstrate the performance of VisComposer with a variety of examples and an informal user evaluation. Keywords: Information Visualization, Visualization authoring, Interactive development environment

  8. Web tools for effective retrieval, visualization, and evaluation of cardiology medical images and records

    Science.gov (United States)

    Masseroli, Marco; Pinciroli, Francesco

    2000-12-01

    To provide easy retrieval, integration and evaluation of multimodal cardiology images and data in a web browser environment, distributed application technologies and java programming were used to implement a client-server architecture based on software agents. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. The client side is a Java applet running in a web browser and providing a friendly medical user interface to perform queries on patient and medical test dat and integrate and visualize properly the various query results. A set of tools based on Java Advanced Imaging API enables to process and analyze the retrieved cardiology images, and quantify their features in different regions of interest. The platform-independence Java technology makes the developed prototype easy to be managed in a centralized form and provided in each site where an intranet or internet connection can be located. Giving the healthcare providers effective tools for querying, visualizing and evaluating comprehensively cardiology medical images and records in all locations where they can need them- i.e. emergency, operating theaters, ward, or even outpatient clinics- the developed prototype represents an important aid in providing more efficient diagnoses and medical treatments.

  9. The effectiveness of dental health education tools for visually impaired students in Bukit Mertajam

    Science.gov (United States)

    Shahabudin, Saadiah; Hashim, Hasnah; Omar, Maizurah

    2016-12-01

    Oral health is a vital component of overall health. It is important in adults and children alike, however, it is even more crucial for children with special needs as they have limited ability to perform oral health practices. Disabled children deserve the same opportunity for oral health as normal children. Unfortunately, oral health care is the most unattended health needs of the disabled children. This study aimed to assess the effectiveness of dental health education tools for visually impaired students in two schools in Bukit Mertajam, Penang. The project utilized dental health education tools consisting of an oral health module (printed in braille for the blind and in font 18px for the partially blind), an audio narration of the module were prepared and content-validated by an expert panel. Baseline plaque scores of 38 subjects aged 6-17 years were determined by a trained dental staff nurse. The module was then administered to the subjects facilitated by the teachers. Post intervention plaque scores were recorded again after one month. The pre and post intervention data were analyzed using Wilcoxon Signed Ranks Test with a significant p value set at among students with visual impairment. We recommend for further studies to be conducted on a bigger sample.

  10. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  11. PRI-CAT: a web-tool for the analysis, storage and visualization of plant ChIP-seq experiments.

    NARCIS (Netherlands)

    Muino, J.M.; Hoogstraat, M.; Ham, van R.C.H.J.; Dijk, van A.D.J.

    2011-01-01

    Although several tools for the analysis of ChIP-seq data have been published recently, there is a growing demand, in particular in the plant research community, for computational resources with which such data can be processed, analyzed, stored, visualized and integrated within a single,

  12. Evolview v2: an online visualization and management tool for customized and annotated phylogenetic trees.

    Science.gov (United States)

    He, Zilong; Zhang, Huangkai; Gao, Shenghan; Lercher, Martin J; Chen, Wei-Hua; Hu, Songnian

    2016-07-08

    Evolview is an online visualization and management tool for customized and annotated phylogenetic trees. It allows users to visualize phylogenetic trees in various formats, customize the trees through built-in functions and user-supplied datasets and export the customization results to publication-ready figures. Its 'dataset system' contains not only the data to be visualized on the tree, but also 'modifiers' that control various aspects of the graphical annotation. Evolview is a single-page application (like Gmail); its carefully designed interface allows users to upload, visualize, manipulate and manage trees and datasets all in a single webpage. Developments since the last public release include a modern dataset editor with keyword highlighting functionality, seven newly added types of annotation datasets, collaboration support that allows users to share their trees and datasets and various improvements of the web interface and performance. In addition, we included eleven new 'Demo' trees to demonstrate the basic functionalities of Evolview, and five new 'Showcase' trees inspired by publications to showcase the power of Evolview in producing publication-ready figures. Evolview is freely available at: http://www.evolgenius.info/evolview/. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Adoption of Free Open Source Geographic Information System Solution for Health Sector in Zanzibar Tanzania

    OpenAIRE

    BAKAR, Abubakar D.; KIMARO, Honest C.; SULTAN, Abu Bakar MD; HAMIAR, S.

    2014-01-01

    The study aims at developing in-depth understanding on how Open Source Geographic Information System technology is used to provide solutions for data visualization in the health sector of Zanzibar, Tanzania. The study focuses on implementing the health visualization solutions for the purpose of bridging the gap during the transition period from proprietary software to the Free Open-Source Software using Key Indicator Data System. The developed tool facilitates data integration between the two...

  14. Public data and open source tools for multi-assay genomic investigation of disease.

    Science.gov (United States)

    Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi

    2016-07-01

    Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.

  15. An open-source optimization tool for solar home systems: A case study in Namibia

    International Nuclear Information System (INIS)

    Campana, Pietro Elia; Holmberg, Aksel; Pettersson, Oscar; Klintenberg, Patrik; Hangula, Abraham; Araoz, Fabian Benavente; Zhang, Yang; Stridh, Bengt; Yan, Jinyue

    2016-01-01

    Highlights: • An open-source optimization tool for solar home systems (SHSs) design is developed. • The optimization tool is written in MS Excel-VBA. • The optimization tool is validated with a commercial and open-source software. • The optimization tool has the potential of improving future SHS installations. - Abstract: Solar home systems (SHSs) represent a viable technical solution for providing electricity to households and improving standard of living conditions in areas not reached by the national grid or local grids. For this reason, several rural electrification programmes in developing countries, including Namibia, have been relying on SHSs to electrify rural off-grid communities. However, the limited technical know-how of service providers, often resulting in over- or under-sized SHSs, is an issue that has to be solved to avoid dissatisfaction of SHSs’ users. The solution presented here is to develop an open-source software that service providers can use to optimally design SHSs components based on the specific electricity requirements of the end-user. The aim of this study is to develop and validate an optimization model written in MS Excel-VBA which calculates the optimal SHSs components capacities guaranteeing the minimum costs and the maximum system reliability. The results obtained with the developed tool showed good agreement with a commercial software and a computational code used in research activities. When applying the developed optimization tool to existing systems, the results identified that several components were incorrectly sized. The tool has thus the potentials of improving future SHSs installations, contributing to increasing satisfaction of end-users.

  16. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    Science.gov (United States)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  17. Evaluating role of interactive visualization tool in improving students' conceptual understanding of chemical equilibrium

    Science.gov (United States)

    Sampath Kumar, Bharath

    The purpose of this study is to examine the role of partnering visualization tool such as simulation towards development of student's concrete conceptual understanding of chemical equilibrium. Students find chemistry concepts abstract, especially at the microscopic level. Chemical equilibrium is one such topic. While research studies have explored effectiveness of low tech instructional strategies such as analogies, jigsaw, cooperative learning, and using modeling blocks, fewer studies have explored the use of visualization tool such as simulations in the context of dynamic chemical equilibrium. Research studies have identified key reasons behind misconceptions such as lack of systematic understanding of foundational chemistry concepts, failure to recognize the system is dynamic, solving numerical problems on chemical equilibrium in an algorithmic fashion, erroneous application Le Chatelier's principle (LCP) etc. Kress et al. (2001) suggested that external representation in the form of visualization is more than a tool for learning, because it enables learners to make meanings or express their ideas which cannot be readily done so through a verbal representation alone. Mixed method study design was used towards data collection. The qualitative portion of the study is aimed towards understanding the change in student's mental model before and after the intervention. A quantitative instrument was developed based on common areas of misconceptions identified by research studies. A pilot study was conducted prior to the actual study to obtain feedback from students on the quantitative instrument and the simulation. Participants for the pilot study were sampled from a single general chemistry class. Following the pilot study, the research study was conducted with a total of 27 students (N=15 in experimental group and N=12 in control group). Prior to participating in the study, students have completed their midterm test on the topic of chemical equilibrium. Qualitative

  18. Deploying web-based visual exploration tools on the grid

    Energy Technology Data Exchange (ETDEWEB)

    Jankun-Kelly, T.J.; Kreylos, Oliver; Shalf, John; Ma, Kwan-Liu; Hamann, Bernd; Joy, Kenneth; Bethel, E. Wes

    2002-02-01

    We discuss a web-based portal for the exploration, encapsulation, and dissemination of visualization results over the Grid. This portal integrates three components: an interface client for structured visualization exploration, a visualization web application to manage the generation and capture of the visualization results, and a centralized portal application server to access and manage grid resources. We demonstrate the usefulness of the developed system using an example for Adaptive Mesh Refinement (AMR) data visualization.

  19. When complex is easy on the mind: internal repetition of visual information in complex objects is a source of perceptual fluency

    NARCIS (Netherlands)

    Linda Steg; Roos Pals; Ayça Berfu Ünal; Yannick Joye

    2015-01-01

    Across 3 studies, we investigated whether visual complexity deriving from internally repeating visual information over many scale levels is a source of perceptual fluency. Such continuous repetition of visual information is formalized in fractal geometry and is a key-property of natural structures.

  20. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  1. Towards a New Generation of Time-Series Visualization Tools in the ESA Heliophysics Science Archives

    Science.gov (United States)

    Perez, H.; Martinez, B.; Cook, J. P.; Herment, D.; Fernandez, M.; De Teodoro, P.; Arnaud, M.; Middleton, H. R.; Osuna, P.; Arviset, C.

    2017-12-01

    During the last decades a varied set of Heliophysics missions have allowed the scientific community to gain a better knowledge on the solar atmosphere and activity. The remote sensing images of missions such as SOHO have paved the ground for Helio-based spatial data visualization software such as JHelioViewer/Helioviewer. On the other hand, the huge amount of in-situ measurements provided by other missions such as Cluster provide a wide base for plot visualization software whose reach is still far from being fully exploited. The Heliophysics Science Archives within the ESAC Science Data Center (ESDC) already provide a first generation of tools for time-series visualization focusing on each mission's needs: visualization of quicklook plots, cross-calibration time series, pre-generated/on-demand multi-plot stacks (Cluster), basic plot zoom in/out options (Ulysses) and easy navigation through the plots in time (Ulysses, Cluster, ISS-Solaces). However, as the needs evolve and the scientists involved in new missions require to plot multi-variable data, heat maps stacks interactive synchronization and axis variable selection among other improvements. The new Heliophysics archives (such as Solar Orbiter) and the evolution of existing ones (Cluster) intend to address these new challenges. This paper provides an overview of the different approaches for visualizing time-series followed within the ESA Heliophysics Archives and their foreseen evolution.

  2. iELVis: An open source MATLAB toolbox for localizing and visualizing human intracranial electrode data.

    Science.gov (United States)

    Groppe, David M; Bickel, Stephan; Dykstra, Andrew R; Wang, Xiuyuan; Mégevand, Pierre; Mercier, Manuel R; Lado, Fred A; Mehta, Ashesh D; Honey, Christopher J

    2017-04-01

    Intracranial electrical recordings (iEEG) and brain stimulation (iEBS) are invaluable human neuroscience methodologies. However, the value of such data is often unrealized as many laboratories lack tools for localizing electrodes relative to anatomy. To remedy this, we have developed a MATLAB toolbox for intracranial electrode localization and visualization, iELVis. NEW METHOD: iELVis uses existing tools (BioImage Suite, FSL, and FreeSurfer) for preimplant magnetic resonance imaging (MRI) segmentation, neuroimaging coregistration, and manual identification of electrodes in postimplant neuroimaging. Subsequently, iELVis implements methods for correcting electrode locations for postimplant brain shift with millimeter-scale accuracy and provides interactive visualization on 3D surfaces or in 2D slices with optional functional neuroimaging overlays. iELVis also localizes electrodes relative to FreeSurfer-based atlases and can combine data across subjects via the FreeSurfer average brain. It takes 30-60min of user time and 12-24h of computer time to localize and visualize electrodes from one brain. We demonstrate iELVis's functionality by showing that three methods for mapping primary hand somatosensory cortex (iEEG, iEBS, and functional MRI) provide highly concordant results. COMPARISON WITH EXISTING METHODS: iELVis is the first public software for electrode localization that corrects for brain shift, maps electrodes to an average brain, and supports neuroimaging overlays. Moreover, its interactive visualizations are powerful and its tutorial material is extensive. iELVis promises to speed the progress and enhance the robustness of intracranial electrode research. The software and extensive tutorial materials are freely available as part of the EpiSurg software project: https://github.com/episurg/episurg. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. VSEARCH: a versatile open source tool for metagenomics

    Directory of Open Access Journals (Sweden)

    Torbjørn Rognes

    2016-10-01

    Full Text Available Background VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010 for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. Methods When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. Results VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment, clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order, chimera detection (reference-based or de novo, dereplication (full length or prefix, pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling

  4. Data-Proximate Analysis and Visualization in the Cloud using Cloudstream, an Open-Source Application Streaming Technology Stack

    Science.gov (United States)

    Fisher, W. I.

    2017-12-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.

  5. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  6. Customizable Time-Oriented Visualizations

    DEFF Research Database (Denmark)

    Kuhail, Mohammad Amin; Pantazos, Kostas; Lauesen, Søren

    2012-01-01

    Most commercial visualization tools support an easy and quick creation of conventional time-oriented visualizations such as line charts, but customization is limited. In contrast, some academic visualization tools and programming languages support the creation of some customizable time......-oriented visualizations but it is time consuming and hard. To combine efficiency, the effort required to develop a visualization, and customizability, the ability to tailor a visualization, we developed time-oriented building blocks that address the specifics of time (e.g. linear vs. cyclic or point-based vs. interval......-based) and consist of inner customizable parts (e.g. ticks). A combination of the time-oriented and other primitive graphical building blocks allowed the creation of several customizable advanced time-oriented visualizations. The appearance and behavior of the blocks are specified using spreadsheet-like formulas. We...

  7. Measuring temporal summation in visual detection with a single-photon source.

    Science.gov (United States)

    Holmes, Rebecca; Victora, Michelle; Wang, Ranxiao Frances; Kwiat, Paul G

    2017-11-01

    Temporal summation is an important feature of the visual system which combines visual signals that arrive at different times. Previous research estimated complete summation to last for 100ms for stimuli judged "just detectable." We measured the full range of temporal summation for much weaker stimuli using a new paradigm and a novel light source, developed in the field of quantum optics for generating small numbers of photons with precise timing characteristics and reduced variance in photon number. Dark-adapted participants judged whether a light was presented to the left or right of their fixation in each trial. In Experiment 1, stimuli contained a stream of photons delivered at a constant rate while the duration was systematically varied. Accuracy should increase with duration as long as the later photons can be integrated with the proceeding ones into a single signal. The temporal integration window was estimated as the point that performance no longer improved, and was found to be 650ms on average. In Experiment 2, the duration of the visual stimuli was kept short (100ms or photons was varied to explore the efficiency of summation over the integration window compared to Experiment 1. There was some indication that temporal summation remains efficient over the integration window, although there is variation between individuals. The relatively long integration window measured in this study may be relevant to studies of the absolute visual threshold, i.e., tests of single-photon vision, where "single" photons should be separated by greater than the integration window to avoid summation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  9. 802.11s Wireless Mesh Network Visualization Application

    Science.gov (United States)

    Mauldin, James Alexander

    2014-01-01

    Results of past experimentation at NASA Johnson Space Center showed that the IEEE 802.11s standard has better performance than the widely implemented alternative protocol B.A.T.M.A.N (Better Approach to Mobile Ad hoc Networking). 802.11s is now formally incorporated into the Wi- Fi 802.11-2012 standard, which specifies a hybrid wireless mesh networking protocol (HWMP). In order to quickly analyze changes to the routing algorithm and to support optimizing the mesh network behavior for our intended application a visualization tool was developed by modifying and integrating open source tools.

  10. A simple quality assurance test tool for the visual verification of light and radiation field congruent using electronic portal images device and computed radiography

    Directory of Open Access Journals (Sweden)

    Njeh Christopher F

    2012-03-01

    Full Text Available Abstract Background The radiation field on most megavoltage radiation therapy units are shown by a light field projected through the collimator by a light source mounted inside the collimator. The light field is traditionally used for patient alignment. Hence it is imperative that the light field is congruent with the radiation field. Method A simple quality assurance tool has been designed for rapid and simple test of the light field and radiation field using electronic portal images device (EPID or computed radiography (CR. We tested this QA tool using Varian PortalVision and Elekta iViewGT EPID systems and Kodak CR system. Results Both the single and double exposure techniques were evaluated, with double exposure technique providing a better visualization of the light-radiation field markers. The light and radiation congruency could be detected within 1 mm. This will satisfy the American Association of Physicists in Medicine task group report number 142 recommendation of 2 mm tolerance. Conclusion The QA tool can be used with either an EPID or CR to provide a simple and rapid method to verify light and radiation field congruence.

  11. Critical visualization: a case for rethinking how we visualize risk and security

    OpenAIRE

    Hall, Peter; Heath, Claude; Coles-Kemp, Lizzie

    2015-01-01

    In an era of high-profile hacks, information leaks and cybercrime, cybersecurity is the focus of much corporate and state-funded research. Data visualization is regarded as an important tool in the detection and prediction of risk and vulnerability in cybersecurity, but discussion tends to remain at the level of the usability of visualization tools and how to reduce the cognitive load on the consumers of the visualizations. This focus is rooted in a desire to simplify the complexity of cybers...

  12. Visual tool for estimating the fractal dimension of images

    Science.gov (United States)

    Grossu, I. V.; Besliu, C.; Rusu, M. V.; Jipa, Al.; Bordeianu, C. C.; Felea, D.

    2009-10-01

    This work presents a new Visual Basic 6.0 application for estimating the fractal dimension of images, based on an optimized version of the box-counting algorithm. Following the attempt to separate the real information from "noise", we considered also the family of all band-pass filters with the same band-width (specified as parameter). The fractal dimension can be thus represented as a function of the pixel color code. The program was used for the study of paintings cracks, as an additional tool which can help the critic to decide if an artistic work is original or not. Program summaryProgram title: Fractal Analysis v01 Catalogue identifier: AEEG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 29 690 No. of bytes in distributed program, including test data, etc.: 4 967 319 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 30M Classification: 14 Nature of problem: Estimating the fractal dimension of images. Solution method: Optimized implementation of the box-counting algorithm. Use of a band-pass filter for separating the real information from "noise". User friendly graphical interface. Restrictions: Although various file-types can be used, the application was mainly conceived for the 8-bit grayscale, windows bitmap file format. Running time: In a first approximation, the algorithm is linear.

  13. 3rd Annual Earth System Grid Federation and 3rd Annual Earth System Grid Federation and Ultrascale Visualization Climate Data Analysis Tools Face-to-Face Meeting Report December 2013

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2014-02-21

    The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed and simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.

  14. A graph algebra for scalable visual analytics.

    Science.gov (United States)

    Shaverdian, Anna A; Zhou, Hao; Michailidis, George; Jagadish, Hosagrahar V

    2012-01-01

    Visual analytics (VA), which combines analytical techniques with advanced visualization features, is fast becoming a standard tool for extracting information from graph data. Researchers have developed many tools for this purpose, suggesting a need for formal methods to guide these tools' creation. Increased data demands on computing requires redesigning VA tools to consider performance and reliability in the context of analysis of exascale datasets. Furthermore, visual analysts need a way to document their analyses for reuse and results justification. A VA graph framework encapsulated in a graph algebra helps address these needs. Its atomic operators include selection and aggregation. The framework employs a visual operator and supports dynamic attributes of data to enable scalable visual exploration of data.

  15. Visual Artist or Visual Designer? Visual Communication Design Education

    OpenAIRE

    Arsoy, Aysu

    2010-01-01

    ABSTRACT: Design tools and contents have been digitalized, forming the contemporary fields of the visual arts and design. Corporate culture demands techno-social experts who understand the arts, design, culture and society, while also having a high level of technological proficiency. New departments have opened offering alternatives in art and design education such as Visual Communication Design (VCD) and are dedicated to educating students in the practical aspect of using digital technologi...

  16. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  17. Maximizing Impact: Pairing interactive web visualizations with traditional print media

    Science.gov (United States)

    Read, E. K.; Appling, A.; Carr, L.; De Cicco, L.; Read, J. S.; Walker, J. I.; Winslow, L. A.

    2016-12-01

    Our Nation's rapidly growing store of environmental data makes new demands on researchers: to take on increasingly broad-scale, societally relevant analyses and to rapidly communicate findings to the public. Interactive web-based data visualizations now commonly supplement or comprise journalism, and science journalism has followed suit. To maximize the impact of US Geological Survey (USGS) science, the USGS Office of Water Information Data Science team builds tools and products that combine traditional static research products (e.g., print journal articles) with web-based, interactive data visualizations that target non-scientific audiences. We developed a lightweight, open-source framework for web visualizations to reduce time to production. The framework provides templates for a data visualization workflow and the packaging of text, interactive figures, and images into an appealing web interface with standardized look and feel, usage tracking, and responsiveness. By partnering with subject matter experts to focus on timely, societally relevant issues, we use these tools to produce appealing visual stories targeting specific audiences, including managers, the general public, and scientists, on diverse topics including drought, microplastic pollution, and fisheries response to climate change. We will describe the collaborative and technical methodologies used; describe some examples of how it's worked; and challenges and opportunities for the future.

  18. Web-based discovery, access and analysis tools for the provision of different data sources like remote sensing products and climate data

    Science.gov (United States)

    Eberle, J.; Hese, S.; Schmullius, C.

    2012-12-01

    To provide different of Earth Observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. The infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO) for data discovery, data access and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. Several products from the Moderate Resolution Imaging Spectroradiometer sensor were integrated by serving ISO-compliant Metadata and providing OGC-compliant Web Map Service for data visualization and Web Coverage Services / Web Feature Service for data access. Furthermore climate data from the World Meteorological Organization were downloaded, converted, provided as OGC Sensor Observation Service. Each climate data station is described with ISO-compliant Metadata. All these datasets from multiple sources are provided within the SIB-ESS-C infrastructure (figure 1). Furthermore an automatic workflow integrates updates of these datasets daily. The brokering approach within the SIB-ESS-C system is to collect data from different sources, convert the data into common data formats, if necessary, and provide them with standardized Web services. Additional tools are made available within the SIB-ESS-C Geoportal for an easy access to download and analysis functions (figure 2). The data can be visualized, accessed and analysed with this Geoportal. Providing OGC-compliant services the data can also be accessed with other OGC-compliant clients.; Figure 1. Technical Concept of SIB-ESS-C providing different data sources ; Figure 2. Screenshot of the web-based SIB-ESS-C system.

  19. High performance geospatial and climate data visualization using GeoJS

    Science.gov (United States)

    Chaudhary, A.; Beezley, J. D.

    2015-12-01

    GeoJS (https://github.com/OpenGeoscience/geojs) is an open-source library developed to support interactive scientific and geospatial visualization of climate and earth science datasets in a web environment. GeoJS has a convenient application programming interface (API) that enables users to harness the fast performance of WebGL and Canvas 2D APIs with sophisticated Scalable Vector Graphics (SVG) features in a consistent and convenient manner. We started the project in response to the need for an open-source JavaScript library that can combine traditional geographic information systems (GIS) and scientific visualization on the web. Many libraries, some of which are open source, support mapping or other GIS capabilities, but lack the features required to visualize scientific and other geospatial datasets. For instance, such libraries are not be capable of rendering climate plots from NetCDF files, and some libraries are limited in regards to geoinformatics (infovis in a geospatial environment). While libraries such as d3.js are extremely powerful for these kinds of plots, in order to integrate them into other GIS libraries, the construction of geoinformatics visualizations must be completed manually and separately, or the code must somehow be mixed in an unintuitive way.We developed GeoJS with the following motivations:• To create an open-source geovisualization and GIS library that combines scientific visualization with GIS and informatics• To develop an extensible library that can combine data from multiple sources and render them using multiple backends• To build a library that works well with existing scientific visualizations tools such as VTKWe have successfully deployed GeoJS-based applications for multiple domains across various projects. The ClimatePipes project funded by the Department of Energy, for example, used GeoJS to visualize NetCDF datasets from climate data archives. Other projects built visualizations using GeoJS for interactively exploring

  20. Learning Visualizations by Analogy: Promoting Visual Literacy through Visualization Morphing.

    Science.gov (United States)

    Ruchikachorn, Puripant; Mueller, Klaus

    2015-09-01

    We propose the concept of teaching (and learning) unfamiliar visualizations by analogy, that is, demonstrating an unfamiliar visualization method by linking it to another more familiar one, where the in-betweens are designed to bridge the gap of these two visualizations and explain the difference in a gradual manner. As opposed to a textual description, our morphing explains an unfamiliar visualization through purely visual means. We demonstrate our idea by ways of four visualization pair examples: data table and parallel coordinates, scatterplot matrix and hyperbox, linear chart and spiral chart, and hierarchical pie chart and treemap. The analogy is commutative i.e. any member of the pair can be the unfamiliar visualization. A series of studies showed that this new paradigm can be an effective teaching tool. The participants could understand the unfamiliar visualization methods in all of the four pairs either fully or at least significantly better after they observed or interacted with the transitions from the familiar counterpart. The four examples suggest how helpful visualization pairings be identified and they will hopefully inspire other visualization morphings and associated transition strategies to be identified.

  1. SNPexp - A web tool for calculating and visualizing correlation between HapMap genotypes and gene expression levels

    Directory of Open Access Journals (Sweden)

    Franke Andre

    2010-12-01

    Full Text Available Abstract Background Expression levels for 47294 transcripts in lymphoblastoid cell lines from all 270 HapMap phase II individuals, and genotypes (both HapMap phase II and III of 3.96 million single nucleotide polymorphisms (SNPs in the same individuals are publicly available. We aimed to generate a user-friendly web based tool for visualization of the correlation between SNP genotypes within a specified genomic region and a gene of interest, which is also well-known as an expression quantitative trait locus (eQTL analysis. Results SNPexp is implemented as a server-side script, and publicly available on this website: http://tinyurl.com/snpexp. Correlation between genotype and transcript expression levels are calculated by performing linear regression and the Wald test as implemented in PLINK and visualized using the UCSC Genome Browser. Validation of SNPexp using previously published eQTLs yielded comparable results. Conclusions SNPexp provides a convenient and platform-independent way to calculate and visualize the correlation between HapMap genotypes within a specified genetic region anywhere in the genome and gene expression levels. This allows for investigation of both cis and trans effects. The web interface and utilization of publicly available and widely used software resources makes it an attractive supplement to more advanced bioinformatic tools. For the advanced user the program can be used on a local computer on custom datasets.

  2. Integrative Genomics Viewer (IGV): high-performance genomics data visualization and exploration.

    Science.gov (United States)

    Thorvaldsdóttir, Helga; Robinson, James T; Mesirov, Jill P

    2013-03-01

    Data visualization is an essential component of genomic data analysis. However, the size and diversity of the data sets produced by today's sequencing and array-based profiling methods present major challenges to visualization tools. The Integrative Genomics Viewer (IGV) is a high-performance viewer that efficiently handles large heterogeneous data sets, while providing a smooth and intuitive user experience at all levels of genome resolution. A key characteristic of IGV is its focus on the integrative nature of genomic studies, with support for both array-based and next-generation sequencing data, and the integration of clinical and phenotypic data. Although IGV is often used to view genomic data from public sources, its primary emphasis is to support researchers who wish to visualize and explore their own data sets or those from colleagues. To that end, IGV supports flexible loading of local and remote data sets, and is optimized to provide high-performance data visualization and exploration on standard desktop systems. IGV is freely available for download from http://www.broadinstitute.org/igv, under a GNU LGPL open-source license.

  3. Design and visualization of synthetic holograms for security applications

    International Nuclear Information System (INIS)

    Škeren, M; Nývlt, M; Svoboda, J

    2013-01-01

    In this paper we present a software for the design and visualization of holographic elements containing full scale of visual effects. It enables to simulate an observation of the holographic elements under general conditions including different light sources with various spectral and coherence properties and various geometries of reconstruction. Furthermore, recent technologies offer interesting possibilities for the 3D visualization such as the 3D techniques based on shutter or polarization glasses, anaglyphs, etc. The presented software is compatible with the mentioned techniques and enables an application of the 3D hardware tools for visualization. The software package can be used not only for visualization of the existing designs, but also for a fine tuning of the spatial, kinetic, and color properties of the hologram. Moreover, the holograms containing all types of the 3D effects, general color mixing, kinetic behavior, diffractive cryptograms, etc. can be translated using the software directly to a high resolution micro-structure.

  4. Interactive web visualization tools to the results interpretation of a seismic risk study aimed at the emergency levels definition

    Science.gov (United States)

    Rivas-Medina, A.; Gutierrez, V.; Gaspar-Escribano, J. M.; Benito, B.

    2009-04-01

    Results of a seismic risk assessment study are often applied and interpreted by users unspecialised on the topic or lacking a scientific background. In this context, the availability of tools that help translating essentially scientific contents to broader audiences (such as decision makers or civil defence officials) as well as representing and managing results in a user-friendly fashion, are on indubitable value. On of such tools is the visualization tool VISOR-RISNA, a web tool developed within the RISNA project (financed by the Emergency Agency of Navarre, Spain) for regional seismic risk assessment of Navarre and the subsequent development of emergency plans. The RISNA study included seismic hazard evaluation, geotechnical characterization of soils, incorporation of site effects to expected ground motions, vulnerability distribution assessment and estimation of expected damage distributions for a 10% probability of exceedance in 50 years. The main goal of RISNA was the identification of higher risk area where focusing detailed, local-scale risk studies in the future and the corresponding urban emergency plans. A geographic information system was used to combine different information layers, generate tables of results and represent maps with partial and final results. The visualization tool VISOR-RISNA is intended to facilitate the interpretation and representation of the collection of results, with the ultimate purpose of defining actuation plans. A number of criteria for defining actuation priorities are proposed in this work. They are based on combinations of risk parameters resulting from the risk study (such as expected ground motion and damage and exposed population), as determined by risk assessment specialists. Although the values that these parameters take are a result of the risk study, their distribution in several classes depends on the intervals defined by decision takers or civil defense officials. These criteria provide a ranking of

  5. Visualizing Matrix Multiplication

    Science.gov (United States)

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  6. DisEpi: Compact Visualization as a Tool for Applied Epidemiological Research.

    Science.gov (United States)

    Benis, Arriel; Hoshen, Moshe

    2017-01-01

    Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.

  7. NanoPack: visualizing and processing long read sequencing data.

    Science.gov (United States)

    De Coster, Wouter; D'Hert, Svenn; Schultz, Darrin T; Cruts, Marc; Van Broeckhoven, Christine

    2018-03-14

    Here we describe NanoPack, a set of tools developed for visualization and processing of long read sequencing data from Oxford Nanopore Technologies and Pacific Biosciences. The NanoPack tools are written in Python3 and released under the GNU GPL3.0 License. The source code can be found at https://github.com/wdecoster/nanopack, together with links to separate scripts and their documentation. The scripts are compatible with Linux, Mac OS and the MS Windows 10 subsystem for Linux and are available as a graphical user interface, a web service at http://nanoplot.bioinf.be and command line tools. wouter.decoster@molgen.vib-ua.be. Supplementary tables and figures are available at Bioinformatics online.

  8. Online social media analysis and visualization

    CERN Document Server

    Kawash, Jalal

    2015-01-01

    This edited volume addresses the vast challenges of adapting Online Social Media (OSM) to developing research methods and applications. The topics cover generating realistic social network topologies, awareness of user activities, topic and trend generation, estimation of user attributes from their social content, behavior detection, mining social content for common trends, identifying and ranking social content sources, building friend-comprehension tools, and many others. Each of the ten chapters tackle one or more of these issues by proposing new analysis methods or new visualization techn

  9. Visualizing Patterns of Marine Eukaryotic Diversity from Metabarcoding Data Using QIIME.

    Science.gov (United States)

    Leray, Matthieu; Knowlton, Nancy

    2016-01-01

    PCR amplification followed by deep sequencing of homologous gene regions is increasingly used to characterize the diversity and taxonomic composition of marine eukaryotic communities. This approach may generate millions of sequences for hundreds of samples simultaneously. Therefore, tools that researchers can use to visualize complex patterns of diversity for these massive datasets are essential. Efforts by microbiologists to understand the Earth and human microbiomes using high-throughput sequencing of the 16S rRNA gene has led to the development of several user-friendly, open-source software packages that can be similarly used to analyze eukaryotic datasets. Quantitative Insights Into Microbial Ecology (QIIME) offers some of the most helpful data visualization tools. Here, we describe functionalities to import OTU tables generated with any molecular marker (e.g., 18S, COI, ITS) and associated metadata into QIIME. We then present a range of analytical tools implemented within QIIME that can be used to obtain insights about patterns of alpha and beta diversity for marine eukaryotes.

  10. Methods and tools to evaluate the availability of renewable energy sources

    International Nuclear Information System (INIS)

    Angelis-Dimakis, Athanasios; Kartalidis, Avraam; Biberacher, Markus; Gadocha, Sabine; Dominguez, Javier; Pinedo, Irene; Fiorese, Giulia; Gnansounou, Edgard; Panichelli, Luis; Guariso, Giorgio; Robba, Michela

    2011-01-01

    The recent statements of both the European Union and the US Presidency pushed in the direction of using renewable forms of energy, in order to act against climate changes induced by the growing concentration of carbon dioxide in the atmosphere. In this paper, a survey regarding methods and tools presently available to determine potential and exploitable energy in the most important renewable sectors (i.e., solar, wind, wave, biomass and geothermal energy) is presented. Moreover, challenges for each renewable resource are highlighted as well as the available tools that can help in evaluating the use of a mix of different sources. (author)

  11. Tools for Visualizing HIV in Cure Research.

    Science.gov (United States)

    Niessl, Julia; Baxter, Amy E; Kaufmann, Daniel E

    2018-02-01

    The long-lived HIV reservoir remains a major obstacle for an HIV cure. Current techniques to analyze this reservoir are generally population-based. We highlight recent developments in methods visualizing HIV, which offer a different, complementary view, and provide indispensable information for cure strategy development. Recent advances in fluorescence in situ hybridization techniques enabled key developments in reservoir visualization. Flow cytometric detection of HIV mRNAs, concurrently with proteins, provides a high-throughput approach to study the reservoir on a single-cell level. On a tissue level, key spatial information can be obtained detecting viral RNA and DNA in situ by fluorescence microscopy. At total-body level, advancements in non-invasive immuno-positron emission tomography (PET) detection of HIV proteins may allow an encompassing view of HIV reservoir sites. HIV imaging approaches provide important, complementary information regarding the size, phenotype, and localization of the HIV reservoir. Visualizing the reservoir may contribute to the design, assessment, and monitoring of HIV cure strategies in vitro and in vivo.

  12. Anatomy of BioJS, an open source community for the life sciences.

    Science.gov (United States)

    Yachdav, Guy; Goldberg, Tatyana; Wilzbach, Sebastian; Dao, David; Shih, Iris; Choudhary, Saket; Crouch, Steve; Franz, Max; García, Alexander; García, Leyla J; Grüning, Björn A; Inupakutika, Devasena; Sillitoe, Ian; Thanki, Anil S; Vieira, Bruno; Villaveces, José M; Schneider, Maria V; Lewis, Suzanna; Pettifer, Steve; Rost, Burkhard; Corpas, Manuel

    2015-07-08

    BioJS is an open source software project that develops visualization tools for different types of biological data. Here we report on the factors that influenced the growth of the BioJS user and developer community, and outline our strategy for building on this growth. The lessons we have learned on BioJS may also be relevant to other open source software projects.

  13. HMM Logos for visualization of protein families

    Directory of Open Access Journals (Sweden)

    Schultz Jörg

    2004-01-01

    Full Text Available Abstract Background Profile Hidden Markov Models (pHMMs are a widely used tool for protein family research. Up to now, however, there exists no method to visualize all of their central aspects graphically in an intuitively understandable way. Results We present a visualization method that incorporates both emission and transition probabilities of the pHMM, thus extending sequence logos introduced by Schneider and Stephens. For each emitting state of the pHMM, we display a stack of letters. The stack height is determined by the deviation of the position's letter emission frequencies from the background frequencies. The stack width visualizes both the probability of reaching the state (the hitting probability and the expected number of letters the state emits during a pass through the model (the state's expected contribution. A web interface offering online creation of HMM Logos and the corresponding source code can be found at the Logos web server of the Max Planck Institute for Molecular Genetics http://logos.molgen.mpg.de. Conclusions We demonstrate that HMM Logos can be a useful tool for the biologist: We use them to highlight differences between two homologous subfamilies of GTPases, Rab and Ras, and we show that they are able to indicate structural elements of Ras.

  14. Images as tools. On visual epistemic practices in the biological sciences.

    Science.gov (United States)

    Samuel, Nina

    2013-06-01

    Contemporary visual epistemic practices in the biological sciences raise new questions of how to transform an iconic data measurements into images, and how the process of an imaging technique may change the material it is 'depicting'. This case-oriented study investigates microscopic imagery, which is used by system and synthetic biologists alike. The core argument is developed around the analysis of two recent methods, developed between 2003 and 2006: localization microscopy and photo-induced cell death. Far from functioning merely as illustrations of work done by other means, images can be determined as tools for discovery in their own right and as objects of investigation. Both methods deploy different constellations of intended and unintended interactions between visual appearance and underlying biological materiality. To characterize these new ways of interaction, the article introduces the notions of 'operational images' and 'operational agency'. Despite all their novelty, operational images are still subject to conventions of seeing and depicting: Phenomena emerging with the new method of localization microscopy have to be designed according to image traditions of older, conventional fluorescence microscopy to function properly as devices for communication between physicists and biologists. The article emerged from a laboratory study based on interviews conducted with researchers from the Kirchhoff-Institute for Physics and German Cancer Research Center (DKFZ) at Bioquant, Heidelberg, in 2011. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Validity of the growth model of the 'computerized visual perception assessment tool for Chinese characters structures'.

    Science.gov (United States)

    Wu, Huey-Min; Li, Cheng-Hsaun; Kuo, Bor-Chen; Yang, Yu-Mao; Lin, Chin-Kai; Wan, Wei-Hsiang

    2017-08-01

    Morphological awareness is the foundation for the important developmental skills involved with vocabulary, as well as understanding the meaning of words, orthographic knowledge, reading, and writing. Visual perception of space and radicals in two-dimensional positions of Chinese characters' morphology is very important in identifying Chinese characters. The important predictive variables of special and visual perception in Chinese characters identification were investigated in the growth model in this research. The assessment tool is the "Computerized Visual Perception Assessment Tool for Chinese Characters Structures" developed by this study. There are two constructs, basic stroke and character structure. In the basic stroke, there are three subtests of one, two, and more than three strokes. In the character structure, there are three subtests of single-component character, horizontal-compound character, and vertical-compound character. This study used purposive sampling. In the first year, 551 children 4-6 years old participated in the study and were monitored for one year. In the second year, 388 children remained in the study and the successful follow-up rate was 70.4%. This study used a two-wave cross-lagged panel design to validate the growth model of the basic stroke and the character structure. There was significant correlation of the basic stroke and the character structure at different time points. The abilities in the basic stroke and in the character structure steadily developed over time for preschool children. Children's knowledge of the basic stroke effectively predicted their knowledge of the basic stroke and the character structure. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)

    Science.gov (United States)

    Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.

    2017-07-27

    A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting

  17. Visualizing Repertory Grid Data for Formative Assessment

    DEFF Research Database (Denmark)

    Pantazos, Kostas; Vatrapu, Ravi; Hussain, Abid

    2013-01-01

    at facilitating data analysis through a visual and interactive approach, which allows users to understand their data, reflect, and make better decisions. This paper presents an interactive visualization tool for teachers and students. The tool visualizes repertory grid data using two dashboards, where teachers...... and students can investigate constructs and rating elements of students at the individual or group level. Visualizing the repertory grid data is an initial attempt towards teaching analytics. Future work will focus on evaluating the tool in a real setting with teachers and students, and collecting suggestions...

  18. ANLIZE: a molecular mechanics force field visualization tool and its application to 18-crown-6.

    Science.gov (United States)

    Stolworthy, L D; Shirts, R B

    1997-03-01

    We describe a software tool that allows one to visualize and analyze the importance of each individual steric interaction in a molecular mechanics force field. ANLIZE is presently implemented for the Dreiding force field for use with the Cerius2 software package, but could be implemented in any molecular mechanics package with a graphical user interface. ANLIZE calculates individual interactions in the force field, sorts them by size, and displays them in several ways from a menu of choices. This allows the user to scan through selected interactions to visualize which interactions are the primary determinants of preferred conformations. The features of ANLIZE are illustrated using 18-crown-6 as an example, and the factors governing conformational preference in 18-crown-6 are demonstrated. Users of molecular mechanics packages are encouraged to demand this functionality from commercial software producers.

  19. Health figures: an open source JavaScript library for health data visualization.

    Science.gov (United States)

    Ledesma, Andres; Al-Musawi, Mohammed; Nieminen, Hannu

    2016-03-22

    The way we look at data has a great impact on how we can understand it, particularly when the data is related to health and wellness. Due to the increased use of self-tracking devices and the ongoing shift towards preventive medicine, better understanding of our health data is an important part of improving the general welfare of the citizens. Electronic Health Records, self-tracking devices and mobile applications provide a rich variety of data but it often becomes difficult to understand. We implemented the hFigures library inspired on the hGraph visualization with additional improvements. The purpose of the library is to provide a visual representation of the evolution of health measurements in a complete and useful manner. We researched the usefulness and usability of the library by building an application for health data visualization in a health coaching program. We performed a user evaluation with Heuristic Evaluation, Controlled User Testing and Usability Questionnaires. In the Heuristics Evaluation the average response was 6.3 out of 7 points and the Cognitive Walkthrough done by usability experts indicated no design or mismatch errors. In the CSUQ usability test the system obtained an average score of 6.13 out of 7, and in the ASQ usability test the overall satisfaction score was 6.64 out of 7. We developed hFigures, an open source library for visualizing a complete, accurate and normalized graphical representation of health data. The idea is based on the concept of the hGraph but it provides additional key features, including a comparison of multiple health measurements over time. We conducted a usability evaluation of the library as a key component of an application for health and wellness monitoring. The results indicate that the data visualization library was helpful in assisting users in understanding health data and its evolution over time.

  20. Supporting interactive visual analytics of energy behavior in buildings through affine visualizations

    DEFF Research Database (Denmark)

    Nielsen, Matthias; Brewer, Robert S.; Grønbæk, Kaj

    2016-01-01

    Domain experts dealing with big data are typically not familiar with advanced data mining tools. This especially holds true for domain experts within energy management. In this paper, we introduce a visual analytics approach that empowers such users to visually analyze energy behavior based......Viz, that interactively maps data from real world buildings. It is an overview +detail inter-active visual analytics tool supporting both rapid ad hoc explorations and structured evaluation of hypotheses about patterns and anomalies in resource consumption data mixed with occupant survey data. We have evaluated...... the approach with five domain experts within energy management, and further with 10 data analytics experts and found that it was easily attainable and that it supported visual analysis of mixed consumption and survey data. Finally, we discuss future perspectives of affine visual analytics for mixed...

  1. Design and implementation of visualization methods for the CHANGES Spatial Decision Support System

    Science.gov (United States)

    Cristal, Irina; van Westen, Cees; Bakker, Wim; Greiving, Stefan

    2014-05-01

    The CHANGES Spatial Decision Support System (SDSS) is a web-based system aimed for risk assessment and the evaluation of optimal risk reduction alternatives at local level as a decision support tool in long-term natural risk management. The SDSS use multidimensional information, integrating thematic, spatial, temporal and documentary data. The role of visualization in this context becomes of vital importance for efficiently representing each dimension. This multidimensional aspect of the required for the system risk information, combined with the diversity of the end-users imposes the use of sophisticated visualization methods and tools. The key goal of the present work is to exploit efficiently the large amount of data in relation to the needs of the end-user, utilizing proper visualization techniques. Three main tasks have been accomplished for this purpose: categorization of the end-users, the definition of system's modules and the data definition. The graphical representation of the data and the visualization tools were designed to be relevant to the data type and the purpose of the analysis. Depending on the end-users category, each user should have access to different modules of the system and thus, to the proper visualization environment. The technologies used for the development of the visualization component combine the latest and most innovative open source JavaScript frameworks, such as OpenLayers 2.13.1, ExtJS 4 and GeoExt 2. Moreover, the model-view-controller (MVC) pattern is used in order to ensure flexibility of the system at the implementation level. Using the above technologies, the visualization techniques implemented so far offer interactive map navigation, querying and comparison tools. The map comparison tools are of great importance within the SDSS and include the following: swiping tool for comparison of different data of the same location; raster subtraction for comparison of the same phenomena varying in time; linked views for comparison

  2. Data Visualization within the Python ecosystem

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    Data analysis is integral to what we do at CERN. Data visualization is at the foundation of this workflow and is also an important part of the python stack. Python's plotting ecosystem offers numerous open source solutions. These solutions can offer ease of use, detailed configuration, interactivity and web readiness. This talk will cover three of the most robust and supported packages, matplotlib, bokeh, and plotly. It aims to provide an overview of these packages. In addition, give suggestions to where these tools might fit in an analysis workflow.

  3. Updates in metabolomics tools and resources: 2014-2015.

    Science.gov (United States)

    Misra, Biswapriya B; van der Hooft, Justin J J

    2016-01-01

    Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Principles of Information Visualization for Business Research

    OpenAIRE

    Ioan I. ANDONE

    2008-01-01

    In the era of data-centric-science, a large number of visualization tools have been created to help researchers understand increasingly rich business databases. Information visualization is a process of constructing a visual presentation of business quantitative data, especially prepared for managerial use. Interactive information visualization provide researchers with remarkable tools for discovery and innovation. By combining powerful data mining methods with user-controlled interfaces, use...

  5. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  6. Optimal power allocation and joint source-channel coding for wireless DS-CDMA visual sensor networks

    Science.gov (United States)

    Pandremmenou, Katerina; Kondi, Lisimachos P.; Parsopoulos, Konstantinos E.

    2011-01-01

    In this paper, we propose a scheme for the optimal allocation of power, source coding rate, and channel coding rate for each of the nodes of a wireless Direct Sequence Code Division Multiple Access (DS-CDMA) visual sensor network. The optimization is quality-driven, i.e. the received quality of the video that is transmitted by the nodes is optimized. The scheme takes into account the fact that the sensor nodes may be imaging scenes with varying levels of motion. Nodes that image low-motion scenes will require a lower source coding rate, so they will be able to allocate a greater portion of the total available bit rate to channel coding. Stronger channel coding will mean that such nodes will be able to transmit at lower power. This will both increase battery life and reduce interference to other nodes. Two optimization criteria are considered. One that minimizes the average video distortion of the nodes and one that minimizes the maximum distortion among the nodes. The transmission powers are allowed to take continuous values, whereas the source and channel coding rates can assume only discrete values. Thus, the resulting optimization problem lies in the field of mixed-integer optimization tasks and is solved using Particle Swarm Optimization. Our experimental results show the importance of considering the characteristics of the video sequences when determining the transmission power, source coding rate and channel coding rate for the nodes of the visual sensor network.

  7. Visualization of neutron flux and power distributions in TRIGA Mark II reactor as an educational tool

    International Nuclear Information System (INIS)

    Snoj, Luka; Ravnik, Matjaz; Lengar, Igor

    2008-01-01

    Modern Monte Carlo computer codes (e.g. MCNP) for neutron transport allow calculation of detailed neutron flux and power distribution in complex geometries with resolution of ∼1 mm. Moreover they enable the calculation of individual particle tracks, scattering and absorption events. With the use of advanced software for 3D visualization (e.g. Amira, Voxler, etc.) one can create and present neutron flux and power distribution in a 'user friendly' way convenient for educational purposes. One can view axial, radial or any other spatial distribution of the neutron flux and power distribution in a nuclear reactor from various perspectives and in various modalities of presentation. By visualizing the distribution of scattering and absorption events and individual particle tracks one can visualize neutron transport parameters (mean free path, diffusion length, macroscopic cross section, up-scattering, thermalization, etc.) from elementary point of view. Most of the people remember better, if they visualize the processes. Therefore the representation of the reactor and neutron transport parameters is a convenient modern educational tool for the (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. The visualization of neutron flux and power distributions in Jozef Stefan Institute TRIGA Mark II research reactor is treated in the paper. The distributions are calculated with MCNP computer code and presented using Amira and Voxler software. The results in the form of figures are presented in the paper together with comments qualitatively explaining the figures. (authors)

  8. Critical evaluation of reverse engineering tool Imagix 4D!

    Science.gov (United States)

    Yadav, Rashmi; Patel, Ravindra; Kothari, Abhay

    2016-01-01

    The comprehension of legacy codes is difficult to understand. Various commercial reengineering tools are available that have unique working styles, and are equipped with their inherent capabilities and shortcomings. The focus of the available tools is in visualizing static behavior not the dynamic one. Therefore, it is difficult for people who work in software product maintenance, code understanding reengineering/reverse engineering. Consequently, the need for a comprehensive reengineering/reverse engineering tool arises. We found the usage of Imagix 4D to be good as it generates the maximum pictorial representations in the form of flow charts, flow graphs, class diagrams, metrics and, to a partial extent, dynamic visualizations. We evaluated Imagix 4D with the help of a case study involving a few samples of source code. The behavior of the tool was analyzed on multiple small codes and a large code gcc C parser. Large code evaluation was performed to uncover dead code, unstructured code, and the effect of not including required files at preprocessing level. The utility of Imagix 4D to prepare decision density and complexity metrics for a large code was found to be useful in getting to know how much reengineering is required. At the outset, Imagix 4D offered limitations in dynamic visualizations, flow chart separation (large code) and parsing loops. The outcome of evaluation will eventually help in upgrading Imagix 4D and posed a need of full featured tools in the area of software reengineering/reverse engineering. It will also help the research community, especially those who are interested in the realm of software reengineering tool building.

  9. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  10. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    Science.gov (United States)

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  11. Vortex filament method as a tool for computational visualization of quantum turbulence

    Science.gov (United States)

    Hänninen, Risto; Baggaley, Andrew W.

    2014-01-01

    The vortex filament model has become a standard and powerful tool to visualize the motion of quantized vortices in helium superfluids. In this article, we present an overview of the method and highlight its impact in aiding our understanding of quantum turbulence, particularly superfluid helium. We present an analysis of the structure and arrangement of quantized vortices. Our results are in agreement with previous studies showing that under certain conditions, vortices form coherent bundles, which allows for classical vortex stretching, giving quantum turbulence a classical nature. We also offer an explanation for the differences between the observed properties of counterflow and pure superflow turbulence in a pipe. Finally, we suggest a mechanism for the generation of coherent structures in the presence of normal fluid shear. PMID:24704873

  12. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  13. Data Fusion and Visualization with the OpenEarth Framework (OEF)

    Science.gov (United States)

    Nadeau, D. R.; Baru, C.; Fouch, M. J.; Crosby, C. J.

    2010-12-01

    Data fusion is an increasingly important problem to solve as we strive to integrate data from multiple sources and build better models of the complex processes operating at the Earth’s surface and its interior. These data are often large, multi-dimensional, and subject to differing conventions for file formats, data structures, coordinate spaces, units of measure, and metadata organization. When visualized, these data require differing, and often conflicting, conventions for visual representations, dimensionality, icons, color schemes, labeling, and interaction. These issues make the visualization of fused Earth science data particularly difficult. The OpenEarth Framework (OEF) is an open-source data fusion and visualization suite of software being developed at the Supercomputer Center at the University of California, San Diego. Funded by the NSF, the project is leveraging virtual globe technology from NASA’s WorldWind to create interactive 3D visualization tools that combine layered data from a variety of sources to create a holistic view of features at, above, and beneath the Earth’s surface. The OEF architecture is cross-platform, multi-threaded, modular, and based upon Java. The OEF’s modular approach yields a collection of compatible mix-and-match components for assembling custom applications. Available modules support file format handling, web service communications, data management, data filtering, user interaction, and 3D visualization. File parsers handle a variety of formal and de facto standard file formats. Each one imports data into a general-purpose data representation that supports multidimensional grids, topography, points, lines, polygons, images, and more. From there these data then may be manipulated, merged, filtered, reprojected, and visualized. Visualization features support conventional and new visualization techniques for looking at topography, tomography, maps, and feature geometry. 3D grid data such as seismic tomography may be

  14. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for

  15. Web-Based Geospatial Visualization of GPM Data with CesiumJS

    Science.gov (United States)

    Lammers, Matt

    2018-01-01

    Advancements in the capabilities of JavaScript frameworks and web browsing technology have made online visualization of large geospatial datasets such as those coming from precipitation satellites viable. These data benefit from being visualized on and above a three-dimensional surface. The open-source JavaScript framework CesiumJS (http://cesiumjs.org), developed by Analytical Graphics, Inc., leverages the WebGL protocol to do just that. This presentation will describe how CesiumJS has been used in three-dimensional visualization products developed as part of the NASA Precipitation Processing System (PPS) STORM data-order website. Existing methods of interacting with Global Precipitation Measurement (GPM) Mission data primarily focus on two-dimensional static images, whether displaying vertical slices or horizontal surface/height-level maps. These methods limit interactivity with the robust three-dimensional data coming from the GPM core satellite. Integrating the data with CesiumJS in a web-based user interface has allowed us to create the following products. We have linked with the data-order interface an on-the-fly visualization tool for any GPM/partner satellite orbit. A version of this tool also focuses on high-impact weather events. It enables viewing of combined radar and microwave-derived precipitation data on mobile devices and in a way that can be embedded into other websites. We also have used CesiumJS to visualize a method of integrating gridded precipitation data with modeled wind speeds that animates over time. Emphasis in the presentation will be placed on how a variety of technical methods were used to create these tools, and how the flexibility of the CesiumJS framework facilitates creative approaches to interact with the data.

  16. Monocular tool control, eye dominance, and laterality in New Caledonian crows.

    Science.gov (United States)

    Martinho, Antone; Burns, Zackory T; von Bayern, Auguste M P; Kacelnik, Alex

    2014-12-15

    Tool use, though rare, is taxonomically widespread, but morphological adaptations for tool use are virtually unknown. We focus on the New Caledonian crow (NCC, Corvus moneduloides), which displays some of the most innovative tool-related behavior among nonhumans. One of their major food sources is larvae extracted from burrows with sticks held diagonally in the bill, oriented with individual, but not species-wide, laterality. Among possible behavioral and anatomical adaptations for tool use, NCCs possess unusually wide binocular visual fields (up to 60°), suggesting that extreme binocular vision may facilitate tool use. Here, we establish that during natural extractions, tool tips can only be viewed by the contralateral eye. Thus, maintaining binocular view of tool tips is unlikely to have selected for wide binocular fields; the selective factor is more likely to have been to allow each eye to see far enough across the midsagittal line to view the tool's tip monocularly. Consequently, we tested the hypothesis that tool side preference follows eye preference and found that eye dominance does predict tool laterality across individuals. This contrasts with humans' species-wide motor laterality and uncorrelated motor-visual laterality, possibly because bill-held tools are viewed monocularly and move in concert with eyes, whereas hand-held tools are visible to both eyes and allow independent combinations of eye preference and handedness. This difference may affect other models of coordination between vision and mechanical control, not necessarily involving tools. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The Effect of Using a Visual Representation Tool in a Teaching-Learning Sequence for Teaching Newton's Third Law

    Science.gov (United States)

    Savinainen, Antti; Mäkynen, Asko; Nieminen, Pasi; Viiri, Jouni

    2017-01-01

    This paper presents a research-based teaching-learning sequence (TLS) that focuses on the notion of interaction in teaching Newton's third law (N3 law) which is, as earlier studies have shown, a challenging topic for students to learn. The TLS made systematic use of a visual representation tool--an interaction diagram (ID)--highlighting…

  18. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  19. A common source of attention for auditory and visual tracking.

    Science.gov (United States)

    Fougnie, Daryl; Cockhren, Jurnell; Marois, René

    2018-05-01

    Tasks that require tracking visual information reveal the severe limitations of our capacity to attend to multiple objects that vary in time and space. Although these limitations have been extensively characterized in the visual domain, very little is known about tracking information in other sensory domains. Does tracking auditory information exhibit characteristics similar to those of tracking visual information, and to what extent do these two tracking tasks draw on the same attention resources? We addressed these questions by asking participants to perform either single or dual tracking tasks from the same (visual-visual) or different (visual-auditory) perceptual modalities, with the difficulty of the tracking tasks being manipulated across trials. The results revealed that performing two concurrent tracking tasks, whether they were in the same or different modalities, affected tracking performance as compared to performing each task alone (concurrence costs). Moreover, increasing task difficulty also led to increased costs in both the single-task and dual-task conditions (load-dependent costs). The comparison of concurrence costs between visual-visual and visual-auditory dual-task performance revealed slightly greater interference when two visual tracking tasks were paired. Interestingly, however, increasing task difficulty led to equivalent costs for visual-visual and visual-auditory pairings. We concluded that visual and auditory tracking draw largely, though not exclusively, on common central attentional resources.

  20. WiseView: Visualizing motion and variability of faint WISE sources

    Science.gov (United States)

    Caselden, Dan; Westin, Paul, III; Meisner, Aaron; Kuchner, Marc; Colin, Guillaume

    2018-06-01

    WiseView renders image blinks of Wide-field Infrared Survey Explorer (WISE) coadds spanning a multi-year time baseline in a browser. The software allows for easy visual identification of motion and variability for sources far beyond the single-frame detection limit, a key threshold not surmounted by many studies. WiseView transparently gathers small image cutouts drawn from many terabytes of unWISE coadds, facilitating access to this large and unique dataset. Users need only input the coordinates of interest and can interactively tune parameters including the image stretch, colormap and blink rate. WiseView was developed in the context of the Backyard Worlds: Planet 9 citizen science project, and has enabled hundreds of brown dwarf candidate discoveries by citizen scientists and professional astronomers.

  1. The LandCarbon Web Application: Advanced Geospatial Data Delivery and Visualization Tools for Communication about Ecosystem Carbon Sequestration and Greenhouse Gas Fluxes

    Science.gov (United States)

    Thomas, N.; Galey, B.; Zhu, Z.; Sleeter, B. M.; Lehmer, E.

    2015-12-01

    The LandCarbon web application (http://landcarbon.org) is a collaboration between the U.S. Geological Survey and U.C. Berkeley's Geospatial Innovation Facility (GIF). The LandCarbon project is a national assessment focused on improved understanding of carbon sequestration and greenhouse gas fluxes in and out of ecosystems related to land use, using scientific capabilities from USGS and other organizations. The national assessment is conducted at a regional scale, covers all 50 states, and incorporates data from remote sensing, land change studies, aquatic and wetland data, hydrological and biogeochemical modeling, and wildfire mapping to estimate baseline and future potential carbon storage and greenhouse gas fluxes. The LandCarbon web application is a geospatial portal that allows for a sophisticated data delivery system as well as a suite of engaging tools that showcase the LandCarbon data using interactive web based maps and charts. The web application was designed to be flexible and accessible to meet the needs of a variety of users. Casual users can explore the input data and results of the assessment for a particular area of interest in an intuitive and interactive map, without the need for specialized software. Users can view and interact with maps, charts, and statistics that summarize the baseline and future potential carbon storage and fluxes for U.S. Level 2 Ecoregions for 3 IPCC emissions scenarios. The application allows users to access the primary data sources and assessment results for viewing and download, and also to learn more about the assessment's objectives, methods, and uncertainties through published reports and documentation. The LandCarbon web application is built on free and open source libraries including Django and D3. The GIF has developed the Django-Spillway package, which facilitates interactive visualization and serialization of complex geospatial raster data. The underlying LandCarbon data is available through an open application

  2. BioJS: an open source JavaScript framework for biological data visualization.

    Science.gov (United States)

    Gómez, John; García, Leyla J; Salazar, Gustavo A; Villaveces, Jose; Gore, Swanand; García, Alexander; Martín, Maria J; Launay, Guillaume; Alcántara, Rafael; Del-Toro, Noemi; Dumousseau, Marine; Orchard, Sandra; Velankar, Sameer; Hermjakob, Henning; Zong, Chenggong; Ping, Peipei; Corpas, Manuel; Jiménez, Rafael C

    2013-04-15

    BioJS is an open-source project whose main objective is the visualization of biological data in JavaScript. BioJS provides an easy-to-use consistent framework for bioinformatics application programmers. It follows a community-driven standard specification that includes a collection of components purposely designed to require a very simple configuration and installation. In addition to the programming framework, BioJS provides a centralized repository of components available for reutilization by the bioinformatics community. http://code.google.com/p/biojs/. Supplementary data are available at Bioinformatics online.

  3. Visualization tools for insurance risk processes

    OpenAIRE

    Krzysztof Burnecki; Rafal Weron

    2006-01-01

    This chapter develops on risk processes which, perhaps, are most suitable for computer visualization of all insurance objects. At the same time, risk processes are basic instruments for any non-life actuary – they are vital for calculating the amount of loss that an insurance company may incur.

  4. MutScan: fast detection and visualization of target mutations by scanning FASTQ data.

    Science.gov (United States)

    Chen, Shifu; Huang, Tanxiao; Wen, Tiexiang; Li, Hong; Xu, Mingyan; Gu, Jia

    2018-01-22

    Some types of clinical genetic tests, such as cancer testing using circulating tumor DNA (ctDNA), require sensitive detection of known target mutations. However, conventional next-generation sequencing (NGS) data analysis pipelines typically involve different steps of filtering, which may cause miss-detection of key mutations with low frequencies. Variant validation is also indicated for key mutations detected by bioinformatics pipelines. Typically, this process can be executed using alignment visualization tools such as IGV or GenomeBrowse. However, these tools are too heavy and therefore unsuitable for validating mutations in ultra-deep sequencing data. We developed MutScan to address problems of sensitive detection and efficient validation for target mutations. MutScan involves highly optimized string-searching algorithms, which can scan input FASTQ files to grab all reads that support target mutations. The collected supporting reads for each target mutation will be piled up and visualized using web technologies such as HTML and JavaScript. Algorithms such as rolling hash and bloom filter are applied to accelerate scanning and make MutScan applicable to detect or visualize target mutations in a very fast way. MutScan is a tool for the detection and visualization of target mutations by only scanning FASTQ raw data directly. Compared to conventional pipelines, this offers a very high performance, executing about 20 times faster, and offering maximal sensitivity since it can grab mutations with even one single supporting read. MutScan visualizes detected mutations by generating interactive pile-ups using web technologies. These can serve to validate target mutations, thus avoiding false positives. Furthermore, MutScan can visualize all mutation records in a VCF file to HTML pages for cloud-friendly VCF validation. MutScan is an open source tool available at GitHub: https://github.com/OpenGene/MutScan.

  5. Visual Basic, Excel-based fish population modeling tool - The pallid sturgeon example

    Science.gov (United States)

    Moran, Edward H.; Wildhaber, Mark L.; Green, Nicholas S.; Albers, Janice L.

    2016-02-10

    The model presented in this report is a spreadsheet-based model using Visual Basic for Applications within Microsoft Excel (http://dx.doi.org/10.5066/F7057D0Z) prepared in cooperation with the U.S. Army Corps of Engineers and U.S. Fish and Wildlife Service. It uses the same model structure and, initially, parameters as used by Wildhaber and others (2015) for pallid sturgeon. The difference between the model structure used for this report and that used by Wildhaber and others (2015) is that variance is not partitioned. For the model of this report, all variance is applied at the iteration and time-step levels of the model. Wildhaber and others (2015) partition variance into parameter variance (uncertainty about the value of a parameter itself) applied at the iteration level and temporal variance (uncertainty caused by random environmental fluctuations with time) applied at the time-step level. They included implicit individual variance (uncertainty caused by differences between individuals) within the time-step level.The interface developed for the model of this report is designed to allow the user the flexibility to change population model structure and parameter values and uncertainty separately for every component of the model. This flexibility makes the modeling tool potentially applicable to any fish species; however, the flexibility inherent in this modeling tool makes it possible for the user to obtain spurious outputs. The value and reliability of the model outputs are only as good as the model inputs. Using this modeling tool with improper or inaccurate parameter values, or for species for which the structure of the model is inappropriate, could lead to untenable management decisions. By facilitating fish population modeling, this modeling tool allows the user to evaluate a range of management options and implications. The goal of this modeling tool is to be a user-friendly modeling tool for developing fish population models useful to natural resource

  6. Mathematica data visualization

    CERN Document Server

    Saquib, Nazmus

    2014-01-01

    If you are planning to create data analysis and visualization tools in the context of science, engineering, economics, or social science, then this book is for you. With this book, you will become a visualization expert, in a short time, using Mathematica.

  7. Metaviz: interactive statistical and visual analysis of metagenomic data.

    Science.gov (United States)

    Wagner, Justin; Chelaru, Florin; Kancherla, Jayaram; Paulson, Joseph N; Zhang, Alexander; Felix, Victor; Mahurkar, Anup; Elmqvist, Niklas; Corrada Bravo, Héctor

    2018-04-06

    Large studies profiling microbial communities and their association with healthy or disease phenotypes are now commonplace. Processed data from many of these studies are publicly available but significant effort is required for users to effectively organize, explore and integrate it, limiting the utility of these rich data resources. Effective integrative and interactive visual and statistical tools to analyze many metagenomic samples can greatly increase the value of these data for researchers. We present Metaviz, a tool for interactive exploratory data analysis of annotated microbiome taxonomic community profiles derived from marker gene or whole metagenome shotgun sequencing. Metaviz is uniquely designed to address the challenge of browsing the hierarchical structure of metagenomic data features while rendering visualizations of data values that are dynamically updated in response to user navigation. We use Metaviz to provide the UMD Metagenome Browser web service, allowing users to browse and explore data for more than 7000 microbiomes from published studies. Users can also deploy Metaviz as a web service, or use it to analyze data through the metavizr package to interoperate with state-of-the-art analysis tools available through Bioconductor. Metaviz is free and open source with the code, documentation and tutorials publicly accessible.

  8. Mapping as a visual health communication tool: promises and dilemmas.

    Science.gov (United States)

    Parrott, Roxanne; Hopfer, Suellen; Ghetian, Christie; Lengerich, Eugene

    2007-01-01

    In the era of evidence-based public health promotion and planning, the use of maps as a form of evidence to communicate about the multiple determinants of cancer is on the rise. Geographic information systems and mapping technologies make future proliferation of this strategy likely. Yet disease maps as a communication form remain largely unexamined. This content analysis considers the presence of multivariate information, credibility cues, and the communication function of publicly accessible maps for cancer control activities. Thirty-six state comprehensive cancer control plans were publicly available in July 2005 and were reviewed for the presence of maps. Fourteen of the 36 state cancer plans (39%) contained map images (N = 59 static maps). A continuum of map inter activity was observed, with 10 states having interactive mapping tools available to query and map cancer information. Four states had both cancer plans with map images and interactive mapping tools available to the public on their Web sites. Of the 14 state cancer plans that depicted map images, two displayed multivariate data in a single map. Nine of the 10 states with interactive mapping capability offered the option to display multivariate health risk messages. The most frequent content category mapped was cancer incidence and mortality, with stage at diagnosis infrequently available. The most frequent communication function served by the maps reviewed was redundancy, as maps repeated information contained in textual forms. The social and ethical implications for communicating about cancer through the use of visual geographic representations are discussed.

  9. Visualizing spikes in source-space

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Duez, Lene; Scherg, Michael

    2016-01-01

    OBJECTIVE: Reviewing magnetoencephalography (MEG) recordings is time-consuming: signals from the 306 MEG-sensors are typically reviewed divided into six arrays of 51 sensors each, thus browsing each recording six times in order to evaluate all signals. A novel method of reconstructing the MEG...... signals in source-space was developed using a source-montage of 29 brain-regions and two spatial components to remove magnetocardiographic (MKG) artefacts. Our objective was to evaluate the accuracy of reviewing MEG in source-space. METHODS: In 60 consecutive patients with epilepsy, we prospectively...... evaluated the accuracy of reviewing the MEG signals in source-space as compared to the classical method of reviewing them in sensor-space. RESULTS: All 46 spike-clusters identified in sensor-space were also identified in source-space. Two additional spike-clusters were identified in source-space. As 29...

  10. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  11. Updates on resources, software tools, and databases for plant proteomics in 2016-2017.

    Science.gov (United States)

    Misra, Biswapriya B

    2018-02-08

    Proteomics data processing, annotation, and analysis can often lead to major hurdles in large-scale high-throughput bottom-up proteomics experiments. Given the recent rise in protein-based big datasets being generated, efforts in in silico tool development occurrences have had an unprecedented increase; so much so, that it has become increasingly difficult to keep track of all the advances in a particular academic year. However, these tools benefit the plant proteomics community in circumventing critical issues in data analysis and visualization, as these continually developing open-source and community-developed tools hold potential in future research efforts. This review will aim to introduce and summarize more than 50 software tools, databases, and resources developed and published during 2016-2017 under the following categories: tools for data pre-processing and analysis, statistical analysis tools, peptide identification tools, databases and spectral libraries, and data visualization and interpretation tools. Intended for a well-informed proteomics community, finally, efforts in data archiving and validation datasets for the community will be discussed as well. Additionally, the author delineates the current and most commonly used proteomics tools in order to introduce novice readers to this -omics discovery platform. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  13. Visual Thinking in Teaching History: Reading the Visual Thinking Skills of 12 Year-Old Pupils in Istanbul

    Science.gov (United States)

    Dilek, Gulcin

    2010-01-01

    This study aims to explore the visual thinking skills of some sixth grade (12-13 year-old) primary pupils who created visual interpretations during history courses. Pupils drew pictures describing historical scenes or events based on visual sources. They constructed these illustrations by using visual and written primary and secondary sources in…

  14. SpacePy - a Python-based library of tools for the space sciences

    International Nuclear Information System (INIS)

    Morley, Steven K.; Welling, Daniel T.; Koller, Josef; Larsen, Brian A.; Henderson, Michael G.

    2010-01-01

    Space science deals with the bodies within the solar system and the interplanetary medium; the primary focus is on atmospheres and above - at Earth the short timescale variation in the the geomagnetic field, the Van Allen radiation belts and the deposition of energy into the upper atmosphere are key areas of investigation. SpacePy is a package for Python, targeted at the space sciences, that aims to make basic data analysis, modeling and visualization easier. It builds on the capabilities of the well-known NumPy and MatPlotLib packages. Publication quality output direct from analyses is emphasized. The SpacePy project seeks to promote accurate and open research standards by providing an open environment for code development. In the space physics community there has long been a significant reliance on proprietary languages that restrict free transfer of data and reproducibility of results. By providing a comprehensive, open-source library of widely used analysis and visualization tools in a free, modern and intuitive language, we hope that this reliance will be diminished. SpacePy includes implementations of widely used empirical models, statistical techniques used frequently in space science (e.g. superposed epoch analysis), and interfaces to advanced tools such as electron drift shell calculations for radiation belt studies. SpacePy also provides analysis and visualization tools for components of the Space Weather Modeling Framework - currently this only includes the BATS-R-US 3-D magnetohydrodynamic model and the RAM ring current model - including streamline tracing in vector fields. Further development is currently underway. External libraries, which include well-known magnetic field models, high-precision time conversions and coordinate transformations are wrapped for access from Python using SWIG and f2py. The rest of the tools have been implemented directly in Python. The provision of open-source tools to perform common tasks will provide openness in the

  15. Information visualization of the minority game

    Science.gov (United States)

    Jiang, W.; Herbert, R. D.; Webber, R.

    2008-02-01

    Many dynamical systems produce large quantities of data. How can the system be understood from the output data? Often people are simply overwhelmed by the data. Traditional tools such as tables and plots are often not adequate, and new techniques are needed to help people to analyze the system. In this paper, we propose the use of two spacefilling visualization tools to examine the output from a complex agent-based financial model. We measure the effectiveness and performance of these tools through usability experiments. Based on the experimental results, we develop two new visualization techniques that combine the advantages and discard the disadvantages of the information visualization tools. The model we use is an evolutionary version of the Minority Game which simulates a financial market.

  16. Information visualization of the minority game

    International Nuclear Information System (INIS)

    Jiang, W; Herbert, R D; Webber, R

    2008-01-01

    Many dynamical systems produce large quantities of data. How can the system be understood from the output data? Often people are simply overwhelmed by the data. Traditional tools such as tables and plots are often not adequate, and new techniques are needed to help people to analyze the system. In this paper, we propose the use of two spacefilling visualization tools to examine the output from a complex agent-based financial model. We measure the effectiveness and performance of these tools through usability experiments. Based on the experimental results, we develop two new visualization techniques that combine the advantages and discard the disadvantages of the information visualization tools. The model we use is an evolutionary version of the Minority Game which simulates a financial market

  17. Visual exploration of images

    Science.gov (United States)

    Suaste-Gomez, Ernesto; Leybon, Jaime I.; Rodriguez, D.

    1998-07-01

    Visual scanpath has been an important work applied in neuro- ophthalmic and psychological studies. This is because it has been working like a tool to validate some pathologies such as visual perception in color or black/white images; color blindness; etc. On the other hand, this tool has reached a big field of applications such as marketing. The scanpath over a specific picture, shows the observer interest in color, shapes, letter size, etc.; even tough the picture be among a group of images, this tool has demonstrated to be helpful to catch people interest over a specific advertisement.

  18. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments

    Directory of Open Access Journals (Sweden)

    Kotaro Hoshiba

    2017-11-01

    Full Text Available In search and rescue activities, unmanned aerial vehicles (UAV should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  19. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments.

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Kumon, Makoto; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G

    2017-11-03

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators.

  20. Data Visualization: Conversion of Data to Animation Files

    National Research Council Canada - National Science Library

    Kimbler, Nate

    2004-01-01

    .... Because visualization tools are vital to understanding complex physical phenomena, these visualization tools attempt to facilitate converting data into animations that can be saved and used in data...

  1. Theoretical and methodological notes on visual and audiovisual sources in researches on Life Stories and Self-referential Memorials

    Directory of Open Access Journals (Sweden)

    Maria Helena Menna Barreto Abrahão

    2014-01-01

    Full Text Available The text explicits the reflection that bases the use of pictures, films and videofilms as sources in research on Life Stories and Self-re- ferential Memorials in Teachers’ Education. After refering to the researches in which we use this support since 1988, we work with two complementary pairs of theoretical dimensions of narratives in visual and audiovisual sources and their use in such empirical rese- arch: subjectivity/truth and space/time. These dimensions are worked grounded in Barthes (1984 to propose an interpretative effort of these sources to understand the essence of photography according to the photographer and the essence of photography according to the photographed person concocted to the essence of photography according to the researcher. The Barthesian constructs studium and punctum are applied to reading the narratives of the filmic and pho- tographic material, reaching the more radical expression of the Bar- thesian puctum: real or representational death of the referent that serves the photos and movies. The discussion of these dimensions for the analysis of (audio visual sources is complemented with the support of several other authors.

  2. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  3. VISIBIOweb: visualization and layout services for BioPAX pathway models

    Science.gov (United States)

    Dilek, Alptug; Belviranli, Mehmet E.; Dogrusoz, Ugur

    2010-01-01

    With recent advancements in techniques for cellular data acquisition, information on cellular processes has been increasing at a dramatic rate. Visualization is critical to analyzing and interpreting complex information; representing cellular processes or pathways is no exception. VISIBIOweb is a free, open-source, web-based pathway visualization and layout service for pathway models in BioPAX format. With VISIBIOweb, one can obtain well-laid-out views of pathway models using the standard notation of the Systems Biology Graphical Notation (SBGN), and can embed such views within one's web pages as desired. Pathway views may be navigated using zoom and scroll tools; pathway object properties, including any external database references available in the data, may be inspected interactively. The automatic layout component of VISIBIOweb may also be accessed programmatically from other tools using Hypertext Transfer Protocol (HTTP). The web site is free and open to all users and there is no login requirement. It is available at: http://visibioweb.patika.org. PMID:20460470

  4. Genome Context Viewer: visual exploration of multiple annotated genomes using microsynteny.

    Science.gov (United States)

    Cleary, Alan; Farmer, Andrew

    2018-05-01

    The Genome Context Viewer is a visual data-mining tool that allows users to search across multiple providers of genome data for regions with similarly annotated content that may be aligned and visualized at the level of their shared functional elements. By handling ordered sequences of gene family memberships as a unit of search and comparison, the user interface enables quick and intuitive assessment of the degree of gene content divergence and the presence of various types of structural events within syntenic contexts. Insights into functionally significant differences seen at this level of abstraction can then serve to direct the user to more detailed explorations of the underlying data in other interconnected, provider-specific tools. GCV is provided under the GNU General Public License version 3 (GPL-3.0). Source code is available at https://github.com/legumeinfo/lis_context_viewer. adf@ncgr.org. Supplementary data are available at Bioinformatics online.

  5. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  6. IRscope: An online program to visualize the junction sites of chloroplast genomes.

    Science.gov (United States)

    Amiryousefi, Ali; Hyvönen, Jaakko; Poczai, Peter

    2018-04-05

    Genome plotting is performed using a wide range of visualizations tools each with emphasis on a different informative dimension of the genome. These tools can provide a deeper insight into the genomic structure of the organism. Here we announce a new visualization tool that is specifically designed for chloroplast genomes. It allows the users to depict the genetic architecture of up to ten chloroplast genomes in the vicinity of the sites connecting the inverted repeats to the short and long single copy regions. The software and its dependent libraries are fully coded in R and the reflected plot is scaled up to realistic size of nucleotide base pairs in the vicinity of the junction sites. We introduce a website for easier use of the program as well as R source code of the software to be used in case of preferences to be changed and integrated into personal pipelines. The input of the program is an annotation GenBank (.gb) file, the accession or GI number of the sequence or a DOGMA output file. The software was tested using over a hundred embryophyte chloroplast genomes and in all cases a reliable output was obtained. Source codes and the online suit available @ https://irscope.shinyapps.io/irapp/ or @ https://github.com/Limpfrog/irscope. ali.amiryousefi@helsinki.fi.

  7. A spectroscopic tool for identifying sources of origin for materials of military interest

    Science.gov (United States)

    Miziolek, Andrzej W.; De Lucia, Frank C.

    2014-05-01

    There is a need to identify the source of origin for many items of military interest, including ammunition and weapons that may be circulated and traded in illicit markets. Both fieldable systems (man-portable or handheld) as well as benchtop systems in field and home base laboratories are desired for screening and attribution purposes. Laser Induced Breakdown Spectroscopy (LIBS) continues to show significant capability as a promising new tool for materials identification, matching, and provenance. With the use of the broadband, high resolution spectrometer systems, the LIBS devices can not only determine the elemental inventory of the sample, but they are also capable of elemental fingerprinting to signify sources of origin of various materials. We present the results of an initial study to differentiate and match spent cartridges from different manufacturers and countries. We have found that using Partial Least Squares Discriminant Analysis (PLS-DA) we are able to achieve on average 93.3% True Positives and 5.3% False Positives. These results add to the large body of publications that have demonstrated that LIBS is a particularly suitable tool for source of origin determinations.

  8. Abstractocyte: A Visual Tool for Exploring Nanoscale Astroglial Cells

    KAUST Repository

    Mohammed, Haneen

    2017-06-12

    This thesis presents the design and implementation of Abstractocyte, a system for the visual analysis of astrocytes, and their relation to neurons, in nanoscale volumes of brain tissue. Astrocytes are glial cells, i.e., non-neuronal cells that support neurons and the nervous system. Even though glial cells make up around 50 percent of all cells in the mammalian brain, so far they have been far less studied than neurons. Nevertheless, the study of astrocytes has immense potential for understanding brain function. However, the complex and widely-branching structure of astrocytes requires high-resolution electron microscopy imaging and makes visualization and analysis challenging. Using Abstractocyte, biologists can explore the morphology of astrocytes at various visual abstraction levels, while simultaneously analyzing neighboring neurons and their connectivity. We define a novel, conceptual 2D abstraction space for jointly visualizing astrocytes and neurons. Neuroscientists can choose a joint visualization as a specific point in that 2D abstraction space. Dragging this point allows them to smoothly transition between different abstraction levels in an intuitive manner. We describe the design of Abstractocyte, and present three case studies in which neuroscientists have successfully used our system to assess astrocytic coverage of synapses, glycogen distribution in relation to synapses, and astrocytic-mitochondria coverage.

  9. Usability Evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT).

    Science.gov (United States)

    Scotch, Matthew; Parmanto, Bambang; Monaco, Valerie

    2007-02-01

    Increasingly sophisticated technologies, such as On-Line Analytical Processing (OLAP) and Geospatial Information Systems (GIS), are being leveraged for conducting community health assessments (CHA). Little is known about the usability of OLAP and GIS interfaces with respect to CHA. We conducted an iterative usability evaluation of the Spatial OLAP Visualization and Analysis Tool (SOVAT), a software application that combines OLAP and GIS. A total of nine graduate students and six community health researchers were asked to think-aloud while completing five CHA questions using SOVAT. The sessions were analyzed after every three participants and changes to the interface were made based on the findings. Measures included elapsed time, answers provided, erroneous actions, and satisfaction. Traditional OLAP interface features were poorly understood by participants and combined OLAP-GIS features needed to be better emphasized. The results suggest that the changes made to the SOVAT interface resulted in increases in both usability and user satisfaction.

  10. Pulseq-Graphical Programming Interface: Open source visual environment for prototyping pulse sequences and integrated magnetic resonance imaging algorithm development.

    Science.gov (United States)

    Ravi, Keerthi Sravan; Potdar, Sneha; Poojar, Pavan; Reddy, Ashok Kumar; Kroboth, Stefan; Nielsen, Jon-Fredrik; Zaitsev, Maxim; Venkatesan, Ramesh; Geethanath, Sairam

    2018-03-11

    To provide a single open-source platform for comprehensive MR algorithm development inclusive of simulations, pulse sequence design and deployment, reconstruction, and image analysis. We integrated the "Pulseq" platform for vendor-independent pulse programming with Graphical Programming Interface (GPI), a scientific development environment based on Python. Our integrated platform, Pulseq-GPI, permits sequences to be defined visually and exported to the Pulseq file format for execution on an MR scanner. For comparison, Pulseq files using either MATLAB only ("MATLAB-Pulseq") or Python only ("Python-Pulseq") were generated. We demonstrated three fundamental sequences on a 1.5 T scanner. Execution times of the three variants of implementation were compared on two operating systems. In vitro phantom images indicate equivalence with the vendor supplied implementations and MATLAB-Pulseq. The examples demonstrated in this work illustrate the unifying capability of Pulseq-GPI. The execution times of all the three implementations were fast (a few seconds). The software is capable of user-interface based development and/or command line programming. The tool demonstrated here, Pulseq-GPI, integrates the open-source simulation, reconstruction and analysis capabilities of GPI Lab with the pulse sequence design and deployment features of Pulseq. Current and future work includes providing an ISMRMRD interface and incorporating Specific Absorption Ratio and Peripheral Nerve Stimulation computations. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  12. Custom Formula-Based Visualizations for Savvy Designers"

    DEFF Research Database (Denmark)

    Kuhail, Mohammad Amin

    and expressive. For instance, chart tools are easy to use, but support only predefined visualizations, while visualization tools support custom visualizations, but require program-like specifications. This thesis presents Uvis, a visualization system that targets savvy designers. With Uvis, designers drag......Despite their usefulness in many domains (e.g. healthcare, finance, etc.), custom visualizations remain tedious and hard to implement. It would be advantageous if savvy designers (designers with end-user development skills and much domain knowledge) could refine visualizations to their needs....... For instance, it would save time and money if a clinician familiar with spreadsheet formulas could refine a visualization (e.g. the lifelines) rather than hiring a programmer. Existing approaches to visualization are one of the two: accessible to savvy designers but limited in customizability, or inaccessible...

  13. Virtual Reality: A Tool for Cartographic Visualization | Quaye-Ballard ...

    African Journals Online (AJOL)

    Visualization methods in the analysis of geographical datasets are based on static models, which restrict the visual analysis capabilities. The use of virtual reality, which is a three-dimensional (3D) perspective, gives the user the ability to change viewpoints and models dynamically overcomes the static limitations of ...

  14. Design of UAV-Embedded Microphone Array System for Sound Source Localization in Outdoor Environments †

    Science.gov (United States)

    Hoshiba, Kotaro; Washizaki, Kai; Wakabayashi, Mizuho; Ishiki, Takahiro; Bando, Yoshiaki; Gabriel, Daniel; Nakadai, Kazuhiro; Okuno, Hiroshi G.

    2017-01-01

    In search and rescue activities, unmanned aerial vehicles (UAV) should exploit sound information to compensate for poor visual information. This paper describes the design and implementation of a UAV-embedded microphone array system for sound source localization in outdoor environments. Four critical development problems included water-resistance of the microphone array, efficiency in assembling, reliability of wireless communication, and sufficiency of visualization tools for operators. To solve these problems, we developed a spherical microphone array system (SMAS) consisting of a microphone array, a stable wireless network communication system, and intuitive visualization tools. The performance of SMAS was evaluated with simulated data and a demonstration in the field. Results confirmed that the SMAS provides highly accurate localization, water resistance, prompt assembly, stable wireless communication, and intuitive information for observers and operators. PMID:29099790

  15. New Abstraction Networks and a New Visualization Tool in Support of Auditing the SNOMED CT Content

    Science.gov (United States)

    Geller, James; Ochs, Christopher; Perl, Yehoshua; Xu, Junchuan

    2012-01-01

    Medical terminologies are large and complex. Frequently, errors are hidden in this complexity. Our objective is to find such errors, which can be aided by deriving abstraction networks from a large terminology. Abstraction networks preserve important features but eliminate many minor details, which are often not useful for identifying errors. Providing visualizations for such abstraction networks aids auditors by allowing them to quickly focus on elements of interest within a terminology. Previously we introduced area taxonomies and partial area taxonomies for SNOMED CT. In this paper, two advanced, novel kinds of abstraction networks, the relationship-constrained partial area subtaxonomy and the root-constrained partial area subtaxonomy are defined and their benefits are demonstrated. We also describe BLUSNO, an innovative software tool for quickly generating and visualizing these SNOMED CT abstraction networks. BLUSNO is a dynamic, interactive system that provides quick access to well organized information about SNOMED CT. PMID:23304293

  16. FocusStack and StimServer: a new open source MATLAB toolchain for visual stimulation and analysis of two-photon calcium neuronal imaging data.

    Science.gov (United States)

    Muir, Dylan R; Kampa, Björn M

    2014-01-01

    Two-photon calcium imaging of neuronal responses is an increasingly accessible technology for probing population responses in cortex at single cell resolution, and with reasonable and improving temporal resolution. However, analysis of two-photon data is usually performed using ad-hoc solutions. To date, no publicly available software exists for straightforward analysis of stimulus-triggered two-photon imaging experiments. In addition, the increasing data rates of two-photon acquisition systems imply increasing cost of computing hardware required for in-memory analysis. Here we present a Matlab toolbox, FocusStack, for simple and efficient analysis of two-photon calcium imaging stacks on consumer-level hardware, with minimal memory footprint. We also present a Matlab toolbox, StimServer, for generation and sequencing of visual stimuli, designed to be triggered over a network link from a two-photon acquisition system. FocusStack is compatible out of the box with several existing two-photon acquisition systems, and is simple to adapt to arbitrary binary file formats. Analysis tools such as stack alignment for movement correction, automated cell detection and peri-stimulus time histograms are already provided, and further tools can be easily incorporated. Both packages are available as publicly-accessible source-code repositories.

  17. Visual explorer facilitator's guide

    CERN Document Server

    Palus, Charles J

    2010-01-01

    Grounded in research and practice, the Visual Explorer™ Facilitator's Guide provides a method for supporting collaborative, creative conversations about complex issues through the power of images. The guide is available as a component in the Visual Explorer Facilitator's Letter-sized Set, Visual Explorer Facilitator's Post card-sized Set, Visual Explorer Playing Card-sized Set, and is also available as a stand-alone title for purchase to assist multiple tool users in an organization.

  18. Visual Analysis of Air Traffic Data

    Science.gov (United States)

    Albrecht, George Hans; Pang, Alex

    2012-01-01

    In this paper, we present visual analysis tools to help study the impact of policy changes on air traffic congestion. The tools support visualization of time-varying air traffic density over an area of interest using different time granularity. We use this visual analysis platform to investigate how changing the aircraft separation volume can reduce congestion while maintaining key safety requirements. The same platform can also be used as a decision aid for processing requests for unmanned aerial vehicle operations.

  19. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  20. Email-Set Visualization: Facilitating Re-Finding in Email Archives

    OpenAIRE

    Gorton, Douglas; Murthy, Uma; Vemuri, Naga Srinivas; Pérez-Quiñones, Manuel A.

    2007-01-01

    In this paper we describe ESVT – EmailSet Visualization Tool, an email archive tool that provides users a visualization to re-find and discover information in their email archive. ESVT is an end-to-end email archive tool that can be used from archiving a user’s email messages to visualizing queries on the email archive. We address email archiving by allowing import of email messages from an email server or from a standard existing email client. The central idea in ESVT’s visualization, an “em...

  1. FROMS3D: New Software for 3-D Visualization of Fracture Network System in Fractured Rock Masses

    Science.gov (United States)

    Noh, Y. H.; Um, J. G.; Choi, Y.

    2014-12-01

    A new software (FROMS3D) is presented to visualize fracture network system in 3-D. The software consists of several modules that play roles in management of borehole and field fracture data, fracture network modelling, visualization of fracture geometry in 3-D and calculation and visualization of intersections and equivalent pipes between fractures. Intel Parallel Studio XE 2013, Visual Studio.NET 2010 and the open source VTK library were utilized as development tools to efficiently implement the modules and the graphical user interface of the software. The results have suggested that the developed software is effective in visualizing 3-D fracture network system, and can provide useful information to tackle the engineering geological problems related to strength, deformability and hydraulic behaviors of the fractured rock masses.

  2. Visualizing data mining results with the Brede tools

    DEFF Research Database (Denmark)

    Nielsen, Finn Årup

    2009-01-01

    has expanded and now includes its own database with coordinates along with ontologies for brain regions and functions: The Brede Database. With Brede Toolbox and Database combined we setup automated workflows for extraction of data, mass meta-analytic data mining and visualizations. Most of the Web......A few neuroinformatics databases now exist that record results from neuroimaging studies in the form of brain coordinates in stereotaxic space. The Brede Toolbox was originally developed to extract, analyze and visualize data from one of them --- the BrainMap database. Since then the Brede Toolbox...

  3. Development of Environmental Decision Support System: Unifying Cross-Discipline Data Access Through Open Source Tools

    Science.gov (United States)

    Freeman, S.; Darmenova, K.; Higgins, G. J.; Apling, D.

    2012-12-01

    A common theme when it comes to accessing climate and environmental datasets is that it can be difficult to answer the five basic questions: Who, What, When, Where, and Why. Sometimes even the act of locating a data set or determining how it was generated can prove difficult. It is even more challenging for non-scientific individuals such as planners and policy makers who need to access and include such information in their work. Our Environmental Decision Support System (EDSS) attempts to address this issue by integrating several open source packages to create a simple yet robust web application for conglomerating, searching, viewing, and downloading environmental information for both scientists and decision makers alike. The system is comprised of several open source components, each playing an important role in the EDSS. The Geoportal web application provides an intuitive interface for searching and managing metadata ingested from data sets/data sources. The GeoServer and ncWMS web applications provide overlays and information for visual presentations of the data through web mapping services (WMS) by ingesting ESRI shapefiles, NetCDF, and HDF files. Users of the EDSS can browse the catalog of available products, enter a simple search string, or even constrain searches by temporal and spatial extents. Combined with a custom visualization web application, the EDSS provides a simple yet efficient means for users to not only access and manipulate climate and environmental data, but also trace the data source and the analytical methods used in the final decision aids products.

  4. Model-based evaluation of the use of polycyclic aromatic hydrocarbons molecular diagnostic ratios as a source identification tool

    International Nuclear Information System (INIS)

    Katsoyiannis, Athanasios; Breivik, Knut

    2014-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) molecular diagnostic ratios (MDRs) are unitless concentration ratios of pair-PAHs with the same molecular weight (MW); MDRs have long been used as a tool for PAHs source identification purposes. In the present paper, the efficiency of the MDR methodology is evaluated through the use of a multimedia fate model, the calculation of characteristic travel distances (CTD) and the estimation of air concentrations for individual PAHs as a function of distance from an initial point source. The results show that PAHs with the same MW are sometimes characterized by substantially different CTDs and therefore their air concentrations and hence MDRs are predicted to change as the distance from the original source increases. From the assessed pair-PAHs, the biggest CTD difference is seen for Fluoranthene (107 km) vs. Pyrene (26 km). This study provides a strong indication that MDRs are of limited use as a source identification tool. -- Highlights: • Model-based evaluation of the PAHs molecular diagnostic ratios efficiency. • Individual PAHs are characterized by different characteristic travel distances. • MDRs are proven to be a limited tool for source identification. • Use of MDRs for other environmental media is likely unfeasible. -- PAHs molecular diagnostic ratios which change greatly as a function of distance from the emitting source are improper for source identification purposes

  5. Independent sources of anisotropy in visual orientation representation: a visual and a cognitive oblique effect.

    Science.gov (United States)

    Balikou, Panagiota; Gourtzelidis, Pavlos; Mantas, Asimakis; Moutoussis, Konstantinos; Evdokimidis, Ioannis; Smyrnis, Nikolaos

    2015-11-01

    The representation of visual orientation is more accurate for cardinal orientations compared to oblique, and this anisotropy has been hypothesized to reflect a low-level visual process (visual, "class 1" oblique effect). The reproduction of directional and orientation information also leads to a mean error away from cardinal orientations or directions. This anisotropy has been hypothesized to reflect a high-level cognitive process of space categorization (cognitive, "class 2," oblique effect). This space categorization process would be more prominent when the visual representation of orientation degrades such as in the case of working memory with increasing cognitive load, leading to increasing magnitude of the "class 2" oblique effect, while the "class 1" oblique effect would remain unchanged. Two experiments were performed in which an array of orientation stimuli (1-4 items) was presented and then subjects had to realign a probe stimulus within the previously presented array. In the first experiment, the delay between stimulus presentation and probe varied, while in the second experiment, the stimulus presentation time varied. The variable error was larger for oblique compared to cardinal orientations in both experiments reproducing the visual "class 1" oblique effect. The mean error also reproduced the tendency away from cardinal and toward the oblique orientations in both experiments (cognitive "class 2" oblique effect). The accuracy or the reproduced orientation degraded (increasing variable error) and the cognitive "class 2" oblique effect increased with increasing memory load (number of items) in both experiments and presentation time in the second experiment. In contrast, the visual "class 1" oblique effect was not significantly modulated by any one of these experimental factors. These results confirmed the theoretical predictions for the two anisotropies in visual orientation reproduction and provided support for models proposing the categorization of

  6. Collaboratively Conceived, Designed and Implemented: Matching Visualization Tools with Geoscience Data Collections and Geoscience Data Collections with Visualization Tools via the ToolMatch Service.

    Science.gov (United States)

    Hoebelheinrich, N. J.; Lynnes, C.; West, P.; Ferritto, M.

    2014-12-01

    Two problems common to many geoscience domains are the difficulties in finding tools to work with a given dataset collection, and conversely, the difficulties in finding data for a known tool. A collaborative team from the Earth Science Information Partnership (ESIP) has gotten together to design and create a web service, called ToolMatch, to address these problems. The team began their efforts by defining an initial, relatively simple conceptual model that addressed the two uses cases briefly described above. The conceptual model is expressed as an ontology using OWL (Web Ontology Language) and DCterms (Dublin Core Terms), and utilizing standard ontologies such as DOAP (Description of a Project), FOAF (Friend of a Friend), SKOS (Simple Knowledge Organization System) and DCAT (Data Catalog Vocabulary). The ToolMatch service will be taking advantage of various Semantic Web and Web standards, such as OpenSearch, RESTful web services, SWRL (Semantic Web Rule Language) and SPARQL (Simple Protocol and RDF Query Language). The first version of the ToolMatch service was deployed in early fall 2014. While more complete testing is required, a number of communities besides ESIP member organizations have expressed interest in collaborating to create, test and use the service and incorporate it into their own web pages, tools and / or services including the USGS Data Catalog service, DataONE, the Deep Carbon Observatory, Virtual Solar Terrestrial Observatory (VSTO), and the U.S. Global Change Research Program. In this session, presenters will discuss the inception and development of the ToolMatch service, the collaborative process used to design, refine, and test the service, and future plans for the service.

  7. Screening methods for post-stroke visual impairment: a systematic review.

    Science.gov (United States)

    Hanna, Kerry Louise; Hepworth, Lauren Rachel; Rowe, Fiona

    2017-12-01

    To provide a systematic overview of the various tools available to screen for post-stroke visual impairment. A review of the literature was conducted including randomised controlled trials, controlled trials, cohort studies, observational studies, systematic reviews and retrospective medical note reviews. All languages were included and translation was obtained. Participants included adults ≥18 years old diagnosed with a visual impairment as a direct cause of a stroke. We searched a broad range of scholarly online resources and hand-searched articles registers of published, unpublished and on-going trials. Search terms included a variety of MESH terms and alternatives in relation to stroke and visual conditions. Study selection was performed by two authors independently. The quality of the evidence and risk of bias were assessed using the STROBE, GRACE and PRISMA statements. A total of 25 articles (n = 2924) were included in this review. Articles appraised reported on tools screening solely for visual impairments or for general post-stroke disabilities inclusive of vision. The majority of identified tools screen for visual perception including visual neglect (VN), with few screening for visual acuity (VA), visual field (VF) loss or ocular motility (OM) defects. Six articles reported on nine screening tools which combined visual screening assessment alongside screening for general stroke disabilities. Of these, three included screening for VA; three screened for VF loss; three screened for OM defects and all screened for VN. Two tools screened for all visual impairments. A further 19 articles were found which reported on individual vision screening tests in stroke populations; two for VF loss; 11 for VN and six for other visual perceptual defects. Most tools cannot accurately account for those with aphasia or communicative deficits, which are common problems following a stroke. There is currently no standardised visual screening tool which can accurately

  8. Visual impairment and traits of autism in children.

    Science.gov (United States)

    Wrzesińska, Magdalena; Kapias, Joanna; Nowakowska-Domagała, Katarzyna; Kocur, Józef

    2017-04-30

    Visual impairment present from birth or from an early childhood may lead to psychosocial and emotional disorders. 11-40% of children in the group with visual impairment show traits of autism. The aim of this paper was to present the selected examples of how visual impairment in children is related to the occurrence of autism and to describe the available tools for diagnosing autism in children with visual impairment. So far the relation between visual impairment in children and autism has not been sufficiently confirmed. Psychiatric and psychological diagnosis of children with visual impairment has some difficulties in differentiating between "blindism" and traits typical for autism resulting from a lack of standardized diagnostic tools used to diagnosing children with visual impairment. Another difficulty in diagnosing autism in children with visual impairment is the coexistence of other disabilities in case of most children with vision impairment. Additionally, apart from difficulties in diagnosing autistic disorders in children with eye dysfunctions there is also a question of what tools should be used in therapy and rehabilitation of patients.

  9. NASA's Global Imagery Browse Services - Technologies for Visualizing Earth Science Data

    Science.gov (United States)

    Cechini, M. F.; Boller, R. A.; Baynes, K.; Schmaltz, J. E.; Thompson, C. K.; Roberts, J. T.; Rodriguez, J.; Wong, M. M.; King, B. A.; King, J.; De Luca, A. P.; Pressley, N. N.

    2017-12-01

    For more than 20 years, the NASA Earth Observing System (EOS) has collected earth science data for thousands of scientific parameters now totaling nearly 15 Petabytes of data. In 2013, NASA's Global Imagery Browse Services (GIBS) formed its vision to "transform how end users interact and discover [EOS] data through visualizations." This vision included leveraging scientific and community best practices and standards to provide a scalable, compliant, and authoritative source for EOS earth science data visualizations. Since that time, GIBS has grown quickly and now services millions of daily requests for over 500 imagery layers representing hundreds of earth science parameters to a broad community of users. For many of these parameters, visualizations are available within hours of acquisition from the satellite. For others, visualizations are available for the entire mission of the satellite. The GIBS system is built upon the OnEarth and MRF open source software projects, which are provided by the GIBS team. This software facilitates standards-based access for compliance with existing GIS tools. The GIBS imagery layers are predominantly rasterized images represented in two-dimensional coordinate systems, though multiple projections are supported. The OnEarth software also supports the GIBS ingest pipeline to facilitate low latency updates to new or updated visualizations. This presentation will focus on the following topics: Overview of GIBS visualizations and user community Current benefits and limitations of the OnEarth and MRF software projects and related standards GIBS access methods and their in/compatibilities with existing GIS libraries and applications Considerations for visualization accuracy and understandability Future plans for more advanced visualization concepts including Vertical Profiles and Vector-Based Representations Future plans for Amazon Web Service support and deployments

  10. Linking Science and Management in an Interactive Geospatial, Mutli-Criterion, Structured Decision Support Framework: Use Case Studies of the "Future Forests Geo-visualization and Decision Support Tool

    Science.gov (United States)

    Pontius, J.; Duncan, J.

    2017-12-01

    Land managers are often faced with balancing management activities to accomplish a diversity of management objectives, in systems faced with many stress agents. Advances in ecosystem modeling provide a rich source of information to inform management. Coupled with advances in decision support techniques and computing capabilities, interactive tools are now accessible for a broad audience of stakeholders. Here we present one such tool designed to capture information on how climate change may impact forested ecosystems, and how that impact varies spatially across the landscape. This tool integrates empirical models of current and future forest structure and function in a structured decision framework that allows users to customize weights for multiple management objectives and visualize suitability outcomes across the landscape. Combined with climate projections, the resulting products allow stakeholders to compare the relative success of various management objectives on a pixel by pixel basis and identify locations where management outcomes are most likely to be met. Here we demonstrate this approach with the integration of several of the preliminary models developed to map species distributions, sugar maple health, forest fragmentation risk and hemlock vulnerability to hemlock woolly adelgid under current and future climate scenarios. We compare three use case studies with objective weightings designed to: 1) Identify key parcels for sugarbush conservation and management, 2) Target state lands that may serve as hemlock refugia from hemlock woolly adelgid induced mortality, and 3) Examine how climate change may alter the success of managing for both sugarbush and hemlock across privately owned lands. This tool highlights the value of flexible models that can be easily run with customized weightings in a dynamic, integrated assessment that allows users to hone in on their potentially complex management objectives, and to visualize and prioritize locations across the

  11. Data Representations, Transformations, and Statistics for Visual Reasoning

    CERN Document Server

    Maciejewski, Ross

    2011-01-01

    Analytical reasoning techniques are methods by which users explore their data to obtain insight and knowledge that can directly support situational awareness and decision making. Recently, the analytical reasoning process has been augmented through the use of interactive visual representations and tools which utilize cognitive, design and perceptual principles. These tools are commonly referred to as visual analytics tools, and the underlying methods and principles have roots in a variety of disciplines. This chapter provides an introduction to young researchers as an overview of common visual

  12. Getting a handle on virtual tools: An examination of the neuronal activity associated with virtual tool use.

    Science.gov (United States)

    Rallis, Austin; Fercho, Kelene A; Bosch, Taylor J; Baugh, Lee A

    2018-01-31

    Tool use is associated with three visual streams-dorso-dorsal, ventro-dorsal, and ventral visual streams. These streams are involved in processing online motor planning, action semantics, and tool semantics features, respectively. Little is known about the way in which the brain represents virtual tools. To directly assess this question, a virtual tool paradigm was created that provided the ability to manipulate tool components in isolation of one another. During functional magnetic resonance imaging (fMRI), adult participants performed a series of virtual tool manipulation tasks in which vision and movement kinematics of the tool were manipulated. Reaction time and hand movement direction were monitored while the tasks were performed. Functional imaging revealed that activity within all three visual streams was present, in a similar pattern to what would be expected with physical tool use. However, a previously unreported network of right-hemisphere activity was found including right inferior parietal lobule, middle and superior temporal gyri and supramarginal gyrus - regions well known to be associated with tool processing within the left hemisphere. These results provide evidence that both virtual and physical tools are processed within the same brain regions, though virtual tools recruit bilateral tool processing regions to a greater extent than physical tools. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Constructing visual representations

    DEFF Research Database (Denmark)

    Huron, Samuel; Jansen, Yvonne; Carpendale, Sheelagh

    2014-01-01

    tangible building blocks. We learned that all participants, most of whom had little experience in visualization authoring, were readily able to create and talk about their own visualizations. Based on our observations, we discuss participants’ actions during the development of their visual representations......The accessibility of infovis authoring tools to a wide audience has been identified as a major research challenge. A key task in the authoring process is the development of visual mappings. While the infovis community has long been deeply interested in finding effective visual mappings......, comparatively little attention has been placed on how people construct visual mappings. In this paper, we present the results of a study designed to shed light on how people transform data into visual representations. We asked people to create, update and explain their own information visualizations using only...

  14. Experiences in using DISCUS for visualizing human communication

    Science.gov (United States)

    Groehn, Matti; Nieminen, Marko; Haho, Paeivi; Smeds, Riitta

    2000-02-01

    In this paper, we present further improvement to the DISCUS software that can be used to record and analyze the flow and constants of business process simulation session discussion. The tool was initially introduced in 'visual data exploration and analysis IV' conference. The initial features of the tool enabled the visualization of discussion flow in business process simulation sessions and the creation of SOM analyses. The improvements of the tool consists of additional visualization possibilities that enable quick on-line analyses and improved graphical statistics. We have also created the very first interface to audio data and implemented two ways to visualize it. We also outline additional possibilities to use the tool in other application areas: these include usability testing and the possibility to use the tool for capturing design rationale in a product development process. The data gathered with DISCUS may be used in other applications, and further work may be done with data ming techniques.

  15. Sensitivity to the visual field origin of natural image patches in human low-level visual cortex

    Directory of Open Access Journals (Sweden)

    Damien J. Mannion

    2015-06-01

    Full Text Available Asymmetries in the response to visual patterns in the upper and lower visual fields (above and below the centre of gaze have been associated with ecological factors relating to the structure of typical visual environments. Here, we investigated whether the content of the upper and lower visual field representations in low-level regions of human visual cortex are specialised for visual patterns that arise from the upper and lower visual fields in natural images. We presented image patches, drawn from above or below the centre of gaze of an observer navigating a natural environment, to either the upper or lower visual fields of human participants (n = 7 while we used functional magnetic resonance imaging (fMRI to measure the magnitude of evoked activity in the visual areas V1, V2, and V3. We found a significant interaction between the presentation location (upper or lower visual field and the image patch source location (above or below fixation; the responses to lower visual field presentation were significantly greater for image patches sourced from below than above fixation, while the responses in the upper visual field were not significantly different for image patches sourced from above and below fixation. This finding demonstrates an association between the representation of the lower visual field in human visual cortex and the structure of the visual input that is likely to be encountered below the centre of gaze.

  16. Visualizing the Heliosphere

    Science.gov (United States)

    Bridgman, William T.; Shirah, Greg W.; Mitchell, Horace G.

    2008-01-01

    Today, scientific data and models can combine with modern animation tools to produce compelling visualizations to inform and educate. The Scientific Visualization Studio at Goddard Space Flight Center merges these techniques from the very different worlds of entertainment and science to enable scientists and the general public to 'see the unseeable' in new ways.

  17. Planetary SUrface Portal (PSUP): a tool for easy visualization and analysis of Martian surface

    Science.gov (United States)

    Poulet, Francois; Quantin-Nataf, Cathy; Ballans, Hervé; Lozac'h, Loic; Audouard, Joachim; Carter, John; Dassas, karin; Malapert, Jean-Christophe; Marmo, Chiara; Poulleau, Gilles; Riu, Lucie; Séjourné, antoine

    2016-10-01

    PSUP is two software application platforms for working with raster, vector, DTM, and hyper-spectral data acquired by various space instruments analyzing the surface of Mars from orbit. The first platform of PSUP is MarsSI (Martian surface data processing Information System, http://emars.univ-lyon1.fr). It provides data analysis functionalities to select and download ready-to-use products or to process data though specific and validated pipelines. To date, MarsSI handles CTX, HiRISE and CRISM data of NASA/MRO mission, HRSC and OMEGA data of ESA/MEx mission and THEMIS data of NASA/ODY mission (Lozac'h et al., EPSC 2015). The second part of PSUP is also open to the scientific community and can be visited at http://psup.ias.u-psud.fr/. This web-based user interface provides access to many data products for Mars: image footprints and rasters from the MarsSI tool; compositional maps from OMEGA and TES; albedo and thermal inertia from OMEGA and TES; mosaics from THEMIS, Viking, and CTX; high level specific products (defined as catalogues) such as hydrated mineral sites derived from CRISM and OMEGA data, central peaks mineralogy,… In addition, OMEGA C channel data cubes corrected for atmospheric and aerosol contributions can be downloaded. The architecture of PSUP data management and visualization is based on SITools2 and MIZAR, two CNES generic tools developed by a joint effort between CNES and scientific laboratories. SITools2 provides a self-manageable data access layer deployed on the PSUP data, while MIZAR is 3D application in a browser for discovering and visualizing geospatial data. Further developments including the addition of high level products of Mars (regional geological maps, new global compositional maps,…) are foreseen. Ultimately, PSUP will be adapted to other planetary surfaces and space missions in which the French research institutes are involved.

  18. Scientific visualization of gravitational lenses

    International Nuclear Information System (INIS)

    Magallon, M.

    1999-01-01

    Concepts related to gravitational lenses are discussed and applied to develop an interactive visualization tool that allow us to investigate them. Optimization strategies were performed to elaborate the tool. Some results obtained from the application of the tool are shown [es

  19. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  20. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. The visual in sport history: approaches, methodologies and sources

    OpenAIRE

    Huggins, Mike

    2015-01-01

    Historians of sport now increasingly accept that visual inquiry offers another dimension to social and cultural research into sport and its history. It is complex and its boundaries are rapidly evolving. This overview offers a justification for placing more emphasis on visual approaches and an introduction to the study and interpretation of visual culture in relation to the history of sport. It stresses the importance of adopting a critical approach and the need to be reflective about that cr...

  2. Interactive visualization tools for the structural biologist.

    Science.gov (United States)

    Porebski, Benjamin T; Ho, Bosco K; Buckle, Ashley M

    2013-10-01

    In structural biology, management of a large number of Protein Data Bank (PDB) files and raw X-ray diffraction images often presents a major organizational problem. Existing software packages that manipulate these file types were not designed for these kinds of file-management tasks. This is typically encountered when browsing through a folder of hundreds of X-ray images, with the aim of rapidly inspecting the diffraction quality of a data set. To solve this problem, a useful functionality of the Macintosh operating system (OSX) has been exploited that allows custom visualization plugins to be attached to certain file types. Software plugins have been developed for diffraction images and PDB files, which in many scenarios can save considerable time and effort. The direct visualization of diffraction images and PDB structures in the file browser can be used to identify key files of interest simply by scrolling through a list of files.

  3. Towards a Synesthesia Laboratory: Real-time Localization and Visualization of a Sound Source for Virtual Reality Applications

    OpenAIRE

    Kose, Ahmet; Tepljakov, Aleksei; Astapov, Sergei; Draheim, Dirk; Petlenkov, Eduard; Vassiljeva, Kristina

    2018-01-01

    In this paper, we present our findings related to the problem of localization and visualization of a sound source placed in the same room as the listener. The particular effect that we aim to investigate is called synesthesia—the act of experiencing one sense modality as another, e.g., a person may vividly experience flashes of colors when listening to a series of sounds. Towards that end, we apply a series of recently developed methods for detecting sound source in a three-dimensional space ...

  4. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  5. Windmill Noise Annoyance, Visual Aesthetics, and Attitudes towards Renewable Energy Sources

    Directory of Open Access Journals (Sweden)

    Ronny Klæboe

    2016-07-01

    Full Text Available A small focused socio-acoustic after-study of annoyance from a windmill park was undertaken after local health officials demanded a health impact study to look into neighborhood complaints. The windmill park consists of 31 turbines and is located in the South of Norway where it affects 179 dwellings. Simple exposure-effect relationships indicate stronger reactions to windmills and wind turbine noise than shown internationally, with the caveat that the sample size is small (n = 90 and responses are colored by the existing local conflict. Pulsating swishing sounds and turbine engine hum are the main causes of noise annoyance. About 60 per cent of those who participated in the survey were of the opinion that windmills degrade the landscape aesthetically, and were far from convinced that land-based windmills are desirable as a renewable energy source (hydropower is an important alternative source of renewables in Norway. Attitudes play an important role in addition to visual aesthetics in determining the acceptance of windmills and the resulting noise annoyance. To compare results from different wind turbine noise studies it seems necessary to assess the impact of important modifying factors.

  6. Windmill Noise Annoyance, Visual Aesthetics, and Attitudes towards Renewable Energy Sources

    Science.gov (United States)

    Klæboe, Ronny; Sundfør, Hanne Beate

    2016-01-01

    A small focused socio-acoustic after-study of annoyance from a windmill park was undertaken after local health officials demanded a health impact study to look into neighborhood complaints. The windmill park consists of 31 turbines and is located in the South of Norway where it affects 179 dwellings. Simple exposure-effect relationships indicate stronger reactions to windmills and wind turbine noise than shown internationally, with the caveat that the sample size is small (n = 90) and responses are colored by the existing local conflict. Pulsating swishing sounds and turbine engine hum are the main causes of noise annoyance. About 60 per cent of those who participated in the survey were of the opinion that windmills degrade the landscape aesthetically, and were far from convinced that land-based windmills are desirable as a renewable energy source (hydropower is an important alternative source of renewables in Norway). Attitudes play an important role in addition to visual aesthetics in determining the acceptance of windmills and the resulting noise annoyance. To compare results from different wind turbine noise studies it seems necessary to assess the impact of important modifying factors. PMID:27455301

  7. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  8. Visualizing the Limits of Low Vision in Detecting Natural Image Features

    NARCIS (Netherlands)

    Hogervorst, M.A.; Damme, W.J.M. van

    2008-01-01

    Purpose. The purpose of our study was to develop a tool to visualize the limitations posed by visual impairments in detecting small and low-contrast elements in natural images. This visualization tool incorporates existing models of several aspects of visual perception, such as the band-limited

  9. Visualization of RNA structure models within the Integrative Genomics Viewer.

    Science.gov (United States)

    Busan, Steven; Weeks, Kevin M

    2017-07-01

    Analyses of the interrelationships between RNA structure and function are increasingly important components of genomic studies. The SHAPE-MaP strategy enables accurate RNA structure probing and realistic structure modeling of kilobase-length noncoding RNAs and mRNAs. Existing tools for visualizing RNA structure models are not suitable for efficient analysis of long, structurally heterogeneous RNAs. In addition, structure models are often advantageously interpreted in the context of other experimental data and gene annotation information, for which few tools currently exist. We have developed a module within the widely used and well supported open-source Integrative Genomics Viewer (IGV) that allows visualization of SHAPE and other chemical probing data, including raw reactivities, data-driven structural entropies, and data-constrained base-pair secondary structure models, in context with linear genomic data tracks. We illustrate the usefulness of visualizing RNA structure in the IGV by exploring structure models for a large viral RNA genome, comparing bacterial mRNA structure in cells with its structure under cell- and protein-free conditions, and comparing a noncoding RNA structure modeled using SHAPE data with a base-pairing model inferred through sequence covariation analysis. © 2017 Busan and Weeks; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  10. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  11. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  12. An online model composition tool for system biology models.

    Science.gov (United States)

    Coskun, Sarp A; Cicek, A Ercument; Lai, Nicola; Dash, Ranjan K; Ozsoyoglu, Z Meral; Ozsoyoglu, Gultekin

    2013-09-05

    There are multiple representation formats for Systems Biology computational models, and the Systems Biology Markup Language (SBML) is one of the most widely used. SBML is used to capture, store, and distribute computational models by Systems Biology data sources (e.g., the BioModels Database) and researchers. Therefore, there is a need for all-in-one web-based solutions that support advance SBML functionalities such as uploading, editing, composing, visualizing, simulating, querying, and browsing computational models. We present the design and implementation of the Model Composition Tool (Interface) within the PathCase-SB (PathCase Systems Biology) web portal. The tool helps users compose systems biology models to facilitate the complex process of merging systems biology models. We also present three tools that support the model composition tool, namely, (1) Model Simulation Interface that generates a visual plot of the simulation according to user's input, (2) iModel Tool as a platform for users to upload their own models to compose, and (3) SimCom Tool that provides a side by side comparison of models being composed in the same pathway. Finally, we provide a web site that hosts BioModels Database models and a separate web site that hosts SBML Test Suite models. Model composition tool (and the other three tools) can be used with little or no knowledge of the SBML document structure. For this reason, students or anyone who wants to learn about systems biology will benefit from the described functionalities. SBML Test Suite models will be a nice starting point for beginners. And, for more advanced purposes, users will able to access and employ models of the BioModels Database as well.

  13. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance

    Energy Technology Data Exchange (ETDEWEB)

    Dong, Han; Sharma, Diksha; Badano, Aldo, E-mail: aldo.badano@fda.hhs.gov [Division of Imaging, Diagnostics, and Software Reliability, Center for Devices and Radiological Health, U.S. Food and Drug Administration, Silver Spring, Maryland 20993 (United States)

    2014-12-15

    Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying

  14. Web-based, GPU-accelerated, Monte Carlo simulation and visualization of indirect radiation imaging detector performance

    International Nuclear Information System (INIS)

    Dong, Han; Sharma, Diksha; Badano, Aldo

    2014-01-01

    Purpose: Monte Carlo simulations play a vital role in the understanding of the fundamental limitations, design, and optimization of existing and emerging medical imaging systems. Efforts in this area have resulted in the development of a wide variety of open-source software packages. One such package, hybridMANTIS, uses a novel hybrid concept to model indirect scintillator detectors by balancing the computational load using dual CPU and graphics processing unit (GPU) processors, obtaining computational efficiency with reasonable accuracy. In this work, the authors describe two open-source visualization interfaces, webMANTIS and visualMANTIS to facilitate the setup of computational experiments via hybridMANTIS. Methods: The visualization tools visualMANTIS and webMANTIS enable the user to control simulation properties through a user interface. In the case of webMANTIS, control via a web browser allows access through mobile devices such as smartphones or tablets. webMANTIS acts as a server back-end and communicates with an NVIDIA GPU computing cluster that can support multiuser environments where users can execute different experiments in parallel. Results: The output consists of point response and pulse-height spectrum, and optical transport statistics generated by hybridMANTIS. The users can download the output images and statistics through a zip file for future reference. In addition, webMANTIS provides a visualization window that displays a few selected optical photon path as they get transported through the detector columns and allows the user to trace the history of the optical photons. Conclusions: The visualization tools visualMANTIS and webMANTIS provide features such as on the fly generation of pulse-height spectra and response functions for microcolumnar x-ray imagers while allowing users to save simulation parameters and results from prior experiments. The graphical interfaces simplify the simulation setup and allow the user to go directly from specifying

  15. Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions

    Science.gov (United States)

    Nottrott, A.; Tan, S. M.; He, Y.

    2016-12-01

    vertical dispersion of the plume due to building wake effects. The OpenFOAM flow fields were combined with an inverse, stochastic dispersion model to quantify and visualize the sensitivity of point sensors to upwind sources in various built environments.

  16. Feature Usage Explorer: Usage Monitoring and Visualization Tool in HTML5 Based Applications

    Directory of Open Access Journals (Sweden)

    Sarunas Marciuska

    2013-10-01

    Full Text Available Feature Usage Explorer is a JavaScript library, which automatically detects features in HTML5 based applications and monitors their usage. The collected information can be visualized in a Feature Usage Diagram, which is automatically generated from an input json file. Currently, the users of Feature Usage Explorer have to design their own tool in order to generate the json file from collected usage information. This option remains viable when using the library in order not to constraint the user’s choice of preferred data storage. Feature Usage Explorer can be reused in any HTML5 based applications where an understanding of how users interact with the system is required (i.e. user experience and usability studies, human computer interaction field, or requirement prioritization area.

  17. Interactive data visualization foundations, techniques, and applications

    CERN Document Server

    Ward, Matthew; Keim, Daniel

    2010-01-01

    Visualization is the process of representing data, information, and knowledge in a visual form to support the tasks of exploration, confirmation, presentation, and understanding. This book is designed as a textbook for students, researchers, analysts, professionals, and designers of visualization techniques, tools, and systems. It covers the full spectrum of the field, including mathematical and analytical aspects, ranging from its foundations to human visual perception; from coded algorithms for different types of data, information and tasks to the design and evaluation of new visualization techniques. Sample programs are provided as starting points for building one's own visualization tools. Numerous data sets have been made available that highlight different application areas and allow readers to evaluate the strengths and weaknesses of different visualization methods. Exercises, programming projects, and related readings are given for each chapter. The book concludes with an examination of several existin...

  18. Cross-Dataset Analysis and Visualization Driven by Expressive Web Services

    Science.gov (United States)

    Alexandru Dumitru, Mircea; Catalin Merticariu, Vlad

    2015-04-01

    The deluge of data that is hitting us every day from satellite and airborne sensors is changing the workflow of environmental data analysts and modelers. Web geo-services play now a fundamental role, and are no longer needed to preliminary download and store the data, but rather they interact in real-time with GIS applications. Due to the very large amount of data that is curated and made available by web services, it is crucial to deploy smart solutions for optimizing network bandwidth, reducing duplication of data and moving the processing closer to the data. In this context we have created a visualization application for analysis and cross-comparison of aerosol optical thickness datasets. The application aims to help researchers identify and visualize discrepancies between datasets coming from various sources, having different spatial and time resolutions. It also acts as a proof of concept for integration of OGC Web Services under a user-friendly interface that provides beautiful visualizations of the explored data. The tool was built on top of the World Wind engine, a Java based virtual globe built by NASA and the open source community. For data retrieval and processing we exploited the OGC Web Coverage Service potential: the most exciting aspect being its processing extension, a.k.a. the OGC Web Coverage Processing Service (WCPS) standard. A WCPS-compliant service allows a client to execute a processing query on any coverage offered by the server. By exploiting a full grammar, several different kinds of information can be retrieved from one or more datasets together: scalar condensers, cross-sectional profiles, comparison maps and plots, etc. This combination of technology made the application versatile and portable. As the processing is done on the server-side, we ensured that the minimal amount of data is transferred and that the processing is done on a fully-capable server, leaving the client hardware resources to be used for rendering the visualization

  19. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  20. Interactive Visual Intervention Planning: Interactive Visualization for Intervention Planning in Particle Accelerator Environments with Ionizing Radiation

    CERN Document Server

    Fabry, Thomas; Feral, Bruno

    2013-01-01

    Intervention planning is crucial for maintenance operations in particle accelerator environments with ionizing radiation, during which the radiation dose contracted by maintenance workers should be reduced to a minimum. In this context, we discuss the visualization aspects of a new software tool, which integrates interactive exploration of a scene depicting an accelerator facility augmented with residual radiation level simulations, with the visualization of intervention data such as the followed trajectory and maintenance tasks. The visualization of each of these aspects has its effect on the final predicted contracted radiation dose. In this context, we explore the possible benefits of a user study, with the goal of enhancing the visual conditions in which the intervention planner using the software tool is minimizing the radiation dose.