WorldWideScience

Sample records for r-package graphic user

  1. fgui: A Method for Automatically Creating Graphical User Interfaces for Command-Line R Packages.

    Science.gov (United States)

    Hoffmann, Thomas J; Laird, Nan M

    2009-04-01

    The fguiR package is designed for developers of R packages, to help rapidly, and sometimes fully automatically, create a graphical user interface for a command line R package. The interface is built upon the Tcl/Tk graphical interface included in R. The package further facilitates the developer by loading in the help files from the command line functions to provide context sensitive help to the user with no additional effort from the developer. Passing a function as the argument to the routines in the fgui package creates a graphical interface for the function, and further options are available to tweak this interface for those who want more flexibility.

  2. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    NARCIS (Netherlands)

    Snellenburg, J.J.; Laptenok, S.; Seger, R.; Mullen, K.M.; Stokkum, van I.H.M.

    2012-01-01

    In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the spe

  3. Glotaran: A Java-Based Graphical User Interface for the R Package TIMP

    Directory of Open Access Journals (Sweden)

    Katharine M. Mullen

    2012-06-01

    Full Text Available In this work the software application called Glotaran is introduced as a Java-based graphical user interface to the R package TIMP, a problem solving environment for fitting superposition models to multi-dimensional data. TIMP uses a command-line user interface for the interaction with data, the specification of models and viewing of analysis results. Instead, Glotaran provides a graphical user interface which features interactive and dynamic data inspection, easier -- assisted by the user interface -- model specification and interactive viewing of results. The interactivity component is especially helpful when working with large, multi-dimensional datasets as often result from time-resolved spectroscopy measurements, allowing the user to easily pre-select and manipulate data before analysis and to quickly zoom in to regions of interest in the analysis results. Glotaran has been developed on top of the NetBeans rich client platform and communicates with R through the Java-to-R interface Rserve. The background and the functionality of the application are described here. In addition, the design, development and implementation process of Glotaran is documented in a generic way.

  4. Easing access to R using 'shiny' to create graphical user interfaces: An example for the R package 'Luminescence'

    Science.gov (United States)

    Burow, Christoph; Kreutzer, Sebastian; Dietze, Michael; Fuchs, Margret C.; Schmidt, Christoph; Fischer, Manfred; Brückner, Helmut

    2017-04-01

    Since the release of the R package 'Luminescence' (Kreutzer et al., 2012) the functionality of the package has been greatly enhanced by implementing further functions for measurement data processing, statistical analysis and graphical output. Despite its capabilities for complex and non-standard analysis of luminescence data, working with the command-line interface (CLI) of R can be tedious at best and overwhelming at worst, especially for users without experience in programming languages. Even though much work is put into simplifying the usage of the package to continuously lower the entry threshold, at least basic knowledge of R will always be required. Thus, the potential user base of the package cannot be exhausted, at least as long as the CLI is the only means of utilising the 'Luminescence' package. But even experienced users may find it tedious to iteratively run a function until a satisfying results is produced. For example, plotting data is also at least partly subject to personal aesthetic tastes in accordance with the information it is supposed to convey and iterating through all the possible options in the R CLI can be a time-consuming task. An alternative approach to the CLI is the graphical user interface (GUI), which allows direct, interactive manipulation and interaction with the underlying software. For users with little or no experience with command-lines a GUI offers intuitive access that counteracts the perceived steep learning curve of a CLI. Even though R lacks native support for GUI functions, its capabilities of linking it to other programming languages allows to utilise external frameworks to build graphical user interfaces. A recent attempt to provide a GUI toolkit for R was the introduction of the 'shiny' package (Chang et al., 2016), which allows automatic construction of HTML, CSS and JavaScript based user interfaces straight from R. Here, we give (1) a brief introduction to the 'shiny' framework for R, before we (2) present a GUI for

  5. A novel R-package graphic user interface for the analysis of metabonomic profiles

    Directory of Open Access Journals (Sweden)

    Villa Palmira

    2009-10-01

    Full Text Available Abstract Background Analysis of the plethora of metabolites found in the NMR spectra of biological fluids or tissues requires data complexity to be simplified. We present a graphical user interface (GUI for NMR-based metabonomic analysis. The "Metabonomic Package" has been developed for metabonomics research as open-source software and uses the R statistical libraries. Results The package offers the following options: Raw 1-dimensional spectra processing: phase, baseline correction and normalization. Importing processed spectra. Including/excluding spectral ranges, optional binning and bucketing, detection and alignment of peaks. Sorting of metabolites based on their ability to discriminate, metabolite selection, and outlier identification. Multivariate unsupervised analysis: principal components analysis (PCA. Multivariate supervised analysis: partial least squares (PLS, linear discriminant analysis (LDA, k-nearest neighbor classification. Neural networks. Visualization and overlapping of spectra. Plot values of the chemical shift position for different samples. Furthermore, the "Metabonomic" GUI includes a console to enable other kinds of analyses and to take advantage of all R statistical tools. Conclusion We made complex multivariate analysis user-friendly for both experienced and novice users, which could help to expand the use of NMR-based metabonomics.

  6. User guide to Exploration and Graphics for RivEr Trends (EGRET) and dataRetrieval: R packages for hydrologic data

    Science.gov (United States)

    Hirsch, Robert M.; De Cicco, Laura A.

    2015-01-01

    Evaluating long-term changes in river conditions (water quality and discharge) is an important use of hydrologic data. To carry out such evaluations, the hydrologist needs tools to facilitate several key steps in the process: acquiring the data records from a variety of sources, structuring it in ways that facilitate the analysis, processing the data with routines that extract information about changes that may be happening, and displaying findings with graphical techniques. A pair of tightly linked R packages, called dataRetrieval and EGRET (Exploration and Graphics for RivEr Trends), have been developed for carrying out each of these steps in an integrated manner. They are designed to easily accept data from three sources: U.S. Geological Survey hydrologic data, U.S. Environmental Protection Agency (EPA) STORET data, and user-supplied flat files. The dataRetrieval package not only serves as a “front end” to the EGRET package, it can also be used to easily download many types of hydrologic data and organize it in ways that facilitate many other hydrologic applications. The EGRET package has components oriented towards the description of long-term changes in streamflow statistics (high flow, average flow, and low flow) as well as changes in water quality. For the water-quality analysis, it uses Weighted Regressions on Time, Discharge and Season (WRTDS) to describe long-term trends in both concentration and flux. EGRET also creates a wide range of graphical presentations of the water-quality data and of the WRTDS results. This report serves as a user guide to these two R packages, providing detailed guidance on installation and use of the software, documentation of the analysis methods used, as well as guidance on some of the kinds of questions and approaches that the software can facilitate.

  7. User guide to Exploration and Graphics for RivEr Trends (EGRET) and dataRetrieval: R packages for hydrologic data

    Science.gov (United States)

    Hirsch, Robert M.; De Cicco, Laura A.

    2015-01-01

    Evaluating long-term changes in river conditions (water quality and discharge) is an important use of hydrologic data. To carry out such evaluations, the hydrologist needs tools to facilitate several key steps in the process: acquiring the data records from a variety of sources, structuring it in ways that facilitate the analysis, processing the data with routines that extract information about changes that may be happening, and displaying findings with graphical techniques. A pair of tightly linked R packages, called dataRetrieval and EGRET (Exploration and Graphics for RivEr Trends), have been developed for carrying out each of these steps in an integrated manner. They are designed to easily accept data from three sources: U.S. Geological Survey hydrologic data, U.S. Environmental Protection Agency (EPA) STORET data, and user-supplied flat files. The dataRetrieval package not only serves as a “front end” to the EGRET package, it can also be used to easily download many types of hydrologic data and organize it in ways that facilitate many other hydrologic applications. The EGRET package has components oriented towards the description of long-term changes in streamflow statistics (high flow, average flow, and low flow) as well as changes in water quality. For the water-quality analysis, it uses Weighted Regressions on Time, Discharge and Season (WRTDS) to describe long-term trends in both concentration and flux. EGRET also creates a wide range of graphical presentations of the water-quality data and of the WRTDS results. This report serves as a user guide to these two R packages, providing detailed guidance on installation and use of the software, documentation of the analysis methods used, as well as guidance on some of the kinds of questions and approaches that the software can facilitate.

  8. PKgraph: an R package for graphically diagnosing population pharmacokinetic models.

    Science.gov (United States)

    Sun, Xiaoyong; Wu, Kai; Cook, Dianne

    2011-12-01

    Population pharmacokinetic (PopPK) modeling has become increasing important in drug development because it handles unbalanced design, sparse data and the study of individual variation. However, the increased complexity of the model makes it more of a challenge to diagnose the fit. Graphics can play an important and unique role in PopPK model diagnostics. The software described in this paper, PKgraph, provides a graphical user interface for PopPK model diagnosis. It also provides an integrated and comprehensive platform for the analysis of pharmacokinetic data including exploratory data analysis, goodness of model fit, model validation and model comparison. Results from a variety of modeling fitting software, including NONMEM, Monolix, SAS and R, can be used. PKgraph is programmed in R, and uses the R packages lattice, ggplot2 for static graphics, and rggobi for interactive graphics.

  9. MIRE:A GRAPHICAL R PACKAGE FOR MICRORNA RELATED ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Xing-qi Yan; Kang Tu; Lu Xie; Yi-xue Li; Bin Yin; Yan-hua Gong; Jian-gang Yuan; Bo-qin Qiang; Xiao-zhong Peng

    2008-01-01

    Objective To provide a set of useful analysis tools for the researchers to explore the microRNA data.Methods The R language was used for generating the Graphical Users Interface and implementing most functions.Some Practical Extraction and Report Language (Perl) scripts were used for parsing source fries.Results We developed a graphical R package named miRE,which was designated for the analysis of microRNA functions,genomic organization,etc.This package provided effective and convenient tools for molecular biologists to deal with routine analyses in microRNA-related research.With its help,the users would be able to build a desktopcentered microRNA research environment quite easily and effectively,miRE is freely available at http://www.biosino.org/~kanghu/WorkPresentation/miRE/miRE.html.A detailed user manual and tutorials with example code and image are also available.Conclusion miRE is a tool providing an open-source,user-friendly,integrated interface for microRNA-related analysis.With its help,researchers can perform microRNA-related analysis more efficiently.

  10. phyloseq: an R package for reproducible interactive analysis and graphics of microbiome census data.

    Directory of Open Access Journals (Sweden)

    Paul J McMurdie

    Full Text Available the analysis of microbial communities through dna sequencing brings many challenges: the integration of different types of data with methods from ecology, genetics, phylogenetics, multivariate statistics, visualization and testing. With the increased breadth of experimental designs now being pursued, project-specific statistical analyses are often needed, and these analyses are often difficult (or impossible for peer researchers to independently reproduce. The vast majority of the requisite tools for performing these analyses reproducibly are already implemented in R and its extensions (packages, but with limited support for high throughput microbiome census data.Here we describe a software project, phyloseq, dedicated to the object-oriented representation and analysis of microbiome census data in R. It supports importing data from a variety of common formats, as well as many analysis techniques. These include calibration, filtering, subsetting, agglomeration, multi-table comparisons, diversity analysis, parallelized Fast UniFrac, ordination methods, and production of publication-quality graphics; all in a manner that is easy to document, share, and modify. We show how to apply functions from other R packages to phyloseq-represented data, illustrating the availability of a large number of open source analysis techniques. We discuss the use of phyloseq with tools for reproducible research, a practice common in other fields but still rare in the analysis of highly parallel microbiome census data. We have made available all of the materials necessary to completely reproduce the analysis and figures included in this article, an example of best practices for reproducible research.The phyloseq project for R is a new open-source software package, freely available on the web from both GitHub and Bioconductor.

  11. Programming Graphical User Interfaces in R

    CERN Document Server

    Verzani, John

    2012-01-01

    Programming Graphical User Interfaces with R introduces each of the major R packages for GUI programming: RGtk2, qtbase, Tcl/Tk, and gWidgets. With examples woven through the text as well as stand-alone demonstrations of simple yet reasonably complete applications, the book features topics especially relevant to statisticians who aim to provide a practical interface to functionality implemented in R. The book offers: A how-to guide for developing GUIs within R The fundamentals for users with limited knowledge of programming within R and other languages GUI design for specific functions or as l

  12. Independencies Induced from a Graphical Markov Model After Marginalization and Conditioning: The R Package ggm

    Directory of Open Access Journals (Sweden)

    Giovanni M. Marchetti

    2006-02-01

    Full Text Available We describe some functions in the R package ggm to derive from a given Markov model, represented by a directed acyclic graph, different types of graphs induced after marginalizing over and conditioning on some of the variables. The package has a few basic functions that find the essential graph, the induced concentration and covariance graphs, and several types of chain graphs implied by the directed acyclic graph (DAG after grouping and reordering the variables. These functions can be useful to explore the impact of latent variables or of selection effects on a chosen data generating model.

  13. Satistical Graphical User Interface Plug-In for Survival Analysis in R Statistical and Graphics Language and Environment

    Directory of Open Access Journals (Sweden)

    Daniel C. LEUCUŢA

    2008-12-01

    Full Text Available Introduction: R is a statistical and graphics language and environment. Although it is extensively used in command line, graphical user interfaces exist to ease the accommodation with it for new users. Rcmdr is an R package providing a basic-statistics graphical user interface to R. Survival analysis interface is not provided by Rcmdr. The AIM of this paper was to create a plug-in for Rcmdr to provide survival analysis user interface for some basic R survival analysis functions.Materials and Methods: The Rcmdr plug-in code was written in Tinn-R. The plug-in package was tested and built with Rtools. The plug-in was installed and tested in R with Rcmdr package on a Windows XP workstation with the "aml" and "kidney" data sets from survival R package.Results: The Rcmdr survival analysis plug-in was successfully built and it provides the functionality it was designed to offer: interface for Kaplan Meier and log log survival graph, interface for the log-rank test, interface to create a Cox proportional hazard regression model, interface commands to test and assess graphically the proportional hazard assumption, and influence observations. Conclusion: Rcmdr and R though their flexible and well planed structure, offer an easy way to expand their functionality that was used here to make the statistical environment more user friendly in respect with survival analysis.

  14. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  15. Disjoint forms in graphical user interfaces

    NARCIS (Netherlands)

    Evers, S.; Achten, P.M.; Plasmeijer, M.J.; Loidl, H.W.

    2006-01-01

    Forms are parts of a graphical user interface (GUI) that show a set of values and allow the user to update them. The declarative form construction library FunctionalForms is extended with disjoint form combinators to capture some common patterns in which the form structure expresses a choice. We dem

  16. Digital Data Acquisition Graphical User Interface

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Matthew W.; Aalseth, Craig E.; Ely, James H.; Haas, Derek A.; Hayes, James C.; McIntyre, Justin I.; Schrom, Brian T.

    2010-09-21

    Traditional radioxenon measurements have been done by ground based fixed systems, however in recent years there has been an increased need for systems capable of quick deployment or even complete mobility. Using the Pixie-4 data acquisition (DAQ) system can help reduce the electronics footprint of both current systems, like the radioxenon Radionuclide Laboratory 16 (RL-16) and the Swedish Automatic Unit for Noble Gas Acquisition (SAUNA), as well as future systems. Pacific Northwest National Laboratory (PNNL) has developed a Linux based Nyx graphical user interface (GUI) for Pixie-4 cards. The Nyx software can be installed on various Linux platforms and is written in C++. This software offers a rich user interface for configuring and operating the Pixie4 card and PNNL designed high voltage (HV) cards. Nyx allows one to quickly get a nuclear detector operational by maintaining the core diagnostic features built into the Pixie-4 cards. First, Nyx maintains the multitude of adjustable parameters accessible in the Pixie-4 cards, which allows one to customize settings to take full advantage of a particular detector. Nyx also maintains an oscilloscope feature which is extremely useful to optimize settings and to verify proper detector behavior and is often the first feature used in Nyx during detector setup. Finally, Nyx allows the user to collect data in several formats including full pulse shapes to basic histograms. Overall, it is the corner stone for the transition of beta-gamma systems to a state-of-the-art digitizing DAQ.

  17. Simulation Control Graphical User Interface Logging Report

    Science.gov (United States)

    Hewling, Karl B., Jr.

    2012-01-01

    One of the many tasks of my project was to revise the code of the Simulation Control Graphical User Interface (SIM GUI) to enable logging functionality to a file. I was also tasked with developing a script that directed the startup and initialization flow of the various LCS software components. This makes sure that a software component will not spin up until all the appropriate dependencies have been configured properly. Also I was able to assist hardware modelers in verifying the configuration of models after they have been upgraded to a new software version. I developed some code that analyzes the MDL files to determine if any error were generated due to the upgrade process. Another one of the projects assigned to me was supporting the End-to-End Hardware/Software Daily Tag-up meeting.

  18. PAMLX: a graphical user interface for PAML.

    Science.gov (United States)

    Xu, Bo; Yang, Ziheng

    2013-12-01

    This note announces pamlX, a graphical user interface/front end for the paml (for Phylogenetic Analysis by Maximum Likelihood) program package (Yang Z. 1997. PAML: a program package for phylogenetic analysis by maximum likelihood. Comput Appl Biosci. 13:555-556; Yang Z. 2007. PAML 4: Phylogenetic analysis by maximum likelihood. Mol Biol Evol. 24:1586-1591). pamlX is written in C++ using the Qt library and communicates with paml programs through files. It can be used to create, edit, and print control files for paml programs and to launch paml runs. The interface is available for free download at http://abacus.gene.ucl.ac.uk/software/paml.html.

  19. An Overview on R Packages for Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Haibin Qiu

    2014-05-01

    Full Text Available The aim of this study is to present overview on R packages for structural equation modeling. Structural equation modeling, a statistical technique for testing and estimating causal relations using an amalgamation of statistical data and qualitative causal hypotheses, allow both confirmatory and exploratory modeling, meaning they are matched to both hypothesis testing and theory development. R project or R language, a free and popular programming language and computer software surroundings for statistical computing and graphics, is popularly used among statisticians for developing statistical computer software and data analysis. The major finding is that it is necessary to build excellent and enough structural equation modeling packages for R users to do research. Numerous packages for structural equation modeling of R project are introduced in this study and most of them are enclosed in the Comprehensive R Archive Network task view Psychometrics.

  20. FGB: A Graphical and Haptic User Interface for Creating Graphical, Haptic User Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    ANDERSON,THOMAS G.; BRECKENRIDGE,ARTHURINE; DAVIDSON,GEORGE S.

    1999-10-18

    The emerging field of haptics represents a fundamental change in human-computer interaction (HCI), and presents solutions to problems that are difficult or impossible to solve with a two-dimensional, mouse-based interface. To take advantage of the potential of haptics, however, innovative interaction techniques and programming environments are needed. This paper describes FGB (FLIGHT GHUI Builder), a programming tool that can be used to create an application specific graphical and haptic user interface (GHUI). FGB is itself a graphical and haptic user interface with which a programmer can intuitively create and manipulate components of a GHUI in real time in a graphical environment through the use of a haptic device. The programmer can create a GHUI without writing any programming code. After a user interface is created, FGB writes the appropriate programming code to a file, using the FLIGHT API, to recreate what the programmer created in the FGB interface. FGB saves programming time and increases productivity, because a programmer can see the end result as it is created, and FGB does much of the programming itself. Interestingly, as FGB was created, it was used to help build itself. The further FGB was in its development, the more easily and quickly it could be used to create additional functionality and improve its own design. As a finished product, FGB can be used to recreate itself in much less time than it originally required, and with much less programming. This paper describes FGB's GHUI components, the techniques used in the interface, how the output code is created, where programming additions and modifications should be placed, and how it can be compared to and integrated with existing API's such as MFC and Visual C++, OpenGL, and GHOST.

  1. Open|SpeedShop Graphical User Interface Technology Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to create a new graphical user interface (GUI) for an existing parallel application performance and profiling tool, Open|SpeedShop. The current GUI has...

  2. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    Science.gov (United States)

    Jalili, Mahdi; Salehzadeh-Yazdi, Ali; Asgari, Yazdan; Arab, Seyed Shahriar; Yaghmaie, Marjan; Ghavamzadeh, Ardeshir; Alimoghaddam, Kamran

    2015-01-01

    Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv) file format or a mapped graphical format as a graph modeling language (GML) file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer) and R package (centiserve) are freely available at http://www.centiserver.org/.

  3. CentiServer: A Comprehensive Resource, Web-Based Application and R Package for Centrality Analysis.

    Directory of Open Access Journals (Sweden)

    Mahdi Jalili

    Full Text Available Various disciplines are trying to solve one of the most noteworthy queries and broadly used concepts in biology, essentiality. Centrality is a primary index and a promising method for identifying essential nodes, particularly in biological networks. The newly created CentiServer is a comprehensive online resource that provides over 110 definitions of different centrality indices, their computational methods, and algorithms in the form of an encyclopedia. In addition, CentiServer allows users to calculate 55 centralities with the help of an interactive web-based application tool and provides a numerical result as a comma separated value (csv file format or a mapped graphical format as a graph modeling language (GML file. The standalone version of this application has been developed in the form of an R package. The web-based application (CentiServer and R package (centiserve are freely available at http://www.centiserver.org/.

  4. A Functional Programming Technique for Forms in Graphical User Interfaces

    NARCIS (Netherlands)

    Evers, S.; Kuper, J.; Achten, P.M.; Grelck, G.; Huch, F.; Michaelson, G.; Trinder, Ph.W.

    2005-01-01

    This paper presents FunctionalForms, a new combinator library for constructing fully functioning forms in a concise and flexible way. A form is a part of a graphical user interface (GUI) restricted to displaying a value and allowing the user to modify it. The library is built on top of the medium-le

  5. A Graphical User Interface for Formal Proofs in Geometry.

    OpenAIRE

    Narboux, Julien

    2007-01-01

    International audience; We present in this paper the design of a graphical user interface to deal with proofs in geometry. The software developed combines three tools: a dynamic geometry software to explore, measure and invent conjectures, an automatic theorem prover to check facts and an interactive proof system (Coq) to mechanically check proofs built interactively by the user.

  6. tclust: An R Package for a Trimming Approach to Cluster Analysis

    Directory of Open Access Journals (Sweden)

    2012-04-01

    Full Text Available Outlying data can heavily influence standard clustering methods. At the same time, clustering principles can be useful when robustifying statistical procedures. These two reasons motivate the development of feasible robust model-based clustering approaches. With this in mind, an R package for performing non-hierarchical robust clustering, called tclust, is presented here. Instead of trying to “fit” noisy data, a proportion α of the most outlying observations is trimmed. The tclust package efficiently handles different cluster scatter constraints. Graphical exploratory tools are also provided to help the user make sensible choices for the trimming proportion as well as the number of clusters to search for.

  7. Creating R Packages: A Tutorial

    OpenAIRE

    Leisch, Friedrich

    2008-01-01

    This tutorial gives a practical introduction to creating R packages. We discuss how object oriented programming and S formulas can be used to give R code the usual look and feel, how to start a package from a collection of R functions, and how to test the code once the package has been created. As running example we use functions for standard linear regression analysis which are developed from scratch.

  8. Defining Domain Language of Graphical User Interfaces

    OpenAIRE

    Baciková, Michaela; PORUBÄN Jaroslav; Lakatos, Dominik

    2013-01-01

    Domain-specific languages are computer (programming, modeling, specification) languages devoted to solving problems in a specific domain. The least examined DSL development phases are analysis and design. Various formal methodologies exist, however domain analysis is still done informally most of the time. There are also methodologies of deriving DSLs from existing ontologies but the presumption is to have an ontology for the specific domain. We propose a solution of a user interface driven d...

  9. Defining Domain Language of Graphical User Interfaces

    OpenAIRE

    Baciková, Michaela; PORUBÄN Jaroslav; Lakatos, Dominik

    2013-01-01

    Domain-specific languages are computer (programming, modeling, specification) languages devoted to solving problems in a specific domain. The least examined DSL development phases are analysis and design. Various formal methodologies exist, however domain analysis is still done informally most of the time. There are also methodologies of deriving DSLs from existing ontologies but the presumption is to have an ontology for the specific domain. We propose a solution of a user interface driven d...

  10. A Graphical User Interface to Generalized Linear Models in MATLAB

    Directory of Open Access Journals (Sweden)

    Peter Dunn

    1999-07-01

    Full Text Available Generalized linear models unite a wide variety of statistical models in a common theoretical framework. This paper discusses GLMLAB-software that enables such models to be fitted in the popular mathematical package MATLAB. It provides a graphical user interface to the powerful MATLAB computational engine to produce a program that is easy to use but with many features, including offsets, prior weights and user-defined distributions and link functions. MATLAB's graphical capacities are also utilized in providing a number of simple residual diagnostic plots.

  11. Reservation system with graphical user interface

    KAUST Repository

    Mohamed, Mahmoud A. Abdelhamid

    2012-01-05

    Techniques for providing a reservation system are provided. The techniques include displaying a scalable visualization object, wherein the scalable visualization object comprises an expanded view element of the reservation system depicting information in connection with a selected interval of time and a compressed view element of the reservation system depicting information in connection with one or more additional intervals of time, maintaining a visual context between the expanded view and the compressed view within the visualization object, and enabling a user to switch between the expanded view and the compressed view to facilitate use of the reservation system.

  12. BatTool: an R package with GUI for assessing the effect of White-nose syndrome and other take events on Myotis spp. of bats

    Science.gov (United States)

    Erickson, Richard A.; Thogmartin, Wayne E.; Szymanski, Jennifer A.

    2014-01-01

    Background: Myotis species of bats such as the Indiana Bat and Little Brown Bat are facing population declines because of White-nose syndrome (WNS). These species also face threats from anthropogenic activities such as wind energy development. Population models may be used to provide insights into threats facing these species. We developed a population model, BatTool, as an R package to help decision makers and natural resource managers examine factors influencing the dynamics of these species. The R package includes two components: 1) a deterministic and stochastic model that are accessible from the command line and 2) a graphical user interface (GUI). Results: BatTool is an R package allowing natural resource managers and decision makers to understand Myotis spp. population dynamics. Through the use of a GUI, the model allows users to understand how WNS and other take events may affect the population. The results are saved both graphically and as data files. Additionally, R-savvy users may access the population functions through the command line and reuse the code as part of future research. This R package could also be used as part of a population dynamics or wildlife management course. Conclusions: BatTool provides access to a Myotis spp. population model. This tool can help natural resource managers and decision makers with the Endangered Species Act deliberations for these species and with issuing take permits as part of regulatory decision making. The tool is available online as part of this publication.

  13. A Graphical User Interface in WLAN Monitoring and Management System

    Directory of Open Access Journals (Sweden)

    Jiantao Gu

    2012-04-01

    Full Text Available This paper aims at providing a graphical user interface for WLAN monitoring and management system “WLAN Inspector”, which gives network operators the software and performance management tools necessary to monitor and manage network availability, achieve real-time monitoring (7 × 24 hours and intelligent management, report on IP networks performance, and troubleshoot issues through a single Web-based graphical user interface. The overall framework design of graphical interface, brief description of each module, and the detailed design in the basic information interface are discussed in this paper. The WLAN monitoring and management system has multiple functions: real-time network monitoring, real-time protocol analysis, information, statistics, safety testing and network performance monitoring, etc. This system can give Video Frame Capture for Mac, analyze the WLAN traffic characteristics, detect possible security vulnerabilities, and give the appropriate solution.

  14. Helping Students Test Programs That Have Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Matthew Thornton

    2008-08-01

    Full Text Available Within computer science education, many educators are incorporating software testing activities into regular programming assignments. Tools like JUnit and its relatives make software testing tasks much easier, bringing them into the realm of even introductory students. At the same time, many introductory programming courses are now including graphical interfaces as part of student assignments to improve student interest and engagement. Unfortunately, writing software tests for programs that have significant graphical user interfaces is beyond the skills of typical students (and many educators. This paper presents initial work at combining educationally oriented and open-source tools to create an infrastructure for writing tests for Java programs that have graphical user interfaces. Critically, these tools are intended to be appropriate for introductory (CS1/CS2 student use, and to dovetail with current teaching approaches that incorporate software testing in programming assignments. We also include in our findings our proposed approach to evaluating our techniques.

  15. Circumventing Graphical User Interfaces in Chemical Engineering Plant Design

    Science.gov (United States)

    Romey, Noel; Schwartz, Rachel M.; Behrend, Douglas; Miao, Peter; Cheung, H. Michael; Beitle, Robert

    2007-01-01

    Graphical User Interfaces (GUIs) are pervasive elements of most modern technical software and represent a convenient tool for student instruction. For example, GUIs are used for [chemical] process design software (e.g., CHEMCAD, PRO/II and ASPEN) typically encountered in the senior capstone course. Drag and drop aspects of GUIs are challenging for…

  16. Graphical User Interface Color Display Animation Interaction Tool

    Energy Technology Data Exchange (ETDEWEB)

    1999-10-05

    The Nuclear Plant Analyzer (NPA) is a highly flexible graphical user interface for displaying the results of a calculation, typically generated by RELAP5 or other code. This display consists of one or more picture, called masks, that mimic the host code input. This mask can be animated to display user-specified code output information mapped as colors, dials, moving arrows, etc., on the mask. The user can also interact with the control systems of the host input file as the execution progresses, thereby controlling aspects of the calculation. The Computer Visual System (CVS) creates, edits, and animates the the masks for use in the NPA.

  17. Toward a graphical user interface for the SPIRE spectrometer pipeline

    Science.gov (United States)

    Ordenovic, C.; Surace, C.; Baluteau, J. P.; Benielli, D.; Davis, P.; Fulton, T.

    2008-08-01

    Herschel is a satellite mission led by ESA and involving an international consortium of countries. The HCSS is in charge of the data processing pipeline. This pipeline is written in Jython and includes java classes. We present a convenient way for a user to deal with SPIRE photometer and spectrometer pipeline scripts. The provided Graphical User Interface is built up automatically from Jython script. The user can choose tasks to be executed, parameterise them and set breakpoints during the pipeline execution. Results can be displayed and saved in FITS and VOTable formats.

  18. A remote computer graphics user at General Motors

    Science.gov (United States)

    Murphy, H. S.

    1982-01-01

    The successful use of automotive body surface design data is described. This data has been originally created elsewhere in GM's two large computer graphics systems of CADANCE and Fisher Graphics. As a supplier exterior lighting components, radiator grilles, energy absorbing soft faced bumper systems, and other associated items, Guide has become most dependent on the corporate computer graphics systems to supply accurate car body styling and sheet metal surfacing information for the design of their products. The presentation includes the origin and transfer of design data to a remote user site; its use in the design of their products; and the ultimate production of detailed drawings, N/C punched tapes, and subsequent downstream transfers of detailed part data to a turnkey system for tool design purposes.

  19. SRF Test Areas Cryogenic System Controls Graphical User Interface

    Energy Technology Data Exchange (ETDEWEB)

    DeGraff, B.D.; Ganster, G.; Klebaner, A.; Petrov, A.D.; Soyars, W.M.; /Fermilab

    2011-06-09

    Fermi National Accelerator Laboratory has constructed a superconducting 1.3 GHz cavity test facility at Meson Detector Building (MDB) and a superconducting 1.3 GHz cryomodule test facility located at the New Muon Lab Building (NML). The control of these 2K cryogenic systems is accomplished by using a Synoptic graphical user interface (GUI) to interact with the underlying Fermilab Accelerator Control System. The design, testing and operational experience of employing the Synoptic client-server system for graphical representation will be discussed. Details on the Synoptic deployment to the MDB and NML cryogenic sub-systems will also be discussed. The implementation of the Synoptic as the GUI for both NML and MDB has been a success. Both facilities are currently fulfilling their individual roles in SCRF testing as a result of successful availability of the cryogenic systems. The tools available for creating Synoptic pages will continue to be developed to serve the evolving needs of users.

  20. WASAT. A graphical user interface for visualization of wave spectrograms

    Energy Technology Data Exchange (ETDEWEB)

    Joergensen, R.

    1996-12-01

    The report describes a technique for the decoding and visualization of sounding rocket data sets. A specific application for the visualization of three dimensional wave HF FFT spectra obtained from the SCIFER sounding rocket launched January 25, 1995, is made. The data set was decoded from its original data format which was the NASA DITES I/II format. A graphical user interface, WASAT (WAve Spectrogram Analysis Tool), using the Interactive Data Language was created. The data set was visualized using IDL image tools overlayed with contour routines. The user interface was based on the IDL widget concept. 9 refs., 7 figs.

  1. Comparing Text-based and Graphic User Interfaces for novice and expert users.

    Science.gov (United States)

    Chen, Jung-Wei; Zhang, Jiajie

    2007-10-11

    Graphic User Interface (GUI) is commonly considered to be superior to Text-based User Interface (TUI). This study compares GUI and TUI in an electronic dental record system. Several usability analysis techniques compared the relative effectiveness of a GUI and a TUI. Expert users and novice users were evaluated in time required and steps needed to complete the task. A within-subject design was used to evaluate if the experience with either interface will affect task performance. The results show that the GUI interface was not better than the TUI for expert users. GUI interface was better for novice users. For novice users there was a learning transfer effect from TUI to GUI. This means a user interface is user-friendly or not depending on the mapping between the user interface and tasks. GUI by itself may or may not be better than TUI.

  2. Graphical user interface prototyping for distributed requirements engineering

    CERN Document Server

    Scheibmayr, Sven

    2014-01-01

    Finding and understanding the right requirements is essential for every software project. This book deals with the challenge to improve requirements engineering in distributed software projects. The use of graphical user interface (GUI) prototypes can help stakeholders in such projects to elicit and specify high quality requirements. The research objective of this study is to develop a method and a software artifact to support the activities in the early requirements engineering phase in order to overcome some of the difficulties and improve the quality of the requirements, which should eventu

  3. Graphical user interface for wireless sensor networks simulator

    Science.gov (United States)

    Paczesny, Tomasz; Paczesny, Daniel; Weremczuk, Jerzy

    2008-01-01

    Wireless Sensor Networks (WSN) are currently very popular area of development. It can be suited in many applications form military through environment monitoring, healthcare, home automation and others. Those networks, when working in dynamic, ad-hoc model, need effective protocols which must differ from common computer networks algorithms. Research on those protocols would be difficult without simulation tool, because real applications often use many nodes and tests on such a big networks take much effort and costs. The paper presents Graphical User Interface (GUI) for simulator which is dedicated for WSN studies, especially in routing and data link protocols evaluation.

  4. SPIKY: A graphical user interface for monitoring spike train synchrony

    CERN Document Server

    Bozanic, Nebojsa

    2014-01-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface which facilitates the applicati...

  5. NASA access mechanism: Graphical user interface information retrieval system

    Science.gov (United States)

    Hunter, Judy; Generous, Curtis; Duncan, Denise

    1993-01-01

    Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited to factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

  6. NASA Access Mechanism - Graphical user interface information retrieval system

    Science.gov (United States)

    Hunter, Judy F.; Generous, Curtis; Duncan, Denise

    1993-01-01

    Access to online information sources of aerospace, scientific, and engineering data, a mission focus for NASA's Scientific and Technical Information Program, has always been limited by factors such as telecommunications, query language syntax, lack of standardization in the information, and the lack of adequate tools to assist in searching. Today, the NASA STI Program's NASA Access Mechanism (NAM) prototype offers a solution to these problems by providing the user with a set of tools that provide a graphical interface to remote, heterogeneous, and distributed information in a manner adaptable to both casual and expert users. Additionally, the NAM provides access to many Internet-based services such as Electronic Mail, the Wide Area Information Servers system, Peer Locating tools, and electronic bulletin boards.

  7. A Graphical User Interface for RELAX3D

    Science.gov (United States)

    Jones, F. W.

    1997-05-01

    The Laplace/Poisson solver RELAX3D has been used extensively in cyclotron central region design and other accelerator and beam physics applications. It is typically run in an interactive mode where the user types in commands and parameters to initiate and control the solution process and to view or output the results. This paper describes a prototype graphical user interface (GUI), developed using Tcl/Tk, that eliminates most of this typing and makes for more efficient user interaction. The use of a unique package called Expect (a Tcl/Tk extension) allows the interface to be implemented as an independent front-end process that communicates with the running RELAX3D program, thus requiring minimal modifications to RELAX3D itself. Since Expect can control multiple processes, and since RELAX3D results are often sent to some subsequent program for visualization, particle tracking, etc., there are interesting opportunities to integrate these post-processing tasks into the same GUI that is used for RELAX3D.

  8. Some computer graphical user interfaces in radiation therapy

    Institute of Scientific and Technical Information of China (English)

    James C L Chow

    2016-01-01

    In this review, five graphical user interfaces(GUIs) used in radiation therapy practices and researches are introduced. They are:(1) the treatment time calculator, superficialx-ray treatment time calculator(SUPCALC) used in the superficial X-ray radiation therapy;(2) the monitor unit calculator, electron monitor unit calculator(EMUC) used in the electron radiation therapy;(3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy(SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy;(4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and(5) the monitor unit calculator, photon beam monitor unit calculator(PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls the

  9. Some computer graphical user interfaces in radiation therapy.

    Science.gov (United States)

    Chow, James C L

    2016-03-28

    In this review, five graphical user interfaces (GUIs) used in radiation therapy practices and researches are introduced. They are: (1) the treatment time calculator, superficial X-ray treatment time calculator (SUPCALC) used in the superficial X-ray radiation therapy; (2) the monitor unit calculator, electron monitor unit calculator (EMUC) used in the electron radiation therapy; (3) the multileaf collimator machine file creator, sliding window intensity modulated radiotherapy (SWIMRT) used in generating fluence map for research and quality assurance in intensity modulated radiation therapy; (4) the treatment planning system, DOSCTP used in the calculation of 3D dose distribution using Monte Carlo simulation; and (5) the monitor unit calculator, photon beam monitor unit calculator (PMUC) used in photon beam radiation therapy. One common issue of these GUIs is that all user-friendly interfaces are linked to complex formulas and algorithms based on various theories, which do not have to be understood and noted by the user. In that case, user only needs to input the required information with help from graphical elements in order to produce desired results. SUPCALC is a superficial radiation treatment time calculator using the GUI technique to provide a convenient way for radiation therapist to calculate the treatment time, and keep a record for the skin cancer patient. EMUC is an electron monitor unit calculator for electron radiation therapy. Instead of doing hand calculation according to pre-determined dosimetric tables, clinical user needs only to input the required drawing of electron field in computer graphical file format, prescription dose, and beam parameters to EMUC to calculate the required monitor unit for the electron beam treatment. EMUC is based on a semi-experimental theory of sector-integration algorithm. SWIMRT is a multileaf collimator machine file creator to generate a fluence map produced by a medical linear accelerator. This machine file controls

  10. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.

  11. Development of Graphical User Interface Student Electoral System

    Directory of Open Access Journals (Sweden)

    Challiz Delima- Omorog

    2016-08-01

    Full Text Available The study was conducted to design and obtain evidence concerning the software quality and acceptance of a graphical user interface (GUI student electoral voting system. The intention of this research is three-fold; firstly, a system based on ISO 9126 software quality characteristics, secondly, a system that conforms to the current hardware and software standard and lastly, improve student participation to decision-making. Designing a usable system in the context of the user’s perception (needs and let these perceptions dictate the design is therefore a great challenge. This study used descriptivedevelopment research method. Data were collected thru guided interviews and survey questionnaires from the respondents. The researcher adopted the Princeton Development Methodology through the entire life cycle of the software development process. A very substantial majority of the respondents stated that for them, the new voting system is highly acceptable as compared to the old system both in terms of development (maintainability and portability and implementation (efficiency, functionality, reliability and usability requirements of the ISO 9126. The researcher came to conclude that usability is tied to the four software characteristics. Users’ perception about software quality-implementation requirement is correlated specifically with usability. Based on data and the problems encountered, respondents’ placed low importance on metrics if it is not well represented in the interface. When the interface fails, users are more likely to take longer to vote, failing efficiency targets and be less reliable, weakening functionality

  12. An intuitive graphical user interface for small UAS

    Science.gov (United States)

    Stroumtsos, Nicholas; Gilbreath, Gary; Przybylski, Scott

    2013-05-01

    Thousands of small UAVs are in active use by the US military and are generally operated by trained but not necessarily skilled personnel. The user interfaces for these devices often seem to be more engineering-focused than usability-focused, which can lead to operator frustration, poor mission effectiveness, reduced situational awareness, and sometimes loss of the vehicle. In addition, coordinated control of both air and ground vehicles is a frequently desired objective, usually with the intent of increasing situational awareness for the ground vehicle. The Space and Naval Warfare Systems Center Pacific (SSCPAC) is working under a Naval Innovative Science and Engineering project to address these topics. The UAS currently targeted are the Raven/Puma/Wasp family of air vehicles as they are small, all share the same communications protocol, and are in wide-spread use. The stock ground control station (GCS) consists of a hand control unit, radio, interconnect hub, and laptop. The system has been simplified to an X-box controller, radio and a laptop, resulting in a smaller hardware footprint, but most importantly the number of personnel required to operate the system has been reduced from two to one. The stock displays, including video with text overlay on one and FalconView on the other, are replaced with a single, graphics-based, integrated user interface, providing the user with much improved situational awareness. The SSCPAC government-developed GCS (the Multi-robot Operator Control Unit) already has the ability to control ground robots and this is leveraged to realize simultaneous multi-vehicle operations including autonomous UAV over-watch for enhanced UGV situational awareness.

  13. SPIKY: a graphical user interface for monitoring spike train synchrony.

    Science.gov (United States)

    Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa

    2015-05-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.

  14. Beowulf - Beta-Gamma Detector Calibration Graphical User Interface

    Energy Technology Data Exchange (ETDEWEB)

    McIntyre, Justin I.; Schrom, Brian T.; Cooper, Matthew W.; Haas, Derek A.; Hayes, James C.

    2009-09-21

    Pacific Northwest National Laboratory (PNNL) has demonstrated significant advancement in using beta-gamma coincidence detectors to detect a wide range of radioxenon isotopes. To obtain accurate activities with the detector it must be properly calibrated by measuring a series of calibration gas samples. The data is analyzed to create the calibration block used in the International Monitoring System file format. Doing the calibration manually has proven to be tedious and prone to errors, requiring a high degree of expertise. The Beowulf graphical user interface (GUI) is a software application that encompasses several components of the calibration task and generates a calibration block, as well as, a detailed report describing the specific calibration process used. This additional document can be used as a Quality assurance certificate to assist in auditing the calibration. This paper consists of two sections. Section 1 will describe the capabilities of Beowulf and section 2 will be a representative report generated or the 137Cs calibration and quality assurance source.

  15. GCL – An Easy Way for Creating Graphical User Interfaces

    Directory of Open Access Journals (Sweden)

    Mariusz Trzaska

    2011-02-01

    Full Text Available Graphical User Interfaces (GUI can be created using several approaches. Beside using visual editors or a manually written source code, it is possible to employ a declarative method. Such a solution usually allows working on a higher abstraction level which saves the developers' time and reduces errors. The approach can follow many ideas. One of them is based on utilizing a Domain Specific Language (DSL. In this paper we present the results of our research concerning a DSL language called GCL (GUI Creating Language. The prototype is implemented as a library for Java with an API emulating the syntax and semantics of a DSL language. A programmer, using a few keywords, is able to create different types of GUIs, including forms, panels, dialogs, etc. The widgets of the GUI are built automatically during the run-time phase based on a given data instance (an ordinary Java object and optionally are to be customized by the programmer. The main contribution of our work is delivering a working library for a popular platform. The library could be easily ported for other programming languages such the MS C#.

  16. Optoelectronic polarimeter controlled by a graphical user interface of Matlab

    Science.gov (United States)

    Vilardy, J. M.; Jimenez, C. J.; Torres, R.

    2017-01-01

    We show the design and implementation of an optical polarimeter using electronic control. The polarimeter has a software with a graphical user interface (GUI) that controls the optoelectronic setup and captures the optical intensity measurement, and finally, this software evaluates the Stokes vector of a state of polarization (SOP) by means of the synchronous detection of optical waves. The proposed optoelectronic polarimeter can determine the Stokes vector of a SOP in a rapid and efficient way. Using the polarimeter proposed in this paper, the students will be able to observe (in an optical bench) and understand the different interactions of the SOP when the optical waves pass through to the linear polarizers and retarder waves plates. The polarimeter prototype could be used as a main tool for the students in order to learn the theory and experimental aspects of the SOP for optical waves via the Stokes vector measurement. The proposed polarimeter controlled by a GUI of Matlab is more attractive and suitable to teach and to learn the polarization of optical waves.

  17. Extending Graphic Statics for User-Controlled Structural Morphogenesis

    OpenAIRE

    Fivet, Corentin; Zastavni, Denis; Cap, Jean-François; Structural Morphology Group International Seminar 2011

    2011-01-01

    The first geometrical definitions of any structure are of primary importance when considering pertinence and efficiency in structural design processes. Engineering history has taught us how graphic statics can be a very powerful tool since it allows the designer to take shapes and forces into account simultaneously. However, current and past graphic statics methods are more suitable for analysis than structural morphogenesis. This contribution introduces new graphical methods that can supp...

  18. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  19. Simultaneous Optimization of Multiple Responses with the R Package JOP

    Directory of Open Access Journals (Sweden)

    Sonja Kuhnt

    2013-09-01

    Full Text Available A joint optimization plot, shortly JOP, graphically displays the result of a loss function based robust parameter design for multiple responses. Different importance of reaching a target value can be assigned to the individual responses by weights. The JOP method simultaneously runs through a whole range of possible weights. For each weight matrix a parameter setting is derived which minimizes the estimated expected loss. The joint optimization plot displays these settings together with corresponding expected values and standard deviations of the response variable. The R package JOP provides all tools necessary to apply the JOP approach to a given data set. It also returns parameter settings for a desirable compromise of achieved expected responses chosen from the plot.

  20. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    Science.gov (United States)

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  1. smwrData—An R package of example hydrologic data, version 1.1.1

    Science.gov (United States)

    Lorenz, David L.

    2015-11-06

    A collection of 24 datasets, including streamflow, well characteristics, groundwater elevations, and discrete water-quality concentrations, is provided to produce a consistent set of example data to demonstrate typical data manipulations or statistical analysis of hydrologic data. These example data are provided in an R package called smwrData. The data in the package have been collected by the U.S. Geological Survey or published in its reports, for example Helsel and Hirsch (2002). The R package provides a convenient mechanism for distributing the data to users of R within the U.S. Geological Survey and other users in the R community.

  2. Phxnlme: An R package that facilitates pharmacometric workflow of Phoenix NLME analyses.

    Science.gov (United States)

    Lim, Chay Ngee; Liang, Shuang; Feng, Kevin; Chittenden, Jason; Henry, Ana; Mouksassi, Samer; Birnbaum, Angela K

    2017-03-01

    Pharmacometric analyses are integral components of the drug development process, and Phoenix NLME is one of the popular software used to conduct such analyses. To address current limitations with model diagnostic graphics and efficiency of the workflow for this software, we developed an R package, Phxnlme, to facilitate its workflow and provide improved graphical diagnostics. Phxnlme was designed to provide functionality for the major tasks that are usually performed in pharmacometric analyses (i.e. nonlinear mixed effects modeling, basic model diagnostics, visual predictive checks and bootstrap). Various estimation methods for modeling using the R package are made available through the Phoenix NLME engine. The Phxnlme R package utilizes other packages such as ggplot2 and lattice to produce the graphical output, and various features were included to allow customizability of the output. Interactive features for some plots were also added using the manipulate R package. Phxnlme provides enhanced capabilities for nonlinear mixed effects modeling that can be accessed using the phxnlme() command. Output from the model can be graphed to assess the adequacy of model fits and further explore relationships in the data using various functions included in this R package, such as phxplot() and phxvpc.plot(). Bootstraps, stratified up to three variables, can also be performed to obtain confidence intervals around the model estimates. With the use of an R interface, different R projects can be created to allow multi-tasking, which addresses the current limitation of the Phoenix NLME desktop software. In addition, there is a wide selection of diagnostic and exploratory plots in the Phxnlme package, with improvements in the customizability of plots, compared to Phoenix NLME. The Phxnlme package is a flexible tool that allows implementation of the analytical workflow of Phoenix NLME with R, with features for greater overall efficiency and improved customizable graphics. Phxnlme is

  3. ada: An R Package for Stochastic Boosting

    Directory of Open Access Journals (Sweden)

    Mark Culp

    2006-09-01

    Full Text Available Boosting is an iterative algorithm that combines simple classification rules with ‘mediocre’ performance in terms of misclassification error rate to produce a highly accurate classification rule. Stochastic gradient boosting provides an enhancement which incorporates a random mechanism at each boosting step showing an improvement in performance and speed in generating the ensemble. ada is an R package that implements three popular variants of boosting, together with a version of stochastic gradient boosting. In addition, useful plots for data analytic purposes are provided along with an extension to the multi-class case. The algorithms are illustrated with synthetic and real data sets.

  4. ada: An R Package for Stochastic Boosting

    Directory of Open Access Journals (Sweden)

    Mark Culp

    2006-09-01

    Full Text Available Boosting is an iterative algorithm that combines simple classification rules with "mediocre" performance in terms of misclassification error rate to produce a highly accurate classification rule. Stochastic gradient boosting provides an enhancement which incorporates a random mechanism at each boosting step showing an improvement in performance and speed in generating the ensemble. ada is an R package that implements three popular variants of boosting, together with a version of stochastic gradient boosting. In addition, useful plots for data analytic purposes are provided along with an extension to the multi-class case. The algorithms are illustrated with synthetic and real data sets.

  5. An Improved User Interface for an Interactive Graphics Figure Illustrator.

    Science.gov (United States)

    1987-06-01

    user to be knowledgeable of the local text editor on the computer system being used. Additionally, the user needs acceptable typing skills to reduce...rectanglexc draw smooth -line() smoothline.c() draw textO text.c duplicate() duplicatexc edit menu() edit-menuxc editobj() edit obj.c ellipse2list...es() edit objo( changefonto( changels() changelw() changetex- tao instructionso select font() select Iso select-Iwo select textO arc2list() file2list

  6. The Langevin Approach: An R Package for Modeling Markov Processes

    Directory of Open Access Journals (Sweden)

    Philip Rinn

    2016-08-01

    Full Text Available We describe an 'R' package developed by the research group 'Turbulence, Wind energy' 'and Stochastics' (TWiSt at the Carl von Ossietzky University of Oldenburg, which extracts the (stochastic evolution equation underlying a set of data or measurements. The method can be directly applied to data sets with one or two stochastic variables. Examples for the one-dimensional and two-dimensional cases are provided. This framework is valid under a small set of conditions which are explicitly presented and which imply simple preliminary test procedures to the data. For Markovian processes involving Gaussian white noise, a stochastic differential equation is derived straightforwardly from the time series and captures the full dynamical properties of the underlying process. Still, even in the case such conditions are not fulfilled, there are alternative versions of this method which we discuss briefly and provide the user with the necessary bibliography.

  7. Bayesian cost-effectiveness analysis with the R package BCEA

    CERN Document Server

    Baio, Gianluca; Heath, Anna

    2017-01-01

    The book provides a description of the process of health economic evaluation and modelling for cost-effectiveness analysis, particularly from the perspective of a Bayesian statistical approach. Some relevant theory and introductory concepts are presented using practical examples and two running case studies. The book also describes in detail how to perform health economic evaluations using the R package BCEA (Bayesian Cost-Effectiveness Analysis). BCEA can be used to post-process the results of a Bayesian cost-effectiveness model and perform advanced analyses producing standardised and highly customisable outputs. It presents all the features of the package, including its many functions and their practical application, as well as its user-friendly web interface. The book is a valuable resource for statisticians and practitioners working in the field of health economics wanting to simplify and standardise their workflow, for example in the preparation of dossiers in support of marketing authorisation, or acade...

  8. The Langevin Approach: An R Package for Modeling Markov Processes

    CERN Document Server

    Rinn, Philip; Wächter, Matthias; Peinke, Joachim

    2016-01-01

    We describe an R package developed by the research group Turbulence, Wind energy and Stochastics (TWiSt) at the Carl von Ossietzky University of Oldenburg, which extracts the (stochastic) evolution equation underlying a set of data or measurements. The method can be directly applied to data sets with one or two stochastic variables. Examples for the one-dimensional and two-dimensional cases are provided. This framework is valid under a small set of conditions which are explicitly presented and which imply simple preliminary test procedures to the data. For Markovian processes involving Gaussian white noise, a stochastic differential equation is derived straightforwardly from the time series and captures the full dynamical properties of the underlying process. Still, even in the case such conditions are not fulfilled, there are alternative versions of this method which we discuss briefly and provide the user with the necessary bibliography.

  9. Easyplot: An Interactive, User-Friendly Graphics Program.

    Science.gov (United States)

    1984-09-01

    AD-Ali58 779 EASYPLOT: AN INTERACTIVE USER-FRIENDLY GRPHICS PROGRAM 1/2 (U) NAVAL POSTGRADUATE SCHOOL MONTEREY CRRNWRN SP UNCLASSIFIED F/G 9/2 N...log) ar.J thzce-dime nsional (standard) .Graph correction and a-lteraticn dre possible with a ininimu~a oZ effort. The irogram was designed for the...thr-ougi, the design irocess to produce tie desired graph. The irograx is compieteiy int=r- act Ive. it prompts the user "or all ti’e necessary i;f

  10. The art of visualising dose distributions: Improved plotting flexibility for the R-package 'Luminescence'

    Science.gov (United States)

    Dietze, Michael; Kreutzer, Sebastian; Burow, Christoph; Fuchs, Margret; Fischer, Manfred; Schmidt, Christoph

    2014-05-01

    Luminescence dating profoundly relies on the compelling presentation of equivalent doses. However, there is no perfect way to depict equivalent dose distributions with all their measures of uncertainty. Amongst others, most common approaches are the Radial Plot and kernel density estimate (KDE) graphs. Both plot types are supported by the R-package 'Luminescence', a comprehensive and flexible compilation of functions for convenient analysis and presentation of luminescence dating data. In its upcoming version, the package comprises updated versions of these two most popular plot functions to allow the user sound control over a wide variety of graphical parameters. Furthermore, a new plot type is added: The Abanico Plot (plot_AbanicoPlot()). It combines the strengths of both, the classic Radial Plot and a KDE plot. Our contribution will show all updated data visualisation approaches and provide a quick guide (workflow chart) on how to get from measurement data to high-quality dose distribution plots. It may serve to raise further discussions about the package in general and specific plot approaches in particular.

  11. GUIdock: Using Docker Containers with a Common Graphics User Interface to Address the Reproducibility of Research.

    Directory of Open Access Journals (Sweden)

    Ling-Hong Hung

    Full Text Available Reproducibility is vital in science. For complex computational methods, it is often necessary, not just to recreate the code, but also the software and hardware environment to reproduce results. Virtual machines, and container software such as Docker, make it possible to reproduce the exact environment regardless of the underlying hardware and operating system. However, workflows that use Graphical User Interfaces (GUIs remain difficult to replicate on different host systems as there is no high level graphical software layer common to all platforms. GUIdock allows for the facile distribution of a systems biology application along with its graphics environment. Complex graphics based workflows, ubiquitous in systems biology, can now be easily exported and reproduced on many different platforms. GUIdock uses Docker, an open source project that provides a container with only the absolutely necessary software dependencies and configures a common X Windows (X11 graphic interface on Linux, Macintosh and Windows platforms. As proof of concept, we present a Docker package that contains a Bioconductor application written in R and C++ called networkBMA for gene network inference. Our package also includes Cytoscape, a java-based platform with a graphical user interface for visualizing and analyzing gene networks, and the CyNetworkBMA app, a Cytoscape app that allows the use of networkBMA via the user-friendly Cytoscape interface.

  12. Application of the AMBUR R package for spatio-temporal analysis of shoreline change: Jekyll Island, Georgia, USA

    Science.gov (United States)

    Jackson, Chester W.; Alexander, Clark R.; Bush, David M.

    2012-04-01

    The AMBUR (Analyzing Moving Boundaries Using R) package for the R software environment provides a collection of functions for assisting with analyzing and visualizing historical shoreline change. The package allows import and export of geospatial data in ESRI shapefile format, which is compatible with most commercial and open-source GIS software. The "baseline and transect" method is the primary technique used to quantify distances and rates of shoreline movement, and to detect classification changes across time. Along with the traditional "perpendicular" transect method, two new transect methods, "near" and "filtered," assist with quantifying changes along curved shorelines that are problematic for perpendicular transect methods. Output from the analyses includes data tables, graphics, and geospatial data, which are useful in rapidly assessing trends and potential errors in the dataset. A forecasting function also allows the user to estimate the future location of the shoreline and store the results in a shapefile. Other utilities and tools provided in the package assist with preparing and manipulating geospatial data, error checking, and generating supporting graphics and shapefiles. The package can be customized to perform additional statistical, graphical, and geospatial functions, and, it is capable of analyzing the movement of any boundary (e.g., shorelines, glacier terminus, fire edge, and marine and terrestrial ecozones).

  13. REPPlab: An R package for detecting clusters and outliers using exploratory projection pursuit

    OpenAIRE

    Fischer, Daniel; Berro, Alain; Nordhausen, Klaus; Ruiz-Gazen, Anne

    2016-01-01

    The R-package REPPlab is designed to explore multivariate data sets using one-dimensional unsupervised projection pursuit. It is useful in practice as a preprocessing step to find clusters or as an outlier detection tool for multivariate numerical data. Except from the package tourr that implements smooth sequences of projection matrices and rggobi that provides an interface to a dynamic graphics package called GGobi, there is no implementation of exploratory projection pursuit tools availabl...

  14. smwrGraphs—An R package for graphing hydrologic data, version 1.1.2

    Science.gov (United States)

    Lorenz, David L.; Diekoff, Aliesha L.

    2017-01-31

    This report describes an R package called smwrGraphs, which consists of a collection of graphing functions for hydrologic data within R, a programming language and software environment for statistical computing. The functions in the package have been developed by the U.S. Geological Survey to create high-quality graphs for publication or presentation of hydrologic data that meet U.S. Geological Survey graphics guidelines.

  15. VMSbase: an R-package for VMS and logbook data management and analysis in fisheries ecology.

    Directory of Open Access Journals (Sweden)

    Tommaso Russo

    Full Text Available VMSbase is an R package devised to manage, process and visualize information about fishing vessels activity (provided by the vessel monitoring system--VMS and catches/landings (as reported in the logbooks. VMSbase is primarily conceived to be user-friendly; to this end, a suite of state-of-the-art analyses is accessible via a graphical interface. In addition, the package uses a database platform allowing large datasets to be stored, managed and processed vey efficiently. Methodologies include data cleaning, that is removal of redundant or evidently erroneous records, and data enhancing, that is interpolation and merging with external data sources. In particular, VMSbase is able to estimate sea bottom depth for single VMS pings using an on-line connection to the National Oceanic and Atmospheric Administration (NOAA database. It also allows VMS pings to be assigned to whatever geographic partitioning has been selected by users. Standard analyses comprise: 1 métier identification (using a modified CLARA clustering approach on Logbook data or Artificial Neural Networks on VMS data; 2 linkage between VMS and Logbook records, with the former organized into fishing trips; 3 discrimination between steaming and fishing points; 4 computation of spatial effort with respect to user-selected grids; 5 calculation of standard fishing effort indicators within Data Collection Framework; 6 a variety of mapping tools, including an interface for Google viewer; 7 estimation of trawled area. Here we report a sample workflow for the accessory sample datasets (available with the package in order to explore the potentialities of VMSbase. In addition, the results of some performance tests on two large datasets (1×10(5 and 1×10(6 VMS signals, respectively are reported to inform about the time required for the analyses. The results, although merely illustrative, indicate that VMSbase can represent a step forward in extracting and enhancing information from VMS

  16. VMSbase: an R-package for VMS and logbook data management and analysis in fisheries ecology.

    Science.gov (United States)

    Russo, Tommaso; D'Andrea, Lorenzo; Parisi, Antonio; Cataudella, Stefano

    2014-01-01

    VMSbase is an R package devised to manage, process and visualize information about fishing vessels activity (provided by the vessel monitoring system--VMS) and catches/landings (as reported in the logbooks). VMSbase is primarily conceived to be user-friendly; to this end, a suite of state-of-the-art analyses is accessible via a graphical interface. In addition, the package uses a database platform allowing large datasets to be stored, managed and processed vey efficiently. Methodologies include data cleaning, that is removal of redundant or evidently erroneous records, and data enhancing, that is interpolation and merging with external data sources. In particular, VMSbase is able to estimate sea bottom depth for single VMS pings using an on-line connection to the National Oceanic and Atmospheric Administration (NOAA) database. It also allows VMS pings to be assigned to whatever geographic partitioning has been selected by users. Standard analyses comprise: 1) métier identification (using a modified CLARA clustering approach on Logbook data or Artificial Neural Networks on VMS data); 2) linkage between VMS and Logbook records, with the former organized into fishing trips; 3) discrimination between steaming and fishing points; 4) computation of spatial effort with respect to user-selected grids; 5) calculation of standard fishing effort indicators within Data Collection Framework; 6) a variety of mapping tools, including an interface for Google viewer; 7) estimation of trawled area. Here we report a sample workflow for the accessory sample datasets (available with the package) in order to explore the potentialities of VMSbase. In addition, the results of some performance tests on two large datasets (1×10(5) and 1×10(6) VMS signals, respectively) are reported to inform about the time required for the analyses. The results, although merely illustrative, indicate that VMSbase can represent a step forward in extracting and enhancing information from VMS/logbook data

  17. Java-based Graphical User Interface for MAVERIC-II

    Science.gov (United States)

    Seo, Suk Jai

    2005-01-01

    A computer program entitled "Marshall Aerospace Vehicle Representation in C II, (MAVERIC-II)" is a vehicle flight simulation program written primarily in the C programming language. It is written by James W. McCarter at NASA/Marshall Space Flight Center. The goal of the MAVERIC-II development effort is to provide a simulation tool that facilitates the rapid development of high-fidelity flight simulations for launch, orbital, and reentry vehicles of any user-defined configuration for all phases of flight. MAVERIC-II has been found invaluable in performing flight simulations for various Space Transportation Systems. The flexibility provided by MAVERIC-II has allowed several different launch vehicles, including the Saturn V, a Space Launch Initiative Two-Stage-to-Orbit concept and a Shuttle-derived launch vehicle, to be simulated during ascent and portions of on-orbit flight in an extremely efficient manner. It was found that MAVERIC-II provided the high fidelity vehicle and flight environment models as well as the program modularity to allow efficient integration, modification and testing of advanced guidance and control algorithms. In addition to serving as an analysis tool for techno logy development, many researchers have found MAVERIC-II to be an efficient, powerful analysis tool that evaluates guidance, navigation, and control designs, vehicle robustness, and requirements. MAVERIC-II is currently designed to execute in a UNIX environment. The input to the program is composed of three segments: 1) the vehicle models such as propulsion, aerodynamics, and guidance, navigation, and control 2) the environment models such as atmosphere and gravity, and 3) a simulation framework which is responsible for executing the vehicle and environment models and propagating the vehicle s states forward in time and handling user input/output. MAVERIC users prepare data files for the above models and run the simulation program. They can see the output on screen and/or store in

  18. DGCA: A comprehensive R package for Differential Gene Correlation Analysis.

    Science.gov (United States)

    McKenzie, Andrew T; Katsyv, Igor; Song, Won-Min; Wang, Minghui; Zhang, Bin

    2016-11-15

    Dissecting the regulatory relationships between genes is a critical step towards building accurate predictive models of biological systems. A powerful approach towards this end is to systematically study the differences in correlation between gene pairs in more than one distinct condition. In this study we develop an R package, DGCA (for Differential Gene Correlation Analysis), which offers a suite of tools for computing and analyzing differential correlations between gene pairs across multiple conditions. To minimize parametric assumptions, DGCA computes empirical p-values via permutation testing. To understand differential correlations at a systems level, DGCA performs higher-order analyses such as measuring the average difference in correlation and multiscale clustering analysis of differential correlation networks. Through a simulation study, we show that the straightforward z-score based method that DGCA employs significantly outperforms the existing alternative methods for calculating differential correlation. Application of DGCA to the TCGA RNA-seq data in breast cancer not only identifies key changes in the regulatory relationships between TP53 and PTEN and their target genes in the presence of inactivating mutations, but also reveals an immune-related differential correlation module that is specific to triple negative breast cancer (TNBC). DGCA is an R package for systematically assessing the difference in gene-gene regulatory relationships under different conditions. This user-friendly, effective, and comprehensive software tool will greatly facilitate the application of differential correlation analysis in many biological studies and thus will help identification of novel signaling pathways, biomarkers, and targets in complex biological systems and diseases.

  19. Graphical User Interface for Simplified Neutron Transport Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Schwarz, Randolph; Carter, Leland L

    2011-07-18

    A number of codes perform simple photon physics calculations. The nuclear industry is lacking in similar tools to perform simplified neutron physics shielding calculations. With the increased importance of performing neutron calculations for homeland security applications and defense nuclear nonproliferation tasks, having an efficient method for performing simple neutron transport calculations becomes increasingly important. Codes such as Monte Carlo N-particle (MCNP) can perform the transport calculations; however, the technical details in setting up, running, and interpreting the required simulations are quite complex and typically go beyond the abilities of most users who need a simple answer to a neutron transport calculation. The work documented in this report resulted in the development of the NucWiz program, which can create an MCNP input file for a set of simple geometries, source, and detector configurations. The user selects source, shield, and tally configurations from a set of pre-defined lists, and the software creates a complete MCNP input file that can be optionally run and the results viewed inside NucWiz.

  20. mediation: R Package for Causal Mediation Analysis

    Directory of Open Access Journals (Sweden)

    Dustin Tingley

    2014-09-01

    Full Text Available In this paper, we describe the R package mediation for conducting causal mediation analysis in applied empirical research. In many scientific disciplines, the goal of researchers is not only estimating causal effects of a treatment but also understanding the process in which the treatment causally affects the outcome. Causal mediation analysis is frequently used to assess potential causal mechanisms. The mediation package implements a comprehensive suite of statistical tools for conducting such an analysis. The package is organized into two distinct approaches. Using the model-based approach, researchers can estimate causal mediation effects and conduct sensitivity analysis under the standard research design. Furthermore, the design-based approach provides several analysis tools that are applicable under different experimental designs. This approach requires weaker assumptions than the model-based approach. We also implement a statistical method for dealing with multiple (causally dependent mediators, which are often encountered in practice. Finally, the package also offers a methodology for assessing causal mediation in the presence of treatment noncompliance, a common problem in randomized trials.

  1. A modular approach for item response theory modeling with the R package flirt.

    Science.gov (United States)

    Jeon, Minjeong; Rijmen, Frank

    2016-06-01

    The new R package flirt is introduced for flexible item response theory (IRT) modeling of psychological, educational, and behavior assessment data. flirt integrates a generalized linear and nonlinear mixed modeling framework with graphical model theory. The graphical model framework allows for efficient maximum likelihood estimation. The key feature of flirt is its modular approach to facilitate convenient and flexible model specifications. Researchers can construct customized IRT models by simply selecting various modeling modules, such as parametric forms, number of dimensions, item and person covariates, person groups, link functions, etc. In this paper, we describe major features of flirt and provide examples to illustrate how flirt works in practice.

  2. SDD: An R Package for Serial Dependence Diagrams

    Directory of Open Access Journals (Sweden)

    Luca Bagnato

    2015-03-01

    Full Text Available Detecting and measuring lag-dependencies is very important in time-series analysis. This study is commonly carried out by focusing on the linear lag-dependencies via the well-known autocorrelogram. However, in practice, there are many situations in which the autocorrelogram fails because of the nonlinear structure of the serial dependence. To cope with this problem, in this paper the R package SDD is introduced. Among the available approaches to analyze the lag-dependencies in an omnibus way, the SDD package considers the autodependogram and some of its variants. The autodependogram, defined by computing the classical Pearson χ2 -statistic at various lags, is a graphical device recently proposed in the literature to analyze lag-dependencies. The concept of reproducibility probability, and several density-based measures of divergence, are considered to define the variants of the autodependogram. An application to daily returns of the Swiss Market Index is also presented to exemplify the use of the package.

  3. Teaching Photovoltaic Array Modelling and Characterization Using a Graphical User Interface and a Flash Solar Simulator

    DEFF Research Database (Denmark)

    Spataru, Sergiu; Sera, Dezso; Kerekes, Tamas

    2012-01-01

    This paper presents a set of laboratory tools aimed to support students with various backgrounds (no programming) to understand photovoltaic array modelling and characterization techniques. A graphical user interface (GUI) has been developed in Matlab, for modelling PV arrays and characterizing...

  4. Graphical User Interface Development and Design to Support Airport Runway Configuration Management

    Science.gov (United States)

    Jones, Debra G.; Lenox, Michelle; Onal, Emrah; Latorella, Kara A.; Lohr, Gary W.; Le Vie, Lisa

    2015-01-01

    The objective of this effort was to develop a graphical user interface (GUI) for the National Aeronautics and Space Administration's (NASA) System Oriented Runway Management (SORM) decision support tool to support runway management. This tool is expected to be used by traffic flow managers and supervisors in the Airport Traffic Control Tower (ATCT) and Terminal Radar Approach Control (TRACON) facilities.

  5. Guidance from the Graphical User Interface (GUI) Experience: What GUI Teaches about Technology Access.

    Science.gov (United States)

    National Council on Disability, Washington, DC.

    This report investigates the use of the graphical user interface (GUI) in computer programs, the problems it creates for individuals with visual impairments or blindness, and advocacy efforts concerning this issue, which have been targeted primarily at Microsoft, producer of Windows. The report highlights the concerns of individuals with visual…

  6. tmle : An R Package for Targeted Maximum Likelihood Estimation

    Directory of Open Access Journals (Sweden)

    Susan Gruber

    2012-11-01

    Full Text Available Targeted maximum likelihood estimation (TMLE is a general approach for constructing an efficient double-robust semi-parametric substitution estimator of a causal effect parameter or statistical association measure. tmle is a recently developed R package that implements TMLE of the effect of a binary treatment at a single point in time on an outcome of interest, controlling for user supplied covariates, including an additive treatment effect, relative risk, odds ratio, and the controlled direct effect of a binary treatment controlling for a binary intermediate variable on the pathway from treatment to the out- come. Estimation of the parameters of a marginal structural model is also available. The package allows outcome data with missingness, and experimental units that contribute repeated records of the point-treatment data structure, thereby allowing the analysis of longitudinal data structures. Relevant factors of the likelihood may be modeled or fit data-adaptively according to user specifications, or passed in from an external estimation procedure. Effect estimates, variances, p values, and 95% confidence intervals are provided by the software.

  7. A Prototype Lisp-Based Soft Real-Time Object-Oriented Graphical User Interface for Control System Development

    Science.gov (United States)

    Litt, Jonathan; Wong, Edmond; Simon, Donald L.

    1994-01-01

    A prototype Lisp-based soft real-time object-oriented Graphical User Interface for control system development is presented. The Graphical User Interface executes alongside a test system in laboratory conditions to permit observation of the closed loop operation through animation, graphics, and text. Since it must perform interactive graphics while updating the screen in real time, techniques are discussed which allow quick, efficient data processing and animation. Examples from an implementation are included to demonstrate some typical functionalities which allow the user to follow the control system's operation.

  8. Database Graphic User Interface correspondence with Ellis Information Seeking behavior Model

    Directory of Open Access Journals (Sweden)

    Muhammad Azami

    2010-03-01

    Full Text Available   Graphic User interface serves as a bridge between man and databases. Its primary purpose is to assist uses by establishing interaction with computer systems. Database user interface designers have seldom focused on the impact of user information seeking behaviors on the database user interface structures. Therefore, it is crucial to incorporate the user information seeking behavior within database software design as well as analyzing their impact on upgrade and optimization of user interface environment. The present study intends to determine the degree of correspondence of database interface with information seeking behavioral components of Ellis’ model. The component studied starting, chaining, browsing, differentiating, monitoring and extracting. Investigators employed direct observation method, using a checklist, in order to see how much the database interfaces support these components. Results indicated that the information seeking behavior components outlined by Ellis Model are not fully considered in database user interface design. Some of the components such as starting, chaining and differentiation were to some extent supported by some of database user interfaces studied. However elements such as browsing, monitoring and extracting have not been incorporated within the user interface structures of these databases. On the whole, the degree of correspondence and correlation of database user interfaces with Ellis information seeking components is about average. Therefore incorporating these elements in design and evaluation of user interface environment could have high impact on better optimization of database interface environment and consequently the very process of search and retrieval.

  9. Spatial issues in user interface design from a graphic design perspective

    Science.gov (United States)

    Marcus, Aaron

    1989-01-01

    The user interface of a computer system is a visual display that provides information about the status of operations on data within the computer and control options to the user that enable adjustments to these operations. From the very beginning of computer technology the user interface was a spatial display, although its spatial features were not necessarily complex or explicitly recognized by the users. All text and nonverbal signs appeared in a virtual space generally thought of as a single flat plane of symbols. Current technology of high performance workstations permits any element of the display to appear as dynamic, multicolor, 3-D signs in a virtual 3-D space. The complexity of appearance and the user's interaction with the display provide significant challenges to the graphic designer of current and future user interfaces. In particular, spatial depiction provides many opportunities for effective communication of objects, structures, processes, navigation, selection, and manipulation. Issues are presented that are relevant to the graphic designer seeking to optimize the user interface's spatial attributes for effective visual communication.

  10. Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array

    Science.gov (United States)

    Sadeh, I.; Oya, I.; Schwarz, J.; Pietriga, E.

    2016-07-01

    The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction (HCI). The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.

  11. Prototyping the graphical user interface for the operator of the Cherenkov Telescope Array

    CERN Document Server

    Sadeh, Iftach; Schwarz, Joseph; Pietriga, Emmanuel

    2016-01-01

    The Cherenkov Telescope Array (CTA) is a planned gamma-ray observatory. CTA will incorporate about 100 imaging atmospheric Cherenkov telescopes (IACTs) at a Southern site, and about 20 in the North. Previous IACT experiments have used up to five telescopes. Subsequently, the design of a graphical user interface (GUI) for the operator of CTA involves new challenges. We present a GUI prototype, the concept for which is being developed in collaboration with experts from the field of Human-Computer Interaction. The prototype is based on Web technology; it incorporates a Python web server, Web Sockets and graphics generated with the d3.js Javascript library.

  12. BGRAPH -- A Program for Biplot Multivariate Graphics. Version 1. User’s Guide.

    Science.gov (United States)

    1981-09-01

    AOA11 449 ROCHESTER UNIV NY DIV OF BIOSTATISTICS F/G 9/2 BBRAPH -- A PROGRAM FOR BIPLOT MULTIVARIATE GRAPHICS. VERSION 1-ETC(U) SEP 81 M C TSIANCO. C...IIIIIIIIIIIIII BGRAPH -- A Program for Biplot Multivariate Graphics. Version 1: User’s Guide Michael C. Tsianco Charles L. Odoroff Sandra Plumb and K...Summary of BGRAPH Commands 2.0 Introduction 2.1 A Simple Example: A Biplot and its BGRAPH Display 2.2 Description of BGRAPH -- How to Invoke BGRAPH on

  13. An Overview on R Packages for Seasonal Analysis of Time Series

    Directory of Open Access Journals (Sweden)

    Haibin Qiu

    2014-05-01

    Full Text Available Time series analysis consists of approaches for analysing time series data so thatimportant information and other features can be isolated from the data. Time series forecasting is the use of a model to predict perspective values on the basis of previouly observed values by a model. Statisticians generally use R project or R language, a free and popular programming language and computer software environment for statistical computing and graphics, for developing statistical computer software and data analysis. Plenty of time series display cyclic variation significant as seasonality, periodic variation, or periodic fluctuations in statistics. This study introducesabundant functions in the R packages TSA, marls, depersonalize and season for analyzing seasonal processes of time series, are introduced in this study. Note that R packages marls, depersonalize and season are included in the comprehensive R archive network task view TimeSeries.

  14. TopKLists: a comprehensive R package for statistical inference, stochastic aggregation, and visualization of multiple omics ranked lists.

    Science.gov (United States)

    Schimek, Michael G; Budinská, Eva; Kugler, Karl G; Švendová, Vendula; Ding, Jie; Lin, Shili

    2015-06-01

    High-throughput sequencing techniques are increasingly affordable and produce massive amounts of data. Together with other high-throughput technologies, such as microarrays, there are an enormous amount of resources in databases. The collection of these valuable data has been routine for more than a decade. Despite different technologies, many experiments share the same goal. For instance, the aims of RNA-seq studies often coincide with those of differential gene expression experiments based on microarrays. As such, it would be logical to utilize all available data. However, there is a lack of biostatistical tools for the integration of results obtained from different technologies. Although diverse technological platforms produce different raw data, one commonality for experiments with the same goal is that all the outcomes can be transformed into a platform-independent data format - rankings - for the same set of items. Here we present the R package TopKLists, which allows for statistical inference on the lengths of informative (top-k) partial lists, for stochastic aggregation of full or partial lists, and for graphical exploration of the input and consolidated output. A graphical user interface has also been implemented for providing access to the underlying algorithms. To illustrate the applicability and usefulness of the package, we integrated microRNA data of non-small cell lung cancer across different measurement techniques and draw conclusions. The package can be obtained from CRAN under a LGPL-3 license.

  15. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    Science.gov (United States)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  16. FIRINPC and FIRACPC graphics post-processor support user`s guide and programmer`s reference

    Energy Technology Data Exchange (ETDEWEB)

    Hensel, E. [New Mexico State Univ., Las Cruces, NM (United States). Dept. of Mechanical Engineering

    1992-03-01

    FIRIN is a computer program used by DOE fire protection engineers to simulate hypothetical fire accidents in compartments at DOE facilities. The FIRIN code is typically used in conjunction with a ventilation system code such as FIRAC, which models the impact of the fire compartment upon the rest of the system. The code described here, FIRINPC is a PC based implementation of the full mainframe code FIRIN. In addition, FIRINPC contains graphics support for monitoring the progress of the simulation during execution and for reviewing the complete results of the simulation upon completion of the run. This document describes how to install, test, and subsequently use the code FIRINPC, and addresses differences in usage between the PC version of the code and its mainframe predecessor. The PC version contains all of the modeling capabilities of the earlier version, with additional graphics support. This user`s guide is a supplement to the original FIRIN report published by the NRC. FIRAC is a computer program used by DOE fire protection engineers to simulate the transient response of complete ventilation system to fire induced transients. FIRAC has the ability to use the FIRIN code as the driving function or source term for the ventilation system response. The current version of FIRAC does not contain interactive graphics capabilities. A third program, called POST, is made available for reviewing the results of a previous FIRIN or FIRAC simulation, without having to recompute the numerical simulation. POST uses the output data files created by FIRINPC and FIRACPC to avoid recomputation.

  17. gems: An R Package for Simulating from Disease Progression Models

    Directory of Open Access Journals (Sweden)

    Nello Blaser

    2015-03-01

    Full Text Available Mathematical models of disease progression predict disease outcomes and are useful epidemiological tools for planners and evaluators of health interventions. The R package gems is a tool that simulates disease progression in patients and predicts the effect of different interventions on patient outcome. Disease progression is represented by a series of events (e.g., diagnosis, treatment and death, displayed in a directed acyclic graph. The vertices correspond to disease states and the directed edges represent events. The package gems allows simulations based on a generalized multistate model that can be described by a directed acyclic graph with continuous transition-specific hazard functions. The user can specify an arbitrary hazard function and its parameters. The model includes parameter uncertainty, does not need to be a Markov model, and may take the history of previous events into account. Applications are not limited to the medical field and extend to other areas where multistate simulation is of interest. We provide a technical explanation of the multistate models used by gems, explain the functions of gems and their arguments, and show a sample application.

  18. Multiscale analysis of river networks using the R package linbin

    Science.gov (United States)

    Welty, Ethan Z.; Torgersen, Christian E.; Brenkman, Samuel J.; Duda, Jeffrey J.; Armstrong, Jonathan B.

    2015-01-01

    Analytical tools are needed in riverine science and management to bridge the gap between GIS and statistical packages that were not designed for the directional and dendritic structure of streams. We introduce linbin, an R package developed for the analysis of riverscapes at multiple scales. With this software, riverine data on aquatic habitat and species distribution can be scaled and plotted automatically with respect to their position in the stream network or—in the case of temporal data—their position in time. The linbin package aggregates data into bins of different sizes as specified by the user. We provide case studies illustrating the use of the software for (1) exploring patterns at different scales by aggregating variables at a range of bin sizes, (2) comparing repeat observations by aggregating surveys into bins of common coverage, and (3) tailoring analysis to data with custom bin designs. Furthermore, we demonstrate the utility of linbin for summarizing patterns throughout an entire stream network, and we analyze the diel and seasonal movements of tagged fish past a stationary receiver to illustrate how linbin can be used with temporal data. In short, linbin enables more rapid analysis of complex data sets by fisheries managers and stream ecologists and can reveal underlying spatial and temporal patterns of fish distribution and habitat throughout a riverscape.

  19. smwrBase—An R package for managing hydrologic data, version 1.1.1

    Science.gov (United States)

    Lorenz, David L.

    2015-12-09

    This report describes an R package called smwrBase, which consists of a collection of functions to import, transform, manipulate, and manage hydrologic data within the R statistical environment. Functions in the package allow users to import surface-water and groundwater data from the U.S. Geological Survey’s National Water Information System database and other sources. Additional functions are provided to transform, manipulate, and manage hydrologic data in ways necessary for analyzing the data.

  20. Design and validation of an improved graphical user interface with the 'Tool ball'.

    Science.gov (United States)

    Lee, Kuo-Wei; Lee, Ying-Chu

    2012-01-01

    The purpose of this research is introduce the design of an improved graphical user interface (GUI) and verifies the operational efficiency of the proposed interface. Until now, clicking the toolbar with the mouse is the usual way to operate software functions. In our research, we designed an improved graphical user interface - a tool ball that is operated by a mouse wheel to perform software functions. Several experiments are conducted to measure the time needed to operate certain software functions with the traditional combination of "mouse click + tool button" and the proposed integration of "mouse wheel + tool ball". The results indicate that the tool ball design can accelerate the speed of operating software functions, decrease the number of icons on the screen, and enlarge the applications of the mouse wheel. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  1. AutoAssemblyD: a graphical user interface system for several genome assemblers.

    Science.gov (United States)

    Veras, Adonney Allan de Oliveira; de Sá, Pablo Henrique Caracciolo Gomes; Azevedo, Vasco; Silva, Artur; Ramos, Rommel Thiago Jucá

    2013-01-01

    Next-generation sequencing technologies have increased the amount of biological data generated. Thus, bioinformatics has become important because new methods and algorithms are necessary to manipulate and process such data. However, certain challenges have emerged, such as genome assembly using short reads and high-throughput platforms. In this context, several algorithms have been developed, such as Velvet, Abyss, Euler-SR, Mira, Edna, Maq, SHRiMP, Newbler, ALLPATHS, Bowtie and BWA. However, most such assemblers do not have a graphical interface, which makes their use difficult for users without computing experience given the complexity of the assembler syntax. Thus, to make the operation of such assemblers accessible to users without a computing background, we developed AutoAssemblyD, which is a graphical tool for genome assembly submission and remote management by multiple assemblers through XML templates. AssemblyD is freely available at https://sourceforge.net/projects/autoassemblyd. It requires Sun jdk 6 or higher.

  2. dcGOR: an R package for analysing ontologies and protein domain annotations.

    Directory of Open Access Journals (Sweden)

    Hai Fang

    2014-10-01

    Full Text Available I introduce an open-source R package 'dcGOR' to provide the bioinformatics community with the ease to analyse ontologies and protein domain annotations, particularly those in the dcGO database. The dcGO is a comprehensive resource for protein domain annotations using a panel of ontologies including Gene Ontology. Although increasing in popularity, this database needs statistical and graphical support to meet its full potential. Moreover, there are no bioinformatics tools specifically designed for domain ontology analysis. As an add-on package built in the R software environment, dcGOR offers a basic infrastructure with great flexibility and functionality. It implements new data structure to represent domains, ontologies, annotations, and all analytical outputs as well. For each ontology, it provides various mining facilities, including: (i domain-based enrichment analysis and visualisation; (ii construction of a domain (semantic similarity network according to ontology annotations; and (iii significance analysis for estimating a contact (statistical significance network. To reduce runtime, most analyses support high-performance parallel computing. Taking as inputs a list of protein domains of interest, the package is able to easily carry out in-depth analyses in terms of functional, phenotypic and diseased relevance, and network-level understanding. More importantly, dcGOR is designed to allow users to import and analyse their own ontologies and annotations on domains (taken from SCOP, Pfam and InterPro and RNAs (from Rfam as well. The package is freely available at CRAN for easy installation, and also at GitHub for version control. The dedicated website with reproducible demos can be found at http://supfam.org/dcGOR.

  3. dcGOR: an R package for analysing ontologies and protein domain annotations.

    Science.gov (United States)

    Fang, Hai

    2014-10-01

    I introduce an open-source R package 'dcGOR' to provide the bioinformatics community with the ease to analyse ontologies and protein domain annotations, particularly those in the dcGO database. The dcGO is a comprehensive resource for protein domain annotations using a panel of ontologies including Gene Ontology. Although increasing in popularity, this database needs statistical and graphical support to meet its full potential. Moreover, there are no bioinformatics tools specifically designed for domain ontology analysis. As an add-on package built in the R software environment, dcGOR offers a basic infrastructure with great flexibility and functionality. It implements new data structure to represent domains, ontologies, annotations, and all analytical outputs as well. For each ontology, it provides various mining facilities, including: (i) domain-based enrichment analysis and visualisation; (ii) construction of a domain (semantic similarity) network according to ontology annotations; and (iii) significance analysis for estimating a contact (statistical significance) network. To reduce runtime, most analyses support high-performance parallel computing. Taking as inputs a list of protein domains of interest, the package is able to easily carry out in-depth analyses in terms of functional, phenotypic and diseased relevance, and network-level understanding. More importantly, dcGOR is designed to allow users to import and analyse their own ontologies and annotations on domains (taken from SCOP, Pfam and InterPro) and RNAs (from Rfam) as well. The package is freely available at CRAN for easy installation, and also at GitHub for version control. The dedicated website with reproducible demos can be found at http://supfam.org/dcGOR.

  4. New Graphical User Interface for EXAFS analysis with the GNXAS suite of programs

    Science.gov (United States)

    Hatada, Keisuke; Iesari, Fabio; Properzi, Leonardo; Minicucci, M.; di Cicco, Andrea

    2016-05-01

    GNXAS is a suite of programs based on multiple scattering calculations which performs a structural refinement of EXAFS spectra. It can be used for any system although it has been mainly developed to determine the local structure of disordered substances. We developed a user-friendly graphical user interface (GUI) to facilitate use of the codes by using wxPython. The developed GUI and the codes are multiplatform running on Windows, Macintosh and Linux systems, and are free shareware (http://gnxas.unicam.it). In this work we illustrate features and potentials of this newly developed version of GNXAS (w-GNXAS).

  5. Towards automatically generating graphical user interfaces from openEHR archetypes.

    Science.gov (United States)

    Schuler, Thilo; Garde, Sebastian; Heard, Sam; Beale, Thomas

    2006-01-01

    One of the main challenges in the field of Electronic Health Records (EHRs) is semantic interoperability. To utilise the full potential of interoperable EHR systems they have to be accepted by their users, the health care providers. Good Graphical User Interfaces (GUIs) that support customisation and data validation play a decisive role for user acceptance and data quality. This study investigates the use of openEHR archetypes to automatically generate coherent, customizable, data-validating GUIs. Using the Mozilla XML User Interface Language (XUL) a series of prototypes has been developed. The results show that the automatic generation of GUIs from openEHR archetypes is feasible in principle. Although XUL revealed some problems, the advantages of XML-based GUI languages are evident.

  6. MuSim, a Graphical User Interface for Multiple Simulation Programs

    Energy Technology Data Exchange (ETDEWEB)

    Roberts, Thomas [MUONS Inc., Batavia; Cummings, Mary Anne [MUONS Inc., Batavia; Johnson, Rolland [MUONS Inc., Batavia; Neuffer, David [Fermilab

    2016-06-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parametrized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline, MAD-X, and MCNP; more coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For today's technology-driven students, graphical interfaces meet their expectations far better than text-based tools, and education in accelerator physics is one of our primary goals.

  7. IncucyteDRC: An R package for the dose response analysis of live cell imaging data

    OpenAIRE

    Philip J. Chapman; Dominic I. James; Amanda J. Watson; Hopkins, Gemma V.; Waddell, Ian D.; Ogilvie, Donald J.

    2016-01-01

    We present IncucyteDRC, an R package for the analysis of data from live cell imaging cell proliferation experiments carried out on the Essen Biosciences IncuCyte ZOOM instrument. The package provides a simple workflow for summarising data into a form that can be used to calculate dose response curves and EC50 values for small molecule inhibitors. Data from different cell lines, or cell lines grown under different conditions, can be normalised as to their doubling time. A simple graphical web ...

  8. GUIDS:A Graphical User Interface Development System in UniECAD

    Institute of Scientific and Technical Information of China (English)

    许建国; 魏文欣

    1994-01-01

    UniECAD is an integrated electronic CAD system,the user interface development system is the key of the integration of UniECAD.This paper presents the architecture of GUIDS,a graphical user interface development system in UniECAD,and then discusses a series of new techniques and methods in the design and the implementation of this system around the following aspects:the editing environment of interface elements,the implementation of dialogue control and the automatic generation of interface code.As an example,the generation of the main interfaces of UniECAD shows the procedure of the development of user interfaces with this development system.

  9. Transportable Applications Environment (TAE) Plus - A NASA productivity tool used to develop graphical user interfaces

    Science.gov (United States)

    Szczur, Martha R.

    1991-01-01

    The Transportable Applications Environment (TAE) Plus, developed at NASA's Goddard Space Flight Center, is an advanced portable user interface development environment which simplifies the process of creating and managing complex application graphical user interfaces (GUIs), supports prototyping, allows applications to be oported easily between different platforms, and encourages appropriate levels of user interface consistency between applications. This paper discusses the capabilities of the TAE Plus tool, and how it makes the job of designing and developing GUIs easier for the application developers. The paper also explains how tools like TAE Plus provide for reusability and ensure reliability of UI software components, as well as how they aid in the reduction of development and maintenance costs.

  10. An R package for statistical provenance analysis

    Science.gov (United States)

    Vermeesch, Pieter; Resentini, Alberto; Garzanti, Eduardo

    2016-05-01

    This paper introduces provenance, a software package within the statistical programming environment R, which aims to facilitate the visualisation and interpretation of large amounts of sedimentary provenance data, including mineralogical, petrographic, chemical and isotopic provenance proxies, or any combination of these. provenance comprises functions to: (a) calculate the sample size required to achieve a given detection limit; (b) plot distributional data such as detrital zircon U-Pb age spectra as Cumulative Age Distributions (CADs) or adaptive Kernel Density Estimates (KDEs); (c) plot compositional data as pie charts or ternary diagrams; (d) correct the effects of hydraulic sorting on sandstone petrography and heavy mineral composition; (e) assess the settling equivalence of detrital minerals and grain-size dependence of sediment composition; (f) quantify the dissimilarity between distributional data using the Kolmogorov-Smirnov and Sircombe-Hazelton distances, or between compositional data using the Aitchison and Bray-Curtis distances; (e) interpret multi-sample datasets by means of (classical and nonmetric) Multidimensional Scaling (MDS) and Principal Component Analysis (PCA); and (f) simplify the interpretation of multi-method datasets by means of Generalised Procrustes Analysis (GPA) and 3-way MDS. All these tools can be accessed through an intuitive query-based user interface, which does not require knowledge of the R programming language. provenance is free software released under the GPL-2 licence and will be further expanded based on user feedback.

  11. A MATLAB Graphical User Interface Dedicated to the Optimal Design of the High Power Induction Motor with Heavy Starting Conditions

    Directory of Open Access Journals (Sweden)

    Maria Brojboiu

    2014-09-01

    Full Text Available In this paper, a Matlab graphical user interface dedicated to the optimal design of the high power induction motor with heavy starting conditions is presented. This graphical user interface allows to input the rated parameters, the selection of the induction motor type and the optimization criterion of the induction motor design also. For the squirrel cage induction motor the graphical user interface allows the selection of the rotor bar geometry, the material of the rotor bar as well as the fastening technology of the shorting ring on the rotor bar. The Matlab graphical user interface is developed and applied to the general optimal design program of the induction motor described in [1], [2].

  12. Development of MATLAB-Based Digital Signal Processing Teaching Module with Graphical User Interface Environment for Nigerian University

    National Research Council Canada - National Science Library

    Oyetunji Samson Ade; Daniel Ale

    2013-01-01

    .... This paper annexes the potential of Peripheral Interface Controllers (PICs) with MATLAB resources to develop a PIC-based system with graphic user interface environment suitable for data acquisition and signal processing...

  13. Implementation of a graphical user interface for the virtual multifrequency spectrometer: The VMS-Draw tool.

    Science.gov (United States)

    Licari, Daniele; Baiardi, Alberto; Biczysko, Malgorzata; Egidi, Franco; Latouche, Camille; Barone, Vincenzo

    2015-02-15

    This article presents the setup and implementation of a graphical user interface (VMS-Draw) for a virtual multifrequency spectrometer. Special attention is paid to ease of use, generality and robustness for a panel of spectroscopic techniques and quantum mechanical approaches. Depending on the kind of data to be analyzed, VMS-Draw produces different types of graphical representations, including two-dimensional or three-dimesional (3D) plots, bar charts, or heat maps. Among other integrated features, one may quote the convolution of stick spectra to obtain realistic line-shapes. It is also possible to analyze and visualize, together with the structure, the molecular orbitals and/or the vibrational motions of molecular systems thanks to 3D interactive tools. On these grounds, VMS-Draw could represent a useful additional tool for spectroscopic studies integrating measurements and computer simulations.

  14. A standard format and a graphical user interface for spin system specification

    CERN Document Server

    Biternas, A G; Kuprov, Ilya

    2013-01-01

    We introduce a simple and general XML format for spin system description that is the result of extensive consultations within Magnetic Resonance community and unifies under one roof all major existing spin interaction specification conventions. The format is human-readable, easy to edit and easy to parse using standard XML libraries. We also describe a graphical user interface that was designed to facilitate construction and visualization of complicated spin systems. The interface is capable of generating input files for several popular spin dynamics simulation packages.

  15. Design and Implementation of a User Friendly OpenModelica Graphical Connection Editor

    OpenAIRE

    Asghar, Syed Adeel; Tariq, Sonia

    2010-01-01

    OpenModelica (www.openmodelica.org) is an open-source Modelica-based modeling and simulation environment intended for industrial as well as academic usage. Its long-term development is supported by a non-profit organization – the Open Source Modelica Consortium OSMC, where Linköping University is a member.The main reason behind this thesis was the need for a user friendly, efficient and modular OpenModelica graphical connection editor. The already existing open source editors were either text...

  16. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  17. Graphical user interface (GUI for design of passenger car suspension system using random road profile

    Directory of Open Access Journals (Sweden)

    Duna Tariq Yaseen

    2016-01-01

    Full Text Available In this paper, an Interactive approach has been introduced for the design of Passenger car suspension system subjected to random road profile, by building a graphical user interface (GUI, using Matlab/Guide has been presented. The aim of the work is to show the importance and usefulness of the developed GUI in designing and describing the dynamic behavior of car suspension system for different design criteria. Common problems in the field of design of suspension systems for the quarter-car passive model are analyzed. The result shows that the designed GUI is very convenient for engineers, analysts, and designers of car suspension systems.

  18. Database-centric Development of Menus andGraphic User Interfaces

    Directory of Open Access Journals (Sweden)

    R.B. Aggarwal

    2007-01-01

    Full Text Available The database-centric approach to graphic user interface (GUI development, quickly andeasily manages standardisation and modification of labels and look and feel of controls bykeeping various control-creation data into the database. The runtime generation of controlsprovides the flexibility to control their creation and modification issues. This method freezes theapplication code once the development is over. The process of recompilation is eliminated whencreation or modification of controls is done. Dynamic controls such as menus, label, text box,button, combo box, list box, group box, check box, radio button, tab control, spin button, treecontrol can be easily formed and controlled using this approach.

  19. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  20. A standard format and a graphical user interface for spin system specification

    Science.gov (United States)

    Biternas, A. G.; Charnock, G. T. P.; Kuprov, Ilya

    2014-03-01

    We introduce a simple and general XML format for spin system description that is the result of extensive consultations within Magnetic Resonance community and unifies under one roof all major existing spin interaction specification conventions. The format is human-readable, easy to edit and easy to parse using standard XML libraries. We also describe a graphical user interface that was designed to facilitate construction and visualization of complicated spin systems. The interface is capable of generating input files for several popular spin dynamics simulation packages.

  1. MuSim, a graphical user interface for multiple simulation programs

    CERN Document Server

    Roberts, Thomas J; Johnson, Rolland Paul; Neuffer, David Vincent

    2016-01-01

    MuSim is a new user-friendly program designed to interface to many different particle simulation codes, regardless of their data formats or geometry descriptions. It presents the user with a compelling graphical user interface that includes a flexible 3-D view of the simulated world plus powerful editing and drag-and-drop capabilities. All aspects of the design can be parameterized so that parameter scans and optimizations are easy. It is simple to create plots and display events in the 3-D viewer (with a slider to vary the transparency of solids), allowing for an effortless comparison of different simulation codes. Simulation codes: G4beamline 3.02 and MCNP 6.1; more are coming. Many accelerator design tools and beam optics codes were written long ago, with primitive user interfaces by today's standards. MuSim is specifically designed to make it easy to interface to such codes, providing a common user experience for all, and permitting the construction and exploration of models with very little overhead. For...

  2. Graphical user interface for input output characterization of single variable and multivariable highly nonlinear systems

    Directory of Open Access Journals (Sweden)

    Shahrukh Adnan Khan M. D.

    2017-01-01

    Full Text Available This paper presents a Graphical User Interface (GUI software utility for the input/output characterization of single variable and multivariable nonlinear systems by obtaining the sinusoidal input describing function (SIDF of the plant. The software utility is developed on MATLAB R2011a environment. The developed GUI holds no restriction on the nonlinearity type, arrangement and system order; provided that output(s of the system is obtainable either though simulation or experiments. An insight to the GUI and its features are presented in this paper and example problems from both single variable and multivariable cases are demonstrated. The formulation of input/output behavior of the system is discussed and the nucleus of the MATLAB command underlying the user interface has been outlined. Some of the industries that would benefit from this software utility includes but not limited to aerospace, defense technology, robotics and automotive.

  3. Theorema 2.0: A Graphical User Interface for a Mathematical Assistant System

    Directory of Open Access Journals (Sweden)

    Wolfgang Windsteiger

    2013-07-01

    Full Text Available Theorema 2.0 stands for a re-design including a complete re-implementation of the Theorema system, which was originally designed, developed, and implemented by Bruno Buchberger and his Theorema group at RISC. In this paper, we present the first prototype of a graphical user interface (GUI for the new system. It heavily relies on powerful interactive capabilities introduced in recent releases of the underlying Mathematica system, most importantly the possibility of having dynamic objects connected to interface elements like sliders, menus, check-boxes, radio-buttons and the like. All these features are fully integrated into the Mathematica programming environment and allow the implementation of a modern user interface.

  4. Development and New Directions for the RELAP5-3D Graphical Users Interface

    Energy Technology Data Exchange (ETDEWEB)

    Mesina, George Lee

    2001-09-01

    The direction of development for the RELAP5 Graphical User Interfaces (RGUI) has been extended. In addition to existing plans for displaying all aspects of RELAP5 calculations, the plan now includes plans to display the calculations of a variety of codes including SCDAP, RETRAN and FLUENT. Recent work has included such extensions along with the previously planned and user-requested improvements and extensions. Visualization of heat-structures has been added. Adaptations were made for another computer program, SCDAP-3D, including plant core views. An input model builder for generating RELAP5-3D input files was partially implemented. All these are reported. Plans for future work are also summarized. These include an input processor that transfers steady-state conditions into an input file.

  5. iFlow: A Graphical User Interface for Flow Cytometry Tools in Bioconductor

    Directory of Open Access Journals (Sweden)

    Kyongryun Lee

    2009-01-01

    Full Text Available Flow cytometry (FCM has become an important analysis technology in health care and medical research, but the large volume of data produced by modern high-throughput experiments has presented significant new challenges for computational analysis tools. The development of an FCM software suite in Bioconductor represents one approach to overcome these challenges. In the spirit of the R programming language (Tree Star Inc., “FlowJo”, these tools are predominantly console-driven, allowing for programmatic access and rapid development of novel algorithms. Using this software requires a solid understanding of programming concepts and of the R language. However, some of these tools|in particular the statistical graphics and novel analytical methods|are also useful for nonprogrammers. To this end, we have developed an open source, extensible graphical user interface (GUI iFlow, which sits on top of the Bioconductor backbone, enabling basic analyses by means of convenient graphical menus and wizards. We envision iFlow to be easily extensible in order to quickly integrate novel methodological developments.

  6. GeoXp : An R Package for Exploratory Spatial Data Analysis

    Directory of Open Access Journals (Sweden)

    Thibault Laurent

    2012-04-01

    Full Text Available We present GeoXp, an R package implementing interactive graphics for exploratory spatial data analysis. We use a data set concerning public schools of the French MidiPyrenees region to illustrate the use of these exploratory techniques based on the coupling between a statistical graph and a map. Besides elementary plots like boxplots,histograms or simple scatterplots, GeoXp also couples maps with Moran scatterplots, variogram clouds, Lorenz curves and other graphical tools. In order to make the most of the multidimensionality of the data, GeoXp includes dimension reduction techniques such as principal components analysis and cluster analysis whose results are also linked to the map.

  7. User perception and interpretation of tornado probabilistic hazard information: Comparison of four graphical designs.

    Science.gov (United States)

    Miran, Seyed M; Ling, Chen; James, Joseph J; Gerard, Alan; Rothfusz, Lans

    2017-11-01

    Effective design for presenting severe weather information is important to reduce devastating consequences of severe weather. The Probabilistic Hazard Information (PHI) system for severe weather is being developed by NOAA National Severe Storms Laboratory (NSSL) to communicate probabilistic hazardous weather information. This study investigates the effects of four PHI graphical designs for tornado threat, namely, "four-color"," red-scale", "grayscale" and "contour", on users' perception, interpretation, and reaction to threat information. PHI is presented on either a map background or a radar background. Analysis showed that the accuracy was significantly higher and response time faster when PHI was displayed on map background as compared to radar background due to better contrast. When displayed on a radar background, "grayscale" design resulted in a higher accuracy of responses. Possibly due to familiarity, participants reported four-color design as their favorite design, which also resulted in the fastest recognition of probability levels on both backgrounds. Our study shows the importance of using intuitive color-coding and sufficient contrast in conveying probabilistic threat information via graphical design. We also found that users follows a rational perceiving-judging-feeling-and acting approach in processing probabilistic hazard information for tornado. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. ESA New Generation Science Archives: New Technologies Applied to Graphical User Interface Creation

    Science.gov (United States)

    Fernandez, M.; Arviset, C.; Barbarisi, I.; Castellanos, J.; Cheek, N.; Costa, H.; Fajersztejn, N.; Gonzalez, J.; Laruelo, A.; Leon, I.; Ortiz, I.; Osuna, P.; Salgado, J.; Stebe, A.; Tapiador, D.

    2010-12-01

    The Science Archives and VO Team (SAT) has undertaken the effort to build state of the art sub-systems for its new generation of archives. At the time of writing this abstract, the new technology has already been applied to the creation of the SOHO and EXOSAT Science Archive s and will be used to re-engineer some of the already existing ESA Science Archives in the future. The Graphical User Interface sub-system has been designed and developed upon the premises of building a lightweight rich client application to query and retrieve scientific data quickly and efficiently; special attention has been paid to the usability and ergonomics of the interface. The system architecture relies on the Model View Controller pattern, which isolates logic from the graphical interface. Multiple window layout arrangements are possible using a docking windows framework with virtually no limitations (InfoNode). New graphical components have been developed to fulfill project-specific user requirements. For example video animations can be generated at runtime based on image data requests matching a specific search criteria. In addition, interoperability is achieved with other tools for data visualization purposes using internationally approved standards (c.f., IVOA SAMP), a messaging protocol already adopted by several analysis tools (ds9, Aladin, Gaia). In order to avoid the increasingly common network constraints affecting the end-user’s daily work the system has been designed to cope with possible restrictive firewall set up. Therefore, ESA New Generation archives are accessible from anyplace where standard basic port 80 HTTP connections are available.

  9. Solving Differential Equations in R: Package deSolve

    NARCIS (Netherlands)

    Soetaert, K.E.R.; Petzoldt, T.; Setzer, R.W.

    2010-01-01

    In this paper we present the R package deSolve to solve initial value problems (IVP) written as ordinary differential equations (ODE), differential algebraic equations (DAE) of index 0 or 1 and partial differential equations (PDE), the latter solved using the method of lines approach. The differenti

  10. Structural Equation Modeling Diagnostics Using R Package Semdiag and EQS

    Science.gov (United States)

    Yuan, Ke-Hai; Zhang, Zhiyong

    2012-01-01

    Yuan and Hayashi (2010) introduced 2 scatter plots for model and data diagnostics in structural equation modeling (SEM). However, the generation of the plots requires in-depth understanding of their underlying technical details. This article develops and introduces an R package semdiag for easily drawing the 2 plots. With a model specified in EQS…

  11. Graphical user interface for yield and dose estimations for cyclotron-produced technetium.

    Science.gov (United States)

    Hou, X; Vuckovic, M; Buckley, K; Bénard, F; Schaffer, P; Ruth, T; Celler, A

    2014-07-07

    The cyclotron-based (100)Mo(p,2n)(99m)Tc reaction has been proposed as an alternative method for solving the shortage of (99m)Tc. With this production method, however, even if highly enriched molybdenum is used, various radioactive and stable isotopes will be produced simultaneously with (99m)Tc. In order to optimize reaction parameters and estimate potential patient doses from radiotracers labeled with cyclotron produced (99m)Tc, the yields for all reaction products must be estimated. Such calculations, however, are extremely complex and time consuming. Therefore, the objective of this study was to design a graphical user interface (GUI) that would automate these calculations, facilitate analysis of the experimental data, and predict dosimetry. The resulting GUI, named Cyclotron production Yields and Dosimetry (CYD), is based on Matlab®. It has three parts providing (a) reaction yield calculations, (b) predictions of gamma emissions and (c) dosimetry estimations. The paper presents the outline of the GUI, lists the parameters that must be provided by the user, discusses the details of calculations and provides examples of the results. Our initial experience shows that the proposed GUI allows the user to very efficiently calculate the yields of reaction products and analyze gamma spectroscopy data. However, it is expected that the main advantage of this GUI will be at the later clinical stage when entering reaction parameters will allow the user to predict production yields and estimate radiation doses to patients for each particular cyclotron run.

  12. AGUIA: autonomous graphical user interface assembly for clinical trials semantic data services

    Directory of Open Access Journals (Sweden)

    Hayashi Yuki

    2010-10-01

    Full Text Available Abstract Background AGUIA is a front-end web application originally developed to manage clinical, demographic and biomolecular patient data collected during clinical trials at MD Anderson Cancer Center. The diversity of methods involved in patient screening and sample processing generates a variety of data types that require a resource-oriented architecture to capture the associations between the heterogeneous data elements. AGUIA uses a semantic web formalism, resource description framework (RDF, and a bottom-up design of knowledge bases that employ the S3DB tool as the starting point for the client's interface assembly. Methods The data web service, S3DB, meets the necessary requirements of generating the RDF and of explicitly distinguishing the description of the domain from its instantiation, while allowing for continuous editing of both. Furthermore, it uses an HTTP-REST protocol, has a SPARQL endpoint, and has open source availability in the public domain, which facilitates the development and dissemination of this application. However, S3DB alone does not address the issue of representing content in a form that makes sense for domain experts. Results We identified an autonomous set of descriptors, the GBox, that provides user and domain specifications for the graphical user interface. This was achieved by identifying a formalism that makes use of an RDF schema to enable the automatic assembly of graphical user interfaces in a meaningful manner while using only resources native to the client web browser (JavaScript interpreter, document object model. We defined a generalized RDF model such that changes in the graphic descriptors are automatically and immediately (locally reflected into the configuration of the client's interface application. Conclusions The design patterns identified for the GBox benefit from and reflect the specific requirements of interacting with data generated by clinical trials, and they contain clues for a general

  13. Two graphical user interfaces for managing and analyzing MODFLOW groundwater-model scenarios

    Science.gov (United States)

    Banta, Edward R.

    2014-01-01

    Scenario Manager and Scenario Analyzer are graphical user interfaces that facilitate the use of calibrated, MODFLOW-based groundwater models for investigating possible responses to proposed stresses on a groundwater system. Scenario Manager allows a user, starting with a calibrated model, to design and run model scenarios by adding or modifying stresses simulated by the model. Scenario Analyzer facilitates the process of extracting data from model output and preparing such display elements as maps, charts, and tables. Both programs are designed for users who are familiar with the science on which groundwater modeling is based but who may not have a groundwater modeler’s expertise in building and calibrating a groundwater model from start to finish. With Scenario Manager, the user can manipulate model input to simulate withdrawal or injection wells, time-variant specified hydraulic heads, recharge, and such surface-water features as rivers and canals. Input for stresses to be simulated comes from user-provided geographic information system files and time-series data files. A Scenario Manager project can contain multiple scenarios and is self-documenting. Scenario Analyzer can be used to analyze output from any MODFLOW-based model; it is not limited to use with scenarios generated by Scenario Manager. Model-simulated values of hydraulic head, drawdown, solute concentration, and cell-by-cell flow rates can be presented in display elements. Map data can be represented as lines of equal value (contours) or as a gradated color fill. Charts and tables display time-series data obtained from output generated by a transient-state model run or from user-provided text files of time-series data. A display element can be based entirely on output of a single model run, or, to facilitate comparison of results of multiple scenarios, an element can be based on output from multiple model runs. Scenario Analyzer can export display elements and supporting metadata as a Portable

  14. Perception of Graphical Virtual Environments by Blind Users via Sensory Substitution.

    Science.gov (United States)

    Maidenbaum, Shachar; Buchs, Galit; Abboud, Sami; Lavi-Rotbain, Ori; Amedi, Amir

    2016-01-01

    Graphical virtual environments are currently far from accessible to blind users as their content is mostly visual. This is especially unfortunate as these environments hold great potential for this population for purposes such as safe orientation, education, and entertainment. Previous tools have increased accessibility but there is still a long way to go. Visual-to-audio Sensory-Substitution-Devices (SSDs) can increase accessibility generically by sonifying on-screen content regardless of the specific environment and offer increased accessibility without the use of expensive dedicated peripherals like electrode/vibrator arrays. Using SSDs virtually utilizes similar skills as when using them in the real world, enabling both training on the device and training on environments virtually before real-world visits. This could enable more complex, standardized and autonomous SSD training and new insights into multisensory interaction and the visually-deprived brain. However, whether congenitally blind users, who have never experienced virtual environments, will be able to use this information for successful perception and interaction within them is currently unclear.We tested this using the EyeMusic SSD, which conveys whole-scene visual information, to perform virtual tasks otherwise impossible without vision. Congenitally blind users had to navigate virtual environments and find doors, differentiate between them based on their features (Experiment1:task1) and surroundings (Experiment1:task2) and walk through them; these tasks were accomplished with a 95% and 97% success rate, respectively. We further explored the reactions of congenitally blind users during their first interaction with a more complex virtual environment than in the previous tasks-walking down a virtual street, recognizing different features of houses and trees, navigating to cross-walks, etc. Users reacted enthusiastically and reported feeling immersed within the environment. They highlighted the

  15. COREMAP: Graphical user interface for displaying reactor core data in an interactive hexagon map

    Energy Technology Data Exchange (ETDEWEB)

    Muscat, F.L.; Derstine, K.L.

    1995-06-01

    COREMAP is a Graphical User Interface (GUI) designed to assist users read and check reactor core data from multidimensional neutronic simulation models in color and/or as text in an interactive 2D planar grid of hexagonal subassemblies. COREMAP is a complete GEODST/RUNDESC viewing tool which enables the user to access multi data set files (e.g. planes, moments, energy groups ,... ) and display up to two data sets simultaneously, one as color and the other as text. The user (1) controls color scale characteristics such as type (linear or logarithmic) and range limits, (2) controls the text display based upon conditional statements on data spelling, and value. (3) chooses zoom features such as core map size, number of rings and surrounding subassemblies, and (4) specifies the data selection for supplied popup subwindows which display a selection of data currently off-screen for a selected cell, as a list of data and/or as a graph. COREMAP includes a RUNDESC file editing tool which creates ``proposed`` Run-description files by point and click revisions to subassembly assignments in an existing EBRII Run-description file. COREMAP includes a fully automated printing option which creates high quality PostScript color or greyscale images of the core map independent of the monitor used, e.g. color prints can be generated with a session from a color or monochrome monitor. The automated PostScript output is an alternative to the xgrabsc based printing option. COREMAP includes a plotting option which creates graphs related to a selected cell. The user specifies the X and Y coordinates types (planes, moment, group, flux ,... ) and a parameter, P, when displaying several curves for the specified (X, Y) pair COREMAP supports hexagonal geometry reactor core configurations specified by: the GEODST file and binary Standard Interface Files and the RUNDESC ordering.

  16. Transit Analysis Package: An IDL Graphical User Interface for Exoplanet Transit Photometry

    Directory of Open Access Journals (Sweden)

    J. Zachary Gazak

    2012-01-01

    Full Text Available We present an IDL graphical user-interface-driven software package designed for the analysis of exoplanet transit light curves. The Transit Analysis Package (TAP software uses Markov Chain Monte Carlo (MCMC techniques to fit light curves using the analytic model of Mandal and Agol (2002. The package incorporates a wavelet-based likelihood function developed by Carter and Winn (2009, which allows the MCMC to assess parameter uncertainties more robustly than classic χ2 methods by parameterizing uncorrelated “white” and correlated “red” noise. The software is able to simultaneously analyze multiple transits observed in different conditions (instrument, filter, weather, etc.. The graphical interface allows for the simple execution and interpretation of Bayesian MCMC analysis tailored to a user’s specific data set and has been thoroughly tested on ground-based and Kepler photometry. This paper describes the software release and provides applications to new and existing data. Reanalysis of ground-based observations of TrES-1b, WASP-4b, and WASP-10b (Winn et al., 2007, 2009; Johnson et al., 2009; resp. and space-based Kepler 4b–8b (Kipping and Bakos 2010 show good agreement between TAP and those publications. We also present new multi-filter light curves of WASP-10b and we find excellent agreement with previously published values for a smaller radius.

  17. A graphical user interface for numerical modeling of acclimation responses of vegetation to climate change

    Science.gov (United States)

    Le, Phong V. V.; Kumar, Praveen; Drewry, Darren T.; Quijano, Juan C.

    2012-12-01

    Ecophysiological models that vertically resolve vegetation canopy states are becoming a powerful tool for studying the exchange of mass, energy, and momentum between the land surface and the atmosphere. A mechanistic multilayer canopy-soil-root system model (MLCan) developed by Drewry et al. (2010a) has been used to capture the emergent vegetation responses to elevated atmospheric CO2 for both C3 and C4 plants under various climate conditions. However, processing input data and setting up such a model can be time-consuming and error-prone. In this paper, a graphical user interface that has been developed for MLCan is presented. The design of this interface aims to provide visualization capabilities and interactive support for processing input meteorological forcing data and vegetation parameter values to facilitate the use of this model. In addition, the interface also provides graphical tools for analyzing the forcing data and simulated numerical results. The model and its interface are both written in the MATLAB programming language. Finally, an application of this model package for capturing the ecohydrological responses of three bioenergy crops (maize, miscanthus, and switchgrass) to local environmental drivers at two different sites in the Midwestern United States is presented.

  18. LiTrack A Fast longitudinal phase space tracking code with graphical user interface

    CERN Document Server

    Emma, Paul

    2005-01-01

    Many linear accelerators, such as linac-based light sources and linear colliders, apply longitudinal phase space manipulations in their design, including electron bunch compression and wakefield-induced energy spread control. Several computer codes handle such issues, but most require detailed information on the transverse focusing lattice. In fact, in most linear accelerators, the transverse distributions do not significantly affect the longitudinal, and can be ignored initially. This allows the use of a fast 2D code to study longitudinal aspects without time-consuming considerations of the transverse focusing. LiTrack is based on a 15-year old code (same name) originally written by one of us (KB), which is now a MATLAB-based code with additional features, such as a graphical user interface and output plotting. The single-bunch tracking includes RF acceleration, bunch compression to 3rd order, geometric and resistive wakefields, aperture limits, synchrotron radiation, and flexible output plotting. The code w...

  19. A Developed Graphical User Interface for Power System Stability and Robustness Studies

    Directory of Open Access Journals (Sweden)

    GHOURAF Djamel Eddine

    2015-06-01

    Full Text Available This paper present the realization and development of a graphical user interface (GUI to studied the stability and robustness of power systems (analysis and synthesis, using Conventional Power System Stabilizers (CPSS - realized on PID scheme or advanced controllers (based on adaptive and robust control, and applied on automatic excitation control of powerful synchronous generators, to improve dynamic performances and robustness. The GUI is a useful average to facilitate stability study of power system with the analysis and synthesis of regulators, and resolution of the compromise: results precision / calculation speed. The obtained Simulation results exploiting our developed GUI realized under MATLAB shown considerable improvements in static and dynamic performances, a great stability and enhancing the robustness of power system, with best precision and minimum operating time. This study was performed for different types of powerful synchronous generators.

  20. Design and Development of the Graphical User Interface for Sindhi Language

    Directory of Open Access Journals (Sweden)

    Imdad Ali Ismaili

    2011-10-01

    Full Text Available This paper describes the design and implementation of a Unicode-based GUISL (Graphical User Interface for Sindhi Language. The idea is to provide a software platform to the people of Sindh as well as Sindhi diasporas living across the globe to make use of computing for basic tasks such as editing, composition, formatting, and printing of documents in Sindhi by using GUISL. The implementation of the GUISL has been done in the Java technology to make the system platform independent. The paper describes several design issues of Sindhi GUI in the context of existing software tools and technologies and explains how mapping and concatenation techniques have been employed to achieve the cursive shape of Sindhi script.

  1. Graphical user interface for a remote medical monitoring system: U.S. Army medic recommendations.

    Science.gov (United States)

    Kaushik, Sangeeta; Tharion, William J

    2009-11-01

    We obtained recommendations for a graphical user interface (GUI) design for a new medical monitoring system. Data were obtained from 26 combat-experienced medics. Volunteers were briefed on the medical monitoring system. They then completed a questionnaire on background medical treatment experience, provided drawings on how and what information should be displayed on the GUI screens for use on a personal digital assistant, and participated in focus group sessions with four to seven medics per group to obtain group consensus on what information the GUI screens should contain. Detailed displays on seven screens provide the medical and situational awareness information medics need for triage decisions and for early processing of a casualty. The created GUI screens are a combination of object-based and text-based information using a color-coded system. Medics believed the information displayed with these GUI designs would improve treatment of casualties on the battlefield.

  2. USER-DEFINED CONTENT IN A MODERN LEARNING ENVIRONMENT FOR ENGINEERING GRAPHICS

    Directory of Open Access Journals (Sweden)

    DOLGA Lia

    2008-07-01

    Full Text Available New pedagogic methods are developed during the current “knowledge-based era”. They replace the “taught lesson” by collaboration, reflection and iteration; in this context, the internet should not remain only a convenient and cheep (if not free mechanism for delivering traditional materials online. As the amount of available information continues to enlarge and diversify, the skills needed to access and process this information become quickly outdated. The ability to use new technologies and a wide range of multimedia tools will define success. This paper outlines the important role played by the user-generated content in defining new pedagogical approaches to learning in the context of online communities. Graphical subjects, like “Computer Graphics” and “Computer Aided Design” require an active participation of the student. Students-led lessons and students generated content give consistency and aid value to the educational process. The term of “teaching” transforms in “studying”.

  3. Impact of representational systems on color selections for graphic user interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Brown-VanHoozer, S.A.; Brownson, L.W.

    1996-04-01

    This paper is based on a study involving representational systems and color preference on graphic user interfaces (GUI). The study is an extension of a general exploratory experiment (GEE) conducted in October of 1993, wherein individuals` favored sensory representational systems (visual, auditory and kinesthetic) (FRS) were compared to their GUI comfort parameters. The results of the study show that an individual`s FRS is a significant factor in their acceptance of a GUI design, and that further in-depth study of the various display attributes to an individual`s FRS is required. This research is the first in the series of follow-up studies to be conducted regarding specific characteristics of GUI (i.e., fonts, character density, etc.) with respect to an individual`s FRS. The study focus on the attribute of color preferences for GUI design.

  4. A Matlab-Based Graphical User Interface for Simulation and Control Design of a Hydrogen Mixer

    Science.gov (United States)

    Richter, Hanz; Figueroa, Fernando

    2003-01-01

    A Graphical User Interface (GUI) that facilitates prediction and control design tasks for a propellant mixer is described. The Hydrogen mixer is used in rocket test stand operations at the NASA John C. Stennis Space Center. The mixer injects gaseous hydrogen (GH2) into a stream of liquid hydrogen (LH2) to obtain a combined flow with desired thermodynamic properties. The flows of GH2 and LH2 into the mixer are regulated by two control valves, and a third control valve is installed at the exit of the mixer to regulate the combined flow. The three valves may be simultaneously operated in order to achieve any desired combination of total flow, exit temperature and mixer pressure within the range of operation. The mixer, thus, constitutes a three-input, three-output system. A mathematical model of the mixer has been obtained and validated with experimental data. The GUI presented here uses the model to predict mixer response under diverse conditions.

  5. A Matlab-Based Graphical User Interface for Simulation and Control Design of a Hydrogen Mixer

    Science.gov (United States)

    Richter, Hanz; Figueroa, Fernando

    2003-01-01

    A Graphical User Interface (GUI) that facilitates prediction and control design tasks for a propellant mixer is described. The Hydrogen mixer is used in rocket test stand operations at the NASA John C. Stennis Space Center. The mixer injects gaseous hydrogen (GH2) into a stream of liquid hydrogen (LH2) to obtain a combined flow with desired thermodynamic properties. The flows of GH2 and LH2 into the mixer are regulated by two control valves, and a third control valve is installed at the exit of the mixer to regulate the combined flow. The three valves may be simultaneously operated in order to achieve any desired combination of total flow, exit temperature and mixer pressure within the range of operation. The mixer, thus, constitutes a three-input, three-output system. A mathematical model of the mixer has been obtained and validated with experimental data. The GUI presented here uses the model to predict mixer response under diverse conditions.

  6. METAGUI 3: A graphical user interface for choosing the collective variables in molecular dynamics simulations

    Science.gov (United States)

    Giorgino, Toni; Laio, Alessandro; Rodriguez, Alex

    2017-08-01

    Molecular dynamics (MD) simulations allow the exploration of the phase space of biopolymers through the integration of equations of motion of their constituent atoms. The analysis of MD trajectories often relies on the choice of collective variables (CVs) along which the dynamics of the system is projected. We developed a graphical user interface (GUI) for facilitating the interactive choice of the appropriate CVs. The GUI allows: defining interactively new CVs; partitioning the configurations into microstates characterized by similar values of the CVs; calculating the free energies of the microstates for both unbiased and biased (metadynamics) simulations; clustering the microstates in kinetic basins; visualizing the free energy landscape as a function of a subset of the CVs used for the analysis. A simple mouse click allows one to quickly inspect structures corresponding to specific points in the landscape.

  7. Graphical User Interface (GUI) Design for Ballistic Research Laboratory-Computer-Aided Design’s (BRL-CAD’s) Geometry Difference (GDiff) Tool

    Science.gov (United States)

    2015-04-01

    Graphical User Interface ( GUI ) Design for Ballistic Research Laboratory–Computer-Aided Design’s (BRL–CAD’s) Geometry Difference (GDiff) Tool...to the originator. Army Research Laboratory Aberdeen Proving Ground, MD 21005 ARL-CR-0756 April 2015 Graphical User Interface ( GUI ...2015 2. REPORT TYPE Contractor Report 3. DATES COVERED (From - To) 06/2014–08/2014 4. TITLE AND SUBTITLE Graphical User Interface ( GUI ) Design for

  8. admixturegraph: an R package for admixture graph manipulation and fitting.

    Science.gov (United States)

    Leppälä, Kalle; Nielsen, Svend V; Mailund, Thomas

    2017-06-01

    Admixture graphs generalize phylogenetic trees by allowing genetic lineages to merge as well as split. In this paper we present the R package admixturegraph containing tools for building and visualizing admixture graphs, for fitting graph parameters to genetic data, for visualizing goodness of fit and for evaluating the relative goodness of fit between different graphs. GitHub: https://github.com/mailund/admixture_graph and CRAN: https://cran.r-project.org/web/packages/admixturegraph . mailund@birc.au.dk .

  9. Foundations of statistical algorithms with references to R packages

    CERN Document Server

    Weihs, Claus; Ligges, Uwe

    2013-01-01

    A new and refreshingly different approach to presenting the foundations of statistical algorithms, Foundations of Statistical Algorithms: With References to R Packages reviews the historical development of basic algorithms to illuminate the evolution of today's more powerful statistical algorithms. It emphasizes recurring themes in all statistical algorithms, including computation, assessment and verification, iteration, intuition, randomness, repetition and parallelization, and scalability. Unique in scope, the book reviews the upcoming challenge of scaling many of the established techniques

  10. ipw: An R Package for Inverse Probability Weighting

    Directory of Open Access Journals (Sweden)

    Ronald B. Geskus

    2011-10-01

    Full Text Available We describe the R package ipw for estimating inverse probability weights. We show how to use the package to fit marginal structural models through inverse probability weighting, to estimate causal effects. Our package can be used with data from a point treatment situation as well as with a time-varying exposure and time-varying confounders. It can be used with binomial, categorical, ordinal and continuous exposure variables.

  11. Visualization for Hyper-Heuristics. Front-End Graphical User Interface

    Energy Technology Data Exchange (ETDEWEB)

    Kroenung, Lauren [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. While such automated design has great advantages, it can often be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues of usability by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics to support practitioners, as well as scientific visualization of the produced automated designs. My contributions to this project are exhibited in the user-facing portion of the developed system and the detailed scientific visualizations created from back-end data.

  12. A Student-Friendly Graphical User Interface to Extract Data from Remote Sensing Level-2 Products.

    Science.gov (United States)

    Bernardello, R.

    2016-02-01

    Remote sensing era has provided an unprecedented amount of publicly available data. The United States National Aeronautics and Space Administration Goddard Space Flight Center (NASA-GSFC) has achieved remarkable results in the distribution of these data to the scientific community through the OceanColor web page (http://oceancolor.gsfc.nasa.gov/). However, the access to these data, is not straightforward and needs a certain investment of time in learning the use of existing software. Satellite sensors acquire raw data that are processed through several steps towards a format usable by the scientific community. These products are distributed in Hierarchical Data Format (HDF) which often represents the first obstacle for students, teachers and scientists not used to deal with extensive matrices. We present here SATellite data PROcessing (SATPRO) a newly developed Graphical User Interface (GUI) designed in MATLAB environment to provide an easy, immediate yet reliable way to select and extract Level-2 data from NASA SeaWIFS and MODIS-Aqua databases for oceanic surface temperature and chlorophyll. Since no previous experience with MATLAB is required, SATPRO allows the user to explore the available dataset without investing any software-learning time. SATPRO is an ideal tool to introduce undergraduate students to the use of remote sensing data in oceanography and can also be useful for research projects at the graduate level.

  13. M-Split: A Graphical User Interface to Analyze Multilayered Anisotropy from Shear Wave Splitting

    Science.gov (United States)

    Abgarmi, Bizhan; Ozacar, A. Arda

    2017-04-01

    Shear wave splitting analysis are commonly used to infer deep anisotropic structure. For simple cases, obtained delay times and fast-axis orientations are averaged from reliable results to define anisotropy beneath recording seismic stations. However, splitting parameters show systematic variations with back azimuth in the presence of complex anisotropy and cannot be represented by average time delay and fast axis orientation. Previous researchers had identified anisotropic complexities at different tectonic settings and applied various approaches to model them. Most commonly, such complexities are modeled by using multiple anisotropic layers with priori constraints from geologic data. In this study, a graphical user interface called M-Split is developed to easily process and model multilayered anisotropy with capabilities to properly address the inherited non-uniqueness. M-Split program runs user defined grid searches through the model parameter space for two-layer anisotropy using formulation of Silver and Savage (1994) and creates sensitivity contour plots to locate local maximas and analyze all possible models with parameter tradeoffs. In order to minimize model ambiguity and identify the robust model parameters, various misfit calculation procedures are also developed and embedded to M-Split which can be used depending on the quality of the observations and their back-azimuthal coverage. Case studies carried out to evaluate the reliability of the program using real noisy data and for this purpose stations from two different networks are utilized. First seismic network is the Kandilli Observatory and Earthquake research institute (KOERI) which includes long term running permanent stations and second network comprises seismic stations deployed temporary as part of the "Continental Dynamics-Central Anatolian Tectonics (CD-CAT)" project funded by NSF. It is also worth to note that M-Split is designed as open source program which can be modified by users for

  14. LFSTAT - An R-Package for Low-Flow Analysis

    Science.gov (United States)

    Koffler, D.; Laaha, G.

    2012-04-01

    When analysing daily streamflow data focusing on low flow and drought, the state of the art is well documented in the Manual on Low-Flow Estimation and Prediction [1] published by the WMO. While it is clear what has to be done, it is not so clear how to preform the analysis and make the calculation as reproducible as possible. Our software solution expands the high preforming statistical open source software package R to analyse daily stream flow data focusing on low-flows. As command-line based programs are not everyone's preference, we also offer a plug-in for the R-Commander, an easy to use graphical user interface (GUI) to analyse data in R. Functionality includes estimation of the most important low-flow indices. Beside standardly used flow indices also BFI and Recession constants can be computed. The main applications of L-moment based Extreme value analysis and regional frequency analysis (RFA) are available. Calculation of streamflow deficits is another important feature. The most common graphics are prepared and can easily be modified according to the users preferences. Graphics include hydrographs for different periods, flexible streamflow deficit plots, baseflow visualisation, flow duration curves as well as double mass curves just to name a few. The package uses a S3-class called lfobj (low-flow objects). Once this objects are created, analysis can be preformed by mouse-click, and a script can be saved to make the analysis easy reproducible. At the moment we are offering implementation of all major methods proposed in the WMO manual on Low-flow Estimation and Predictions. Future plans include e.g. report export in odt-file using odf-weave. We hope to offer a tool to ease and structure the analysis of stream flow data focusing on low-flows and to make analysis transparent and communicable. The package is designed for hydrological research and water management practice, but can also be used in teaching students the first steps in low-flow hydrology.

  15. MixSim : An R Package for Simulating Data to Study Performance of Clustering Algorithms

    Directory of Open Access Journals (Sweden)

    Volodymyr Melnykov

    2012-11-01

    Full Text Available The R package MixSim is a new tool that allows simulating mixtures of Gaussian distributions with different levels of overlap between mixture components. Pairwise overlap, defined as a sum of two misclassification probabilities, measures the degree of interaction between components and can be readily employed to control the clustering complexity of datasets simulated from mixtures. These datasets can then be used for systematic performance investigation of clustering and finite mixture modeling algorithms. Among other capabilities of MixSim, there are computing the exact overlap for Gaussian mixtures, simulating Gaussian and non-Gaussian data, simulating outliers and noise variables, calculating various measures of agreement between two partitionings, and constructing parallel distribution plots for the graphical display of finite mixture models. All features of the package are illustrated in great detail. The utility of the package is highlighted through a small comparison study of several popular clustering algorithms.

  16. The R Commander: A Basic-Statistics Graphical User Interface to R

    Directory of Open Access Journals (Sweden)

    John Fox

    2005-08-01

    Full Text Available Unlike S-PLUS, R does not incorporate a statistical graphical user interface (GUI, but it does include tools for building GUIs. Based on the tcltk package (which furnishes an interface to the Tcl/Tk GUI toolkit, the Rcmdr package provides a basic-statistics graphical user interface to R called the "R Commander." The design objectives of the R Commander were as follows: to support, through an easy-to-use, extensible, cross-platform GUI, the statistical functionality required for a basic-statistics course (though its current functionality has grown to include support for linear and generalized-linear models, and other more advanced features; to make it relatively difficult to do unreasonable things; and to render visible the relationship between choices made in the GUI and the R commands that they generate. The R Commander uses a simple and familiar menu/dialog-box interface. Top-level menus include File, Edit, Data, Statistics, Graphs, Models, Distributions, Tools, and Help, with the complete menu tree given in the paper. Each dialog box includes a Help button, which leads to a relevant help page. Menu and dialog-box selections generate R commands, which are recorded in a script window and are echoed, along with output, to an output window. The script window also provides the ability to edit, enter, and re-execute commands. Error messages, warnings, and some other information appear in a separate messages window. Data sets in the R Commander are simply R data frames, and can be read from attached packages or imported from files. Although several data frames may reside in memory, only one is "active" at any given time. There may also be an active statistical model (e.g., an R lm or glm ob ject. The purpose of this paper is to introduce and describe the use of the R Commander GUI; to describe the design and development of the R Commander; and to explain how the R Commander GUI can be extended. The second part of the paper (following a brief

  17. Visual design for the user interface, Part 2: Graphics in the interface.

    Science.gov (United States)

    Lynch, P J

    1994-01-01

    Highly interactive multimedia electronic documents pose unique graphic information design problems. This paper is a discussion of some of the graphic design considerations that are unique to electronic documents, including the challenges of adapting existing graphic design skills to electronic documents that are displayed and read from computer screens.

  18. Extending R packages to support 64-bit compiled code: An illustration with spam64 and GIMMS NDVI3g data

    Science.gov (United States)

    Gerber, Florian; Mösinger, Kaspar; Furrer, Reinhard

    2017-07-01

    Software packages for spatial data often implement a hybrid approach of interpreted and compiled programming languages. The compiled parts are usually written in C, C++, or Fortran, and are efficient in terms of computational speed and memory usage. Conversely, the interpreted part serves as a convenient user-interface and calls the compiled code for computationally demanding operations. The price paid for the user friendliness of the interpreted component is-besides performance-the limited access to low level and optimized code. An example of such a restriction is the 64-bit vector support of the widely used statistical language R. On the R side, users do not need to change existing code and may not even notice the extension. On the other hand, interfacing 64-bit compiled code efficiently is challenging. Since many R packages for spatial data could benefit from 64-bit vectors, we investigate strategies to efficiently pass 64-bit vectors to compiled languages. More precisely, we show how to simply extend existing R packages using the foreign function interface to seamlessly support 64-bit vectors. This extension is shown with the sparse matrix algebra R package spam. The new capabilities are illustrated with an example of GIMMS NDVI3g data featuring a parametric modeling approach for a non-stationary covariance matrix.

  19. topicmodels: An R Package for Fitting Topic Models

    Directory of Open Access Journals (Sweden)

    Bettina Grun

    2011-05-01

    Full Text Available Topic models allow the probabilistic modeling of term frequency occurrences in documents. The fitted model can be used to estimate the similarity between documents as well as between a set of specified keywords using an additional layer of latent variables which are referred to as topics. The R package topicmodels provides basic infrastructure for fitting topic models based on data structures from the text mining package tm. The package includes interfaces to two algorithms for fitting topic models: the variational expectation-maximization algorithm provided by David M. Blei and co-authors and an algorithm using Gibbs sampling by Xuan-Hieu Phan and co-authors.

  20. Graphical user interface for a dual-module EMCCD x-ray detector array

    Science.gov (United States)

    Wang, Weiyuan; Ionita, Ciprian; Kuhls-Gilcrist, Andrew; Huang, Ying; Qu, Bin; Gupta, Sandesh K.; Bednarek, Daniel R.; Rudin, Stephen

    2011-03-01

    A new Graphical User Interface (GUI) was developed using Laboratory Virtual Instrumentation Engineering Workbench (LabVIEW) for a high-resolution, high-sensitivity Solid State X-ray Image Intensifier (SSXII), which is a new x-ray detector for radiographic and fluoroscopic imaging, consisting of an array of Electron-Multiplying CCDs (EMCCDs) each having a variable on-chip electron-multiplication gain of up to 2000x to reduce the effect of readout noise. To enlarge the field-of-view (FOV), each EMCCD sensor is coupled to an x-ray phosphor through a fiberoptic taper. Two EMCCD camera modules are used in our prototype to form a computer-controlled array; however, larger arrays are under development. The new GUI provides patient registration, EMCCD module control, image acquisition, and patient image review. Images from the array are stitched into a 2kx1k pixel image that can be acquired and saved at a rate of 17 Hz (faster with pixel binning). When reviewing the patient's data, the operator can select images from the patient's directory tree listed by the GUI and cycle through the images using a slider bar. Commonly used camera parameters including exposure time, trigger mode, and individual EMCCD gain can be easily adjusted using the GUI. The GUI is designed to accommodate expansion of the EMCCD array to even larger FOVs with more modules. The high-resolution, high-sensitivity EMCCD modular-array SSXII imager with the new user-friendly GUI should enable angiographers and interventionalists to visualize smaller vessels and endovascular devices, helping them to make more accurate diagnoses and to perform more precise image-guided interventions.

  1. Overview of Graphical User Interface for ARRBOD (Acute Radiation Risk and BRYNTRN Organ Dose Projection)

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem; Cucinotta, Francis A.

    Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts be-cause organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user-friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations direc-torate (MOD), and space biophysics researchers. Assessment of astronauts' organ doses and ARS from the exposure to historically large SPEs is in support of mission design and opera-tion planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI prod-uct, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.

  2. Downsizer - A Graphical User Interface-Based Application for Browsing, Acquiring, and Formatting Time-Series Data for Hydrologic Modeling

    Science.gov (United States)

    Ward-Garrison, Christian; Markstrom, Steven L.; Hay, Lauren E.

    2009-01-01

    The U.S. Geological Survey Downsizer is a computer application that selects, downloads, verifies, and formats station-based time-series data for environmental-resource models, particularly the Precipitation-Runoff Modeling System. Downsizer implements the client-server software architecture. The client presents a map-based, graphical user interface that is intuitive to modelers; the server provides streamflow and climate time-series data from over 40,000 measurement stations across the United States. This report is the Downsizer user's manual and provides (1) an overview of the software design, (2) installation instructions, (3) a description of the graphical user interface, (4) a description of selected output files, and (5) troubleshooting information.

  3. Towards a responsive and interactive graphical user interface for neutron data reduction and visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chatterjee, Alok; Worlton, T.; Hammonds, J.; Loong, C.K. [Argonne National Laboratory, Argonne, IL (United States); Mikkelson, D.; Mikkelson, R. [Univ. of Wisconsin-Stout, Menomonie, WI (United States); Chen, D. [Neutron Scattering Laboratory, China Institute of Atomic Energy, Beijing (China)

    2001-03-01

    An Integrated Spectral Analysis Workbench, ISAW has been developed at IPNS with the goal of providing a flexible and powerful tool to visualize and analyze neutron scattering time-of-flight data. The software, written in Java, is platform independent, object oriented and modular, making it easier to maintain and add features. The graphical user interface (GUI) for ISAW allows intuitive and interactive loading and manipulation of multiple spectra from different 'runs'. ISAW provides multiple displays of the spectra in a Runfile' and most of the functions can be performed through the GUI menu bar as well as through command scripts. All displays are simultaneously updated when the data is changed using the Observable-observer object-model pattern. All displays are observers of the Dataset (observable) and respond to changes or selections in it simultaneously. A 'tree' display of the spectra in run files is provided for a detailed view of detector elements and easy selection of spectra. The operations menu is instrument sensitive so that it displays the appropriate set of operators accordingly. Automatic menu generation is made possible by the ability of the DataSet objects to furnish a list of operations contained in the particular DataSet selected at the time the menu bar is accessed. The transformed and corrected data can be saved to a disk in different file formats for further analyses (e.g., GSAS for structure refinement). (author)

  4. A computer graphical user interface for survival mixture modelling of recurrent infections.

    Science.gov (United States)

    Lee, Andy H; Zhao, Yun; Yau, Kelvin K W; Ng, S K

    2009-03-01

    Recurrent infections data are commonly encountered in medical research, where the recurrent events are characterised by an acute phase followed by a stable phase after the index episode. Two-component survival mixture models, in both proportional hazards and accelerated failure time settings, are presented as a flexible method of analysing such data. To account for the inherent dependency of the recurrent observations, random effects are incorporated within the conditional hazard function, in the manner of generalised linear mixed models. Assuming a Weibull or log-logistic baseline hazard in both mixture components of the survival mixture model, an EM algorithm is developed for the residual maximum quasi-likelihood estimation of fixed effect and variance component parameters. The methodology is implemented as a graphical user interface coded using Microsoft visual C++. Application to model recurrent urinary tract infections for elderly women is illustrated, where significant individual variations are evident at both acute and stable phases. The survival mixture methodology developed enable practitioners to identify pertinent risk factors affecting the recurrent times and to draw valid conclusions inferred from these correlated and heterogeneous survival data.

  5. A Graphical User Interface for Parameterizing Biochemical Models of Photosynthesis and Chlorophyll Fluorescence

    Science.gov (United States)

    Kornfeld, A.; Van der Tol, C.; Berry, J. A.

    2015-12-01

    Recent advances in optical remote sensing of photosynthesis offer great promise for estimating gross primary productivity (GPP) at leaf, canopy and even global scale. These methods -including solar-induced chlorophyll fluorescence (SIF) emission, fluorescence spectra, and hyperspectral features such as the red edge and the photochemical reflectance index (PRI) - can be used to greatly enhance the predictive power of global circulation models (GCMs) by providing better constraints on GPP. The way to use measured optical data to parameterize existing models such as SCOPE (Soil Canopy Observation, Photochemistry and Energy fluxes) is not trivial, however. We have therefore extended a biochemical model to include fluorescence and other parameters in a coupled treatment. To help parameterize the model, we then use nonlinear curve-fitting routines to determine the parameter set that enables model results to best fit leaf-level gas exchange and optical data measurements. To make the tool more accessible to all practitioners, we have further designed a graphical user interface (GUI) based front-end to allow researchers to analyze data with a minimum of effort while, at the same time, allowing them to change parameters interactively to visualize how variation in model parameters affect predicted outcomes such as photosynthetic rates, electron transport, and chlorophyll fluorescence. Here we discuss the tool and its effectiveness, using recently-gathered leaf-level data.

  6. A Wide-range Survey on Recall-Based Graphical User Authentications Algorithms Based on ISO and Attack Patterns

    Directory of Open Access Journals (Sweden)

    Arash Habibi Lashkari

    2009-12-01

    Full Text Available Nowadays, user authentication is one of the important topics in information security. Text-based strong password schemes could provide with certain degree of security. However, the fact that strong passwords being difficult to memorize often leads their owners to write them down on papers or even save them in a computer file. Graphical user authentication (GUA has been proposed as a possible alternative solution to text-based authentication, motivated particularly by the fact that humans can remember images better than text. In recent years, many networks, computer systems and Internet-based environments try used GUA technique for their user’s authentication. All of GUA algorithms have two different aspects which are usability and security. Unfortunately, none of graphical algorithms were being able to cover both of these aspects at the same time. This paper presents a wide-range survey on the pure and cued recall-based algorithms in GUA, based on ISO standards for usability and attack patterns standards for security. After explain usability ISO standards and attack patterns international standards, we try to collect the major attributes of usability and security in GUA. Finally, try to make comparison tables among all recall-based algorithms based on usability attributes and attack patterns those we found. Keywords - Recall-Based Graphical User Authentication; Graphical Password; Usability and security; ISO 9241-11; ISO 9126, ISO 13407; Attack Patterns; Brute force, Dictionary attacks; Guessing; Spyware; Shoulder surfing; Social engineering (description.

  7. An R package for the integrated analysis of metabolomics and spectral data.

    Science.gov (United States)

    Costa, Christopher; Maraschin, Marcelo; Rocha, Miguel

    2016-06-01

    Recently, there has been a growing interest in the field of metabolomics, materialized by a remarkable growth in experimental techniques, available data and related biological applications. Indeed, techniques as nuclear magnetic resonance, gas or liquid chromatography, mass spectrometry, infrared and UV-visible spectroscopies have provided extensive datasets that can help in tasks as biological and biomedical discovery, biotechnology and drug development. However, as it happens with other omics data, the analysis of metabolomics datasets provides multiple challenges, both in terms of methodologies and in the development of appropriate computational tools. Indeed, from the available software tools, none addresses the multiplicity of existing techniques and data analysis tasks. In this work, we make available a novel R package, named specmine, which provides a set of methods for metabolomics data analysis, including data loading in different formats, pre-processing, metabolite identification, univariate and multivariate data analysis, machine learning, and feature selection. Importantly, the implemented methods provide adequate support for the analysis of data from diverse experimental techniques, integrating a large set of functions from several R packages in a powerful, yet simple to use environment. The package, already available in CRAN, is accompanied by a web site where users can deposit datasets, scripts and analysis reports to be shared with the community, promoting the efficient sharing of metabolomics data analysis pipelines. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. parallelMCMCcombine: an R package for bayesian methods for big data and analytics.

    Directory of Open Access Journals (Sweden)

    Alexey Miroshnikov

    Full Text Available Recent advances in big data and analytics research have provided a wealth of large data sets that are too big to be analyzed in their entirety, due to restrictions on computer memory or storage size. New Bayesian methods have been developed for data sets that are large only due to large sample sizes. These methods partition big data sets into subsets and perform independent Bayesian Markov chain Monte Carlo analyses on the subsets. The methods then combine the independent subset posterior samples to estimate a posterior density given the full data set. These approaches were shown to be effective for Bayesian models including logistic regression models, Gaussian mixture models and hierarchical models. Here, we introduce the R package parallelMCMCcombine which carries out four of these techniques for combining independent subset posterior samples. We illustrate each of the methods using a Bayesian logistic regression model for simulation data and a Bayesian Gamma model for real data; we also demonstrate features and capabilities of the R package. The package assumes the user has carried out the Bayesian analysis and has produced the independent subposterior samples outside of the package. The methods are primarily suited to models with unknown parameters of fixed dimension that exist in continuous parameter spaces. We envision this tool will allow researchers to explore the various methods for their specific applications and will assist future progress in this rapidly developing field.

  9. parallelMCMCcombine: an R package for bayesian methods for big data and analytics.

    Science.gov (United States)

    Miroshnikov, Alexey; Conlon, Erin M

    2014-01-01

    Recent advances in big data and analytics research have provided a wealth of large data sets that are too big to be analyzed in their entirety, due to restrictions on computer memory or storage size. New Bayesian methods have been developed for data sets that are large only due to large sample sizes. These methods partition big data sets into subsets and perform independent Bayesian Markov chain Monte Carlo analyses on the subsets. The methods then combine the independent subset posterior samples to estimate a posterior density given the full data set. These approaches were shown to be effective for Bayesian models including logistic regression models, Gaussian mixture models and hierarchical models. Here, we introduce the R package parallelMCMCcombine which carries out four of these techniques for combining independent subset posterior samples. We illustrate each of the methods using a Bayesian logistic regression model for simulation data and a Bayesian Gamma model for real data; we also demonstrate features and capabilities of the R package. The package assumes the user has carried out the Bayesian analysis and has produced the independent subposterior samples outside of the package. The methods are primarily suited to models with unknown parameters of fixed dimension that exist in continuous parameter spaces. We envision this tool will allow researchers to explore the various methods for their specific applications and will assist future progress in this rapidly developing field.

  10. clues: An R Package for Nonparametric Clustering Based on Local Shrinking

    Directory of Open Access Journals (Sweden)

    Fang Chang

    2010-02-01

    Full Text Available Determining the optimal number of clusters appears to be a persistent and controversial issue in cluster analysis. Most existing R packages targeting clustering require the user to specify the number of clusters in advance. However, if this subjectively chosen number is far from optimal, clustering may produce seriously misleading results. In order to address this vexing problem, we develop the R package clues to automate and evaluate the selection of an optimal number of clusters, which is widely applicable in the field of clustering analysis. Package clues uses two main procedures, shrinking and partitioning, to estimate an optimal number of clusters by maximizing an index function, either the CH index or the Silhouette index, rather than relying on guessing a pre-specified number. Five agreement indices (Rand index, Hubert and Arabie’s adjusted Rand index, Morey and Agresti’s adjusted Rand index, Fowlkes and Mallows index and Jaccard index, which measure the degree of agreement between any two partitions, are also provided in clues. In addition to numerical evidence, clues also supplies a deeper insight into the partitioning process with trajectory plots.

  11. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation

    Science.gov (United States)

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C.; Wong, Willy; Daskalakis, Zafiris J.; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline

  12. TMSEEG: A MATLAB-Based Graphical User Interface for Processing Electrophysiological Signals during Transcranial Magnetic Stimulation.

    Science.gov (United States)

    Atluri, Sravya; Frehlich, Matthew; Mei, Ye; Garcia Dominguez, Luis; Rogasch, Nigel C; Wong, Willy; Daskalakis, Zafiris J; Farzan, Faranak

    2016-01-01

    Concurrent recording of electroencephalography (EEG) during transcranial magnetic stimulation (TMS) is an emerging and powerful tool for studying brain health and function. Despite a growing interest in adaptation of TMS-EEG across neuroscience disciplines, its widespread utility is limited by signal processing challenges. These challenges arise due to the nature of TMS and the sensitivity of EEG to artifacts that often mask TMS-evoked potentials (TEP)s. With an increase in the complexity of data processing methods and a growing interest in multi-site data integration, analysis of TMS-EEG data requires the development of a standardized method to recover TEPs from various sources of artifacts. This article introduces TMSEEG, an open-source MATLAB application comprised of multiple algorithms organized to facilitate a step-by-step procedure for TMS-EEG signal processing. Using a modular design and interactive graphical user interface (GUI), this toolbox aims to streamline TMS-EEG signal processing for both novice and experienced users. Specifically, TMSEEG provides: (i) targeted removal of TMS-induced and general EEG artifacts; (ii) a step-by-step modular workflow with flexibility to modify existing algorithms and add customized algorithms; (iii) a comprehensive display and quantification of artifacts; (iv) quality control check points with visual feedback of TEPs throughout the data processing workflow; and (v) capability to label and store a database of artifacts. In addition to these features, the software architecture of TMSEEG ensures minimal user effort in initial setup and configuration of parameters for each processing step. This is partly accomplished through a close integration with EEGLAB, a widely used open-source toolbox for EEG signal processing. In this article, we introduce TMSEEG, validate its features and demonstrate its application in extracting TEPs across several single- and multi-pulse TMS protocols. As the first open-source GUI-based pipeline

  13. soilphysics: An R package to determine soil preconsolidation pressure

    Science.gov (United States)

    da Silva, Anderson Rodrigo; de Lima, Renato Paiva

    2015-11-01

    Preconsolidation pressure is a parameter obtained from the soil compression curve and has been used as an indicator of load-bearing capacity of soil, as well as to characterize the impacts suffered by the use of machines. Despite its importance in soil physics, there is a few software or computational routines to support its determination. In this paper we present a computational package in R language, the package soilphysics, which contains implementations of the main methods for determining preconsolidation pressure, such as the method of Casagrande, Pacheco Silva, regression methods and the method of the virgin compression line intercept. There is still a consensus that Casagrande is the standard method, although the method of Pacheco Silva has shown similar values. The method of the virgin compression line intercept can be used when trying to be more conservative on the value (smaller) of preconsolidation pressure. Furthermore, Casagrande could be replaced by a regression method when the compression curve is obtained from saturated soils. The theory behind each method is presented and the algorithms are thoroughly described. We also give some support on how to use the R functions. Examples are used to illustrate the capabilities of the package, and the results are briefly discussed. The latter were validated using a recently published VBA. With soilphysics, the user has all the graphical and statistical power of R to determine preconsolidation pressure using different methods. The package is distribution free (under the GPL-2|3) and is currently available from the Comprehensive R Archive Network.

  14. Parietal neural prosthetic control of a computer cursor in a graphical-user-interface task

    Science.gov (United States)

    Revechkis, Boris; Aflalo, Tyson NS; Kellis, Spencer; Pouratian, Nader; Andersen, Richard A.

    2014-12-01

    Objective. To date, the majority of Brain-Machine Interfaces have been used to perform simple tasks with sequences of individual targets in otherwise blank environments. In this study we developed a more practical and clinically relevant task that approximated modern computers and graphical user interfaces (GUIs). This task could be problematic given the known sensitivity of areas typically used for BMIs to visual stimuli, eye movements, decision-making, and attentional control. Consequently, we sought to assess the effect of a complex, GUI-like task on the quality of neural decoding. Approach. A male rhesus macaque monkey was implanted with two 96-channel electrode arrays in area 5d of the superior parietal lobule. The animal was trained to perform a GUI-like ‘Face in a Crowd’ task on a computer screen that required selecting one cued, icon-like, face image from a group of alternatives (the ‘Crowd’) using a neurally controlled cursor. We assessed whether the crowd affected decodes of intended cursor movements by comparing it to a ‘Crowd Off’ condition in which only the matching target appeared without alternatives. We also examined if training a neural decoder with the Crowd On rather than Off had any effect on subsequent decode quality. Main results. Despite the additional demands of working with the Crowd On, the animal was able to robustly perform the task under Brain Control. The presence of the crowd did not itself affect decode quality. Training the decoder with the Crowd On relative to Off had no negative influence on subsequent decoding performance. Additionally, the subject was able to gaze around freely without influencing cursor position. Significance. Our results demonstrate that area 5d recordings can be used for decoding in a complex, GUI-like task with free gaze. Thus, this area is a promising source of signals for neural prosthetics that utilize computing devices with GUI interfaces, e.g. personal computers, mobile devices, and tablet

  15. Linear mixed-effects models for within-participant psychology experiments: an introductory tutorial and free, graphical user interface (LMMgui)

    OpenAIRE

    Magezi, David A.

    2015-01-01

    Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).

  16. Linear mixed-effects models for within-participant psychology experiments: an introductory tutorial and free, graphical user interface (LMMgui).

    Science.gov (United States)

    Magezi, David A

    2015-01-01

    Linear mixed-effects models (LMMs) are increasingly being used for data analysis in cognitive neuroscience and experimental psychology, where within-participant designs are common. The current article provides an introductory review of the use of LMMs for within-participant data analysis and describes a free, simple, graphical user interface (LMMgui). LMMgui uses the package lme4 (Bates et al., 2014a,b) in the statistical environment R (R Core Team).

  17. Development of MATLAB-Based Digital Signal Processing Teaching Module with Graphical User Interface Environment for Nigerian University

    OpenAIRE

    2013-01-01

    The development of a teaching aid module for digital Signal processing (DSP) in Nigeria Universities was undertaken to address the problem associated with non-availability instructional module. This paper annexes the potential of Peripheral Interface Controllers (PICs) with MATLAB resources to develop a PIC-based system with graphic user interface environment suitable for data acquisition and signal processing. The module accepts data from three different sources: real time acquisition, pre-r...

  18. Development of Graphical User Interface for Finite Element Analysis of Static Loading of a Column using MATLAB

    OpenAIRE

    Moses Omolayo PETINRIN

    2010-01-01

    In this work, the capability of MATLAB software package to develop graphical user interface (GUI) package was demonstrated. A GUI was successfully developed using MATLAB programming language to study the behaviour of a suspended column under uniaxial static loading by solving the numerical model created based on the finite element method (FEM). The comparison between the exact solution from previous researches and the numerical analysis showed good agreement. The column average strain, averag...

  19. mmnet: An R Package for Metagenomics Systems Biology Analysis

    Directory of Open Access Journals (Sweden)

    Yang Cao

    2015-01-01

    Full Text Available The human microbiome plays important roles in human health and disease. Previous microbiome studies focused mainly on single pure species function and overlooked the interactions in the complex communities on system-level. A metagenomic approach introduced recently integrates metagenomic data with community-level metabolic network modeling, but no comprehensive tool was available for such kind of approaches. To facilitate these kinds of studies, we developed an R package, mmnet, to implement community-level metabolic network reconstruction. The package also implements a set of functions for automatic analysis pipeline construction including functional annotation of metagenomic reads, abundance estimation of enzymatic genes, community-level metabolic network reconstruction, and integrated network analysis. The result can be represented in an intuitive way and sent to Cytoscape for further exploration. The package has substantial potentials in metagenomic studies that focus on identifying system-level variations of human microbiome associated with disease.

  20. mmnet: An R Package for Metagenomics Systems Biology Analysis.

    Science.gov (United States)

    Cao, Yang; Zheng, Xiaofei; Li, Fei; Bo, Xiaochen

    2015-01-01

    The human microbiome plays important roles in human health and disease. Previous microbiome studies focused mainly on single pure species function and overlooked the interactions in the complex communities on system-level. A metagenomic approach introduced recently integrates metagenomic data with community-level metabolic network modeling, but no comprehensive tool was available for such kind of approaches. To facilitate these kinds of studies, we developed an R package, mmnet, to implement community-level metabolic network reconstruction. The package also implements a set of functions for automatic analysis pipeline construction including functional annotation of metagenomic reads, abundance estimation of enzymatic genes, community-level metabolic network reconstruction, and integrated network analysis. The result can be represented in an intuitive way and sent to Cytoscape for further exploration. The package has substantial potentials in metagenomic studies that focus on identifying system-level variations of human microbiome associated with disease.

  1. MNP: R Package for Fitting the Multinomial Probit Model

    Directory of Open Access Journals (Sweden)

    Kosuke Imai

    2005-05-01

    Full Text Available MNP is a publicly available R package that fits the Bayesian multinomial probit model via Markov chain Monte Carlo. The multinomial probit model is often used to analyze the discrete choices made by individuals recorded in survey data. Examples where the multinomial probit model may be useful include the analysis of product choice by consumers in market research and the analysis of candidate or party choice by voters in electoral studies. The MNP software can also fit the model with different choice sets for each individual, and complete or partial individual choice orderings of the available alternatives from the choice set. The estimation is based on the efficient marginal data augmentation algorithm that is developed by Imai and van Dyk (2005.

  2. dglars: An R Package to Estimate Sparse Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Luigi Augugliaro

    2014-09-01

    Full Text Available dglars is a publicly available R package that implements the method proposed in Augugliaro, Mineo, and Wit (2013, developed to study the sparse structure of a generalized linear model. This method, called dgLARS, is based on a differential geometrical extension of the least angle regression method proposed in Efron, Hastie, Johnstone, and Tibshirani (2004. The core of the dglars package consists of two algorithms implemented in Fortran 90 to efficiently compute the solution curve: a predictor-corrector algorithm, proposed in Augugliaro et al. (2013, and a cyclic coordinate descent algorithm, proposed in Augugliaro, Mineo, and Wit (2012. The latter algorithm, as shown here, is significantly faster than the predictor-corrector algorithm. For comparison purposes, we have implemented both algorithms.

  3. GrassmannOptim: An R Package for Grassmann Manifold Optimization

    Directory of Open Access Journals (Sweden)

    Ko Placid Adragni

    2012-07-01

    Full Text Available The optimization of a real-valued objective function f(U, where U is a p X d,p > d, semi-orthogonal matrix such that UTU=Id, and f is invariant under right orthogonal transformation of U, is often referred to as a Grassmann manifold optimization. Manifold optimization appears in a wide variety of computational problems in the applied sciences. In this article, we present GrassmannOptim, an R package for Grassmann manifold optimization. The implementation uses gradient-based algorithms and embeds a stochastic gradient method for global search. We describe the algorithms, provide some illustrative examples on the relevance of manifold optimization and finally, show some practical usages of the package.

  4. CoinCalc-A new R package for quantifying simultaneities of event series

    Science.gov (United States)

    Siegmund, Jonatan F.; Siegmund, Nicole; Donner, Reik V.

    2017-01-01

    We present the new R package CoinCalc for performing event coincidence analysis (ECA), a novel statistical method to quantify the simultaneity of events contained in two series of observations, either as simultaneous or lagged coincidences within a user-specific temporal tolerance window. The package also provides different analytical as well as surrogate-based significance tests (valid under different assumptions about the nature of the observed event series) as well as an intuitive visualization of the identified coincidences. We demonstrate the usage of CoinCalc based on two typical geoscientific example problems addressing the relationship between meteorological extremes and plant phenology as well as that between soil properties and land cover.

  5. SAE: an R package for early stopping rules in clinical trials.

    Science.gov (United States)

    Bascoul-Mollevi, C; Laplanche, A; Le Deley, M C; Kramar, A

    2011-11-01

    In the case of an unexpected high frequency of serious adverse events (SAE), statistical methods are needed to help in the decision making process as to continuation of accrual to the trial. This paper describes an R package, named SAE that implements a method recently developed by defining stopping rules after each observed SAE. The package function control for excessive toxicity either during the trial at the observation of each SAE (function SAE) or during the planning phase of a clinical trial (function DESIGN). This description and the package documentation are complementary to help the users to apply the method. The main difficulty in the implementation of the method is the choice of a priori parameters. Data from an ongoing clinical trial are presented as an example to improve the understanding and the use of the package.

  6. CoinCalc -- A new R package for quantifying simultaneities of event series

    CERN Document Server

    Siegmund, Jonathan F; Donner, Reik V

    2016-01-01

    We present the new R package CoinCalc for performing event coincidence analysis (ECA), a novel statistical method to quantify the simultaneity of events contained in two series of observations, either as simultaneous or lagged coincidences within a user-specific temporal tolerance window. The package also provides different analytical as well as surrogate-based significance tests (valid under different assumptions about the nature of the observed event series) as well as an intuitive visualization of the identified coincidences. We demonstrate the usage of CoinCalc based on two typical geoscientific example problems addressing the relationship between meteorological extremes and plant phenology as well as that between soil properties and land cover.

  7. The MEDIGATE graphical user interface for entry of physical findings: design principles and implementation. Medical Examination Direct Iconic and Graphic Augmented Text Entry System.

    Science.gov (United States)

    Yoder, J W; Schultz, D F; Williams, B T

    1998-10-01

    The solution to many of the problems of the computer-based recording of the medical record has been elusive, largely due to difficulties in the capture of those data elements that comprise the records of the Present Illness and of the Physical Findings. Reliable input of data has proven to be more complex than originally envisioned by early work in the field. This has led to more research and development into better data collection protocols and easy to use human-computer interfaces as support tools. The Medical Examination Direct Iconic and Graphic Augmented Text Entry System (MEDIGATE System) is a computer enhanced interactive graphic and textual record of the findings from physical examinations designed to provide ease of user input and to support organization and processing of the data characterizing these findings. The primary design objective of the MEDIGATE System is to develop and evaluate different interface designs for recording observations from the physical examination in an attempt to overcome some of the deficiencies in this major component of the individual record of health and illness.

  8. RadShield: semiautomated shielding design using a floor plan driven graphical user interface.

    Science.gov (United States)

    DeLorenzo, Matthew C; Wu, Dee H; Yang, Kai; Rutel, Isaac B

    2016-09-01

    The purpose of this study was to introduce and describe the development of RadShield, a Java-based graphical user interface (GUI), which provides a base design that uniquely performs thorough, spatially distributed calculations at many points and reports the maximum air-kerma rate and barrier thickness for each barrier pursuant to NCRP Report 147 methodology. Semiautomated shielding design calculations are validated by two approaches: a geometry-based approach and a manual approach. A series of geometry-based equations were derived giving the maximum air-kerma rate magnitude and location through a first derivative root finding approach. The second approach consisted of comparing RadShield results with those found by manual shielding design by an American Board of Radiology (ABR)-certified medical physicist for two clinical room situations: two adjacent catheterization labs, and a radiographic and fluoroscopic (R&F) exam room. RadShield's efficacy in finding the maximum air-kerma rate was compared against the geometry-based approach and the overall shielding recommendations by RadShield were compared against the medical physicist's shielding results. Percentage errors between the geometry-based approach and RadShield's approach in finding the magnitude and location of the maximum air-kerma rate was within 0.00124% and 14 mm. RadShield's barrier thickness calculations were found to be within 0.156 mm lead (Pb) and 0.150 mm lead (Pb) for the adjacent catheterization labs and R&F room examples, respectively. However, within the R&F room example, differences in locating the most sensitive calculation point on the floor plan for one of the barriers was not considered in the medical physicist's calculation and was revealed by the RadShield calculations. RadShield is shown to accurately find the maximum values of air-kerma rate and barrier thickness using NCRP Report 147 methodology. Visual inspection alone of the 2D X-ray exam distribution by a medical physicist may not

  9. MODIStsp: An R package for automatic preprocessing of MODIS Land Products time series

    Science.gov (United States)

    Busetto, L.; Ranghetti, L.

    2016-12-01

    MODIStsp is a new R package allowing automating the creation of raster time series derived from MODIS Land Products. It allows performing several preprocessing steps (e.g. download, mosaicing, reprojection and resize) on MODIS products on a selected time period and area. All processing parameters can be set with a user-friendly GUI, allowing users to select which specific layers of the original MODIS HDF files have to be processed and which Quality Indicators have to be extracted from the aggregated MODIS Quality Assurance layers. Moreover, the tool allows on-the-fly computation of time series of Spectral Indexes (either standard or custom-specified by the user through the GUI) from surface reflectance bands. Outputs are saved as single-band rasters corresponding to each available acquisition date and output layer. Virtual files allowing easy access to the entire time series as a single file using common image processing/GIS software or R scripts can be also created. Non-interactive execution within an R script and stand-alone execution outside an R environment exploiting a previously created Options File are also possible, the latter allowing scheduling execution of MODIStsp to automatically update a time series when a new image is available. The proposed software constitutes a very useful tool for the Remote Sensing community, since it allows performing all the main preprocessing steps required for the creation of time series of MODIS data within a common framework, and without requiring any particular programming skills by its users.

  10. R graphics

    CERN Document Server

    Murrell, Paul

    2005-01-01

    R is revolutionizing the world of statistical computing. Powerful, flexible, and best of all free, R is now the program of choice for tens of thousands of statisticians. Destined to become an instant classic, R Graphics presents the first complete, authoritative exposition on the R graphical system. Paul Murrell, widely known as the leading expert on R graphics, has developed an in-depth resource that takes nothing for granted and helps both neophyte and seasoned users master the intricacies of R graphics. After an introductory overview of R graphics facilities, the presentation first focuses

  11. tgcd: An R package for analyzing thermoluminescence glow curves

    Science.gov (United States)

    Peng, Jun; Dong, ZhiBao; Han, FengQing

    Thermoluminescence (TL) glow curves are widely used in dosimetric studies. Many commercial and free-distributed programs are used to deconvolute TL glow curves. This study introduces an open-source R package tgcd to conduct TL glow curve analysis, such as kinetic parameter estimation, glow peak simulation, and peak shape analysis. TL glow curves can be deconvoluted according to the general-order empirical expression or the semi-analytical expression derived from the one trap-one recombination center (OTOR) model based on the Lambert W function by using a modified Levenberg-Marquardt algorithm from which any of the parameters can be constrained or fixed. The package provides an interactive environment to initialize parameters and offers an automated "trial-and-error" protocol to obtain optimal fit results. First-order, second-order, and general-order glow peaks (curves) are simulated according to a number of simple kinetic models. The package was developed using a combination of Fortran and R programming languages to improve efficiency and flexibility.

  12. ROTS: An R package for reproducibility-optimized statistical testing.

    Science.gov (United States)

    Suomi, Tomi; Seyednasrollah, Fatemeh; Jaakkola, Maria K; Faux, Thomas; Elo, Laura L

    2017-05-01

    Differential expression analysis is one of the most common types of analyses performed on various biological data (e.g. RNA-seq or mass spectrometry proteomics). It is the process that detects features, such as genes or proteins, showing statistically significant differences between the sample groups under comparison. A major challenge in the analysis is the choice of an appropriate test statistic, as different statistics have been shown to perform well in different datasets. To this end, the reproducibility-optimized test statistic (ROTS) adjusts a modified t-statistic according to the inherent properties of the data and provides a ranking of the features based on their statistical evidence for differential expression between two groups. ROTS has already been successfully applied in a range of different studies from transcriptomics to proteomics, showing competitive performance against other state-of-the-art methods. To promote its widespread use, we introduce here a Bioconductor R package for performing ROTS analysis conveniently on different types of omics data. To illustrate the benefits of ROTS in various applications, we present three case studies, involving proteomics and RNA-seq data from public repositories, including both bulk and single cell data. The package is freely available from Bioconductor (https://www.bioconductor.org/packages/ROTS).

  13. Continuous Time Structural Equation Modeling with R Package ctsem

    Directory of Open Access Journals (Sweden)

    Charles C. Driver

    2017-04-01

    Full Text Available We introduce ctsem, an R package for continuous time structural equation modeling of panel (N > 1 and time series (N = 1 data, using full information maximum likelihood. Most dynamic models (e.g., cross-lagged panel models in the social and behavioural sciences are discrete time models. An assumption of discrete time models is that time intervals between measurements are equal, and that all subjects were assessed at the same intervals. Violations of this assumption are often ignored due to the difficulty of accounting for varying time intervals, therefore parameter estimates can be biased and the time course of effects becomes ambiguous. By using stochastic differential equations to estimate an underlying continuous process, continuous time models allow for any pattern of measurement occasions. By interfacing to OpenMx, ctsem combines the flexible specification of structural equation models with the enhanced data gathering opportunities and improved estimation of continuous time models. ctsem can estimate relationships over time for multiple latent processes, measured by multiple noisy indicators with varying time intervals between observations. Within and between effects are estimated simultaneously by modeling both observed covariates and unobserved heterogeneity. Exogenous shocks with different shapes, group differences, higher order diffusion effects and oscillating processes can all be simply modeled. We first introduce and define continuous time models, then show how to specify and estimate a range of continuous time models using ctsem.

  14. HDDTOOLS: an R package serving Hydrological Data Discovery Tools

    Science.gov (United States)

    Vitolo, C.; Buytaert, W.

    2014-12-01

    Many governmental bodies and institutions are currently committed to publish open data as the result of a trend of increasing transparency, based on which a wide variety of information produced at public expense is now becoming open and freely available to improve public involvement in the process of decision and policy making. Discovery, access and retrieval of information is, however, not always a simple task. Especially when programmatic access to data resources is not allowed, downloading metadata catalogue, select the information needed, request datasets, de-compression, conversion, manual filtering and parsing can become rather tedious. The R package "hddtools" is an open source project, designed to make all the above operations more efficient by means of re-usable functions. The package facilitate non programmatic access to various online data sources such as the Global Runoff Data Centre, NASA's TRMM mission, the Data60UK database amongst others. This package complements R's growing functionality in environmental web technologies to bridge the gap between data providers and data consumers and it is designed to be the starting building block of scientific workflows for linking data and models in a seamless fashion.

  15. Mixed Frequency Data Sampling Regression Models: The R Package midasr

    Directory of Open Access Journals (Sweden)

    Eric Ghysels

    2016-08-01

    Full Text Available When modeling economic relationships it is increasingly common to encounter data sampled at different frequencies. We introduce the R package midasr which enables estimating regression models with variables sampled at different frequencies within a MIDAS regression framework put forward in work by Ghysels, Santa-Clara, and Valkanov (2002. In this article we define a general autoregressive MIDAS regression model with multiple variables of different frequencies and show how it can be specified using the familiar R formula interface and estimated using various optimization methods chosen by the researcher. We discuss how to check the validity of the estimated model both in terms of numerical convergence and statistical adequacy of a chosen regression specification, how to perform model selection based on a information criterion, how to assess forecasting accuracy of the MIDAS regression model and how to obtain a forecast aggregation of different MIDAS regression models. We illustrate the capabilities of the package with a simulated MIDAS regression model and give two empirical examples of application of MIDAS regression.

  16. The R Package bgmm : Mixture Modeling with Uncertain Knowledge

    Directory of Open Access Journals (Sweden)

    Przemys law Biecek

    2012-04-01

    Full Text Available Classical supervised learning enjoys the luxury of accessing the true known labels for the observations in a modeled dataset. Real life, however, poses an abundance of problems, where the labels are only partially defined, i.e., are uncertain and given only for a subsetof observations. Such partial labels can occur regardless of the knowledge source. For example, an experimental assessment of labels may have limited capacity and is prone to measurement errors. Also expert knowledge is often restricted to a specialized area and is thus unlikely to provide trustworthy labels for all observations in the dataset. Partially supervised mixture modeling is able to process such sparse and imprecise input. Here, we present an R package calledbgmm, which implements two partially supervised mixture modeling methods: soft-label and belief-based modeling. For completeness, we equipped the package also with the functionality of unsupervised, semi- and fully supervised mixture modeling. On real data we present the usage of bgmm for basic model-fitting in all modeling variants. The package can be applied also to selection of the best-fitting from a set of models with different component numbers or constraints on their structures. This functionality is presented on an artificial dataset, which can be simulated in bgmm from a distribution defined by a given model.

  17. Fitting Additive Binomial Regression Models with the R Package blm

    Directory of Open Access Journals (Sweden)

    Stephanie Kovalchik

    2013-09-01

    Full Text Available The R package blm provides functions for fitting a family of additive regression models to binary data. The included models are the binomial linear model, in which all covariates have additive effects, and the linear-expit (lexpit model, which allows some covariates to have additive effects and other covariates to have logisitc effects. Additive binomial regression is a model of event probability, and the coefficients of linear terms estimate covariate-adjusted risk differences. Thus, in contrast to logistic regression, additive binomial regression puts focus on absolute risk and risk differences. In this paper, we give an overview of the methodology we have developed to fit the binomial linear and lexpit models to binary outcomes from cohort and population-based case-control studies. We illustrate the blm packages methods for additive model estimation, diagnostics, and inference with risk association analyses of a bladder cancer nested case-control study in the NIH-AARP Diet and Health Study.

  18. Unified Geostatistical Modeling for Data Fusion and Spatial Heteroskedasticity with R Package ramps

    Directory of Open Access Journals (Sweden)

    Brian J. Smith

    2008-03-01

    Full Text Available This article illustrates usage of the ramps R package, which implements the reparameterized and marginalized posterior sampling (RAMPS algorithm for complex Bayesian geostatistical models. The RAMPS methodology allows joint modeling of areal and point-source data arising from the same underlying spatial process. A reparametrization of variance parameters facilitates slice sampling based on simplexes, which can be useful in general when multiple variances are present. Prediction at arbitrary points can be made, which is critical in applications where maps are needed. Our implementation takes advantage of sparse matrix operations in the Matrix package and can provide substantial savings in computing time for large datasets. A user-friendly interface, similar to the nlme mixed effects models package, enables users to analyze datasets with little programming effort. Support is provided for numerous spatial and spatiotemporal correlation structures, user-defined correlation structures, and non-spatial random effects. The package features are illustrated via a synthetic dataset of spatially correlated observation distributed across the state of Iowa, USA.

  19. A Prototype Graphical User Interface for Co-op: A Group Decision Support System.

    Science.gov (United States)

    1992-03-01

    Decision Support System by P. Steven Posey Lieutenant, United States Navy B.S., University of Arkansas, 1985 Submitted in partial fulfillment of the...Design..................16 a. Color....................17 iv b. Screen Layout .... ............. .. 18 c. Typography ..... ............... .. 19 C. GUI...achieve their potential to communicate. Information-oriented, systematic graphic design is the use of typography , symbols, color, and other static and

  20. OptimalCutpoints: An R Package for Selecting Optimal Cutpoints in Diagnostic Tests

    Directory of Open Access Journals (Sweden)

    Mónica López-Ratón

    2014-11-01

    Full Text Available Continuous diagnostic tests are often used for discriminating between healthy and diseased populations. For the clinical application of such tests, it is useful to select a cutpoint or discrimination value c that defines positive and negative test results. In general, individuals with a diagnostic test value of c or higher are classified as diseased. Several search strategies have been proposed for choosing optimal cutpoints in diagnostic tests, depending on the underlying reason for this choice. This paper introduces an R package, known as OptimalCutpoints, for selecting optimal cutpoints in diagnostic tests. It incorporates criteria that take the costs of the different diagnostic decisions into account, as well as the prevalence of the target disease and several methods based on measures of diagnostic test accuracy. Moreover, it enables optimal levels to be calculated according to levels of given (categorical covariates. While the numerical output includes the optimal cutpoint values and associated accuracy measures with their confidence intervals, the graphical output includes the receiver operating characteristic (ROC and predictive ROC curves. An illustration of the use of OptimalCutpoints is provided, using a real biomedical dataset.

  1. Statistical Disclosure Control for Microdata Using the R-Package sdcMicro

    Directory of Open Access Journals (Sweden)

    Matthias Templ

    2008-08-01

    Full Text Available The demand for high quality microdata for analytical purposes has grown rapidly among researchers and the public over the last few years. In order to respect existing laws on data privacy and to be able to provide microdata to researchers and the public, statistical institutes, agencies and other institutions may provide masked data. Using our flexible software tools with which one can apply protection methods in an exploratory manner, it is possible to generate high quality confidential (micro-data. In this paper we present highly flexible and easy to use software for the generation of anonymized microdata and give insights into the implementation and the design of the R}-Package sdcMicro. R} is a highly extendable system for statistical computing and graphics, distributed over the net. sdcMicro contains almost all popular methods for the anonymization of both categorical and continuous variables. Furthermore, several new methods have been implemented. The package can also be used for the comparison of methods and for measuring the information loss and disclosure risk of the masked data.

  2. MATLAB GUI (graphical user interface) for the design of GRIN components for optical systems as an educational tool

    Science.gov (United States)

    Gómez-Varela, A. I.; Bao-Varela, C.

    2014-07-01

    New technologies and the available computing tools are becoming more important every day in the teaching evolution. The use of Graphical User Interfaces (GUI) with MATLAB enables the implementation of practical teaching methodologies to make easier the comprehension of a given subject. In this work, we report on the application of GUIs in order to provide the students with a simple tool for a better understanding on how to design GRIN elements for optical systems. Another GUIs advantage is that they can be converted to an executable file, so any student could use the interface in their own computer without having a MATLAB license. We present a graphical interface to show the performance of an optical device for controlling beam size and for deflecting light for coupling purposes, by a simple geometrical optics study, in a tapered GRIN lens illuminated by a parallel beam of tilted rays. We also show a graphical interface to obtain the maximum coupling efficiency between fundamental modes of two single-mode fibers by a scaling operation carried out by a GRIN fiber lens. With this interface the students can vary the magnification and the image plane in order to get the more suitable GRIN fiber lens to maximize the coupling efficiency between two fibers.

  3. The convergence of robotics, vision, and computer graphics for user interaction

    Energy Technology Data Exchange (ETDEWEB)

    Hollerback, J.M.; Thompson, W.B.; Shirley, P.

    1999-11-01

    Mechanical interfaces to virtual environments and the creation of virtual environments represent important and relatively new application areas for robotics. The creation of immersive interfaces will require codevelopment of visual displays that complement mechanical stimuli with appropriate visual cues, ultimately determined from human psychophysics. Advances in interactive rendering and geometric modeling form computer graphics will play a key role. Examples are drawn from haptic and locomotion interface projects.

  4. The R-package 'eseis' - towards a toolbox for comprehensive seismic data analysis

    Science.gov (United States)

    Dietze, Michael

    2015-04-01

    There are plenty of software solutions to process seismic data. However, most of these are either not free and open-source, are focused on specialised tasks, lack appropriate documentation/examples or are limited to command-line processing. R is the most widely used and still fastest growing scientific software worldwide. This free and open-source software allows contribution of user-build function packages (currently 6091) that cover nearly all scientific research fields. However, support of seismic data is only limited. This contribution is devoted to present the R-package 'eseis', a collection of functions to handle seismic data, mostly for but not limited to "environmental seismology", i.e. analysis of seismic signals, emitted by Earth surface processes such as landslides, rockfalls or debris flows. The package allows import/export/conversion of different data formats (cube, mseed, sac), signal processing (deconvolution, filtering, clipping/merging, power spectral density estimates), event handling (triggering, locating) and data visualisation (2D-plots, images, animations). The main advantages of using this package are the embedding of processed data in a huge framework of other scientific analysis approaches, the presence of a sound documentation and tested examples, benefit from a worldwide help and discussion network, the possibility to modify all functions and enlarge the functionality by the user.

  5. A MATLAB-based graphical user interface for the identification of muscular activations from surface electromyography signals.

    Science.gov (United States)

    Mengarelli, Alessandro; Cardarelli, Stefano; Verdini, Federica; Burattini, Laura; Fioretti, Sandro; Di Nardo, Francesco

    2016-08-01

    In this paper a graphical user interface (GUI) built in MATLAB® environment is presented. This interactive tool has been developed for the analysis of superficial electromyography (sEMG) signals and in particular for the assessment of the muscle activation time intervals. After the signal import, the tool performs a first analysis in a totally user independent way, providing a reliable computation of the muscular activation sequences. Furthermore, the user has the opportunity to modify each parameter of the on/off identification algorithm implemented in the presented tool. The presence of an user-friendly GUI allows the immediate evaluation of the effects that the modification of every single parameter has on the activation intervals recognition, through the real-time updating and visualization of the muscular activation/deactivation sequences. The possibility to accept the initial signal analysis or to modify the on/off identification with respect to each considered signal, with a real-time visual feedback, makes this GUI-based tool a valuable instrument in clinical, research applications and also in an educational perspective.

  6. sdef: an R package to synthesize lists of significant features in related experiments

    Directory of Open Access Journals (Sweden)

    Richardson Sylvia

    2010-05-01

    Full Text Available Abstract Background In microarray studies researchers are often interested in the comparison of relevant quantities between two or more similar experiments, involving different treatments, tissues, or species. Typically each experiment reports measures of significance (e.g. p-values or other measures that rank its features (e.g genes. Our objective is to find a list of features that are significant in all experiments, to be further investigated. In this paper we present an R package called sdef, that allows the user to quantify the evidence of communality between the experiments using previously proposed statistical methods based on the ranked lists of p-values. sdef implements two approaches that address this objective: the first is a permutation test of the maximal ratio of observed to expected common features under the hypothesis of independence between the experiments. The second approach, set in a Bayesian framework, is more flexible as it takes into account the uncertainty on the number of genes differentially expressed in each experiment. Results We used sdef to re-analyze publicly available data i on Type 2 diabetes susceptibility in mice on liver and skeletal muscle (two experiments; ii on molecular similarities between mammalian sexes (three experiments. For the first example, we found between 68 and 104 genes commonly perturbed between the two tissues, using the two methods described above, and enrichment of the inflammation pathways, which are related to obesity and diabetes. For the second example, looking at three lists of features, we found 110 genes commonly perturbed between the three tissues, using the same two methods, and enrichment on genes involved in cell development. Conclusions sdef is an R package that provides researchers with an easy and powerful methodology to find lists of features commonly perturbed in two or more experiments to be further investigated. The package is provided with plots and tables to help the user

  7. ModelMuse: A U.S. Geological Survey Open-Source, Graphical User Interface for Groundwater Models

    Science.gov (United States)

    Winston, R. B.

    2013-12-01

    ModelMuse is a free publicly-available graphical preprocessor used to generate the input and display the output for several groundwater models. It is written in Object Pascal and the source code is available on the USGS software web site. Supported models include the MODFLOW family of models, PHAST (version 1), and SUTRA version 2.2. With MODFLOW and PHAST, the user generates a grid and uses 'objects' (points, lines, and polygons) to define boundary conditions and the spatial variation in aquifer properties. Because the objects define the spatial variation, the grid can be changed without the user needing to re-enter spatial data. The same paradigm is used with SUTRA except that the user generates a quadrilateral finite-element mesh instead of a rectangular grid. The user interacts with the model in a top view and in a vertical cross section. The cross section can be at any angle or location. There is also a three-dimensional view of the model. For SUTRA, a new method of visualizing the permeability and related properties has been introduced. In three dimensional SUTRA models, the user specifies the permeability tensor by specifying permeability in three mutually orthogonal directions that can be oriented in space in any direction. Because it is important for the user to be able to check both the magnitudes and directions of the permeabilities, ModelMuse displays the permeabilities as either a two-dimensional or a three-dimensional vector plot. Color is used to differentiate the maximum, middle, and minimum permeability vectors. The magnitude of the permeability is shown by the vector length. The vector angle shows the direction of the maximum, middle, or minimum permeability. Contour and color plots can also be used to display model input and output data.

  8. MethLAB: a graphical user interface package for the analysis of array-based DNA methylation data.

    Science.gov (United States)

    Kilaru, Varun; Barfield, Richard T; Schroeder, James W; Smith, Alicia K; Conneely, Karen N

    2012-03-01

    Recent evidence suggests that DNA methylation changes may underlie numerous complex traits and diseases. The advent of commercial, array-based methods to interrogate DNA methylation has led to a profusion of epigenetic studies in the literature. Array-based methods, such as the popular Illumina GoldenGate and Infinium platforms, estimate the proportion of DNA methylated at single-base resolution for thousands of CpG sites across the genome. These arrays generate enormous amounts of data, but few software resources exist for efficient and flexible analysis of these data. We developed a software package called MethLAB (http://genetics.emory.edu/conneely/MethLAB) using R, an open source statistical language that can be edited to suit the needs of the user. MethLAB features a graphical user interface (GUI) with a menu-driven format designed to efficiently read in and manipulate array-based methylation data in a user-friendly manner. MethLAB tests for association between methylation and relevant phenotypes by fitting a separate linear model for each CpG site. These models can incorporate both continuous and categorical phenotypes and covariates, as well as fixed or random batch or chip effects. MethLAB accounts for multiple testing by controlling the false discovery rate (FDR) at a user-specified level. Standard output includes a spreadsheet-ready text file and an array of publication-quality figures. Considering the growing interest in and availability of DNA methylation data, there is a great need for user-friendly open source analytical tools. With MethLAB, we present a timely resource that will allow users with no programming experience to implement flexible and powerful analyses of DNA methylation data.

  9. Solving Differential Equations in R: Package deSolve

    Directory of Open Access Journals (Sweden)

    Karline Soetaert

    2010-02-01

    Full Text Available In this paper we present the R package deSolve to solve initial value problems (IVP written as ordinary differential equations (ODE, differential algebraic equations (DAE of index 0 or 1 and partial differential equations (PDE, the latter solved using the method of lines approach. The differential equations can be represented in R code or as compiled code. In the latter case, R is used as a tool to trigger the integration and post-process the results, which facilitates model development and application, whilst the compiled code significantly increases simulation speed. The methods implemented are efficient, robust, and well documented public-domain Fortran routines. They include four integrators from the ODEPACK package (LSODE, LSODES, LSODA, LSODAR, DVODE and DASPK2.0. In addition, a suite of Runge-Kutta integrators and special-purpose solvers to efficiently integrate 1-, 2- and 3-dimensional partial differential equations are available. The routines solve both stiff and non-stiff systems, and include many options, e.g., to deal in an efficient way with the sparsity of the Jacobian matrix, or finding the root of equations. In this article, our objectives are threefold: (1 to demonstrate the potential of using R for dynamic modeling, (2 to highlight typical uses of the different methods implemented and (3 to compare the performance of models specified in R code and in compiled code for a number of test cases. These comparisons demonstrate that, if the use of loops is avoided, R code can efficiently integrate problems comprising several thousands of state variables. Nevertheless, the same problem may be solved from 2 to more than 50 times faster by using compiled code compared to an implementation using only R code. Still, amongst the benefits of R are a more flexible and interactive implementation, better readability of the code, and access to R’s high-level procedures. deSolve is the successor of package odesolve which will be deprecated in

  10. Formal Model for Data Dependency Analysis between Controls and Actions of a Graphical User Interface

    Directory of Open Access Journals (Sweden)

    SKVORC, D.

    2012-02-01

    Full Text Available End-user development is an emerging computer science discipline that provides programming paradigms, techniques, and tools suitable for users not trained in software engineering. One of the techniques that allow ordinary computer users to develop their own applications without the need to learn a classic programming language is a GUI-level programming based on programming-by-demonstration. To build wizard-based tools that assist users in application development and to verify the correctness of user programs, a computer-supported method for GUI-level data dependency analysis is necessary. Therefore, formal model for GUI representation is needed. In this paper, we present a finite state machine for modeling the data dependencies between GUI controls and GUI actions. Furthermore, we present an algorithm for automatic construction of finite state machine for arbitrary GUI application. We show that proposed state aggregation scheme successfully manages state explosion in state machine construction algorithm, which makes the model applicable for applications with complex GUIs.

  11. Neuronvisio: a Graphical User Interface with 3D capabilities for NEURON

    Directory of Open Access Journals (Sweden)

    Michele eMattioni

    2012-06-01

    Full Text Available The NEURON simulation environment is a commonly used tool to perform electrical simulation of neurons and neuronal networks. The NEURON User Interface, based on the now discontinued InterViews library, provides some limited facilities to explore models and to plot their simulation results. Other limitations include the inability to generate a three dimensional visualization, no standard mean to save the results of simulations, or to store the model geometry within the results. Neuronvisio (http://mattions.github.com/neuronvisio/ aims to address these deficiencies through a set of well designed python APIs and provides an improved UI, allowing users to explore and interact with the model.Neuronvisio also facilitates access to previously published models, allowing usersto browse, download and locally run NEURON models stored in ModelDB. Neuronvisio uses the matplotlib library to plot simulation results and uses the HDF standard format to store simulation results. Neuronvisio can be viewed as an extension of NEURON, facilitating typical user workflows such as model browsing, selection, download, compilation and simulation. The 3D viewer simplifies the exploration of complex model structure, while matplotlib permits the plotting of high-quality graphs. The newly introduced ability of saving numerical results allows users to perform additional analysis on their previous simulations.

  12. imDEV: a graphical user interface to R multivariate analysis tools in Microsoft Excel

    Science.gov (United States)

    Interactive modules for data exploration and visualization (imDEV) is a Microsoft Excel spreadsheet embedded application providing an integrated environment for the analysis of omics data sets with a user-friendly interface. Individual modules were designed to provide toolsets to enable interactive ...

  13. Librarian driven analysis with graphic user interface for nuclides quantification by gamma spectra

    Energy Technology Data Exchange (ETDEWEB)

    Kondrashov, V.S. E-mail: vlkondra@cdrewu.edu; Rothenberg, S.J.; Petersone, I

    2001-09-11

    For a set of a priori given radionuclides extracted from a general nuclide data library, the authors use median estimates of the gamma-peak areas and estimates to produce a list of possible radionuclides matching gamma-ray line(s). An a priori determined list of nuclides is obtained by searching for a match with the energy information of the database. This procedure is performed in an interactive graphic mode by markers that superimpose, on the spectral data, the energy information and yields provided by a general gamma-ray data library. This library of experimental data includes approximately 17,000 gamma-energy lines related to 756 known gamma emitter radionuclides listed by ICRP.

  14. animation : An R Package for Creating Animations and Demonstrating Statistical Methods

    Directory of Open Access Journals (Sweden)

    Yihui Xie

    2013-04-01

    Full Text Available Animated graphs that demonstrate statistical ideas and methods can both attract interest and assist understanding. In this paper we first discuss how animations can be related to some statistical topics such as iterative algorithms, random simulations, (resampling methods and dynamic trends, then we describe the approaches that may be used to create animations, and give an overview to the R package animation, including its design, usage and the statistical topics in the package. With the animation package, we can export the animations produced by R into a variety of formats, such as a web page, a GIF animation, a Flash movie, a PDF document, or an MP4/AVI video, so that users can publish the animations fairly easily. The design of this package is flexible enough to be readily incorporated into web applications, e.g., we can generate animations online with Rweb, which means we do not even need R to be installed locally to create animations. We will show examples of the use of animations in teaching statistics and in the presentation of statistical reports using Sweave or knitr. In fact, this paper itself was written with the knitr and animation package, and the animations are embedded in the PDF document, so that readers can watch the animations in real time when they read the paper (the Adobe Reader is required.Animations can add insight and interest to traditional static approaches to teaching statistics and reporting, making statistics a more interesting and appealing subject.

  15. Redevelopment of ANSYS Graphical User Interface Based on UIDL and Tcl/Tk%基于UIDL和Tcl/Tk的ANSYS图形用户界面二次开发

    Institute of Scientific and Technical Information of China (English)

    张朋; 王丽娟

    2013-01-01

    Operation on ANSYS of specific professional problems is cumbersome and difficult to master,so the secondary development method of ANSYS graphical user interface using UIDL and Tcl/Tk aims at the shortcoming.According to the method,a graphical user interface of single pile engineering is created.This paper also provides a reference for other problems to redevelop ANSYS graphical user interface.%针对ANSYS在特定专业问题分析上存在的操作繁琐、不易掌握的缺点,介绍了利用UIDL和Tcl/Tk进行ANSYS图形用户界面二次开发的方法,并依据此方法开发了单桩工程的图形用户界面,为其它问题二次开发ANSYS图形用户界面提供了参考.

  16. RGUI 1.0, New Graphical User Interface for RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    G. L. Mesina; J. Galbraith

    1999-04-01

    With the advent of three-dimensional modeling in nuclear safety analysis codes, the need has arisen for a new display methodology. Currently, analysts either sort through voluminous numerical displays of data at points in a region, or view color coded interpretations of the data on a two-dimensional rendition of the plant. RGUI 1.0 provides 3D capability for displaying data. The 3D isometric hydrodynamic image is built automatically from the input deck without additional input from the user. Standard view change features allow the user to focus on only the important data. Familiar features that are standard to the nuclear industry, such as run, interact, and monitor, are included. RGUI 1.0 reduces the difficulty of analyzing complex three-dimensional plants.

  17. RGUI 1.0, New Graphical User Interface for RELAP5-3D

    Energy Technology Data Exchange (ETDEWEB)

    Mesina, George Lee; Galbraith, James Andrew

    1999-04-01

    With the advent of three-dimensional modeling in nuclear safety analysis codes, the need has arisen for a new display methodology. Currently, analysts either sort through voluminous numerical displays of data at points in a region, or view color coded interpretations of the data on a two-dimensional rendition of the plant. RGUI 1.0 provides 3D capability for displaying data. The 3D isometric hydrodynamic image is built automatically from the input deck without additional input from the user. Standard view change features allow the user to focus on only the important data. Familiar features that are standard to the nuclear industry, such as run, interact, and monitor, are included. RGUI 1.0 reduces the difficulty of analyzing complex three dimensional plants.

  18. Development of Graphical User Interface for Finite Element Analysis of Static Loading of a Column using MATLAB

    Directory of Open Access Journals (Sweden)

    Moses Omolayo PETINRIN

    2010-12-01

    Full Text Available In this work, the capability of MATLAB software package to develop graphical user interface (GUI package was demonstrated. A GUI was successfully developed using MATLAB programming language to study the behaviour of a suspended column under uniaxial static loading by solving the numerical model created based on the finite element method (FEM. The comparison between the exact solution from previous researches and the numerical analysis showed good agreement. The column average strain, average stress and average load are equivalent but more accurate to the ones obtained when the whole column is taken as one element (two nodes for one dimensional linear finite element problem. It was established in this work that MATLAB is not only a software package for numerical computation but also for application development.

  19. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    Science.gov (United States)

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  20. HT-BONE: a graphical user interface for the identification of bone profiles in CT images via extended Hough transform

    Science.gov (United States)

    Campi, Cristina; Perasso, Annalisa; Beltrametti, Mauro C.; Piana, Michele; Sambuceti, Gianmario; Massone, Anna Maria

    2016-03-01

    It has been recently proved that the computational analysis of X-ray Computed Tomography (CT) images allows clinicians to assess the alteration of compact bone asset due to hematological diseases. HT-BONE implements a new method, based on an extension of the Hough transform (HT) to a wide class of algebraic curves, for accurately measuring global and regional geometric properties of trabecular and compact bone districts. In the case of CT/PET analysis, the segmentation of the CT images provides masks for Positron Emission Tomography (PET) data, extracting the metabolic activity in the region surrounded by compact bone tissue. HT-BONE offers an intuitive, user-friendly, Matlab-based Graphical User Interface (GUI) for all input/output procedures and the automatic managing of the segmentation process also from non-expert users: the CT/PET data can be loaded and browsed easily and the only pre-preprocessing required from the user is the drawing of Regions Of Interest (ROIs) around the bone districts under consideration. For each bone district, specific families of curves, whose reliability has been already tested in previous works, is automatically selected for the recognition task via HT. As output, the software returns masks of the segmented compact bone regions, images of the Standard Uptake Values (SUV) in the masked regions of PET slices, and the values of the parameters in the curve equations utilized in the HT procedure. This information can be used for all pathologies and clinical conditions for which the alteration of the compact bone asset or bone marrow distribution plays a crucial role.

  1. Robot services for elderly with cognitive impairment: testing usability of graphical user interfaces.

    Science.gov (United States)

    Granata, C; Pino, M; Legouverneur, G; Vidal, J-S; Bidaud, P; Rigaud, A-S

    2013-01-01

    Socially assistive robotics for elderly care is a growing field. However, although robotics has the potential to support elderly in daily tasks by offering specific services, the development of usable interfaces is still a challenge. Since several factors such as age or disease-related changes in perceptual or cognitive abilities and familiarity with computer technologies influence technology use they must be considered when designing interfaces for these users. This paper presents findings from usability testing of two different services provided by a social assistive robot intended for elderly with cognitive impairment: a grocery shopping list and an agenda application. The main goal of this study is to identify the usability problems of the robot interface for target end-users as well as to isolate the human factors that affect the use of the technology by elderly. Socio-demographic characteristics and computer experience were examined as factors that could have an influence on task performance. A group of 11 elderly persons with Mild Cognitive Impairment and a group of 11 cognitively healthy elderly individuals took part in this study. Performance measures (task completion time and number of errors) were collected. Cognitive profile, age and computer experience were found to impact task performance. Participants with cognitive impairment achieved the tasks committing more errors than cognitively healthy elderly. Instead younger participants and those with previous computer experience were faster at completing the tasks confirming previous findings in the literature. The overall results suggested that interfaces and contents of the services assessed were usable by older adults with cognitive impairment. However, some usability problems were identified and should be addressed to better meet the needs and capacities of target end-users.

  2. INSPECT: A graphical user interface software package for IDARC-2D

    Science.gov (United States)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  3. rAvis: an R-package for downloading information stored in Proyecto AVIS, a citizen science bird project.

    Science.gov (United States)

    Varela, Sara; González-Hernández, Javier; Casabella, Eduardo; Barrientos, Rafael

    2014-01-01

    Citizen science projects store an enormous amount of information about species distribution, diversity and characteristics. Researchers are now beginning to make use of this rich collection of data. However, access to these databases is not always straightforward. Apart from the largest and international projects, citizen science repositories often lack specific Application Programming Interfaces (APIs) to connect them to the scientific environments. Thus, it is necessary to develop simple routines to allow researchers to take advantage of the information collected by smaller citizen science projects, for instance, programming specific packages to connect them to popular scientific environments (like R). Here, we present rAvis, an R-package to connect R-users with Proyecto AVIS (http://proyectoavis.com), a Spanish citizen science project with more than 82,000 bird observation records. We develop several functions to explore the database, to plot the geographic distribution of the species occurrences, and to generate personal queries to the database about species occurrences (number of individuals, distribution, etc.) and birdwatcher observations (number of species recorded by each collaborator, UTMs visited, etc.). This new R-package will allow scientists to access this database and to exploit the information generated by Spanish birdwatchers over the last 40 years.

  4. parallelnewhybrid: an R package for the parallelization of hybrid detection using newhybrids.

    Science.gov (United States)

    Wringe, Brendan F; Stanley, Ryan R E; Jeffery, Nicholas W; Anderson, Eric C; Bradbury, Ian R

    2017-01-01

    Hybridization among populations and species is a central theme in many areas of biology, and the study of hybridization has direct applicability to testing hypotheses about evolution, speciation and genetic recombination, as well as having conservation, legal and regulatory implications. Yet, despite being a topic of considerable interest, the identification of hybrid individuals, and quantification of the (un)certainty surrounding the identifications, remains difficult. Unlike other programs that exist to identify hybrids based on genotypic information, newhybrids is able to assign individuals to specific hybrid classes (e.g. F1 , F2 ) because it makes use of patterns of gene inheritance within each locus, rather than just the proportions of gene inheritance within each individual. For each comparison and set of markers, multiple independent runs of each data set should be used to develop an estimate of the hybrid class assignment accuracy. The necessity of analysing multiple simulated data sets, constructed from large genomewide data sets, presents significant computational challenges. To address these challenges, we present parallelnewhybrid, an r package designed to decrease user burden when undertaking multiple newhybrids analyses. parallelnewhybrid does so by taking advantage of the parallel computational capabilities inherent in modern computers to efficiently and automatically execute separate newhybrids runs in parallel. We show that parallelization of analyses using this package affords users several-fold reductions in time over a traditional serial analysis. parallelnewhybrid consists of an example data set, a readme and three operating system-specific functions to execute parallel newhybrids analyses on each of a computer's c cores. parallelnewhybrid is freely available on the long-term software hosting site github (www.github.com/bwringe/parallelnewhybrid). © 2016 John Wiley & Sons Ltd.

  5. Louisiana coastal GIS network: Graphical user interface for access to spatial data

    Science.gov (United States)

    Hiland, Matteson; McBride, Randolph A.; Davis, Donald; Braud, Dewitt; Streiffer, Henry; Jones, Farrell; Lewis, Anthony; Williams, S.

    1991-01-01

    Louisiana's coastal wetlands support a large percentage of the nation's seafood and fur industries, vast deposits of oil and natural gas, habitat for thousands of species of plants and animals, winter nesting grounds and migratory paths for numerous waterfowl, and many recreational resources enjoyed by residents and tourists. Louisiana's wetlands also have the highest rates of coastal erosion and wetland loss in the nation. While numerous studies across many disciplines have been conducted on both local and regional scales, no complete inventory exists for this information. The Louisiana Coastal Geographic Information System Network (LCGISN) is currently being developed to facilitate access to existing data for coastal zone planners, managers, and researchers. The Louisiana Geological Survey (LGS), in cooperation with the LSU Department of Geography and Anthropology, the Computer Aided Design and Geographic Information Systems Research Laboratory (CADGIS), and others, is pursuing this project under the terms of a cooperative agreement with the U.S. Geological Survey. LCGISN is an automated system for searching and retrieving geographic, cartographic, and bibliographic data. By linking original programming with an existing GIS software package and an industry standard relational database management system, LCGISN will provide the capability for users to search for data references by interactively defining the area of interest on a displayed map/image reference background. Several agencies will be networked to provide easy access to a wide variety of information. LCGISN, with its headquarters at LGS, will serve as the central node on the network, providing data format conversions, projection and datum transformations, and storage of several of the most commonly used data sets. Thematic mapper data, USGS 7.5-minute quadrangle map boundaries, political and legal boundaries, major transportation routes, and other digital data will provide a base map to aid the user in

  6. Leaf extraction and analysis framework graphical user interface: segmenting and analyzing the structure of leaf veins and areoles.

    Science.gov (United States)

    Price, Charles A; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S

    2011-01-01

    Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure.

  7. Electro pneumatic trainer embedded with programmable integrated circuit (PIC) microcontroller and graphical user interface platform for aviation industries training purposes

    Science.gov (United States)

    Burhan, I.; Azman, A. A.; Othman, R.

    2016-10-01

    An electro pneumatic trainer embedded with programmable integrated circuit (PIC) microcontroller and Visual Basic (VB) platform is fabricated as a supporting tool to existing teaching and learning process, and to achieve the objectives and learning outcomes towards enhancing the student's knowledge and hands-on skill, especially in electro pneumatic devices. The existing learning process for electro pneumatic courses conducted in the classroom does not emphasize on simulation and complex practical aspects. VB is used as the platform for graphical user interface (GUI) while PIC as the interface circuit between the GUI and hardware of electro pneumatic apparatus. Fabrication of electro pneumatic trainer interfacing between PIC and VB has been designed and improved by involving multiple types of electro pneumatic apparatus such as linear drive, air motor, semi rotary motor, double acting cylinder and single acting cylinder. Newly fabricated electro pneumatic trainer microcontroller interface can be programmed and re-programmed for numerous combination of tasks. Based on the survey to 175 student participants, 97% of the respondents agreed that the newly fabricated trainer is user friendly, safe and attractive, and 96.8% of the respondents strongly agreed that there is improvement in knowledge development and also hands-on skill in their learning process. Furthermore, the Lab Practical Evaluation record has indicated that the respondents have improved their academic performance (hands-on skills) by an average of 23.5%.

  8. SUPPLY CHAIN MANAGEMENT SYSTEM USING THE PROPERTY OF GRAPHICAL USER INTERFACE

    Directory of Open Access Journals (Sweden)

    Venkatesan.M,

    2011-02-01

    Full Text Available Manufacturing companies uses the supply chain management system (SCM. SCM is a system for managing raw material and finished goods requirements in a manufacturing process. It is set of techniques that use inventory data, requirements for materials and goods etc.the system also makes recommendation for purchasing, sale and send for job work of raw materials. The main objective of the project is tomaintain the raw materials and finish goods in the manufacturing organization. In this software the information are stored in the database, which prepare reports when asked, and a very reliablefront-end structure with GUI property to make the user understand and work in a right way even though he/she is a layman. This project is developed using Oracle Developer 2000 Forms 6i as front-end, Oracle8.0 as a back-end, and Oracle Developer 2000 Reports 6i as reporting tool.

  9. Graphical user interface (GUIDE) and semi-automatic system for the acquisition of anaglyphs

    Science.gov (United States)

    Canchola, Marco A.; Arízaga, Juan A.; Cortés, Obed; Tecpanecatl, Eduardo; Cantero, Jose M.

    2013-09-01

    Diverse educational experiences have shown greater acceptance of children to ideas related to science, compared with adults. That fact and showing great curiosity are factors to consider to undertake scientific outreach efforts for children, with prospects of success. Moreover now 3D digital images have become a topic that has gained importance in various areas, entertainment, film and video games mainly, but also in areas such as medical practice transcendental in disease detection This article presents a system model for 3D images for educational purposes that allows students of various grade levels, school and college, have an approach to image processing, explaining the use of filters for stereoscopic images that give brain impression of depth. The system is based on one of two hardware elements, centered on an Arduino board, and a software based on Matlab. The paper presents the design and construction of each of the elements, also presents information on the images obtained and finally how users can interact with the device.

  10. WiSPR - A Graphical User Interface for Accessing a Sybase Database

    Science.gov (United States)

    Williamson, Ramon L., II

    WiSPR is a Tcl/Tk script in the X environment that builds a display of the fields of an arbitrary database table in an easy-to-read format on the fly. Each field of the database table has a text widget into which search strings for that field may be entered. When all desired search fields have been filled, an SQL query is constructed and values are returned into the text widgets one row at a time. Subsequent matches to the query are displayed until all matching rows have been retrieved. WiSPR recognizes all SQL wildcards when doing queries, and once values are returned, the results may be saved to a file or printed on a printer. In addition, if write access to the database in question is available, the user can add new rows to the database or update entries in a row already retrieved. On-line help is available at anytime using a menu-driven help system. Sybase database access is accomplished by means of the sybtcl library, written as an extension to Tcl/Tk by Tom Poindexter of Denver Colorado.

  11. moult : An R Package to Analyze Moult in Birds

    Directory of Open Access Journals (Sweden)

    Birgit Erni

    2013-01-01

    Full Text Available Moult is the process by which birds replace their feathers. It is a costly process in terms of energy and reduced flight ability but necessary for the maintenance of the plumage and its functions. Because birds generally avoid to moult while engaged with other energy demanding activities such as breeding and migration, the analysis of moult data gives insight into how birds fit this life stage into the annual cycle, on time constraints in the annual cycle, and on the effects of environmental variables on the timing of moult. The analysis of moult data requires non-standard statistical techniques. More than 20~years ago Underhill and Zucchini developed a likelihood approach for estimating duration, mean start date and variation in start date of a population of moulting birds. However, use of these models has been limited, mainly due to the lack of user-friendly software. The moult package for R implements the Underhill-Zucchini models, allowing the user to specify moult models in a regression type formula. In addition the functions allow the moult parameters (duration, and mean and variation in start date to depend on explanatory variables. We here describe the package, give a brief summary of the theory and illustrate the models on two datasets included in the package.

  12. Graphical User Interface Software for Gross Defect Detection at the Atucha-I Plant

    Energy Technology Data Exchange (ETDEWEB)

    Wong, A C; Sitaraman, S; Ham, Y S; Peixoto, O

    2012-05-10

    At the Atucha-I pressurized heavy water reactor in Argentina, fuel assemblies in the spent fuel pools are stored by suspending them in two vertically stacked layers. This introduces the unique problem of verifying the presence of fuel in either layer without physically moving the fuel assemblies. Movement of fuel, especially from the lower layer, would involve a major effort on the part of the operator. Given that the facility uses both natural uranium and slightly enriched uranium at 0.85 w% {sup 235}U, and has been in operation since 1974, a wide range of burnups and cooling times can exist in any given pool. Additionally, while fuel assemblies are grouped together in a uniform fashion, the packing density from group to group can vary within a single pool. A tool called the Spent Fuel Neutron Counter (SFNC) was developed and successfully tested at the site to verify, in an in-situ condition, the presence of fuel up to burnups of 8,000 MWd/t. Since the neutron source term becomes a nonlinear function of burnup beyond this burnup, a new algorithm was developed to predict expected response from the SFNC at measurement locations covering the entire range of burnups, cooling times, and initial enrichments. With the aid of a static database of parameters including intrinsic sources and energy group-wise detector response functions, as well as explicit spent fuel information including burnups, cooling times, enrichment types, and spacing between fuel assemblies, an expected response for any given location can be calculated by summing the contributions from the relevant neighboring fuel assemblies. Thus, the new algorithm maps the expected responses across the various pools providing inspectors with a visual aid in verifying the presence of the spent fuel assemblies. This algorithm has been fully integrated into a standalone application built in LabVIEW. The GUI uses a step-by-step approach to allow the end-user to first calibrate the predicted database against a set of

  13. Arabidopsis Gene Family Profiler (aGFP – user-oriented transcriptomic database with easy-to-use graphic interface

    Directory of Open Access Journals (Sweden)

    Reňák David

    2007-07-01

    Full Text Available Abstract Background Microarray technologies now belong to the standard functional genomics toolbox and have undergone massive development leading to increased genome coverage, accuracy and reliability. The number of experiments exploiting microarray technology has markedly increased in recent years. In parallel with the rapid accumulation of transcriptomic data, on-line analysis tools are being introduced to simplify their use. Global statistical data analysis methods contribute to the development of overall concepts about gene expression patterns and to query and compose working hypotheses. More recently, these applications are being supplemented with more specialized products offering visualization and specific data mining tools. We present a curated gene family-oriented gene expression database, Arabidopsis Gene Family Profiler (aGFP; http://agfp.ueb.cas.cz, which gives the user access to a large collection of normalised Affymetrix ATH1 microarray datasets. The database currently contains NASC Array and AtGenExpress transcriptomic datasets for various tissues at different developmental stages of wild type plants gathered from nearly 350 gene chips. Results The Arabidopsis GFP database has been designed as an easy-to-use tool for users needing an easily accessible resource for expression data of single genes, pre-defined gene families or custom gene sets, with the further possibility of keyword search. Arabidopsis Gene Family Profiler presents a user-friendly web interface using both graphic and text output. Data are stored at the MySQL server and individual queries are created in PHP script. The most distinguishable features of Arabidopsis Gene Family Profiler database are: 1 the presentation of normalized datasets (Affymetrix MAS algorithm and calculation of model-based gene-expression values based on the Perfect Match-only model; 2 the choice between two different normalization algorithms (Affymetrix MAS4 or MAS5 algorithms; 3 an intuitive

  14. NuChart: an R package to study gene spatial neighbourhoods with multi-omics annotations.

    Directory of Open Access Journals (Sweden)

    Ivan Merelli

    Full Text Available Long-range chromosomal associations between genomic regions, and their repositioning in the 3D space of the nucleus, are now considered to be key contributors to the regulation of gene expression and important links have been highlighted with other genomic features involved in DNA rearrangements. Recent Chromosome Conformation Capture (3C measurements performed with high throughput sequencing (Hi-C and molecular dynamics studies show that there is a large correlation between colocalization and coregulation of genes, but these important researches are hampered by the lack of biologists-friendly analysis and visualisation software. Here, we describe NuChart, an R package that allows the user to annotate and statistically analyse a list of input genes with information relying on Hi-C data, integrating knowledge about genomic features that are involved in the chromosome spatial organization. NuChart works directly with sequenced reads to identify the related Hi-C fragments, with the aim of creating gene-centric neighbourhood graphs on which multi-omics features can be mapped. Predictions about CTCF binding sites, isochores and cryptic Recombination Signal Sequences are provided directly with the package for mapping, although other annotation data in bed format can be used (such as methylation profiles and histone patterns. Gene expression data can be automatically retrieved and processed from the Gene Expression Omnibus and ArrayExpress repositories to highlight the expression profile of genes in the identified neighbourhood. Moreover, statistical inferences about the graph structure and correlations between its topology and multi-omics features can be performed using Exponential-family Random Graph Models. The Hi-C fragment visualisation provided by NuChart allows the comparisons of cells in different conditions, thus providing the possibility of novel biomarkers identification. NuChart is compliant with the Bioconductor standard and it is freely

  15. Genome-Assisted Prediction of Quantitative Traits Using the R Package sommer.

    Directory of Open Access Journals (Sweden)

    Giovanny Covarrubias-Pazaran

    Full Text Available Most traits of agronomic importance are quantitative in nature, and genetic markers have been used for decades to dissect such traits. Recently, genomic selection has earned attention as next generation sequencing technologies became feasible for major and minor crops. Mixed models have become a key tool for fitting genomic selection models, but most current genomic selection software can only include a single variance component other than the error, making hybrid prediction using additive, dominance and epistatic effects unfeasible for species displaying heterotic effects. Moreover, Likelihood-based software for fitting mixed models with multiple random effects that allows the user to specify the variance-covariance structure of random effects has not been fully exploited. A new open-source R package called sommer is presented to facilitate the use of mixed models for genomic selection and hybrid prediction purposes using more than one variance component and allowing specification of covariance structures. The use of sommer for genomic prediction is demonstrated through several examples using maize and wheat genotypic and phenotypic data. At its core, the program contains three algorithms for estimating variance components: Average information (AI, Expectation-Maximization (EM and Efficient Mixed Model Association (EMMA. Kernels for calculating the additive, dominance and epistatic relationship matrices are included, along with other useful functions for genomic analysis. Results from sommer were comparable to other software, but the analysis was faster than Bayesian counterparts in the magnitude of hours to days. In addition, ability to deal with missing data, combined with greater flexibility and speed than other REML-based software was achieved by putting together some of the most efficient algorithms to fit models in a gentle environment such as R.

  16. Capabilities of R Package mixAK for Clustering Based on Multivariate Continuous and Discrete Longitudinal Data

    Directory of Open Access Journals (Sweden)

    Arnošt Komárek

    2014-09-01

    Full Text Available R package mixAK originally implemented routines primarily for Bayesian estimation of finite normal mixture models for possibly interval-censored data. The functionality of the package was considerably enhanced by implementing methods for Bayesian estimation of mixtures of multivariate generalized linear mixed models proposed in Komrek and Komrkov (2013. Among other things, this allows for a cluster analysis (classification based on multivariate continuous and discrete longitudinal data that arise whenever multiple outcomes of a different nature are recorded in a longitudinal study. This package also allows for a data-driven selection of a number of clusters as methods for selecting a number of mixture components were implemented. A model and clustering methodology for multivariate continuous and discrete longitudinal data is overviewed. Further, a step-by-step cluster analysis based jointly on three longitudinal variables of different types (continuous, count, dichotomous is given, which provides a user manual for using the package for similar problems.

  17. Nonlinearities and transit times in soil organic matter models: new developments in the SoilR package

    Science.gov (United States)

    Sierra, Carlos; Müller, Markus

    2016-04-01

    SoilR is an R package for implementing diverse models representing soil organic matter dynamics. In previous releases of this package, we presented the implementation of linear first-order models with any number of pools as well as radiocarbon dynamics. We present here new improvements of the package regarding the possibility to implement models with nonlinear interactions among state variables and the possibility to calculate ages and transit times for nonlinear models with time dependencies. We show here examples on how to implement model structures with Michaelis-Menten terms for explicit microbial growth and resource use efficiency, and Langmuir isotherms for representing adsorption of organic matter to mineral surfaces. These nonlinear terms can be implemented for any number of organic matter pools, microbial functional groups, or mineralogy, depending on user's requirements. Through a simple example, we also show how transit times of organic matter in soils are controlled by the time-dependencies of the input terms.

  18. The last developments of the airGR R-package, an open source software for rainfall-runoff modelling

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Perrin, Charles; Andréassian, Vazken

    2017-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years with the main objectives of designing models as efficient as possible in terms of streamflow simulation, applicable to a wide range of catchments and having low data requirements. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2016), called airGR (Coron et al., 2016, 2017), to make these models widely available. It includes: - the water balance annual GR1A model, - the monthly GR2M model, - three versions of the daily model, namely GR4J, GR5J and GR6J, - the hourly GR4H model, - a degree-day snow model CemaNeige. The airGR package has been designed to facilitate the use by non-expert users and allow the addition of evaluation criteria, models or calibration algorithm selected by the end-user. Each model core is coded in FORTRAN to ensure low computational time. The other package functions (i.e. mainly the calibration algorithm and the efficiency criteria) are coded in R. The package is also used for educational purposes. It allows for convenient implementation of model inter-comparisons and large sample hydrology experiments. The airGR package undergoes continuous developments for improving the efficiency, computational time

  19. A Web-based graphical user interface for evidence-based decision making for health care allocations in rural areas

    Directory of Open Access Journals (Sweden)

    Leight Margo

    2008-09-01

    Full Text Available Abstract Background The creation of successful health policy and location of resources increasingly relies on evidence-based decision-making. The development of intuitive, accessible tools to analyse, display and disseminate spatial data potentially provides the basis for sound policy and resource allocation decisions. As health services are rationalized, the development of tools such graphical user interfaces (GUIs is especially valuable at they assist decision makers in allocating resources such that the maximum number of people are served. GIS can used to develop GUIs that enable spatial decision making. Results We have created a Web-based GUI (wGUI to assist health policy makers and administrators in the Canadian province of British Columbia make well-informed decisions about the location and allocation of time-sensitive service capacities in rural regions of the province. This tool integrates datasets for existing hospitals and services, regional populations and road networks to allow users to ascertain the percentage of population in any given service catchment who are served by a specific health service, or baskets of linked services. The wGUI allows policy makers to map trauma and obstetric services against rural populations within pre-specified travel distances, illustrating service capacity by region. Conclusion The wGUI can be used by health policy makers and administrators with little or no formal GIS training to visualize multiple health resource allocation scenarios. The GUI is poised to become a critical decision-making tool especially as evidence is increasingly required for distribution of health services.

  20. Culvert Analysis Program Graphical User Interface 1.0--A preprocessing and postprocessing tool for estimating flow through culvert

    Science.gov (United States)

    Bradley, D. Nathan

    2013-01-01

    The peak discharge of a flood can be estimated from the elevation of high-water marks near the inlet and outlet of a culvert after the flood has occurred. This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks on trees or buildings. When combined with the cross-sectional geometry of the channel upstream from the culvert and the culvert size, shape, roughness, and orientation, the high-water marks define a water-surface profile that can be used to estimate the peak discharge by using the methods described by Bodhaine (1968). This type of measurement is in contrast to a “direct” measurement of discharge made during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a streamgage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the streamgage, resulting in more accurate computation of high flows. The Culvert Analysis Program (CAP) (Fulford, 1998) is a command-line program written in Fortran for computing peak discharges and culvert rating surfaces or curves. CAP reads input data from a formatted text file and prints results to another formatted text file. Preparing and correctly formatting the input file may be time-consuming and prone to errors. This document describes the CAP graphical user interface (GUI)—a modern, cross-platform, menu-driven application that prepares the CAP input file, executes the program, and helps the user interpret the output

  1. LABVIEW graphical user interface for precision multichannel alignment of Raman lidar at Jet Propulsion Laboratory, Table Mountain Facility.

    Science.gov (United States)

    Aspey, R A; McDermid, I S; Leblanc, T; Howe, J W; Walsh, T D

    2008-09-01

    The Jet Propulsion Laboratory operates lidar systems at Table Mountain Facility (TMF), California (34.4 degrees N, 117.7 degrees W) and Mauna Loa Observatory, Hawaii (19.5 degrees N, 155.6 degrees W) under the framework of the Network for the Detection of Atmospheric Composition Change. To complement these systems a new Raman lidar has been developed at TMF with particular attention given to optimizing water vapor profile measurements up to the tropopause and lower stratosphere. The lidar has been designed for accuracies of 5% up to 12 km in the free troposphere and a detection capability of LABVIEW/C++ graphical user interface (GUI). This allows the lidar to be aligned on any channel while simultaneously displaying signals from other channels at configurable altitude/bin combinations. The general lidar instrumental setup and the details of the alignment control system, data acquisition, and GUI alignment software are described. Preliminary validation results using radiosonde and lidar intercomparisons are briefly presented.

  2. A flexible R package for nonnegative matrix factorization

    Directory of Open Access Journals (Sweden)

    Seoighe Cathal

    2010-07-01

    Full Text Available Abstract Background Nonnegative Matrix Factorization (NMF is an unsupervised learning technique that has been applied successfully in several fields, including signal processing, face recognition and text mining. Recent applications of NMF in bioinformatics have demonstrated its ability to extract meaningful information from high-dimensional data such as gene expression microarrays. Developments in NMF theory and applications have resulted in a variety of algorithms and methods. However, most NMF implementations have been on commercial platforms, while those that are freely available typically require programming skills. This limits their use by the wider research community. Results Our objective is to provide the bioinformatics community with an open-source, easy-to-use and unified interface to standard NMF algorithms, as well as with a simple framework to help implement and test new NMF methods. For that purpose, we have developed a package for the R/BioConductor platform. The package ports public code to R, and is structured to enable users to easily modify and/or add algorithms. It includes a number of published NMF algorithms and initialization methods and facilitates the combination of these to produce new NMF strategies. Commonly used benchmark data and visualization methods are provided to help in the comparison and interpretation of the results. Conclusions The NMF package helps realize the potential of Nonnegative Matrix Factorization, especially in bioinformatics, providing easy access to methods that have already yielded new insights in many applications. Documentation, source code and sample data are available from CRAN.

  3. Graphic user interface testing based on Petri net%基于 Petri 网的图形用户界面测试

    Institute of Scientific and Technical Information of China (English)

    林涛; 高建华

    2016-01-01

    在不确定上下文相关的环境中,图形用户界面测试主要依靠随机测试以及测试人员的从业经验,其有效性低。提高图形用户界面测试的效率是一个未解难题。因此,通过引入离散并行系统的 Petri 网理论,定义了图形用户界面的事件、事件序列和事件分解等概念,将可达性、有界性、活性以及强连通性等 Petri 网的重要性质延伸至此领域,以提高图形用户界面测试的覆盖面和效率。并且试图解决不可达、不强连通、死锁、无界、不符合初始模型和跳转错误等六类图形用户界面缺陷问题。实验证明基于 Petri 网的图形用户界面测试在事件覆盖数、代码行覆盖数以及缺陷发现数等方面均优于其他方法。%In environment of undetermined and context-sensitive,graphic user interface testing mainly depends on random tes-ting and testers’professional experience,the effectiveness being low.It is an unresolved puzzle for graphic user interface tes-ting.Therefore,it is brought in the Petri net theory in the discrete and parallel system,defining concepts of event,events se-quence,and events decomposition in graphic user interface.The paper introduced some significant properties of Petri net,such as reachability,roundedness,liveness and strong connectedness to this field,so as to improve the coverage and efficiency of graphic user interface testing.In addition,an attempt to solve six categories bugs in graphic user interface,such as non-reach-ability,not strong connected,dead-lock,unbounded,not suitable to the original model and error jumping is conducted.The ex-periment proves that graphic user interface testing based on Petri net is more effective than other test methods in coverage of e-vents,code lines as well as the number of fault detection.

  4. Graphic Design Principle of Iconized User Interface%图形化用户界面图标设计原则

    Institute of Scientific and Technical Information of China (English)

    张振中

    2015-01-01

    The paper studies the design principle of graphic user interface design. Five aspects of text application principles of graphical user interface icons are explored, including principle of unity, the icon of user cognitive and nationality principle, icon simplicity and symbolic principle, icon size principle and icons. The purpose is to ifnd more strategies and methods of the graphical user interface icon design reasonable.%研究图形化用户界面图标设计的原则。并通过对图形化用户界面图标的统一性原则、图标的用户认知性与民族性原则、图标的简洁性和符号性原则、图标的尺寸性原则和图标的文字应用原则几个方面,进行具体的探讨与研究,旨在找寻更为合理的图形化用户界面图标设计的策略与方法。

  5. The R package "sperrorest" : Parallelized spatial error estimation and variable importance assessment for geospatial machine learning

    Science.gov (United States)

    Schratz, Patrick; Herrmann, Tobias; Brenning, Alexander

    2017-04-01

    Computational and statistical prediction methods such as the support vector machine have gained popularity in remote-sensing applications in recent years and are often compared to more traditional approaches like maximum-likelihood classification. However, the accuracy assessment of such predictive models in a spatial context needs to account for the presence of spatial autocorrelation in geospatial data by using spatial cross-validation and bootstrap strategies instead of their now more widely used non-spatial equivalent. The R package sperrorest by A. Brenning [IEEE International Geoscience and Remote Sensing Symposium, 1, 374 (2012)] provides a generic interface for performing (spatial) cross-validation of any statistical or machine-learning technique available in R. Since spatial statistical models as well as flexible machine-learning algorithms can be computationally expensive, parallel computing strategies are required to perform cross-validation efficiently. The most recent major release of sperrorest therefore comes with two new features (aside from improved documentation): The first one is the parallelized version of sperrorest(), parsperrorest(). This function features two parallel modes to greatly speed up cross-validation runs. Both parallel modes are platform independent and provide progress information. par.mode = 1 relies on the pbapply package and calls interactively (depending on the platform) parallel::mclapply() or parallel::parApply() in the background. While forking is used on Unix-Systems, Windows systems use a cluster approach for parallel execution. par.mode = 2 uses the foreach package to perform parallelization. This method uses a different way of cluster parallelization than the parallel package does. In summary, the robustness of parsperrorest() is increased with the implementation of two independent parallel modes. A new way of partitioning the data in sperrorest is provided by partition.factor.cv(). This function gives the user the

  6. vrmlgen: An R Package for 3D Data Visualization on the Web

    Directory of Open Access Journals (Sweden)

    Enrico Glaab

    2010-10-01

    Full Text Available The 3-dimensional representation and inspection of complex data is a frequently used strategy in many data analysis domains. Existing data mining software often lacks functionality that would enable users to explore 3D data interactively, especially if one wishes to make dynamic graphical representations directly viewable on the web.In this paper we present vrmlgen, a software package for the statistical programming language R to create 3D data visualizations in web formats like the Virtual Reality Markup Language (VRML and LiveGraphics3D. vrmlgen can be used to generate 3D charts and bar plots, scatter plots with density estimation contour surfaces, and visualizations of height maps, 3D object models and parametric functions. For greater flexibility, the user can also access low-level plotting methods through a unified interface and freely group different function calls together to create new higher-level plotting methods. Additionally, we present a web tool allowing users to visualize 3D data online and test some of vrmlgen's features without the need to install any software on their computer.

  7. Informatic system for a global tissue–fluid biorepository with a graph theory–oriented graphical user interface

    Directory of Open Access Journals (Sweden)

    William E. Butler

    2014-09-01

    Full Text Available The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour–specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory–driven GUI to accommodate and stimulate the semantic web of EV science.

  8. Development of a clinical applicable graphical user interface to automatically detect exercise oscillatory ventilation: The VOdEX-tool.

    Science.gov (United States)

    Cornelis, Justien; Denis, Tim; Beckers, Paul; Vrints, Christiaan; Vissers, Dirk; Goossens, Maggy

    2017-08-01

    Cardiopulmonary exercise testing (CPET) gained importance in the prognostic assessment of especially patients with heart failure (HF). A meaningful prognostic parameter for early mortality in HF is exercise oscillatory ventilation (EOV). This abnormal respiratory pattern is recognized by hypo- and hyperventilation during CPET. Up until now, assessment of EOV is mainly done upon visual agreement or manual calculation. The purpose of this research was to automate the interpretation of EOV so this prognostic parameter could be readily investigated during CPET. Preliminary, four definitions describing the original characteristics of EOV, were selected and integrated in the "Ventilatory Oscillations during Exercise-tool" (VOdEX-tool), a graphical user interface that allows automate calculation of EOV. A Discrete Meyer Level 2 wavelet transformation appeared to be the optimal filter to apply on the collected breath-by-breath minute ventilation CPET data. Divers aspects of the definitions i.e. cycle length, amplitude, regularity and total duration of EOV were combined and calculated. The oscillations meeting the criteria were visualised. Filter methods and cut-off criteria were made adjustable for clinical application and research. The VOdEX-tool was connected to a database. The VOdEX-tool provides the possibility to calculate EOV automatically and to present the clinician an overview of the presence of EOV at a glance. The computerized analysis of EOV can be made readily available in clinical practice by integrating the tool in the manufactures existing CPET software. The VOdEX-tool enhances assessment of EOV and therefore contributes to the estimation of prognosis in especially patients with HF. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Informatic system for a global tissue-fluid biorepository with a graph theory-oriented graphical user interface.

    Science.gov (United States)

    Butler, William E; Atai, Nadia; Carter, Bob; Hochberg, Fred

    2014-01-01

    The Richard Floor Biorepository supports collaborative studies of extracellular vesicles (EVs) found in human fluids and tissue specimens. The current emphasis is on biomarkers for central nervous system neoplasms but its structure may serve as a template for collaborative EV translational studies in other fields. The informatic system provides specimen inventory tracking with bar codes assigned to specimens and containers and projects, is hosted on globalized cloud computing resources, and embeds a suite of shared documents, calendars, and video-conferencing features. Clinical data are recorded in relation to molecular EV attributes and may be tagged with terms drawn from a network of externally maintained ontologies thus offering expansion of the system as the field matures. We fashioned the graphical user interface (GUI) around a web-based data visualization package. This system is now in an early stage of deployment, mainly focused on specimen tracking and clinical, laboratory, and imaging data capture in support of studies to optimize detection and analysis of brain tumour-specific mutations. It currently includes 4,392 specimens drawn from 611 subjects, the majority with brain tumours. As EV science evolves, we plan biorepository changes which may reflect multi-institutional collaborations, proteomic interfaces, additional biofluids, changes in operating procedures and kits for specimen handling, novel procedures for detection of tumour-specific EVs, and for RNA extraction and changes in the taxonomy of EVs. We have used an ontology-driven data model and web-based architecture with a graph theory-driven GUI to accommodate and stimulate the semantic web of EV science.

  10. JaxoDraw: A graphical user interface for drawing Feynman diagrams. Version 2.0 release notes

    Science.gov (United States)

    Binosi, D.; Collins, J.; Kaufhold, C.; Theussl, L.

    2009-09-01

    A new version of the Feynman graph plotting tool JaxoDraw is presented. Version 2.0 is a fundamental re-write of most of the JaxoDraw core and some functionalities, in particular importing graphs, are not backward-compatible with the 1.x branch. The most prominent new features include: drawing of Bézier curves for all particle modes, on-the-fly update of edited objects, multiple undo/redo functionality, the addition of a plugin infrastructure, and a general improved memory performance. A new LaTeX style file is presented that has been written specifically on top of the original axodraw.sty to meet the needs of this new version. New version program summaryProgram title: JaxoDraw Catalogue identifier: ADUA_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADUA_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPL No. of lines in distributed program, including test data, etc.: 103 544 No. of bytes in distributed program, including test data, etc.: 3 745 814 Distribution format: tar.gz Programming language: Java Computer: Any Java-enabled platform Operating system: Any Java-enabled platform, tested on Linux, Windows XP, Mac OS X Classification: 14 Catalogue identifier of previous version: ADUA_v1_0 Journal reference of previous version: Comput. Phys. Comm. 161 (2004) 76 Does the new version supersede the previous version?: Yes Nature of problem: Existing methods for drawing Feynman diagrams usually require some hard-coding in one or the other programming or scripting language. It is not very convenient and often time consuming, to generate relatively simple diagrams. Solution method: A program is provided that allows for the interactive drawing of Feynman diagrams with a graphical user interface. The program is easy to learn and use, produces high quality output in several formats and runs on any operating system where a Java Runtime Environment is available. Reasons for new version: A

  11. ontoCAT : an R package for ontology traversal and search

    NARCIS (Netherlands)

    Kurbatova, Natalja; Adamusiak, Tomasz; Kurnosov, Pavel; Swertz, Morris A.; Kapushesky, Misha

    2011-01-01

    There exist few simple and easily accessible methods to integrate ontologies programmatically in the R environment. We present ontoCAT-an R package to access ontologies in widely used standard formats, stored locally in the filesystem or available online. The ontoCAT package supports a number of tra

  12. ShapeSelectForest: a new r package for modeling landsat time series

    Science.gov (United States)

    Mary Meyer; Xiyue Liao; Gretchen Moisen; Elizabeth. Freeman

    2015-01-01

    We present a new R package called ShapeSelectForest recently posted to the Comprehensive R Archival Network. The package was developed to fit nonparametric shape-restricted regression splines to time series of Landsat imagery for the purpose of modeling, mapping, and monitoring annual forest disturbance dynamics over nearly three decades. For each pixel and spectral...

  13. [download] (969dlmap: An R Package for Mixed Model QTL and Association Analysis

    Directory of Open Access Journals (Sweden)

    B. Emma Huang

    2012-08-01

    Full Text Available dlmap is a software package capable of mapping quantitative trait loci (QTL in a variety of genetic studies. Unlike most other QTL mapping packages, dlmap is built on a linear mixed model platform, and thus can simultaneously handle multiple sources of genetic and environmental variation. Furthermore, it can accommodate both experimental crosses and association mapping populations within a versatile modeling framework. The software implements a mapping algorithm with separate detection and localization stages in a user-friendly manner. It accepts data in various common formats, has a flexible modeling environment, and summarizes results both graphically and numerically.

  14. proportion: A comprehensive R package for inference on single Binomial proportion and Bayesian computations

    Science.gov (United States)

    Subbiah, M.; Rajeswaran, V.

    Extensive statistical practice has shown the importance and relevance of the inferential problem of estimating probability parameters in a binomial experiment; especially on the issues of competing intervals from frequentist, Bayesian, and Bootstrap approaches. The package written in the free R environment and presented in this paper tries to take care of the issues just highlighted, by pooling a number of widely available and well-performing methods and apporting on them essential variations. A wide range of functions helps users with differing skills to estimate, evaluate, summarize, numerically and graphically, various measures adopting either the frequentist or the Bayesian paradigm.

  15. MATLAB GUIDE在数字图像处理教学中的应用%Applications of MATLAB Graphical User Interfaces in Teaching the Digital Image Processing

    Institute of Scientific and Technical Information of China (English)

    邱广萍

    2014-01-01

    MATLAB GUIDE(Graphical User Interfaces) is designed for the basic function of MATLAB in the demo program. This paper analyzes several examples commonly used in digital image processing design, automatic control system development and utilization of MATLAB GUIDE teaching, MATLAB GUIDE shows advantages in the teaching of digital image processing.%MATLAB GUIDE(Graphical User Interfaces,图形用户界面)是为表现MATLAB中的基本功能而设计的演示程序。本文通过对数字图像处理中几个常用的设计例子,开发和利用MATLAB GUIDE的教学自动控制系统,展示了MATLAB GUIDE在数字图像处理课程辅助教学的优点。

  16. Development Based on CEGUI of Game Graphic User Interface%基于CEGUI的游戏图形用户界面的开发

    Institute of Scientific and Technical Information of China (English)

    吴华蕾

    2013-01-01

    We study CEGUI system framework, resource management, UI creation, Lua application, and Use CEGUI to develop graphical user interface on DirectX-based 3D game.%研究了CEGUI系统框架、资源管理方案、UI创建方法、Lua脚本应用。在基于DirectX的3D游戏开发过程中,运用CEGUI开发了游戏的图形用户界面。

  17. Simplification of 3D Graphics for Mobile Devices: Exploring the Trade-off Between Energy Savings and User Perceptions of Visual Quality

    Science.gov (United States)

    Vatjus-Anttila, Jarkko; Koskela, Timo; Lappalainen, Tuomas; Häkkilä, Jonna

    2017-03-01

    3D graphics have quickly become a popular form of media that can also be accessed with today's mobile devices. However, the use of 3D applications with mobile devices is typically a very energy-consuming task due to the processing complexity and the large file size of 3D graphics. As a result, their use may lead to rapid depletion of the limited battery life. In this paper, we investigate how much energy savings can be gained in the transmission and rendering of 3D graphics by simplifying geometry data. In this connection, we also examine users' perceptions on the visual quality of the simplified 3D models. The results of this paper provide new knowledge on the energy savings that can be gained through geometry simplification, as well as on how much the geometry can be simplified before the visual quality of 3D models becomes unacceptable for the mobile users. Based on the results, it can be concluded that geometry simplification can provide significant energy savings for mobile devices without disturbing the users. When geometry simplification is combined with distance based adjustment of detail, up to 52% energy savings were gained in our experiments compared to using only a single high quality 3D model.

  18. Bayesian accrual prediction for interim review of clinical studies: open source R package and smartphone application.

    Science.gov (United States)

    Jiang, Yu; Guarino, Peter; Ma, Shuangge; Simon, Steve; Mayo, Matthew S; Raghavan, Rama; Gajewski, Byron J

    2016-07-22

    Subject recruitment for medical research is challenging. Slow patient accrual leads to increased costs and delays in treatment advances. Researchers need reliable tools to manage and predict the accrual rate. The previously developed Bayesian method integrates researchers' experience on former trials and data from an ongoing study, providing a reliable prediction of accrual rate for clinical studies. In this paper, we present a user-friendly graphical user interface program developed in R. A closed-form solution for the total subjects that can be recruited within a fixed time is derived. We also present a built-in Android system using Java for web browsers and mobile devices. Using the accrual software, we re-evaluated the Veteran Affairs Cooperative Studies Program 558- ROBOTICS study. The application of the software in monitoring and management of recruitment is illustrated for different stages of the trial. This developed accrual software provides a more convenient platform for estimation and prediction of the accrual process.

  19. Approaches in highly parameterized inversion-PESTCommander, a graphical user interface for file and run management across networks

    Science.gov (United States)

    Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.

    2012-01-01

    Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the

  20. P.R.E.S.S.--an R-package for exploring residual-level protein structural statistics.

    Science.gov (United States)

    Huang, Yuanyuan; Bonett, Stephen; Kloczkowski, Andrzej; Jernigan, Robert; Wu, Zhijun

    2012-06-01

    P.R.E.S.S. is an R-package developed to allow researchers to get access to and manipulate a large set of statistical data on protein residue-level structural properties such as residue-level virtual bond lengths, virtual bond angles, and virtual torsion angles. A large set of high-resolution protein structures is downloaded and surveyed. Their residue-level structural properties are calculated and documented. The statistical distributions and correlations of these properties can be queried and displayed. Tools are also provided for modeling and analyzing a given structure in terms of its residue-level structural properties. In particular, new tools for computing residue-level statistical potentials and displaying residue-level Ramachandran-like plots are developed for structural analysis and refinement. P.R.E.S.S. has been released in R as an open source software package, with a user-friendly GUI, accessible and executable by a public user in any R environment. P.R.E.S.S. can also be downloaded directly at http://www.math.iastate.edu/press/.

  1. pubmed.mineR: an R package with text-mining algorithms to analyse PubMed abstracts.

    Science.gov (United States)

    Rani, Jyoti; Shah, A B Rauf; Ramachandran, Srinivasan

    2015-10-01

    The PubMed literature database is a valuable source of information for scientific research. It is rich in biomedical literature with more than 24 million citations. Data-mining of voluminous literature is a challenging task. Although several text-mining algorithms have been developed in recent years with focus on data visualization, they have limitations such as speed, are rigid and are not available in the open source. We have developed an R package, pubmed.mineR, wherein we have combined the advantages of existing algorithms, overcome their limitations, and offer user flexibility and link with other packages in Bioconductor and the Comprehensive R Network (CRAN) in order to expand the user capabilities for executing multifaceted approaches. Three case studies are presented, namely, 'Evolving role of diabetes educators', 'Cancer risk assessment' and 'Dynamic concepts on disease and comorbidity' to illustrate the use of pubmed.mineR. The package generally runs fast with small elapsed times in regular workstations even on large corpus sizes and with compute intensive functions. The pubmed.mineR is available at http://cran.rproject. org/web/packages/pubmed.mineR.

  2. pubmed.mineR: An R package with text-mining algorithms to analyse PubMed abstracts

    Indian Academy of Sciences (India)

    Jyoti Rani; Ab Rauf Shah; Srinivasan Ramachandran

    2015-10-01

    The PubMed literature database is a valuable source of information for scientific research. It is rich in biomedical literature with more than 24 million citations. Data-mining of voluminous literature is a challenging task. Although several text-mining algorithms have been developed in recent years with focus on data visualization, they have limitations such as speed, are rigid and are not available in the open source. We have developed an R package, pubmed.mineR, wherein we have combined the advantages of existing algorithms, overcome their limitations, and offer user flexibility and link with other packages in Bioconductor and the Comprehensive R Network (CRAN) in order to expand the user capabilities for executing multifaceted approaches. Three case studies are presented, namely, `Evolving role of diabetes educators', `Cancer risk assessment' and `Dynamic concepts on disease and comorbidity' to illustrate the use of pubmed.mineR. The package generally runs fast with small elapsed times in regular workstations even on large corpus sizes and with compute intensive functions. The pubmed.mineR is available at http://cran.r- project.org/web/packages/pubmed.mineR.

  3. Generalizing the Convex Hull of a Sample: The R Package alphahull

    Directory of Open Access Journals (Sweden)

    Beatriz Pateiro-López

    2010-10-01

    Full Text Available This paper presents the R package alphahull which implements the α-convex hull and the α-shape of a finite set of points in the plane. These geometric structures provide an informative overview of the shape and properties of the point set. Unlike the convex hull, the α-convex hull and the α-shape are able to reconstruct non-convex sets. This flexibility make them specially useful in set estimation. Since the implementation is based on the intimate relation of theses constructs with Delaunay triangulations, the R package alphahull also includes functions to compute Voronoi and Delaunay tesselations. The usefulness of the package is illustrated with two small simulation studies on boundary length estimation.

  4. Gogadget: An R Package for Interpretation and Visualization of GO Enrichment Results.

    Science.gov (United States)

    Nota, Benjamin

    2017-05-01

    Gene expression profiling followed by gene ontology (GO) term enrichment analysis can generate long lists of significant GO terms. To interpret these results and get biological insight in the data, filtering and rearranging these long lists of GO terms might be desirable. The R package gogadget provides functions to modify GO analysis results, with a simple filter strategy. Furthermore, it groups redundant GO terms with hierarchical clustering and presents the results in a colorful heatmap. The filtered GO term enrichment results can also be exported by the package for subsequent analysis in Cytoscape Enrichment Map. The R package is freely available under the terms of the GNU GPLv3 at https://sourceforge.net/projects/gogadget/. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. pophelper: an R package and web app to analyse and visualize population structure.

    Science.gov (United States)

    Francis, R M

    2017-01-01

    The pophelper r package and web app are software tools to aid in population structure analyses. They can be used for the analyses and visualization of output generated from population assignment programs such as admixture, structure and tess. Some of the functions include parsing output run files to tabulate data, estimating K using the Evanno method, generating files for clumpp and functionality to create barplots. These functions can be streamlined into standard r analysis workflows. The latest version of the package is available on github (https://github.com/royfrancis/pophelper). An interactive web version of the pophelper package is available which covers the same functionalities as the r package version with features such as interactive plots, cluster alignment during plotting, sorting individuals and ordering of population groups. The interactive version is available at http://pophelper.com/.

  6. PSF : Introduction to R Package for Pattern Sequence Based Forecasting Algorithm

    OpenAIRE

    Bokde, Neeraj; Asencio-Cortés, Gualberto; Martínez-Álvarez, Francisco; Kulat, Kishore

    2016-01-01

    This paper discusses about an R package that implements the Pattern Sequence based Forecasting (PSF) algorithm, which was developed for univariate time series forecasting. This algorithm has been successfully applied to many different fields. The PSF algorithm consists of two major parts: clustering and prediction. The clustering part includes selection of the optimum number of clusters. It labels time series data with reference to such clusters. The prediction part includes functions like op...

  7. ParallelPC: an R package for efficient constraint based causal exploration

    OpenAIRE

    Le, Thuc Duy; Hoang, Tao; Li, Jiuyong; Liu, Lin; Hu, Shu

    2015-01-01

    Discovering causal relationships from data is the ultimate goal of many research areas. Constraint based causal exploration algorithms, such as PC, FCI, RFCI, PC-simple, IDA and Joint-IDA have achieved significant progress and have many applications. A common problem with these methods is the high computational complexity, which hinders their applications in real world high dimensional datasets, e.g gene expression datasets. In this paper, we present an R package, ParallelPC, that includes th...

  8. Robust causal inference using directed acyclic graphs: the R package 'dagitty'.

    Science.gov (United States)

    Textor, Johannes; van der Zander, Benito; Gilthorpe, Mark S; Liśkiewicz, Maciej; Ellison, George T H

    2017-01-15

    Directed acyclic graphs (DAGs), which offer systematic representations of causal relationships, have become an established framework for the analysis of causal inference in epidemiology, often being used to determine covariate adjustment sets for minimizing confounding bias. DAGitty is a popular web application for drawing and analysing DAGs. Here we introduce the R package 'dagitty', which provides access to all of the capabilities of the DAGitty web application within the R platform for statistical computing, and also offers several new functions. We describe how the R package 'dagitty' can be used to: evaluate whether a DAG is consistent with the dataset it is intended to represent; enumerate 'statistically equivalent' but causally different DAGs; and identify exposure-outcome adjustment sets that are valid for causally different but statistically equivalent DAGs. This functionality enables epidemiologists to detect causal misspecifications in DAGs and make robust inferences that remain valid for a range of different DAGs. The R package 'dagitty' is available through the comprehensive R archive network (CRAN) at [https://cran.r-project.org/web/packages/dagitty/]. The source code is available on github at [https://github.com/jtextor/dagitty]. The web application 'DAGitty' is free software, licensed under the GNU general public licence (GPL) version 2 and is available at [http://dagitty.net/].

  9. hydroPSO: A Versatile Particle Swarm Optimisation R Package for Calibration of Environmental Models

    Science.gov (United States)

    Zambrano-Bigiarini, M.; Rojas, R.

    2012-04-01

    Particle Swarm Optimisation (PSO) is a recent and powerful population-based stochastic optimisation technique inspired by social behaviour of bird flocking, which shares similarities with other evolutionary techniques such as Genetic Algorithms (GA). In PSO, however, each individual of the population, known as particle in PSO terminology, adjusts its flying trajectory on the multi-dimensional search-space according to its own experience (best-known personal position) and the one of its neighbours in the swarm (best-known local position). PSO has recently received a surge of attention given its flexibility, ease of programming, low memory and CPU requirements, and efficiency. Despite these advantages, PSO may still get trapped into sub-optimal solutions, suffer from swarm explosion or premature convergence. Thus, the development of enhancements to the "canonical" PSO is an active area of research. To date, several modifications to the canonical PSO have been proposed in the literature, resulting into a large and dispersed collection of codes and algorithms which might well be used for similar if not identical purposes. In this work we present hydroPSO, a platform-independent R package implementing several enhancements to the canonical PSO that we consider of utmost importance to bring this technique to the attention of a broader community of scientists and practitioners. hydroPSO is model-independent, allowing the user to interface any model code with the calibration engine without having to invest considerable effort in customizing PSO to a new calibration problem. Some of the controlling options to fine-tune hydroPSO are: four alternative topologies, several types of inertia weight, time-variant acceleration coefficients, time-variant maximum velocity, regrouping of particles when premature convergence is detected, different types of boundary conditions and many others. Additionally, hydroPSO implements recent PSO variants such as: Improved Particle Swarm

  10. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results

    Directory of Open Access Journals (Sweden)

    Dai Yilin

    2012-06-01

    Full Text Available Abstract Background Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. Findings We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Conclusion Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  11. 表格在MATLAB图形用户界面设计中的应用%Application of Table to Design of MATLAB Graphic User Interface

    Institute of Scientific and Technical Information of China (English)

    龚妙昆

    2010-01-01

    MATLAB的主要特征是矩阵计算,图形用户界面(Graphic User Interface)是它的另一个特色.利用用户界面表格的技术,将这两个特点结合起来,并由此设计和制作了一款线性规划一运输模型优化的教学软件.编译后能在Windows环境下运行.

  12. Neurovascular Network Explorer 1.0: a database of 2-photon single-vessel diameter measurements with MATLAB® graphical user interface

    Directory of Open Access Journals (Sweden)

    Vishnu B Sridhar

    2014-05-01

    Full Text Available We present a database client software – Neurovascular Network Explorer 1.0 (NNE 1.0 – that uses MATLAB® based Graphical User Interface (GUI for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication [1]. These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization, and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.

  13. Neurovascular Network Explorer 1.0: a database of 2-photon single-vessel diameter measurements with MATLAB(®) graphical user interface.

    Science.gov (United States)

    Sridhar, Vishnu B; Tian, Peifang; Dale, Anders M; Devor, Anna; Saisan, Payam A

    2014-01-01

    We present a database client software-Neurovascular Network Explorer 1.0 (NNE 1.0)-that uses MATLAB(®) based Graphical User Interface (GUI) for interaction with a database of 2-photon single-vessel diameter measurements from our previous publication (Tian et al., 2010). These data are of particular interest for modeling the hemodynamic response. NNE 1.0 is downloaded by the user and then runs either as a MATLAB script or as a standalone program on a Windows platform. The GUI allows browsing the database according to parameters specified by the user, simple manipulation and visualization of the retrieved records (such as averaging and peak-normalization), and export of the results. Further, we provide NNE 1.0 source code. With this source code, the user can database their own experimental results, given the appropriate data structure and naming conventions, and thus share their data in a user-friendly format with other investigators. NNE 1.0 provides an example of seamless and low-cost solution for sharing of experimental data by a regular size neuroscience laboratory and may serve as a general template, facilitating dissemination of biological results and accelerating data-driven modeling approaches.

  14. Graphical user interface simplifies infusion pump programming and enhances the ability to detect pump-related faults.

    Science.gov (United States)

    Syroid, Noah; Liu, David; Albert, Robert; Agutter, James; Egan, Talmage D; Pace, Nathan L; Johnson, Ken B; Dowdle, Michael R; Pulsipher, Daniel; Westenskow, Dwayne R

    2012-11-01

    Drug administration errors are frequent and are often associated with the misuse of IV infusion pumps. One source of these errors may be the infusion pump's user interface. We used failure modes-and-effects analyses to identify programming errors and to guide the design of a new syringe pump user interface. We designed the new user interface to clearly show the pump's operating state simultaneously in more than 1 monitoring location. We evaluated anesthesia residents in laboratory and simulated environments on programming accuracy and error detection between the new user interface and the user interface of a commercially available infusion pump. With the new user interface, we observed the number of programming errors reduced by 81%, the number of keystrokes per task reduced from 9.2 ± 5.0 to 7.5 ± 5.5 (mean ± SD), the time required per task reduced from 18.1 ± 14.1 seconds to 10.9 ± 9.5 seconds and significantly less perceived workload. Residents detected 38 of 70 (54%) of the events with the new user interface and 37 of 70 (53%) with the existing user interface, despite no experience with the new user interface and extensive experience with the existing interface. The number of programming errors and workload were reduced partly because it took less time and fewer keystrokes to program the pump when using the new user interface. Despite minimal training, residents quickly identified preexisting infusion pump problems with the new user interface. Intuitive and easy-to-program infusion pump interfaces may reduce drug administration errors and infusion pump-related adverse events.

  15. visCOS: An R-package to evaluate model performance of hydrological models

    Science.gov (United States)

    Klotz, Daniel; Herrnegger, Mathew; Wesemann, Johannes; Schulz, Karsten

    2016-04-01

    The evaluation of model performance is a central part of (hydrological) modelling. Much attention has been given to the development of evaluation criteria and diagnostic frameworks. (Klemeš, 1986; Gupta et al., 2008; among many others). Nevertheless, many applications exist for which objective functions do not yet provide satisfying summaries. Thus, the necessity to visualize results arises in order to explore a wider range of model capacities, be it strengths or deficiencies. Visualizations are usually devised for specific projects and these efforts are often not distributed to a broader community (e.g. via open source software packages). Hence, the opportunity to explicitly discuss a state-of-the-art presentation technique is often missed. We therefore present a comprehensive R-package for evaluating model performance by visualizing and exploring different aspects of hydrological time-series. The presented package comprises a set of useful plots and visualization methods, which complement existing packages, such as hydroGOF (Zambrano-Bigiarini et al., 2012). It is derived from practical applications of the hydrological models COSERO and COSEROreg (Kling et al., 2014). visCOS, providing an interface in R, represents an easy-to-use software package for visualizing and assessing model performance and can be implemented in the process of model calibration or model development. The package provides functions to load hydrological data into R, clean the data, process, visualize, explore and finally save the results in a consistent way. Together with an interactive zoom function of the time series, an online calculation of the objective functions for variable time-windows is included. Common hydrological objective functions, such as the Nash-Sutcliffe Efficiency and the Kling-Gupta Efficiency, can also be evaluated and visualized in different ways for defined sub-periods like hydrological years or seasonal sections. Many hydrologists use long-term water-balances as a

  16. speedR: An R Package for Interactive Data Import, Filtering and Ready-to-Use Code Generation

    Directory of Open Access Journals (Sweden)

    Ilhami Visne

    2012-10-01

    Full Text Available Emerging technologies in the experimental sciences have opened the way for large-scale experiments. Such experiments generate ever growing amounts of data from which researchers need to extract relevant pieces for subsequent analysis. R offers a great environment for statistical analysis. However, due to the diversity of possible data sources and formats, data preprocessing and import can be time consuming especially with data that require user interaction such as editing, filtering or formatting. Writing a code for these tasks can be time-consuming, error prone and rather complex. We present speedR, an R-package for interactive data import, filtering and code generation in order to address these needs. Using speedR, researchers can import new data, make basic corrections, examine current R session objects, open them in the speedR environment for filtering (subsetting, put the filtered data back into R, and even create new R functions with applied import and filtering constraints to speed up their productivity.

  17. DASAF: An R Package for Deep Sequencing-Based Detection of Fetal Autosomal Abnormalities from Maternal Cell-Free DNA

    Directory of Open Access Journals (Sweden)

    Baohong Liu

    2016-01-01

    Full Text Available Background. With the development of massively parallel sequencing (MPS, noninvasive prenatal diagnosis using maternal cell-free DNA is fast becoming the preferred method of fetal chromosomal abnormality detection, due to its inherent high accuracy and low risk. Typically, MPS data is parsed to calculate a risk score, which is used to predict whether a fetal chromosome is normal or not. Although there are several highly sensitive and specific MPS data-parsing algorithms, there are currently no tools that implement these methods. Results. We developed an R package, detection of autosomal abnormalities for fetus (DASAF, that implements the three most popular trisomy detection methods—the standard Z-score (STDZ method, the GC correction Z-score (GCCZ method, and the internal reference Z-score (IRZ method—together with one subchromosome abnormality identification method (SCAZ. Conclusions. With the cost of DNA sequencing declining and with advances in personalized medicine, the demand for noninvasive prenatal testing will undoubtedly increase, which will in turn trigger an increase in the tools available for subsequent analysis. DASAF is a user-friendly tool, implemented in R, that supports identification of whole-chromosome as well as subchromosome abnormalities, based on maternal cell-free DNA sequencing data after genome mapping.

  18. TreeExp1.0: R Package for Analyzing Expression Evolution Based on RNA-Seq Data.

    Science.gov (United States)

    Ruan, Hang; Su, Zhixi; Gu, Xun

    2016-11-01

    Recent innovation of RNA-seq technology has shed insightful light on the transcriptomic evolution studies, especially on researches of tissue-specific expression evolution. Phylogenetic analysis of transcriptome data may help to identify causal gene expression differences underlying the evolutionary changes in morphological, physiological, and developmental characters of interest. However, there is a deficiency of software to phylogenetically analyze transcriptome data. To address this need, we have developed an R package TreeExp that can perform comparative expression evolution analysis based on RNA-seq data, which includes optimized input formatting, normalization, pairwise expression distance estimation, expression character tree inference, and preliminary expression phylogenetic network analysis. TreeExp also enables user to map expression distance onto a customized phylogenetic tree. By applying TreeExp on two cases of mammalian gene expression evolution, we observed that (1) expression trees of brain and testis are largely consistent with known mammalian species tree with minor discrepancies; (2) intertissues expression divergences (brain and testis) are more substantial than interspecies expression divergences across mammalian species; and (3) expression pattern of gene modules related to nervous system development exhibits specific expression pattern in brain of primates compared to housekeeping genes. These tissue-specific expression patterns might give insights underlying evo-devo mechanisms of complex organisms. TreeExp is released under the GPL v3 open source license, and its current stable version 1.0 is freely available at the Github developer site (https://github.com/hr1912/TreeExp).

  19. airGRteaching: an R-package designed for teaching hydrology with lumped hydrological models

    Science.gov (United States)

    Thirel, Guillaume; Delaigue, Olivier; Coron, Laurent; Andréassian, Vazken; Brigode, Pierre

    2017-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2016), called airGR (Coron et al., 2016, 2017), to make these models widely available. Although its initial target public was hydrological modellers, the package is already used for educational purposes. Indeed, simple models allow for rapidly visualising the effects of parameterizations and model components on flows hydrographs. In order to avoid the difficulties that students may have when manipulating R and datasets, we developed (Delaigue and Coron, 2016): - Three simplified functions to prepare data, calibrate a model and run a simulation - Simplified and dynamic plot functions - A shiny (Chang et al., 2016) interface that connects this R-package to a browser-based visualisation tool. On this interface, the students can use different hydrological models (including the possibility to use a snow-accounting model), manually modify their parameters and automatically calibrate their parameters with diverse objective functions. One of the visualisation tabs of the interface includes observed precipitation and temperature, simulated snowpack (if any), observed and simulated

  20. DiagTest3Grp: An R Package for Analyzing Diagnostic Tests with Three Ordinal Groups

    Directory of Open Access Journals (Sweden)

    Jingqin Luo

    2012-10-01

    Full Text Available Medical researchers endeavor to identify potentially useful biomarkers to develop marker-based screening assays for disease diagnosis and prevention. Useful summary measures which properly evaluate the discriminative ability of diagnostic markers are critical for this purpose. Literature and existing software, for example, R packages nicely cover summary measures for diagnostic markers used for the binary case (e.g., healthy vs. diseased. An intermediate population at an early disease stage usually exists between the healthy and the fully diseased population in many disease processes. Supporting utilities for three-group diagnostic tests are highly desired and important for identifying patients at the early disease stage for timely treatments. However, application packages which provide summary measures for three ordinal groups are currently lacking. This paper focuses on two summary measures of diagnostic accuracy—volume under the receiver operating characteristic surface and the extended Youden index, with three diagnostic groups. We provide the R package DiagTest3Grp to estimate, under both parametric and nonparametric assumptions, the two summary measures and the associated variances, as well as the optimal cut-points for disease diagnosis. An omnibus test for multiple markers and a Wald test for two markers, on independent or paired samples, are incorporated to compare diagnostic accuracy across biomarkers. Sample size calculation under the normality assumption can be performed in the R package to design future diagnostic studies. A real world application evaluating the diagnostic accuracy of neuropsychological markers for Alzheimer’s disease is used to guide readers through step-by-step implementation of DiagTest3Grp to demonstrate its utility.

  1. Synth: An R Package for Synthetic Control Methods in Comparative Case Studies

    Directory of Open Access Journals (Sweden)

    Alberto Abadie

    2011-08-01

    Full Text Available The R package Synth implements synthetic control methods for comparative case studies designed to estimate the causal effects of policy interventions and other events of interest (Abadie and Gardeazabal 2003; Abadie, Diamond, and Hainmueller 2010. These techniques are particularly well-suited to investigate events occurring at an aggregate level (i.e., countries, cities, regions, etc. and affecting a relatively small number of units. Benefits and features of the Synth package are illustrated using data from Abadie and Gardeazabal (2003, which examined the economic impact of the terrorist conflict in the Basque Country.

  2. clusterProfiler: an R package for comparing biological themes among gene clusters.

    Science.gov (United States)

    Yu, Guangchuang; Wang, Li-Gen; Han, Yanyan; He, Qing-Yu

    2012-05-01

    Increasing quantitative data generated from transcriptomics and proteomics require integrative strategies for analysis. Here, we present an R package, clusterProfiler that automates the process of biological-term classification and the enrichment analysis of gene clusters. The analysis module and visualization module were combined into a reusable workflow. Currently, clusterProfiler supports three species, including humans, mice, and yeast. Methods provided in this package can be easily extended to other species and ontologies. The clusterProfiler package is released under Artistic-2.0 License within Bioconductor project. The source code and vignette are freely available at http://bioconductor.org/packages/release/bioc/html/clusterProfiler.html.

  3. prepdat- An R Package for Preparing Experimental Data for Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Ayala S. Allon

    2016-11-01

    Full Text Available In many research fields the outcome of running an experiment is a raw data file for each subject, containing a table in which each row describes one trial conducted during the experiment. The next step is to merge all files into one big table, and then aggregate it into one finalized table in which each row corresponds (usually to the averaged performance of each subject. prepdat- An R package- enables to easily perform these steps, including several possibilities for dependent measures and trimming procedures. prepdat helps researchers to optimize and speedup their analysis, and to better understand the results.

  4. TripleR: an R package for social relations analyses based on round-robin designs.

    Science.gov (United States)

    Schönbrodt, Felix D; Back, Mitja D; Schmukle, Stefan C

    2012-06-01

    In this article, we present TripleR, an R package for the calculation of social relations analyses (Kenny, 1994) based on round-robin designs. The scope of existing software solutions is ported to R and enhanced with previously unimplemented methods of significance testing in single groups (Lashley & Bond, 1997) and handling of missing values. The package requires only minimal knowledge of R, and results can be exported for subsequent analyses to other software packages. We demonstrate the use of TripleR with several didactic examples.

  5. Tools for automated acoustic monitoring within the R package monitoR

    DEFF Research Database (Denmark)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those...... with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors....

  6. CAFE: an R package for the detection of gross chromosomal abnormalities from gene expression microarray data.

    Science.gov (United States)

    Bollen, Sander; Leddin, Mathias; Andrade-Navarro, Miguel A; Mah, Nancy

    2014-05-15

    The current methods available to detect chromosomal abnormalities from DNA microarray expression data are cumbersome and inflexible. CAFE has been developed to alleviate these issues. It is implemented as an R package that analyzes Affymetrix *.CEL files and comes with flexible plotting functions, easing visualization of chromosomal abnormalities. CAFE is available from https://bitbucket.org/cob87icW6z/cafe/ as both source and compiled packages for Linux and Windows. It is released under the GPL version 3 license. CAFE will also be freely available from Bioconductor. sander.h.bollen@gmail.com or nancy.mah@mdc-berlin.de Supplementary data are available at Bioinformatics online.

  7. ltm: An R Package for Latent Variable Modeling and Item Response Analysis

    Directory of Open Access Journals (Sweden)

    Dimitris Rizopoulos

    2006-11-01

    Full Text Available The R package ltm has been developed for the analysis of multivariate dichotomous and polytomous data using latent variable models, under the Item Response Theory approach. For dichotomous data the Rasch, the Two-Parameter Logistic, and Birnbaum's Three-Parameter models have been implemented, whereas for polytomous data Semejima's Graded Response model is available. Parameter estimates are obtained under marginal maximum likelihood using the Gauss-Hermite quadrature rule. The capabilities and features of the package are illustrated using two real data examples.

  8. UpSetR: an R package for the visualization of intersecting sets and their properties.

    Science.gov (United States)

    Conway, Jake R; Lex, Alexander; Gehlenborg, Nils

    2017-09-15

    Venn and Euler diagrams are a popular yet inadequate solution for quantitative visualization of set intersections. A scalable alternative to Venn and Euler diagrams for visualizing intersecting sets and their properties is needed. We developed UpSetR, an open source R package that employs a scalable matrix-based visualization to show intersections of sets, their size, and other properties. UpSetR is available at https://github.com/hms-dbmi/UpSetR/ and released under the MIT License. A Shiny app is available at https://gehlenborglab.shinyapps.io/upsetr/ . nils@hms.harvard.edu. Supplementary data are available at Bioinformatics online.

  9. High-Dimensional Bayesian Clustering with Variable Selection: The R Package bclust

    Directory of Open Access Journals (Sweden)

    Vahid Partovi Nia

    2012-04-01

    Full Text Available The R package bclust is useful for clustering high-dimensional continuous data. The package uses a parametric spike-and-slab Bayesian model to downweight the effect of noise variables and to quantify the importance of each variable in agglomerative clustering. We take advantage of the existence of closed-form marginal distributions to estimate the model hyper-parameters using empirical Bayes, thereby yielding a fully automatic method. We discuss computational problems arising in implementation of the procedure and illustrate the usefulness of the package through examples.

  10. NHPoisson: An R Package for Fitting and Validating Nonhomogeneous Poisson Processes

    Directory of Open Access Journals (Sweden)

    Ana C. Cebrián

    2015-03-01

    Full Text Available NHPoisson is an R package for the modeling of nonhomogeneous Poisson processes in one dimension. It includes functions for data preparation, maximum likelihood estimation, covariate selection and inference based on asymptotic distributions and simulation methods. It also provides specific methods for the estimation of Poisson processes resulting from a peak over threshold approach. In addition, the package supports a wide range of model validation tools and functions for generating nonhomogenous Poisson process trajectories. This paper is a description of the package and aims to help those interested in modeling data using nonhomogeneous Poisson processes.

  11. demoniche – an R-package for simulating spatially-explicit population dynamics

    DEFF Research Database (Denmark)

    Nenzén, Hedvig K.; Swab, Rebecca Marie; Keith, David A.

    2012-01-01

    demoniche is a freely available R-package which simulates stochastic population dynamics in multiple populations of a species. A demographic model projects population sizes utilizing several transition matrices that can represent impacts on species growth. The demoniche model offers options...... for setting demographic stochasticity, carrying capacity, and dispersal. The demographic projection in each population is linked to spatially-explicit niche values, which affect the species growth. With the demoniche package it is possible to compare the influence of scenarios of environmental changes...... on future population sizes, extinction probabilities, and range shifts of species....

  12. BlueSNP: R package for highly scalable genome-wide association studies using Hadoop clusters.

    Science.gov (United States)

    Huang, Hailiang; Tata, Sandeep; Prill, Robert J

    2013-01-01

    Computational workloads for genome-wide association studies (GWAS) are growing in scale and complexity outpacing the capabilities of single-threaded software designed for personal computers. The BlueSNP R package implements GWAS statistical tests in the R programming language and executes the calculations across computer clusters configured with Apache Hadoop, a de facto standard framework for distributed data processing using the MapReduce formalism. BlueSNP makes computationally intensive analyses, such as estimating empirical p-values via data permutation, and searching for expression quantitative trait loci over thousands of genes, feasible for large genotype-phenotype datasets. http://github.com/ibm-bioinformatics/bluesnp

  13. msSurv: An R Package for Nonparametric Estimation of Multistate Models

    Directory of Open Access Journals (Sweden)

    Nicole Ferguson

    2012-09-01

    Full Text Available We present an R package, msSurv, to calculate the marginal (that is, not conditional on any covariates state occupation probabilities, the state entry and exit time distributions, and the marginal integrated transition hazard for a general, possibly non-Markov, multistate system under left-truncation and right censoring. For a Markov model, msSurv also calculates and returns the transition probability matrix between any two states. Dependent censoring is handled via modeling the censoring hazard through observable covariates. Pointwise confidence intervals for the above mentioned quantities are obtained and returned for independent censoring from closed-form variance estimators and for dependent censoring using the bootstrap.

  14. Nonparametric Kernel Distribution Function Estimation with kerdiest: An R Package for Bandwidth Choice and Applications

    Directory of Open Access Journals (Sweden)

    Alejandro Quintela-del-Rio

    2012-08-01

    Full Text Available The R package kerdiest has been designed for computing kernel estimators of the distribution function and other related functions. Because of its usefulness in real applications, the bandwidth parameter selection problem has been considered, and a cross-validation method and two of plug-in type have been implemented. Moreover, three relevant functions in nature hazards have also been programmed. The package is completed with two interesting data sets, one of geological type (a complete catalogue of the earthquakes occurring in the northwest of the Iberian Peninsula and another containing the maximum peak flow levels of a river in the United States of America.

  15. Tools for automated acoustic monitoring within the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    The R package monitoR contains tools for managing an acoustic-monitoring program including survey metadata, template creation and manipulation, automated detection and results management. These tools are scalable for use with small projects as well as larger long-term projects and those with expansive spatial extents. Here, we describe typical workflow when using the tools in monitoR. Typical workflow utilizes a generic sequence of functions, with the option for either binary point matching or spectrogram cross-correlation detectors.

  16. edcc : An R Package for the Economic Design of the Control Chart

    Directory of Open Access Journals (Sweden)

    Weicheng Zhu

    2013-01-01

    Full Text Available The basic purpose of the economic design of the control charts is to find the optimum control charts parameters to minimize the process cost. In this paper, an R package, edcc (economic design of control charts, which provides a numerical method to find the optimum chart parameters is presented using the unified approach of the economic design. Also, some examples are given to illustrate how to use this package. The types of the control chart available in the edcc package are X̅, CUSUM (cumulative sum, and EWMA (exponentially-weighted moving average control charts.

  17. 一种基于Linux的图形用户接口的设计与实现%Design and Implementation of a Graphic User Interface Based on Linux

    Institute of Scientific and Technical Information of China (English)

    吴春祥

    2011-01-01

    This paper introduces the design ideas and implementation methods of a graphical user interface based on Linux, which is developed via standard C language. It provides the user with a simple graphical user programming interface, on which the user can efficiently and easily develop graphical interface applications with good transplant ability.%介绍一种采用标准C语言开发的Linux图形用户接口设计思想与实现方法,提供简单易用的图形用户编程接口,在此基础上用户可以高效便捷地开发出具有良好可移植性的图形界面应用程序.

  18. Defining Dynamic Graphics by a Graphical Language

    Institute of Scientific and Technical Information of China (English)

    毛其昌; 戴汝为

    1991-01-01

    A graphical language which can be used for defining dynamic picture and applying control actions to it is defined with an expanded attributed grammar.Based on this a system is built for developing the presentation of application data of user interface.This system provides user interface designers with a friendly and high efficient programming environment.

  19. 基于Qt的电纸书图形用户界面设计%Design of Graphical User Interface for E-Paper Book Based on Qt

    Institute of Scientific and Technical Information of China (English)

    方世烟; 林东

    2012-01-01

    In order that users can read E-books more intuitively and comfotablely,the E-paper Book emerged for E-books' wide applications in daily life. In this paper,design of graphical user interface of E-paper Book is implemented based on Linux operating system,with Qt graphical user interface development tools. Then,text documents can be showed on the electronic paper. The program implements the creation of the main window and some dialog boxes,and the switching and the interaction between the main window and the dialog boxes. Then,text documents decoding and paging of the different encoding methods are implemented in the process of loading text documents.%电子书在日常生活中应用广泛,为了便于用户直观、舒适地阅读电子书,电纸书应运而生.在Linux操作系统上,利用Qt图形用户界面开发工具,实现电纸书图形用户界面的设计,并在电子纸上实现文本文档的显示.程序实现了主窗口和对话框的创建,以及主窗口和对话框之间的切换与交互.同时,在加载文本文档的过程中,实现对不同编码方式的文本文档的解码及分页.

  20. LabVIEW Graphical User Interface for a New High Sensitivity, High Resolution Micro-Angio-Fluoroscopic and ROI-CBCT System.

    Science.gov (United States)

    Keleshis, C; Ionita, Cn; Yadava, G; Patel, V; Bednarek, Dr; Hoffmann, Kr; Verevkin, A; Rudin, S

    2008-01-01

    A graphical user interface based on LabVIEW software was developed to enable clinical evaluation of a new High-Sensitivity Micro-Angio-Fluoroscopic (HSMAF) system for real-time acquisition, display and rapid frame transfer of high-resolution region-of-interest images. The HSMAF detector consists of a CsI(Tl) phosphor, a light image intensifier (LII), and a fiber-optic taper coupled to a progressive scan, frame-transfer, charged-coupled device (CCD) camera which provides real-time 12 bit, 1k × 1k images capable of greater than 10 lp/mm resolution. Images can be captured in continuous or triggered mode, and the camera can be programmed by a computer using Camera Link serial communication. A graphical user interface was developed to control the camera modes such as gain and pixel binning as well as to acquire, store, display, and process the images. The program, written in LabVIEW, has the following capabilities: camera initialization, synchronized image acquisition with the x-ray pulses, roadmap and digital subtraction angiography acquisition (DSA), flat field correction, brightness and contrast control, last frame hold in fluoroscopy, looped playback of the acquired images in angiography, recursive temporal filtering and LII gain control. Frame rates can be up to 30 fps in full-resolution mode. The user friendly implementation of the interface along with the high framerate acquisition and display for this unique high-resolution detector should provide angiographers and interventionalists with a new capability for visualizing details of small vessels and endovascular devices such as stents and hence enable more accurate diagnoses and image guided interventions. (Support: NIH Grants R01NS43924, R01EB002873).

  1. ProbMetab: an R package for Bayesian probabilistic annotation of LC–MS-based metabolomics

    Science.gov (United States)

    Silva, Ricardo R.; Jourdan, Fabien; Salvanha, Diego M.; Letisse, Fabien; Jamin, Emilien L.; Guidetti-Gonzalez, Simone; Labate, Carlos A.; Vêncio, Ricardo Z. N.

    2014-01-01

    Summary: We present ProbMetab, an R package that promotes substantial improvement in automatic probabilistic liquid chromatography–mass spectrometry-based metabolome annotation. The inference engine core is based on a Bayesian model implemented to (i) allow diverse source of experimental data and metadata to be systematically incorporated into the model with alternative ways to calculate the likelihood function and (ii) allow sensitive selection of biologically meaningful biochemical reaction databases as Dirichlet-categorical prior distribution. Additionally, to ensure result interpretation by system biologists, we display the annotation in a network where observed mass peaks are connected if their candidate metabolites are substrate/product of known biochemical reactions. This graph can be overlaid with other graph-based analysis, such as partial correlation networks, in a visualization scheme exported to Cytoscape, with web and stand-alone versions. Availability and implementation: ProbMetab was implemented in a modular manner to fit together with established upstream (xcms, CAMERA, AStream, mzMatch.R, etc) and downstream R package tools (GeneNet, RCytoscape, DiffCorr, etc). ProbMetab, along with extensive documentation and case studies, is freely available under GNU license at: http://labpib.fmrp.usp.br/methods/probmetab/. Contact: rvencio@usp.br Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24443383

  2. geoCount: An R Package for the Analysis of Geostatistical Count Data

    Directory of Open Access Journals (Sweden)

    Liang Jing

    2015-02-01

    Full Text Available We describe the R package geoCount for the analysis of geostatistical count data. The package performs Bayesian analysis for the Poisson-lognormal and binomial-logitnormal spatial models, which are subclasses of the class of generalized linear spatial models proposed by Diggle, Tawn, and Moyeed (1998. The package implements the computational intensive tasks in C++ using an R/C++ interface, and has parallel computation capabilities to speed up the computations. geoCount also implements group updating, Langevin- Hastings algorithms and a data-based parameterization, algorithmic approaches proposed by Christensen, Roberts, and Sko ?ld (2006 to improve the efficiency of the Markov chain Monte Carlo algorithms. In addition, the package includes functions for simulation and visualization, as well as three geostatistical count datasets taken from the literature. One of those is used to illustrate the package capabilities. Finally, we provide a side-by-side comparison between geoCount and the R packages geoRglm and INLA.

  3. robustlmm: An R Package for Robust Estimation of Linear Mixed-Effects Models

    Directory of Open Access Journals (Sweden)

    Manuel Koller

    2016-12-01

    Full Text Available As any real-life data, data modeled by linear mixed-effects models often contain outliers or other contamination. Even little contamination can drive the classic estimates far away from what they would be without the contamination. At the same time, datasets that require mixed-effects modeling are often complex and large. This makes it difficult to spot contamination. Robust estimation methods aim to solve both problems: to provide estimates where contamination has only little influence and to detect and flag contamination. We introduce an R package, robustlmm, to robustly fit linear mixed-effects models. The package's functions and methods are designed to closely equal those offered by lme4, the R package that implements classic linear mixed-effects model estimation in R. The robust estimation method in robustlmm is based on the random effects contamination model and the central contamination model. Contamination can be detected at all levels of the data. The estimation method does not make any assumption on the data's grouping structure except that the model parameters are estimable. robustlmm supports hierarchical and non-hierarchical (e.g., crossed grouping structures. The robustness of the estimates and their asymptotic efficiency is fully controlled through the function interface. Individual parts (e.g., fixed effects and variance components can be tuned independently. In this tutorial, we show how to fit robust linear mixed-effects models using robustlmm, how to assess the model fit, how to detect outliers, and how to compare different fits.

  4. Evaluating variability and uncertainty separately in microbial quantitative risk assessment using two R packages.

    Science.gov (United States)

    Pouillot, Régis; Delignette-Muller, Marie Laure

    2010-09-01

    Quantitative risk assessment has emerged as a valuable tool to enhance the scientific basis of regulatory decisions in the food safety domain. This article introduces the use of two new computing resources (R packages) specifically developed to help risk assessors in their projects. The first package, "fitdistrplus", gathers tools for choosing and fitting a parametric univariate distribution to a given dataset. The data may be continuous or discrete. Continuous data may be right-, left- or interval-censored as is frequently obtained with analytical methods, with the possibility of various censoring thresholds within the dataset. Bootstrap procedures then allow the assessor to evaluate and model the uncertainty around the parameters and to transfer this information into a quantitative risk assessment model. The second package, "mc2d", helps to build and study two dimensional (or second-order) Monte-Carlo simulations in which the estimation of variability and uncertainty in the risk estimates is separated. This package easily allows the transfer of separated variability and uncertainty along a chain of conditional mathematical and probabilistic models. The usefulness of these packages is illustrated through a risk assessment of hemolytic and uremic syndrome in children linked to the presence of Escherichia coli O157:H7 in ground beef. These R packages are freely available at the Comprehensive R Archive Network (cran.r-project.org).

  5. NFP: An R Package for Characterizing and Comparing of Annotated Biological Networks

    Directory of Open Access Journals (Sweden)

    Yang Cao

    2017-01-01

    Full Text Available Large amounts of various biological networks exist for representing different types of interaction data, such as genetic, metabolic, gene regulatory, and protein-protein relationships. Recent approaches on biological network study are based on different mathematical concepts. It is necessary to construct a uniform framework to judge the functionality of biological networks. We recently introduced a knowledge-based computational framework that reliably characterized biological networks in system level. The method worked by making systematic comparisons to a set of well-studied “basic networks,” measuring both the functional and topological similarities. A biological network could be characterized as a spectrum-like vector consisting of similarities to basic networks. Here, to facilitate the application, development, and adoption of this framework, we present an R package called NFP. This package extends our previous pipeline, offering a powerful set of functions for Network Fingerprint analysis. The software shows great potential in biological network study. The open source NFP R package is freely available under the GNU General Public License v2.0 at CRAN along with the vignette.

  6. I-pot: a new approach utilising visual and contextual cues to support users in graphical web browser revisitation

    OpenAIRE

    Shen, Siu-Tsen; Prior, Stephen D.; Chen, Kuen-Meau

    2010-01-01

    With a quarter of the world’s population now having access to the internet, the area of web efficiency and optimal use is of growing importance to all users. The function of revisitation, where a user wants to return to a website that they have visited in the recent past becomes more important. Current static and textual approaches developed within the latest versions of mainstream web browsers leave much to be desired. This paper suggests a new approach via the use of organic visual and cont...

  7. airGR: a suite of lumped hydrological models in an R-package

    Science.gov (United States)

    Coron, Laurent; Perrin, Charles; Delaigue, Olivier; Andréassian, Vazken; Thirel, Guillaume

    2016-04-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years with the main objectives of designing models as efficient as possible in terms of streamflow simulation, applicable to a wide range of catchments and having low data requirements. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2015), called airGR, to make these models widely available. It includes: - the water balance annual GR1A (Mouehli et al., 2006), - the monthly GR2M (Mouehli, 2003) models, - three versions of the daily model, namely GR4J (Perrin et al., 2003), GR5J (Le Moine, 2008) and GR6J (Pushpalatha et al., 2011), - the hourly GR4H model (Mathevet, 2005), - a degree-day snow module CemaNeige (Valéry et al., 2014). The airGR package has been designed to facilitate the use by non-expert users and allow the addition of evaluation criteria, models or calibration algorithms selected by the end-user. Each model core is coded in FORTRAN to ensure low computational time. The other package functions (i.e. mainly the calibration algorithm and the efficiency criteria) are coded in R. The package is already used for educational purposes. The presentation will detail the main functionalities of the package and present a case

  8. THE AUTOMATED TESTING SYSTEM OF PROGRAMS WITH THE GRAPHIC USER INTERFACE WITHIN THE CONTEXT OF EDUCATIONAL PROCESS

    OpenAIRE

    2009-01-01

    The paper describes the problems of automation of educational process at the course "Programming on high level language. Algorithmic languages". Complexities of testing of programs with the user interface are marked. Existing analogues was considered. Methods of automation of student's jobs testing are offered.

  9. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  10. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  11. Publication-quality computer graphics

    Energy Technology Data Exchange (ETDEWEB)

    Slabbekorn, M.H.; Johnston, R.B. Jr.

    1981-01-01

    A user-friendly graphic software package is being used at Oak Ridge National Laboratory to produce publication-quality computer graphics. Close interaction between the graphic designer and computer programmer have helped to create a highly flexible computer graphics system. The programmer-oriented environment of computer graphics has been modified to allow the graphic designer freedom to exercise his expertise with lines, form, typography, and color. The resultant product rivals or surpasses that work previously done by hand. This presentation of computer-generated graphs, charts, diagrams, and line drawings clearly demonstrates the latitude and versatility of the software when directed by a graphic designer.

  12. Automating calibration, sensitivity and uncertainty analysis of complex models using the R package Flexible Modeling Environment (FME): SWAT as an example

    Science.gov (United States)

    Wu, Y.; Liu, S.

    2012-01-01

    Parameter optimization and uncertainty issues are a great challenge for the application of large environmental models like the Soil and Water Assessment Tool (SWAT), which is a physically-based hydrological model for simulating water and nutrient cycles at the watershed scale. In this study, we present a comprehensive modeling environment for SWAT, including automated calibration, and sensitivity and uncertainty analysis capabilities through integration with the R package Flexible Modeling Environment (FME). To address challenges (e.g., calling the model in R and transferring variables between Fortran and R) in developing such a two-language coupling framework, 1) we converted the Fortran-based SWAT model to an R function (R-SWAT) using the RFortran platform, and alternatively 2) we compiled SWAT as a Dynamic Link Library (DLL). We then wrapped SWAT (via R-SWAT) with FME to perform complex applications including parameter identifiability, inverse modeling, and sensitivity and uncertainty analysis in the R environment. The final R-SWAT-FME framework has the following key functionalities: automatic initialization of R, running Fortran-based SWAT and R commands in parallel, transferring parameters and model output between SWAT and R, and inverse modeling with visualization. To examine this framework and demonstrate how it works, a case study simulating streamflow in the Cedar River Basin in Iowa in the United Sates was used, and we compared it with the built-in auto-calibration tool of SWAT in parameter optimization. Results indicate that both methods performed well and similarly in searching a set of optimal parameters. Nonetheless, the R-SWAT-FME is more attractive due to its instant visualization, and potential to take advantage of other R packages (e.g., inverse modeling and statistical graphics). The methods presented in the paper are readily adaptable to other model applications that require capability for automated calibration, and sensitivity and uncertainty

  13. ADaCGH: A parallelized web-based application and R package for the analysis of aCGH data.

    Directory of Open Access Journals (Sweden)

    Ramón Díaz-Uriarte

    Full Text Available BACKGROUND: Copy number alterations (CNAs in genomic DNA have been associated with complex human diseases, including cancer. One of the most common techniques to detect CNAs is array-based comparative genomic hybridization (aCGH. The availability of aCGH platforms and the need for identification of CNAs has resulted in a wealth of methodological studies. METHODOLOGY/PRINCIPAL FINDINGS: ADaCGH is an R package and a web-based application for the analysis of aCGH data. It implements eight methods for detection of CNAs, gains and losses of genomic DNA, including all of the best performing ones from two recent reviews (CBS, GLAD, CGHseg, HMM. For improved speed, we use parallel computing (via MPI. Additional information (GO terms, PubMed citations, KEGG and Reactome pathways is available for individual genes, and for sets of genes with altered copy numbers. CONCLUSIONS/SIGNIFICANCE: ADACGH represents a qualitative increase in the standards of these types of applications: a all of the best performing algorithms are included, not just one or two; b we do not limit ourselves to providing a thin layer of CGI on top of existing BioConductor packages, but instead carefully use parallelization, examining different schemes, and are able to achieve significant decreases in user waiting time (factors up to 45x; c we have added functionality not currently available in some methods, to adapt to recent recommendations (e.g., merging of segmentation results in wavelet-based and CGHseg algorithms; d we incorporate redundancy, fault-tolerance and checkpointing, which are unique among web-based, parallelized applications; e all of the code is available under open source licenses, allowing to build upon, copy, and adapt our code for other software projects.

  14. HumMeth27QCReport: an R package for quality control and primary analysis of Illumina Infinium methylation data

    Directory of Open Access Journals (Sweden)

    Mancuso Francesco M

    2011-12-01

    Full Text Available Abstract Background The study of the human DNA methylome has gained particular interest in the last few years. Researchers can nowadays investigate the potential role of DNA methylation in common disorders by taking advantage of new high-throughput technologies. Among these, Illumina Infinium assays can interrogate the methylation levels of hundreds of thousands of CpG sites, offering an ideal solution for genome-wide methylation profiling. However, like for other high-throughput technologies, the main bottleneck remains at the stage of data analysis rather than data production. Findings We have developed HumMeth27QCReport, an R package devoted to researchers wanting to quickly analyse their Illumina Infinium methylation arrays. This package automates quality control steps by generating a report including sample-independent and sample-dependent quality plots, and performs primary analysis of raw methylation calls by computing data normalization, statistics, and sample similarities. This package is available at CRAN repository, and can be integrated in any Galaxy instance through the implementation of ad-hoc scripts accessible at Galaxy Tool Shed. Conclusions Our package provides users of the Illumina Infinium Methylation assays with a simplified, automated, open-source quality control and primary analysis of their methylation data. Moreover, to enhance its use by experimental researchers, the tool is being distributed along with the scripts necessary for its implementation in the Galaxy workbench. Finally, although it was originally developed for HumanMethylation27, we proved its compatibility with data generated with the HumanMethylation450 Bead Chip.

  15. A flexible open-source toolbox for robust end-member modelling analysis - The R-package EMMAgeo

    Science.gov (United States)

    Dietze, Michael; Dietze, Elisabeth

    2013-04-01

    Interpreting geomorphological and sedimentological processes from grain-size data in environmental archives typically runs into problems when source- and process-related grain-size distributions become mixed during deposition. A powerful approach to overcome this ambiguity is to statistically "unmix" the samples. Typical algorithms use eigenspace decomposition and techniques of dimension reduction. This contribution presents a package for the free statistical software R. Some of the great advantages of R and R-packages are the open code structure, flexibility and low programming effort. The package contains a series of flexible, ready-to-use functions to perform different tasks of data tests, preparation, modelling and visualisation. The package originated from a recently presented Matlab-based end-member modelling algorithm (Dietze et al., 2012, SedGeol). It supports simple modelling of grain-size end-member loadings and scores (eigenspace extraction, factor rotation, data scaling, non-negative least squares solving) along with several measures of model quality. The package further provides preprocessing tools (e.g. grain-size scale conversions, tests of data structure, weight factor limit inference, determination of minimum, optimum and maximum number of meaningful end-members) and allows to model data sets with artificial or user-defined end-member loadings. EMMAgeo also supports inferring uncertainty estimation from a series of plausible model runs and the determination of robust end-members. The contribution presents important package functions, thereby illustrating how large data sets of artificial and natural grain-size samples from different depositional environments can be analysed to infer quantified process-related proxies.

  16. glmulti: An R Package for Easy Automated Model Selection with (Generalized Linear Models

    Directory of Open Access Journals (Sweden)

    Vincent Calcagno

    2010-10-01

    Full Text Available We introduce glmulti, an R package for automated model selection and multi-model inference with glm and related functions. From a list of explanatory variables, the provided function glmulti builds all possible unique models involving these variables and, optionally, their pairwise interactions. Restrictions can be specified for candidate models, by excluding specific terms, enforcing marginality, or controlling model complexity. Models are fitted with standard R functions like glm. The n best models and their support (e.g., (QAIC, (QAICc, or BIC are returned, allowing model selection and multi-model inference through standard R functions. The package is optimized for large candidate sets by avoiding memory limitation, facilitating parallelization and providing, in addition to exhaustive screening, a compiled genetic algorithm method. This article briefly presents the statistical framework and introduces the package, with applications to simulated and real data.

  17. Spatio-Temporal Analysis of Epidemic Phenomena Using the R Package surveillance

    Directory of Open Access Journals (Sweden)

    Sebastian Meyer

    2017-05-01

    Full Text Available The availability of geocoded health data and the inherent temporal structure of communicable diseases have led to an increased interest in statistical models and software for spatio-temporal data with epidemic features. The open source R package surveillance can handle various levels of aggregation at which infective events have been recorded: individual-level time-stamped geo-referenced data (case reports in either continuous space or discrete space, as well as counts aggregated by period and region. For each of these data types, the surveillance package implements tools for visualization, likelihoood inference and simulation from recently developed statistical regression frameworks capturing endemic and epidemic dynamics. Altogether, this paper is a guide to the spatio-temporal modeling of epidemic phenomena, exemplified by analyses of public health surveillance data on measles and invasive meningococcal disease.

  18. Spatio-Temporal Analysis of Epidemic Phenomena Using the R Package surveillance

    CERN Document Server

    Meyer, Sebastian; Höhle, Michael

    2014-01-01

    The availability of geocoded health data and the inherent temporal structure of communicable diseases have led to an increased interest in statistical models and software for spatio-temporal data with epidemic features. The open source R package surveillance can handle various levels of aggregation at which infective events have been recorded: individual-level time-stamped geo-referenced data (case reports) in either continuous space or discrete space, as well as counts aggregated by period and region. For each of these data types, the surveillance package implements tools for visualization, likelihoood inference and simulation from recently developed statistical regression frameworks capturing endemic and epidemic dynamics. Altogether, this paper is a guide to the spatio-temporal modeling of epidemic phenomena, exemplified by analyses of public health surveillance data on measles and invasive meningococcal disease.

  19. DRIFTSEL: an R package for detecting signals of natural selection in quantitative traits.

    Science.gov (United States)

    Karhunen, M; Merilä, J; Leinonen, T; Cano, J M; Ovaskainen, O

    2013-07-01

    Approaches and tools to differentiate between natural selection and genetic drift as causes of population differentiation are of frequent demand in evolutionary biology. Based on the approach of Ovaskainen et al. (2011), we have developed an R package (DRIFTSEL) that can be used to differentiate between stabilizing selection, diversifying selection and random genetic drift as causes of population differentiation in quantitative traits when neutral marker and quantitative genetic data are available. Apart from illustrating the use of this method and the interpretation of results using simulated data, we apply the package on data from three-spined sticklebacks (Gasterosteus aculeatus) to highlight its virtues. DRIFTSEL can also be used to perform usual quantitative genetic analyses in common-garden study designs. © 2013 John Wiley & Sons Ltd.

  20. An R Package for a General Class of Inverse Gaussian Distributions

    Directory of Open Access Journals (Sweden)

    Victor Leiva

    2007-03-01

    Full Text Available The inverse Gaussian distribution is a positively skewed probability model that has received great attention in the last 20 years. Recently, a family that generalizes this model called inverse Gaussian type distributions has been developed. The new R package named ig has been designed to analyze data from inverse Gaussian type distributions. This package contains basic probabilistic functions, lifetime indicators and a random number generator from this model. Also, parameter estimates and diagnostics analysis can be obtained using likelihood methods by means of this package. In addition, goodness-of-fit methods are implemented in order to detect the suitability of the model to the data. The capabilities and features of the ig package are illustrated using simulated and real data sets. Furthermore, some new results related to the inverse Gaussian type distribution are also obtained. Moreover, a simulation study is conducted for evaluating the estimation method implemented in the ig package.

  1. strucchange: An R Package for Testing for Structural Change in Linear Regression Models

    Directory of Open Access Journals (Sweden)

    Achim Zeileis

    2002-01-01

    Full Text Available This paper reviews tests for structural change in linear regression models from the generalized fluctuation test framework as well as from the F test (Chow test framework. It introduces a unified approach for implementing these tests and presents how these ideas have been realized in an R package called strucchange. Enhancing the standard significance test approach the package contains methods to fit, plot and test empirical fluctuation processes (like CUSUM, MOSUM and estimates-based processes and to compute, plot and test sequences of F statistics with the supF , aveF and expF test. Thus, it makes powerful tools available to display information about structural changes in regression relationships and to assess their significance. Furthermore, it is described how incoming data can be monitored.

  2. truncSP: An R Package for Estimation of Semi-Parametric Truncated Linear Regression Models

    Directory of Open Access Journals (Sweden)

    Maria Karlsson

    2014-05-01

    Full Text Available Problems with truncated data occur in many areas, complicating estimation and inference. Regarding linear regression models, the ordinary least squares estimator is inconsistent and biased for these types of data and is therefore unsuitable for use. Alternative estimators, designed for the estimation of truncated regression models, have been developed. This paper presents the R package truncSP. The package contains functions for the estimation of semi-parametric truncated linear regression models using three different estimators: the symmetrically trimmed least squares, quadratic mode, and left truncated estimators, all of which have been shown to have good asymptotic and ?nite sample properties. The package also provides functions for the analysis of the estimated models. Data from the environmental sciences are used to illustrate the functions in the package.

  3. Technical note: An R package for fitting Bayesian regularized neural networks with applications in animal breeding.

    Science.gov (United States)

    Pérez-Rodríguez, P; Gianola, D; Weigel, K A; Rosa, G J M; Crossa, J

    2013-08-01

    In recent years, several statistical models have been developed for predicting genetic values for complex traits using information on dense molecular markers, pedigrees, or both. These models include, among others, the Bayesian regularized neural networks (BRNN) that have been widely used in prediction problems in other fields of application and, more recently, for genome-enabled prediction. The R package described here (brnn) implements BRNN models and extends these to include both additive and dominance effects. The implementation takes advantage of multicore architectures via a parallel computing approach using openMP (Open Multiprocessing) for the computations. This note briefly describes the classes of models that can be fitted using the brnn package, and it also illustrates its use through several real examples.

  4. SEQMINER: An R-Package to Facilitate the Functional Interpretation of Sequence-Based Associations.

    Science.gov (United States)

    Zhan, Xiaowei; Liu, Dajiang J

    2015-12-01

    Next-generation sequencing has enabled the study of a comprehensive catalogue of genetic variants for their impact on various complex diseases. Numerous consortia studies of complex traits have publically released their summary association statistics, which have become an invaluable resource for learning the underlying biology, understanding the genetic architecture, and guiding clinical translations. There is great interest in the field in developing novel statistical methods for analyzing and interpreting results from these genotype-phenotype association studies. One popular platform for method development and data analysis is R. In order to enable these analyses in R, it is necessary to develop packages that can efficiently query files of summary association statistics, explore the linkage disequilibrium structure between variants, and integrate various bioinformatics databases. The complexity and scale of sequence datasets and databases pose significant computational challenges for method developers. To address these challenges and facilitate method development, we developed the R package SEQMINER for annotating and querying files of sequence variants (e.g., VCF/BCF files) and summary association statistics (e.g., METAL/RAREMETAL files), and for integrating bioinformatics databases. SEQMINER provides an infrastructure where novel methods can be distributed and applied to analyzing sequence datasets in practice. We illustrate the performance of SEQMINER using datasets from the 1000 Genomes Project. We show that SEQMINER is highly efficient and easy to use. It will greatly accelerate the process of applying statistical innovations to analyze and interpret sequence-based associations. The R package, its source code and documentations are available from http://cran.r-project.org/web/packages/seqminer and http://seqminer.genomic.codes/.

  5. A suite of R packages for web-enabled modeling and analysis of surface waters

    Science.gov (United States)

    Read, J. S.; Winslow, L. A.; Nüst, D.; De Cicco, L.; Walker, J. I.

    2014-12-01

    Researchers often create redundant methods for downloading, manipulating, and analyzing data from online resources. Moreover, the reproducibility of science can be hampered by complicated and voluminous data, lack of time for documentation and long-term maintenance of software, and fear of exposing programming skills. The combination of these factors can encourage unshared one-off programmatic solutions instead of openly provided reusable methods. Federal and academic researchers in the water resources and informatics domains have collaborated to address these issues. The result of this collaboration is a suite of modular R packages that can be used independently or as elements in reproducible analytical workflows. These documented and freely available R packages were designed to fill basic needs for the effective use of water data: the retrieval of time-series and spatial data from web resources (dataRetrieval, geoknife), performing quality assurance and quality control checks of these data with robust statistical methods (sensorQC), the creation of useful data derivatives (including physically- and biologically-relevant indices; GDopp, LakeMetabolizer), and the execution and evaluation of models (glmtools, rLakeAnalyzer). Here, we share details and recommendations for the collaborative coding process, and highlight the benefits of an open-source tool development pattern with a popular programming language in the water resources discipline (such as R). We provide examples of reproducible science driven by large volumes of web-available data using these tools, explore benefits of accessing packages as standardized web processing services (WPS) and present a working platform that allows domain experts to publish scientific algorithms in a service-oriented architecture (WPS4R). We assert that in the era of open data, tools that leverage these data should also be freely shared, transparent, and developed in an open innovation environment.

  6. GeneSrF and varSelRF: a web-based tool and R package for gene selection and classification using random forest

    Directory of Open Access Journals (Sweden)

    Diaz-Uriarte Ramón

    2007-09-01

    Full Text Available Abstract Background Microarray data are often used for patient classification and gene selection. An appropriate tool for end users and biomedical researchers should combine user friendliness with statistical rigor, including carefully avoiding selection biases and allowing analysis of multiple solutions, together with access to additional functional information of selected genes. Methodologically, such a tool would be of greater use if it incorporates state-of-the-art computational approaches and makes source code available. Results We have developed GeneSrF, a web-based tool, and varSelRF, an R package, that implement, in the context of patient classification, a validated method for selecting very small sets of genes while preserving classification accuracy. Computation is parallelized, allowing to take advantage of multicore CPUs and clusters of workstations. Output includes bootstrapped estimates of prediction error rate, and assessments of the stability of the solutions. Clickable tables link to additional information for each gene (GO terms, PubMed citations, KEGG pathways, and output can be sent to PaLS for examination of PubMed references, GO terms, KEGG and and Reactome pathways characteristic of sets of genes selected for class prediction. The full source code is available, allowing to extend the software. The web-based application is available from http://genesrf2.bioinfo.cnio.es. All source code is available from Bioinformatics.org or The Launchpad. The R package is also available from CRAN. Conclusion varSelRF and GeneSrF implement a validated method for gene selection including bootstrap estimates of classification error rate. They are valuable tools for applied biomedical researchers, specially for exploratory work with microarray data. Because of the underlying technology used (combination of parallelization with web-based application they are also of methodological interest to bioinformaticians and biostatisticians.

  7. The LMPCA program: a graphical user interface for fitting the linked-mode PARAFAC-PCA model to coupled real-valued data.

    Science.gov (United States)

    Wilderjans, Tom F; Ceulemans, Eva; Kiers, Henk A L; Meers, Kristof

    2009-11-01

    In behavioral research, PARAFAC analysis, a three-mode generalization of standard principal component analysis (PCA), is often used to disclose the structure of three-way three-mode data. To get insight into the underlying mechanisms, one often wants to relate the component matrices resulting from such a PARAFAC analysis to external (two-way two-mode) information, regarding one of the modes of the three-way data. To this end, linked-mode PARAFAC-PCA analysis can be used, in which the three-way and the two-way data set, which have one mode in common, are simultaneously analyzed. More specifically, a PARAFAC and a PCA model are fitted to the three-way and the two-way data, respectively, restricting the component matrix for the common mode to be equal in both models. Until now, however, no software program has been publicly available to perform such an analysis. Therefore, in this article, the LMPCA program, a free and easy-to-use MATLAB graphical user interface, is presented to perform a linked-mode PARAFAC-PCA analysis. The LMPCA software can be obtained from the authors at http://ppw.kuleuven.be/okp/software/LMPCA. For users who do not have access to MATLAB, a stand-alone version is provided.

  8. GURU v2.0: An interactive Graphical User interface to fit rheometer curves in Han’s model for rubber vulcanization

    Directory of Open Access Journals (Sweden)

    G. Milani

    2016-01-01

    Full Text Available A GUI software (GURU for experimental data fitting of rheometer curves in Natural Rubber (NR vulcanized with sulphur at different curing temperatures is presented. Experimental data are automatically loaded in GURU from an Excel spreadsheet coming from the output of the experimental machine (moving die rheometer. To fit the experimental data, the general reaction scheme proposed by Han and co-workers for NR vulcanized with sulphur is considered. From the simplified kinetic scheme adopted, a closed form solution can be found for the crosslink density, with the only limitation that the induction period is excluded from computations. Three kinetic constants must be determined in such a way to minimize the absolute error between normalized experimental data and numerical prediction. Usually, this result is achieved by means of standard least-squares data fitting. On the contrary, GURU works interactively by means of a Graphical User Interface (GUI to minimize the error and allows an interactive calibration of the kinetic constants by means of sliders. A simple mouse click on the sliders allows the assignment of a value for each kinetic constant and a visual comparison between numerical and experimental curves. Users will thus find optimal values of the constants by means of a classic trial and error strategy. An experimental case of technical relevance is shown as benchmark.

  9. COPASutils: an R package for reading, processing, and visualizing data from COPAS large-particle flow cytometers.

    Science.gov (United States)

    Shimko, Tyler C; Andersen, Erik C

    2014-01-01

    The R package COPASutils provides a logical workflow for the reading, processing, and visualization of data obtained from the Union Biometrica Complex Object Parametric Analyzer and Sorter (COPAS) or the BioSorter large-particle flow cytometers. Data obtained from these powerful experimental platforms can be unwieldy, leading to difficulties in the ability to process and visualize the data using existing tools. Researchers studying small organisms, such as Caenorhabditis elegans, Anopheles gambiae, and Danio rerio, and using these devices will benefit from this streamlined and extensible R package. COPASutils offers a powerful suite of functions for the rapid processing and analysis of large high-throughput screening data sets.

  10. COPASutils: an R package for reading, processing, and visualizing data from COPAS large-particle flow cytometers.

    Directory of Open Access Journals (Sweden)

    Tyler C Shimko

    Full Text Available The R package COPASutils provides a logical workflow for the reading, processing, and visualization of data obtained from the Union Biometrica Complex Object Parametric Analyzer and Sorter (COPAS or the BioSorter large-particle flow cytometers. Data obtained from these powerful experimental platforms can be unwieldy, leading to difficulties in the ability to process and visualize the data using existing tools. Researchers studying small organisms, such as Caenorhabditis elegans, Anopheles gambiae, and Danio rerio, and using these devices will benefit from this streamlined and extensible R package. COPASutils offers a powerful suite of functions for the rapid processing and analysis of large high-throughput screening data sets.

  11. Upgrade to MODFLOW-GUI; addition of MODPATH, ZONEBDGT, and additional MODFLOW packages to the U.S. Geological Survey MODFLOW-96 Graphical-User Interface

    Science.gov (United States)

    Winston, R.B.

    1999-01-01

    This report describes enhancements to a Graphical-User Interface (GUI) for MODFLOW-96, the U.S. Geological Survey (USGS) modular, three-dimensional, finitedifference ground-water flow model, and MOC3D, the USGS three-dimensional, method-ofcharacteristics solute-transport model. The GUI is a plug-in extension (PIE) for the commercial program Argus ONEe. The GUI has been modified to support MODPATH (a particle tracking post-processing package for MODFLOW), ZONEBDGT (a computer program for calculating subregional water budgets), and the Stream, Horizontal-Flow Barrier, and Flow and Head Boundary packages in MODFLOW. Context-sensitive help has been added to make the GUI easier to use and to understand. In large part, the help consists of quotations from the relevant sections of this report and its predecessors. The revised interface includes automatic creation of geospatial information layers required for the added programs and packages, and menus and dialog boxes for input of parameters for simulation control. The GUI creates formatted ASCII files that can be read by MODFLOW-96, MOC3D, MODPATH, and ZONEBDGT. All four programs can be executed within the Argus ONEe application (Argus Interware, Inc., 1997). Spatial results of MODFLOW-96, MOC3D, and MODPATH can be visualized within Argus ONEe. Results from ZONEBDGT can be visualized in an independent program that can also be used to view budget data from MODFLOW, MOC3D, and SUTRA. Another independent program extracts hydrographs of head or drawdown at individual cells from formatted MODFLOW head and drawdown files. A web-based tutorial on the use of MODFLOW with Argus ONE has also been updated. The internal structure of the GUI has been modified to make it possible for advanced users to easily customize the GUI. Two additional, independent PIE?s were developed to allow users to edit the positions of nodes and to facilitate exporting the grid geometry to external programs.

  12. airGR: an R-package suitable for large sample hydrology presenting a suite of lumped hydrological models

    Science.gov (United States)

    Thirel, G.; Delaigue, O.; Coron, L.; Perrin, C.; Andreassian, V.

    2016-12-01

    Lumped hydrological models are useful and convenient tools for research, engineering and educational purposes. They propose catchment-scale representations of the precipitation-discharge relationship. Thanks to their limited data requirements, they can be easily implemented and run. With such models, it is possible to simulate a number of hydrological key processes over the catchment with limited structural and parametric complexity, typically evapotranspiration, runoff, underground losses, etc. The Hydrology Group at Irstea (Antony) has been developing a suite of rainfall-runoff models over the past 30 years with the main objectives of designing models as efficient as possible in terms of streamflow simulation, applicable to a wide range of catchments and having low data requirements. This resulted in a suite of models running at different time steps (from hourly to annual) applicable for various issues including water balance estimation, forecasting, simulation of impacts and scenario testing. Recently, Irstea has developed an easy-to-use R-package (R Core Team, 2015; Coron et al., 2016), called airGR, to make these models widely available. It includes: - the water balance annual GR1A (Mouehli et al., 2006), - the monthly GR2M (Mouehli, 2003) models, - three versions of the daily model, namely GR4J (Perrin et al., 2003), GR5J (Le Moine, 2008) and GR6J (Pushpalatha et al., 2011), - the hourly GR4H model (Mathevet, 2005), - a degree-day snow module CemaNeige (Valéry et al., 2014). The airGR package has been designed to facilitate the use by non-expert users and allow the addition of evaluation criteria, models or calibration algorithm selected by the end-user. Each model core is coded in FORTRAN to ensure low computational time. The other package functions (i.e. mainly the calibration algorithm and the efficiency criteria) are coded in R. The package is already used for educational purposes. It allows for convenient implementation of model inter-comparisons and

  13. PlotStuff: A class for plotting stuff from overture based on: GL{underscore}GraphicsInterface: A graphics interface based on OpenGL based on: GenericGraphicsInterface: A generic graphics interface: User guide, Version 1.00

    Energy Technology Data Exchange (ETDEWEB)

    Henshaw, B.

    1996-10-16

    {bold PlotStuff} can be used to interactively plot objects from Overture such as mappings, grids and grid functions. PlotStuff can be used to plot contours, surfaces, streamlines and grids. It can also be used to plot one-dimensional line plots. {bold GL{emdash}GraphicsInterface} is a class (from which PlotStuff is derived) that implements some standard plotting functions using OpenGL. {bold GL{emdash}Graphicslnterface} is itself derived from the class GenericGraphicsInterface which defines some standard plotting functions that are independent of any particular graphics package.

  14. VIDENTE: a graphical user interface and decision support system for stochastic modelling of water table fluctuations at a single location; includes documentation of the programs KALMAX, KALTFN, SSD and EMERALD and introductions to stochastic modellin

    NARCIS (Netherlands)

    Bierkens, M.F.P.; Bron, W.A.

    2000-01-01

    The VIDENTE program contains a decision support system (DSS) to choose between different models for stochastic modelling of water-table depths, and a graphical user interface to facilitate operating and running four implemented models: KALMAX, KALTFN,SSDS and EMERALD. In self-contained parts each of

  15. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  16. An R package for spatial coverage sampling and random sampling from compact geographical strata by k-means

    NARCIS (Netherlands)

    Walvoort, D.J.J.; Brus, D.J.; Gruijter, de J.J.

    2010-01-01

    Both for mapping and for estimating spatial means of an environmental variable, the accuracy of the result will usually be increased by dispersing the sample locations so that they cover the study area as uniformly as possible. We developed a new R package for designing spatial coverage samples for

  17. Monitoring of intratidal lung mechanics: a Graphical User Interface for a model-based decision support system for PEEP-titration in mechanical ventilation.

    Science.gov (United States)

    Buehler, S; Lozano-Zahonero, S; Schumann, S; Guttmann, J

    2014-12-01

    In mechanical ventilation, a careful setting of the ventilation parameters in accordance with the current individual state of the lung is crucial to minimize ventilator induced lung injury. Positive end-expiratory pressure (PEEP) has to be set to prevent collapse of the alveoli, however at the same time overdistension should be avoided. Classic approaches of analyzing static respiratory system mechanics fail in particular if lung injury already prevails. A new approach of analyzing dynamic respiratory system mechanics to set PEEP uses the intratidal, volume-dependent compliance which is believed to stay relatively constant during one breath only if neither atelectasis nor overdistension occurs. To test the success of this dynamic approach systematically at bedside or in an animal study, automation of the computing steps is necessary. A decision support system for optimizing PEEP in form of a Graphical User Interface (GUI) was targeted. Respiratory system mechanics were analyzed using the gliding SLICE method. The resulting shapes of the intratidal compliance-volume curve were classified into one of six categories, each associated with a PEEP-suggestion. The GUI should include a graphical representation of the results as well as a quality check to judge the reliability of the suggestion. The implementation of a user-friendly GUI was successfully realized. The agreement between modelled and measured pressure data [expressed as root-mean-square (RMS)] tested during the implementation phase with real respiratory data from two patient studies was below 0.2 mbar for data taken in volume controlled mode and below 0.4 mbar for data taken in pressure controlled mode except for two cases with RMS < 0.6 mbar. Visual inspections showed, that good and medium quality data could be reliably identified. The new GUI allows visualization of intratidal compliance-volume curves on a breath-by-breath basis. The automatic categorisation of curve shape into one of six shape

  18. bspmma: An R Package for Bayesian Semiparametric Models for Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Deborah Burr

    2012-07-01

    Full Text Available We introduce an R package, bspmma, which implements a Dirichlet-based random effects model specific to meta-analysis. In meta-analysis, when combining effect estimates from several heterogeneous studies, it is common to use a random-effects model. The usual frequentist or Bayesian models specify a normal distribution for the true effects. However, in many situations, the effect distribution is not normal, e.g., it can have thick tails, be skewed, or be multi-modal. A Bayesian nonparametric model based on mixtures of Dirichlet process priors has been proposed in the literature, for the purpose of accommodating the non-normality. We review this model and then describe a competitor, a semiparametric version which has the feature that it allows for a well-defined centrality parameter convenient for determining whether the overall effect is significant. This second Bayesian model is based on a different version of the Dirichlet process prior, and we call it the "conditional Dirichlet model". The package contains functions to carry out analyses based on either the ordinary or the conditional Dirichlet model, functions for calculating certain Bayes factors that provide a check on the appropriateness of the conditional Dirichlet model, and functions that enable an empirical Bayes selection of the precision parameter of the Dirichlet process. We illustrate the use of the package on two examples, and give an interpretation of the results in these two different scenarios.

  19. A review of R-packages for random-intercept probit regression in small clusters

    Directory of Open Access Journals (Sweden)

    Haeike Josephy

    2016-10-01

    Full Text Available Generalized Linear Mixed Models (GLMMs are widely used to model clustered categorical outcomes. To tackle the intractable integration over the random effects distributions, several approximation approaches have been developed for likelihood-based inference. As these seldom yield satisfactory results when analyzing binary outcomes from small clusters, estimation within the Structural Equation Modeling (SEM framework is proposed as an alternative. We compare the performance of R-packages for random-intercept probit regression relying on: the Laplace approximation, adaptive Gaussian quadrature (AGQ, penalized quasi-likelihood, an MCMC-implementation, and integrated nested Laplace approximation within the GLMM-framework, and a robust diagonally weighted least squares estimation within the SEM-framework. In terms of bias for the fixed and random effect estimators, SEM usually performs best for cluster size two, while AGQ prevails in terms of precision (mainly because of SEM's robust standard errors. As the cluster size increases, however, AGQ becomes the best choice for both bias and precision.

  20. A new R package and web application for detecting bilateral asymmetry in parasitic infections.

    Science.gov (United States)

    Wayland, Matthew T; Chubb, James C

    2016-11-10

    When parasites invade paired structures of their host non-randomly, the resulting asymmetry may have both pathological and ecological significance. To facilitate the detection and visualisation of asymmetric infections we have developed a free software tool, Analysis of Symmetry of Parasitic Infections (ASPI). This tool has been implemented as an R package (https://cran.r-project.org/package=aspi) and a web application (https://wayland.shinyapps.io/aspi). ASPI can detect both consistent bias towards one side, and inconsistent bias in which the left side is favoured in some hosts and the right in others. Application of ASPI is demonstrated using previously unpublished data on the distribution of metacercariae of species of Diplostomum von Nordmann, 1832 in the eyes of ruffe Gymnocephalus cernua (Linnaeus). Invasion of the lenses appeared to be random, with the proportion of metacercariae in the left and right lenses showing the pattern expected by chance. However, analysis of counts of metacercariae from the humors, choroid and retina revealed asymmetry between eyes in 38% of host fish.

  1. ClustOfVar: An R Package for the Clustering of Variables

    Directory of Open Access Journals (Sweden)

    Marie Chavent

    2012-09-01

    Full Text Available Clustering of variables is as a way to arrange variables into homogeneous clusters, i.e., groups of variables which are strongly related to each other and thus bring the same information. These approaches can then be useful for dimension reduction and variable selection. Several specific methods have been developed for the clustering of numerical variables. However concerning qualitative variables or mixtures of quantitative and qualitative variables, far fewer methods have been proposed. The R package ClustOfVar was specifically developed for this purpose. The homogeneity criterion of a cluster is defined as the sum of correlation ratios (for qualitative variables and squared correlations (for quantitative variables to a synthetic quantitative variable, summarizing ``as good as possible'' the variables in the cluster. This synthetic variable is the first principal component obtained with the PCAMIX method. Two clustering algorithms are proposed to optimize the homogeneity criterion: iterative relocation algorithm and ascendant hierarchical clustering. We also propose a bootstrap approach in order to determine suitable numbers of clusters. We illustrate the methodologies and the associated package on small datasets.

  2. Inferring signalling networks from longitudinal data using sampling based approaches in the R-package 'ddepn'

    Directory of Open Access Journals (Sweden)

    Korf Ulrike

    2011-07-01

    Full Text Available Abstract Background Network inference from high-throughput data has become an important means of current analysis of biological systems. For instance, in cancer research, the functional relationships of cancer related proteins, summarised into signalling networks are of central interest for the identification of pathways that influence tumour development. Cancer cell lines can be used as model systems to study the cellular response to drug treatments in a time-resolved way. Based on these kind of data, modelling approaches for the signalling relationships are needed, that allow to generate hypotheses on potential interference points in the networks. Results We present the R-package 'ddepn' that implements our recent approach on network reconstruction from longitudinal data generated after external perturbation of network components. We extend our approach by two novel methods: a Markov Chain Monte Carlo method for sampling network structures with two edge types (activation and inhibition and an extension of a prior model that penalises deviances from a given reference network while incorporating these two types of edges. Further, as alternative prior we include a model that learns signalling networks with the scale-free property. Conclusions The package 'ddepn' is freely available on R-Forge and CRAN http://ddepn.r-forge.r-project.org, http://cran.r-project.org. It allows to conveniently perform network inference from longitudinal high-throughput data using two different sampling based network structure search algorithms.

  3. Analyzing State Sequences with Probabilistic Suffix Trees: The PST R Package

    Directory of Open Access Journals (Sweden)

    Alexis Gabadinho

    2016-08-01

    Full Text Available This article presents the PST R package for categorical sequence analysis with probabilistic suffix trees (PSTs, i.e., structures that store variable-length Markov chains (VLMCs. VLMCs allow to model high-order dependencies in categorical sequences with parsimonious models based on simple estimation procedures. The package is specifically adapted to the field of social sciences, as it allows for VLMC models to be learned from sets of individual sequences possibly containing missing values; in addition, the package is extended to account for case weights. This article describes how a VLMC model is learned from one or more categorical sequences and stored in a PST. The PST can then be used for sequence prediction, i.e., to assign a probability to whole observed or artificial sequences. This feature supports data mining applications such as the extraction of typical patterns and outliers. This article also introduces original visualization tools for both the model and the outcomes of sequence prediction. Other features such as functions for pattern mining and artificial sequence generation are described as well. The PST package also allows for the computation of probabilistic divergence between two models and the fitting of segmented VLMCs, where sub-models fitted to distinct strata of the learning sample are stored in a single PST.

  4. CARBayes: An R Package for Bayesian Spatial Modeling with Conditional Autoregressive Priors

    Directory of Open Access Journals (Sweden)

    Duncan Lee

    2013-11-01

    Full Text Available Conditional autoregressive models are commonly used to represent spatial autocorrelation in data relating to a set of non-overlapping areal units, which arise in a wide variety of applications including agriculture, education, epidemiology and image analysis. Such models are typically specified in a hierarchical Bayesian framework, with inference based on Markov chain Monte Carlo (MCMC simulation. The most widely used software to fit such models is WinBUGS or OpenBUGS, but in this paper we introduce the R package CARBayes. The main advantage of CARBayes compared with the BUGS software is its ease of use, because: (1 the spatial adjacency information is easy to specify as a binary neighbourhood matrix; and (2 given the neighbourhood matrix the models can be implemented by a single function call in R. This paper outlines the general class of Bayesian hierarchical models that can be implemented in the CARBayes software, describes their implementation via MCMC simulation techniques, and illustrates their use with two worked examples in the fields of house price analysis and disease mapping.

  5. KernSmoothIRT: An R Package for Kernel Smoothing in Item Response Theory

    Directory of Open Access Journals (Sweden)

    Angelo Mazza

    2014-06-01

    Full Text Available Item response theory (IRT models are a class of statistical models used to describe the response behaviors of individuals to a set of items having a certain number of options. They are adopted by researchers in social science, particularly in the analysis of performance or attitudinal data, in psychology, education, medicine, marketing and other fields where the aim is to measure latent constructs. Most IRT analyses use parametric models that rely on assumptions that often are not satisfied. In such cases, a nonparametric approach might be preferable; nevertheless, there are not many software implementations allowing to use that. To address this gap, this paper presents the R package KernSmoothIRT . It implements kernel smoothing for the estimation of option characteristic curves, and adds several plotting and analytical tools to evaluate the whole test/questionnaire, the items, and the subjects. In order to show the package's capabilities, two real datasets are used, one employing multiple-choice responses, and the other scaled responses.

  6. MCMC Methods for Multi-Response Generalized Linear Mixed Models: The MCMCglmm R Package

    Directory of Open Access Journals (Sweden)

    Jarrod Had

    2010-02-01

    Full Text Available Generalized linear mixed models provide a flexible framework for modeling a range of data, although with non-Gaussian response variables the likelihood cannot be obtained in closed form. Markov chain Monte Carlo methods solve this problem by sampling from a series of simpler conditional distributions that can be evaluated. The R package MCMCglmm implements such an algorithm for a range of model fitting problems. More than one response variable can be analyzed simultaneously, and these variables are allowed to follow Gaussian, Poisson, multi(binominal, exponential, zero-inflated and censored distributions. A range of variance structures are permitted for the random effects, including interactions with categorical or continuous variables (i.e., random regression, and more complicated variance structures that arise through shared ancestry, either through a pedigree or through a phylogeny. Missing values are permitted in the response variable(s and data can be known up to some level of measurement error as in meta-analysis. All simu- lation is done in C/ C++ using the CSparse library for sparse linear systems.

  7. DAKS: An R Package for Data Analysis Methods in Knowledge Space Theory

    Directory of Open Access Journals (Sweden)

    Ali Ünlü

    2010-11-01

    Full Text Available Knowledge space theory is part of psychometrics and provides a theoretical framework for the modeling, assessment, and training of knowledge. It utilizes the idea that some pieces of knowledge may imply others, and is based on order and set theory. We introduce the R package DAKS for performing basic and advanced operations in knowledge space theory. This package implements three inductive item tree analysis algorithms for deriving quasi orders from binary data, the original, corrected, and minimized corrected algorithms, in sample as well as population quantities. It provides functions for computing population and estimated asymptotic variances of and one and two sample Z tests for the diff fit measures, and for switching between test item and knowledge state representations. Other features are a function for computing response pattern and knowledge state frequencies, a data (based on a finite mixture latent variable model and quasi order simulation tool, and a Hasse diagram drawing device. We describe the functions of the package and demonstrate their usage by real and simulated data examples.

  8. Meta-Statistics for Variable Selection: The R Package BioMark

    Directory of Open Access Journals (Sweden)

    Ron Wehrens

    2012-11-01

    Full Text Available Biomarker identification is an ever more important topic in the life sciences. With the advent of measurement methodologies based on microarrays and mass spectrometry, thousands of variables are routinely being measured on complex biological samples. Often, the question is what makes two groups of samples different. Classical hypothesis testing suffers from the multiple testing problem; however, correcting for this often leads to a lack of power. In addition, choosing α cutoff levels remains somewhat arbitrary. Also in a regression context, a model depending on few but relevant variables will be more accurate and precise, and easier to interpret biologically.We propose an R package, BioMark, implementing two meta-statistics for variable selection. The first, higher criticism, presents a data-dependent selection threshold for significance, instead of a cookbook value of α = 0.05. It is applicable in all cases where two groups are compared. The second, stability selection, is more general, and can also be applied in a regression context. This approach uses repeated subsampling of the data in order to assess the variability of the model coefficients and selects those that remain consistently important. It is shown using experimental spike-in data from the field of metabolomics that both approaches work well with real data. BioMark also contains functionality for simulating data with specific characteristics for algorithm development and testing.

  9. OptGS: An R Package for Finding Near-Optimal Group-Sequential Designs

    Directory of Open Access Journals (Sweden)

    James Wason

    2015-08-01

    Full Text Available A group-sequential clinical trial design is one in which interim analyses of the data are conducted after groups of patients are recruited. After each interim analysis, the trial may stop early if the evidence so far shows the new treatment is particularly effective or ineffective. Such designs are ethical and cost-effective, and so are of great interest in practice. An optimal group-sequential design is one which controls the type-I error rate and power at a specified level, but minimizes the expected sample size of the trial when the true treatment effect is equal to some specified value. Searching for an optimal group- sequential design is a significant computational challenge because of the high number of parameters. In this paper the R package OptGS is described. Package OptGS searches for near-optimal and balanced (i.e., one which balances more than one optimality criterion group-sequential designs for randomized controlled trials with normally distributed outcomes. Package OptGS uses a two-parameter family of functions to determine the stopping boundaries, which improves the speed of the search process whilst still allow- ing flexibility in the possible shape of stopping boundaries. The resulting package allows optimal designs to be found in a matter of seconds much faster than a previous approach.

  10. msBP: An R Package to Perform Bayesian Nonparametric Inference Using Multiscale Bernstein Polynomials Mixtures

    Directory of Open Access Journals (Sweden)

    Antonio Canale

    2017-06-01

    Full Text Available msBP is an R package that implements a new method to perform Bayesian multiscale nonparametric inference introduced by Canale and Dunson (2016. The method, based on mixtures of multiscale beta dictionary densities, overcomes the drawbacks of Pólya trees and inherits many of the advantages of Dirichlet process mixture models. The key idea is that an infinitely-deep binary tree is introduced, with a beta dictionary density assigned to each node of the tree. Using a multiscale stick-breaking characterization, stochastically decreasing weights are assigned to each node. The result is an infinite mixture model. The package msBP implements a series of basic functions to deal with this family of priors such as random densities and numbers generation, creation and manipulation of binary tree objects, and generic functions to plot and print the results. In addition, it implements the Gibbs samplers for posterior computation to perform multiscale density estimation and multiscale testing of group differences described in Canale and Dunson (2016.

  11. Introducing COZIGAM: An R Package for Unconstrained and Constrained Zero-Inflated Generalized Additive Model Analysis

    Directory of Open Access Journals (Sweden)

    Hai Liu

    2010-10-01

    Full Text Available Zero-inflation problem is very common in ecological studies as well as other areas. Nonparametric regression with zero-inflated data may be studied via the zero-inflated generalized additive model (ZIGAM, which assumes that the zero-inflated responses come from a probabilistic mixture of zero and a regular component whose distribution belongs to the 1-parameter exponential family. With the further assumption that the probability of non-zero-inflation is some monotonic function of the mean of the regular component, we propose the constrained zero-inflated generalized additive model (COZIGAM for analyzingzero-inflated data. When the hypothesized constraint obtains, the new approach provides a unified framework for modeling zero-inflated data, which is more parsimonious and efficient than the unconstrained ZIGAM. We have developed an R package COZIGAM which contains functions that implement an iterative algorithm for fitting ZIGAMs and COZIGAMs to zero-inflated data basedon the penalized likelihood approach. Other functions included in the package are useful for model prediction and model selection. We demonstrate the use of the COZIGAM package via some simulation studies and a real application.

  12. drought2015: an R-package to facilitate pan-European drought mapping

    Science.gov (United States)

    Gauster, Tobias; Laaha, Gregor

    2016-04-01

    Hydrological processes do not stop at country borders, whereas hydrological data sets (released by national hydrological surveys) cover single countries only. Compiling up-to-date hydrological data on a trans-national scale usually involves difficulties, for example highly-varying file formats and licence restrictions. We developed an R package called drought2015 to describe the complete spatial extent of the streamflow drought that hit parts of Europe in 2015. The key concept is to distribute the package to every participating country and ask only for the data needed to carry out the final analysis. By enabling the participants to easily and autonomously perform the computation, instead of requesting complete streamflow records, all partners showed a willingness to cooperate. drought2015 enhances the well-established package lfstat with country-specific import routines for national file formats, specialised functions to easily compare low flow extremes and convenient plotting methods. Enforcing a uniform data structure and a consistent methodology in the distributed computation has enabled the data collection and facilitated the interpretation of the results. It became apparent that countries are much more willing to share derived data rather than the original raw data sets.

  13. Developing a Graphical User Interface to Automate the Estimation and Prediction of Risk Values for Flood Protective Structures using Artificial Neural Network

    Science.gov (United States)

    Hasan, M.; Helal, A.; Gabr, M.

    2014-12-01

    In this project, we focus on providing a computer-automated platform for a better assessment of the potential failures and retrofit measures of flood-protecting earth structures, e.g., dams and levees. Such structures play an important role during extreme flooding events as well as during normal operating conditions. Furthermore, they are part of other civil infrastructures such as water storage and hydropower generation. Hence, there is a clear need for accurate evaluation of stability and functionality levels during their service lifetime so that the rehabilitation and maintenance costs are effectively guided. Among condition assessment approaches based on the factor of safety, the limit states (LS) approach utilizes numerical modeling to quantify the probability of potential failures. The parameters for LS numerical modeling include i) geometry and side slopes of the embankment, ii) loading conditions in terms of rate of rising and duration of high water levels in the reservoir, and iii) cycles of rising and falling water levels simulating the effect of consecutive storms throughout the service life of the structure. Sample data regarding the correlations of these parameters are available through previous research studies. We have unified these criteria and extended the risk assessment in term of loss of life through the implementation of a graphical user interface to automate input parameters that divides data into training and testing sets, and then feeds them into Artificial Neural Network (ANN) tool through MATLAB programming. The ANN modeling allows us to predict risk values of flood protective structures based on user feedback quickly and easily. In future, we expect to fine-tune the software by adding extensive data on variations of parameters.

  14. 一种低代价的图形用户界面回归测试框架%Low-cost Graphical User Interface Regression Test Framework

    Institute of Scientific and Technical Information of China (English)

    华涛; 李红红; 李来祥

    2011-01-01

    Graphical User Interface(GUI) is created with rapid prototyping, has characteristics that differ it from traditional software, so test techniques for traditional software can't directly apply to GUI. This paper analyses interaction between GUI events, researches on why some events can lead to defects and gives a cost-effective Event Interaction Graph(EIG) based GUI automated regression test framework and corresponding regression test process, which is used to provide the best combination of defect detection rate and cost.%图形用户界面(GUD采用快速原型法生成,具有一些不同于传统软件的特性,使得传统软件测试技术不能直接应用于GUI.为此,分析GUI事件的交互,研究事件交互可能导致缺陷的原因,进而提出一个低代价的基于事件交互图的GUI自动化回归测试框架及相应的回归测试过程,用于提供最优的缺陷发现率和成本组合.

  15. Open-Source Assisted Laboratory Automation through Graphical User Interfaces and 3D Printers: Application to Equipment Hyphenation for Higher-Order Data Generation.

    Science.gov (United States)

    Siano, Gabriel G; Montemurro, Milagros; Alcaráz, Mirta R; Goicoechea, Héctor C

    2017-09-25

    Higher-order data generation implies some automation challenges, which are mainly related to the hidden programming languages and electronic details of the equipment. When techniques and/or equipment hyphenation are the key to obtaining higher-order data, the required simultaneous control of them demands funds for new hardware, software, and licenses, in addition to very skilled operators. In this work, we present Design of Inputs-Outputs with Sikuli (DIOS), a free and open-source code program that provides a general framework for the design of automated experimental procedures without prior knowledge of programming or electronics. Basically, instruments and devices are considered as nodes in a network, and every node is associated both with physical and virtual inputs and outputs. Virtual components, such as graphical user interfaces (GUIs) of equipment, are handled by means of image recognition tools provided by Sikuli scripting language, while handling of their physical counterparts is achieved using an adapted open-source three-dimensional (3D) printer. Two previously reported experiments of our research group, related to fluorescence matrices derived from kinetics and high-performance liquid chromatography, were adapted to be carried out in a more automated fashion. Satisfactory results, in terms of analytical performance, were obtained. Similarly, advantages derived from open-source tools assistance could be appreciated, mainly in terms of lesser intervention of operators and cost savings.

  16. VSDMIP 1.5: an automated structure- and ligand-based virtual screening platform with a PyMOL graphical user interface.

    Science.gov (United States)

    Cabrera, Álvaro Cortés; Gil-Redondo, Rubén; Perona, Almudena; Gago, Federico; Morreale, Antonio

    2011-09-01

    A graphical user interface (GUI) for our previously published virtual screening (VS) and data management platform VSDMIP (Gil-Redondo et al. J Comput Aided Mol Design, 23:171-184, 2009) that has been developed as a plugin for the popular molecular visualization program PyMOL is presented. In addition, a ligand-based VS module (LBVS) has been implemented that complements the already existing structure-based VS (SBVS) module and can be used in those cases where the receptor's 3D structure is not known or for pre-filtering purposes. This updated version of VSDMIP is placed in the context of similar available software and its LBVS and SBVS capabilities are tested here on a reduced set of the Directory of Useful Decoys database. Comparison of results from both approaches confirms the trend found in previous studies that LBVS outperforms SBVS. We also show that by combining LBVS and SBVS, and using a cluster of ~100 modern processors, it is possible to perform complete VS studies of several million molecules in less than a month. As the main processes in VSDMIP are 100% scalable, more powerful processors and larger clusters would notably decrease this time span. The plugin is distributed under an academic license upon request from the authors. © Springer Science+Business Media B.V. 2011

  17. Introducing an R-package for calculating channel width and other basic metrics for irregular river polygons

    Science.gov (United States)

    Golly, Antonius; Turowski, Jens

    2017-04-01

    The width of fluvial streams and channel beds is an important metric for a large number of hydraulic, geomorphic and ecologic applications. For example, for a given discharge the local channel width determines the water flow velocity and thus the sediment transport capacity of a reach. Since streams often have irregular shapes with uneven channel banks, the channel width strongly varies along the channel. Although, the geometry of streams or their beds can be measured easily in the field (e.g. with a Total Station or GPS) or from maps or aerial images in a GIS, the width of the stream cannot be identified objectively without further data processing, since the results are more or less irregular polygons with sometimes bended shapes. An objective quantification of the channel width and other metrics requires automated algorithms that are applicable over a range of channel shapes and spatial scales. Here, we present a lightweight software suite with a small number of functions that process 2D or 3D geometrical data of channels or channel beds. The software, written as an R-package, accepts various text data formats and can be configured through five parameters. It creates interactive overview plots (if desired) and produces three basic channel metrics: the centerline, the channel width along the centerline and the slope along the centerline. The centerline is an optimized line that minimizes the distances to both channel banks. This centerline gives also a measure for the real length and slope of the channel. From this centerline perpendicular transects are generated which allow for the calculation of the channel width where they intersect with the channel banks. Briefly, we present an example and demonstrate the importance of these metrics in a use case of a steep stream, the Erlenbach stream in Switzerland. We were motivated to develop and publish the algorithm in an open-source framework, since only proprietary solutions were available at that time. The software is

  18. 面向盲人的图形显示设计方法及其用户体验研究%User Study of Tactile Graphics Design for Blind Students

    Institute of Scientific and Technical Information of China (English)

    江宁; 鲁晓波; 李元; 徐迎庆

    2011-01-01

    Tactile displays for visually impaired people have problems in showing the accurate graphics information effectively. In this paper, we introduce the user study of tactile graphics design for blind students. The study starts from exploring the needs and problems that blind people meet while they are using tactile graphics, and recalls the background knowledge of related human factors, then proposes several design principles that can help designers to improve the design of tactile graphic displays. Then it shows the studies of three of our T-graphics experiments, and the analysis of the results. T-graphics greatly enhances not only the usability of tactile graphics, but also the efficiency the visually impaired accesses to graphics information. The results demonstrate the effectiveness and feasibility of these design principles.%针对已有的盲人触觉显示器所展示的图像不能有效地为视觉残障人士提供准确的信息的问题,从盲人对电子触觉图像的需求出发,通过分析盲人在使用传统触觉图像中所遇问题以及涉及到的人机因素的相关背景知识,结合新型显示方案的需求,总结并提出盲人触觉图像设计准则.通过实验对比传统图像与根据该准则重新设计的T-图像,实验结果表明,T-图像大大地提高了触觉图像的可用性,有效地增强了盲人获取信息的效率,验证了此准则的有效性与可行性.

  19. Assessment of error rates in acoustic monitoring with the R package monitoR

    Science.gov (United States)

    Katz, Jonathan; Hafner, Sasha D.; Donovan, Therese

    2016-01-01

    Detecting population-scale reactions to climate change and land-use change may require monitoring many sites for many years, a process that is suited for an automated system. We developed and tested monitoR, an R package for long-term, multi-taxa acoustic monitoring programs. We tested monitoR with two northeastern songbird species: black-throated green warbler (Setophaga virens) and ovenbird (Seiurus aurocapilla). We compared detection results from monitoR in 52 10-minute surveys recorded at 10 sites in Vermont and New York, USA to a subset of songs identified by a human that were of a single song type and had visually identifiable spectrograms (e.g. a signal:noise ratio of at least 10 dB: 166 out of 439 total songs for black-throated green warbler, 502 out of 990 total songs for ovenbird). monitoR’s automated detection process uses a ‘score cutoff’, which is the minimum match needed for an unknown event to be considered a detection and results in a true positive, true negative, false positive or false negative detection. At the chosen score cut-offs, monitoR correctly identified presence for black-throated green warbler and ovenbird in 64% and 72% of the 52 surveys using binary point matching, respectively, and 73% and 72% of the 52 surveys using spectrogram cross-correlation, respectively. Of individual songs, 72% of black-throated green warbler songs and 62% of ovenbird songs were identified by binary point matching. Spectrogram cross-correlation identified 83% of black-throated green warbler songs and 66% of ovenbird songs. False positive rates were  for song event detection.

  20. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  1. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  2. SamplingStrata: An R Package for the Optimization of Strati?ed Sampling

    Directory of Open Access Journals (Sweden)

    Giulio Barcaroli

    2014-11-01

    Full Text Available When designing a sampling survey, usually constraints are set on the desired precision levels regarding one or more target estimates (the Ys. If a sampling frame is available, containing auxiliary information related to each unit (the Xs, it is possible to adopt a stratified sample design. For any given strati?cation of the frame, in the multivariate case it is possible to solve the problem of the best allocation of units in strata, by minimizing a cost function sub ject to precision constraints (or, conversely, by maximizing the precision of the estimates under a given budget. The problem is to determine the best stratification in the frame, i.e., the one that ensures the overall minimal cost of the sample necessary to satisfy precision constraints. The Xs can be categorical or continuous; continuous ones can be transformed into categorical ones. The most detailed strati?cation is given by the Cartesian product of the Xs (the atomic strata. A way to determine the best stratification is to explore exhaustively the set of all possible partitions derivable by the set of atomic strata, evaluating each one by calculating the corresponding cost in terms of the sample required to satisfy precision constraints. This is una?ordable in practical situations, where the dimension of the space of the partitions can be very high. Another possible way is to explore the space of partitions with an algorithm that is particularly suitable in such situations: the genetic algorithm. The R package SamplingStrata, based on the use of a genetic algorithm, allows to determine the best strati?cation for a population frame, i.e., the one that ensures the minimum sample cost necessary to satisfy precision constraints, in a multivariate and multi-domain case.

  3. Drought: A comprehensive R package for drought monitoring, prediction and analysis

    Science.gov (United States)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Cheng, Hongguang

    2015-04-01

    Drought may impose serious challenges to human societies and ecosystems. Due to complicated causing effects and wide impacts, a universally accepted definition of drought does not exist. The drought indicator is commonly used to characterize drought properties such as duration or severity. Various drought indicators have been developed in the past few decades for the monitoring of a certain aspect of drought condition along with the development of multivariate drought indices for drought characterizations from multiple sources or hydro-climatic variables. Reliable drought prediction with suitable drought indicators is critical to the drought preparedness plan to reduce potential drought impacts. In addition, drought analysis to quantify the risk of drought properties would provide useful information for operation drought managements. The drought monitoring, prediction and risk analysis are important components in drought modeling and assessments. In this study, a comprehensive R package "drought" is developed to aid the drought monitoring, prediction and risk analysis (available from R-Forge and CRAN soon). The computation of a suite of univariate and multivariate drought indices that integrate drought information from various sources such as precipitation, temperature, soil moisture, and runoff is available in the drought monitoring component in the package. The drought prediction/forecasting component consists of statistical drought predictions to enhance the drought early warning for decision makings. Analysis of drought properties such as duration and severity is also provided in this package for drought risk assessments. Based on this package, a drought monitoring and prediction/forecasting system is under development as a decision supporting tool. The package will be provided freely to the public to aid the drought modeling and assessment for researchers and practitioners.

  4. The R Package MitISEM: Efficient and Robust Simulation Procedures for Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Nalan Baştürk

    2017-07-01

    Full Text Available This paper presents the R package MitISEM (mixture of t by importance sampling weighted expectation maximization which provides an automatic and flexible two-stage method to approximate a non-elliptical target density kernel - typically a posterior density kernel - using an adaptive mixture of Student t densities as approximating density. In the first stage a mixture of Student t densities is fitted to the target using an expectation maximization algorithm where each step of the optimization procedure is weighted using importance sampling. In the second stage this mixture density is a candidate density for efficient and robust application of importance sampling or the Metropolis-Hastings (MH method to estimate properties of the target distribution. The package enables Bayesian inference and prediction on model parameters and probabilities, in particular, for models where densities have multi-modal or other non-elliptical shapes like curved ridges. These shapes occur in research topics in several scientific fields. For instance, analysis of DNA data in bio-informatics, obtaining loans in the banking sector by heterogeneous groups in financial economics and analysis of education's effect on earned income in labor economics. The package MitISEM provides also an extended algorithm, 'sequential MitISEM', which substantially decreases computation time when the target density has to be approximated for increasing data samples. This occurs when the posterior or predictive density is updated with new observations and/or when one computes model probabilities using predictive likelihoods. We illustrate the MitISEM algorithm using three canonical statistical and econometric models that are characterized by several types of non-elliptical posterior shapes and that describe well-known data patterns in econometrics and finance. We show that MH using the candidate density obtained by MitISEM outperforms, in terms of numerical efficiency, MH using a simpler

  5. HCsnip: An R Package for Semi-supervised Snipping of the Hierarchical Clustering Tree.

    Science.gov (United States)

    Obulkasim, Askar; van de Wiel, Mark A

    2015-01-01

    Hierarchical clustering (HC) is one of the most frequently used methods in computational biology in the analysis of high-dimensional genomics data. Given a data set, HC outputs a binary tree leaves of which are the data points and internal nodes represent clusters of various sizes. Normally, a fixed-height cut on the HC tree is chosen, and each contiguous branch of data points below that height is considered as a separate cluster. However, the fixed-height branch cut may not be ideal in situations where one expects a complicated tree structure with nested clusters. Furthermore, due to lack of utilization of related background information in selecting the cutoff, induced clusters are often difficult to interpret. This paper describes a novel procedure that aims to automatically extract meaningful clusters from the HC tree in a semi-supervised way. The procedure is implemented in the R package HCsnip available from Bioconductor. Rather than cutting the HC tree at a fixed-height, HCsnip probes the various way of snipping, possibly at variable heights, to tease out hidden clusters ensconced deep down in the tree. The cluster extraction process utilizes, along with the data set from which the HC tree is derived, commonly available background information. Consequently, the extracted clusters are highly reproducible and robust against various sources of variations that "haunted" high-dimensional genomics data. Since the clustering process is guided by the background information, clusters are easy to interpret. Unlike existing packages, no constraint is placed on the data type on which clustering is desired. Particularly, the package accepts patient follow-up data for guiding the cluster extraction process. To our knowledge, HCsnip is the first package that is able to decomposes the HC tree into clusters with piecewise snipping under the guidance of patient time-to-event information. Our implementation of the semi-supervised HC tree snipping framework is generic, and can

  6. On Improving the User Experience for the Web Site Graphic Design%提升用户体验的网站图形设计方法探析

    Institute of Scientific and Technical Information of China (English)

    隋涌

    2012-01-01

    用户体验设计的概念来源于西方产品设计理论,目前广泛应用于互联网产品的设计。图形作为网站传递信息的重要媒介,也越来越强调用户需求的满足和用户体验的提升。本文从用户体验的概念入手,阐述网站图形设计的依据,并依据不同类型网站应强调的核心体验点将网站进行分类,探讨网站图形设计过程中的原则以及具体的设计方法。%The concept of user experience design from Western design theory, now widely used Internet product design. As an important medium to transmit information of web site, graphic increasing emphasis on user needs and user experience enhancement. In this paper, the concept of user experience explains the basis for web graphic design, and according to different types of sites should he emphasized that the core experience point site classification, web graphic design process to exnlore the nrincinles and the snecific design.

  7. ABAEnrichment: an R package to test for gene set expression enrichment in the adult and developing human brain.

    Science.gov (United States)

    Grote, Steffi; Prüfer, Kay; Kelso, Janet; Dannemann, Michael

    2016-10-15

    We present ABAEnrichment, an R package that tests for expression enrichment in specific brain regions at different developmental stages using expression information gathered from multiple regions of the adult and developing human brain, together with ontologically organized structural information about the brain, both provided by the Allen Brain Atlas. We validate ABAEnrichment by successfully recovering the origin of gene sets identified in specific brain cell-types and developmental stages. ABAEnrichment was implemented as an R package and is available under GPL (≥ 2) from the Bioconductor website (http://bioconductor.org/packages/3.3/bioc/html/ABAEnrichment.html). steffi_grote@eva.mpg.de, kelso@eva.mpg.de or michael_dannemann@eva.mpg.deSupplementary information: Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  8. Temporal and Spatial Independent Component Analysis for fMRI Data Sets Embedded in the AnalyzeFMRI R Package

    Directory of Open Access Journals (Sweden)

    Pierre Lafaye de Micheaux

    2011-10-01

    Full Text Available For statistical analysis of functional magnetic resonance imaging (fMRI data sets, we propose a data-driven approach based on independent component analysis (ICA implemented in a new version of the AnalyzeFMRI R package. For fMRI data sets, spatial dimension being much greater than temporal dimension, spatial ICA is the computationally tractable approach generally proposed. However, for some neuroscientific applications, temporal independence of source signals can be assumed and temporal ICA becomes then an attractive exploratory technique. In this work, we use a classical linear algebra result ensuring the tractability of temporal ICA. We report several experiments on synthetic data and real MRI data sets that demonstrate the potential interest of our R package.

  9. Software design and code generation for the engineering graphical user interface of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    Science.gov (United States)

    Tanci, Claudio; Tosti, Gino; Antolini, Elisa; Gambini, Giorgio F.; Bruno, Pietro; Canestrari, Rodolfo; Conforti, Vito; Lombardi, Saverio; Russo, Federico; Sangiorgi, Pierluca; Scuderi, Salvatore

    2016-08-01

    ASTRI is an on-going project developed in the framework of the Cherenkov Telescope Array (CTA). An end- to-end prototype of a dual-mirror small-size telescope (SST-2M) has been installed at the INAF observing station on Mt. Etna, Italy. The next step is the development of the ASTRI mini-array composed of nine ASTRI SST-2M telescopes proposed to be installed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort carried on by Italy, Brazil and South-Africa and led by the Italian National Institute of Astrophysics, INAF. To control the ASTRI telescopes, a specific ASTRI Mini-Array Software System (MASS) was designed using a scalable and distributed architecture to monitor all the hardware devices for the telescopes. Using code generation we built automatically from the ASTRI Interface Control Documents a set of communication libraries and extensive Graphical User Interfaces that provide full access to the capabilities offered by the telescope hardware subsystems for testing and maintenance. Leveraging these generated libraries and components we then implemented a human designed, integrated, Engineering GUI for MASS to perform the verification of the whole prototype and test shared services such as the alarms, configurations, control systems, and scientific on-line outcomes. In our experience the use of code generation dramatically reduced the amount of effort in development, integration and testing of the more basic software components and resulted in a fast software release life cycle. This approach could be valuable for the whole CTA project, characterized by a large diversity of hardware components.

  10. FIELD SCALE MODELING TO ESTIMATE PHOSPHORUS AND SEDIMENT LOAD REDUCTIONS USING A NEWLY DEVELOPED GRAPHICAL USER INTERFACE FOR SOIL AND WATER ASSESSMENT TOOL

    Directory of Open Access Journals (Sweden)

    Aaron R. Mittelstet

    2012-01-01

    Full Text Available Streams throughout the North Canadian River watershed in northwest Oklahoma, USA have elevated levels of nutrients and sediment. Soil and Water Assessment Tool (SWAT was used to identify areas that likely contributed disproportionate amounts of Phosphorus (P and sediment to Lake Overholser, the receiving reservoir at the watershed outlet. These sites were then targeted by the Oklahoma Conservation Commission (OCC to implement conservation practices, such as conservation tillage and pasture planting as part of a US Environmental Protection Agency Section 319(h project. Conservation practices were implemented on 238 fields. The objective of this project was to evaluate conservation practice effectiveness on these fields using the Texas Best Management Evaluation Tool (TBET, a simplified Graphic User Interface (GUI for SWAT developed for field-scale application. TBET was applied on each field to predict the effects of conservation practice implementation on P and sediment loads. These predictions were used to evaluate the implementation cost (per kg of pollutant associated with these reductions. Overall the implemented practices were predicted to reduce P loads to Lake Overholser by nine percent. The ‘riparian exclusion’ and ‘riparian exclusion with buffer’ practices provided the greatest reduction in P load while ‘conservation tillage’ and ‘converting wheat to bermuda grass’ produced the largest reduction in sediment load. The most cost efficient practices were ‘converting wheat to bermuda grass’ or ‘native range’ and ‘riparian exclusion’. This project demonstrates the importance of conservation practice selection and evaluation prior to implementation in order to optimize cost share funds. In addition, this information may lead to the implementation of more cost effective practices and an improvement in the overall effectiveness of water quality programs."

  11. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  12. Spatio-Temporal Multiway Data Decomposition Using Principal Tensor Analysis on k-Modes: The R Package PTAk

    Directory of Open Access Journals (Sweden)

    Didier G. Leibovici

    2010-10-01

    Full Text Available The purpose of this paper is to describe the R package {PTAk and how the spatio-temporal context can be taken into account in the analyses. Essentially PTAk( is a multiway multidimensional method to decompose a multi-entries data-array, seen mathematically as a tensor of any order. This PTAk-modes method proposes a way of generalizing SVD (singular value decomposition, as well as some other well known methods included in the R package, such as PARAFAC or CANDECOMP and the PCAn-modes or Tucker-n model. The example datasets cover different domains with various spatio-temporal characteristics and issues: (i~medical imaging in neuropsychology with a functional MRI (magnetic resonance imaging study, (ii~pharmaceutical research with a pharmacodynamic study with EEG (electro-encephaloegraphic data for a central nervous system (CNS drug, and (iii~geographical information system (GIS with a climatic dataset that characterizes arid and semi-arid variations. All the methods implemented in the R package PTAk also support non-identity metrics, as well as penalizations during the optimization process. As a result of these flexibilities, together with pre-processing facilities, PTAk constitutes a framework for devising extensions of multidimensional methods such ascorrespondence analysis, discriminant analysis, and multidimensional scaling, also enabling spatio-temporal constraints.

  13. Graphical Independence Networks with the gRain Package for R

    Directory of Open Access Journals (Sweden)

    Soren Hojsgaard

    2012-01-01

    Full Text Available In this paper we present the R package gRain for propagation in graphical independence networks (for which Bayesian networks is a special instance. The paper includes a description of the theory behind the computations. The main part of the paper is an illustration of how to use the package. The paper also illustrates how to turn a graphical model and data into an independence network

  14. NATURAL graphics

    Science.gov (United States)

    Jones, R. H.

    1984-01-01

    The hardware and software developments in computer graphics are discussed. Major topics include: system capabilities, hardware design, system compatibility, and software interface with the data base management system.

  15. Graphical passwords: a qualitative study of password patterns

    CSIR Research Space (South Africa)

    Vorster, J

    2015-03-01

    Full Text Available focus on a quantitative analysis of graphical passwords. During this study users from commercial companies were asked to enter graphical passwords. These passwords were then analysed and patterns identified. Users were also asked what there password...

  16. FunctSNP: an R package to link SNPs to functional knowledge and dbAutoMaker: a suite of Perl scripts to build SNP databases

    Directory of Open Access Journals (Sweden)

    Watson-Haigh Nathan S

    2010-06-01

    Full Text Available Abstract Background Whole genome association studies using highly dense single nucleotide polymorphisms (SNPs are a set of methods to identify DNA markers associated with variation in a particular complex trait of interest. One of the main outcomes from these studies is a subset of statistically significant SNPs. Finding the potential biological functions of such SNPs can be an important step towards further use in human and agricultural populations (e.g., for identifying genes related to susceptibility to complex diseases or genes playing key roles in development or performance. The current challenge is that the information holding the clues to SNP functions is distributed across many different databases. Efficient bioinformatics tools are therefore needed to seamlessly integrate up-to-date functional information on SNPs. Many web services have arisen to meet the challenge but most work only within the framework of human medical research. Although we acknowledge the importance of human research, we identify there is a need for SNP annotation tools for other organisms. Description We introduce an R package called FunctSNP, which is the user interface to custom built species-specific databases. The local relational databases contain SNP data together with functional annotations extracted from online resources. FunctSNP provides a unified bioinformatics resource to link SNPs with functional knowledge (e.g., genes, pathways, ontologies. We also introduce dbAutoMaker, a suite of Perl scripts, which can be scheduled to run periodically to automatically create/update the customised SNP databases. We illustrate the use of FunctSNP with a livestock example, but the approach and software tools presented here can be applied also to human and other organisms. Conclusions Finding the potential functional significance of SNPs is important when further using the outcomes from whole genome association studies. FunctSNP is unique in that it is the only R

  17. Graphic Storytelling

    Science.gov (United States)

    Thompson, John

    2009-01-01

    Graphic storytelling is a medium that allows students to make and share stories, while developing their art communication skills. American comics today are more varied in genre, approach, and audience than ever before. When considering the impact of Japanese manga on the youth, graphic storytelling emerges as a powerful player in pop culture. In…

  18. Graphic Storytelling

    Science.gov (United States)

    Thompson, John

    2009-01-01

    Graphic storytelling is a medium that allows students to make and share stories, while developing their art communication skills. American comics today are more varied in genre, approach, and audience than ever before. When considering the impact of Japanese manga on the youth, graphic storytelling emerges as a powerful player in pop culture. In…

  19. Grid OCL : A Graphical Object Connecting Language

    Science.gov (United States)

    Taylor, I. J.; Schutz, B. F.

    In this paper, we present an overview of the Grid OCL graphical object connecting language. Grid OCL is an extension of Grid, introduced last year, that allows users to interactively build complex data processing systems by selecting a set of desired tools and connecting them together graphically. Algorithms written in this way can now also be run outside the graphical environment.

  20. Adhoc: an R package to calculate ad hoc distance thresholds for DNA barcoding identification

    Directory of Open Access Journals (Sweden)

    Gontran Sonet

    2013-12-01

    Full Text Available Identification by DNA barcoding is more likely to be erroneous when it is based on a large distance between the query (the barcode sequence of the specimen to identify and its best match in a reference barcode library. The number of such false positive identifications can be decreased by setting a distance threshold above which identification has to be rejected. To this end, we proposed recently to use an ad hoc distance threshold producing identifications with an estimated relative error probability that can be fixed by the user (e.g. 5%. Here we introduce two R functions that automate the calculation of ad hoc distance thresholds for reference libraries of DNA barcodes. The scripts of both functions, a user manual and an example file are available on the JEMU website (http://jemu.myspecies.info/computer-programs as well as on the comprehensive R archive network (CRAN, http://cran.r-project.org.

  1. label.switching: An R Package for Dealing with the Label Switching Problem in MCMC Outputs

    Directory of Open Access Journals (Sweden)

    Panagiotis Papastamoulis

    2016-02-01

    Full Text Available Label switching is a well-known and fundamental problem in Bayesian estimation of mixture or hidden Markov models. In case that the prior distribution of the model parameters is the same for all states, then both the likelihood and posterior distribution are invariant to permutations of the parameters. This property makes Markov chain Monte Carlo (MCMC samples simulated from the posterior distribution non-identifiable. In this paper, the label.switching package is introduced. It contains one probabilistic and seven deterministic relabeling algorithms in order to post-process a given MCMC sample, provided by the user. Each method returns a set of permutations that can be used to reorder the MCMC output. Then, any parametric function of interest can be inferred using the reordered MCMC sample. A set of user-defined permutations is also accepted, allowing the researcher to benchmark new relabeling methods against the available ones.

  2. FW: An R Package for Finlay-Wilkinson Regression that Incorporates Genomic/Pedigree Information and Covariance Structures Between Environments.

    Science.gov (United States)

    Lian, Lian; de Los Campos, Gustavo

    2015-12-29

    The Finlay-Wilkinson regression (FW) is a popular method among plant breeders to describe genotype by environment interaction. The standard implementation is a two-step procedure that uses environment (sample) means as covariates in a within-line ordinary least squares (OLS) regression. This procedure can be suboptimal for at least four reasons: (1) in the first step environmental means are typically estimated without considering genetic-by-environment interactions, (2) in the second step uncertainty about the environmental means is ignored, (3) estimation is performed regarding lines and environment as fixed effects, and (4) the procedure does not incorporate genetic (either pedigree-derived or marker-derived) relationships. Su et al. proposed to address these problems using a Bayesian method that allows simultaneous estimation of environmental and genotype parameters, and allows incorporation of pedigree information. In this article we: (1) extend the model presented by Su et al. to allow integration of genomic information [e.g., single nucleotide polymorphism (SNP)] and covariance between environments, (2) present an R package (FW) that implements these methods, and (3) illustrate the use of the package using examples based on real data. The FW R package implements both the two-step OLS method and a full Bayesian approach for Finlay-Wilkinson regression with a very simple interface. Using a real wheat data set we demonstrate that the prediction accuracy of the Bayesian approach is consistently higher than the one achieved by the two-step OLS method. Copyright © 2016 Lian and Campos.

  3. Graphic Review

    DEFF Research Database (Denmark)

    Breiting, Søren

    2002-01-01

    Introduktion til 'graphic review' som en metode til at føre forståelse fra en undervisngsgang til den næste i læreruddannelse og grundskole.......Introduktion til 'graphic review' som en metode til at føre forståelse fra en undervisngsgang til den næste i læreruddannelse og grundskole....

  4. Graphics gems

    CERN Document Server

    Glassner, Andrew S

    1993-01-01

    ""The GRAPHICS GEMS Series"" was started in 1990 by Andrew Glassner. The vision and purpose of the Series was - and still is - to provide tips, techniques, and algorithms for graphics programmers. All of the gems are written by programmers who work in the field and are motivated by a common desire to share interesting ideas and tools with their colleagues. Each volume provides a new set of innovative solutions to a variety of programming problems.

  5. pdc: An R Package for Complexity-Based Clustering of Time Series

    Directory of Open Access Journals (Sweden)

    Andreas M. Brandmaier

    2015-10-01

    Full Text Available Permutation distribution clustering is a complexity-based approach to clustering time series. The dissimilarity of time series is formalized as the squared Hellinger distance between the permutation distribution of embedded time series. The resulting distance measure has linear time complexity, is invariant to phase and monotonic transformations, and robust to outliers. A probabilistic interpretation allows the determination of the number of significantly different clusters. An entropy-based heuristic relieves the user of the need to choose the parameters of the underlying time-delayed embedding manually and, thus, makes it possible to regard the approach as parameter-free. This approach is illustrated with examples on empirical data.

  6. pdc: An R Package for Complexity-Based Clustering of Time Series

    Directory of Open Access Journals (Sweden)

    Andreas M. Brandmaier

    2015-10-01

    Full Text Available Permutation distribution clustering is a complexity-based approach to clustering time series. The dissimilarity of time series is formalized as the squared Hellinger distance between the permutation distribution of embedded time series. The resulting distance measure has linear time complexity, is invariant to phase and monotonic transformations, and robust to outliers. A probabilistic interpretation allows the determination of the number of significantly different clusters. An entropy-based heuristic relieves the user of the need to choose the parameters of the underlying time-delayed embedding manually and, thus, makes it possible to regard the approach as parameter-free. This approach is illustrated with examples on empirical data.

  7. R-HYPE - an open R-package for management and evaluation of HYPE-data

    Science.gov (United States)

    Capell, Rene; Strömbäck, Lena; Gustafsson, David

    2015-04-01

    The Hydrological Predictions for the Environment (HYPE) model is a dynamic, semi-distributed, process-based, integrated catchment model. It uses well-known hydrological and nutrient transport concepts and can be applied for both small and large scale assessments of water resources and status. The model uses a sub-catchment approach to discretize the model domain. Within sub-catchments, a hydrological response unit (HRU) approach is used to calculate the model response. HYPE source code and tools are available through the HYPE Open Source Community (OSC) website (hype.sourceforge.net). HYPE code is released by the SMHI under the Lesser GNU Public to strengthen international collaboration in hydrological modelling and hydrological data production. New versions of the main code are delivered frequently as new versions of the HYPE model are developed. HYPE OSC is open to everyone interested in hydrology, hydrological modelling and code development - e.g. scientists, authorities, and consultancies. To support users, HYPE OSC contains manuals, wiki-pages, sample models and tools. One such tool is R-HYPE. R-HYPE is an R add-on package, providing a continuously growing set of functions to support and simplify processing files and analyzing results of the HYPE model. HYPE is typically used in large scale model set-ups which involves large data sets. Analysing such data requires recurring data handling operations and analysis steps. RHYPE strives to simplify these frequently needed analysis steps in order to support users in their specific research tasks. The current version of the package contains functionality for: - Importing HYPE model set-up and data files as well as model result files into an R workspace - Analyzing and manipulating a HYPE model set-up - Identifying and analyzing model sub-sets or specific sub-catchments - Analysing and manipulating forcing data sets - Plotting results - Exporting data from R into HYPE files R-HYPE is currently hosted on GitHub (https

  8. PtProcess: An R Package for Modelling Marked Point Processes Indexed by Time

    Directory of Open Access Journals (Sweden)

    David Harte

    2010-10-01

    Full Text Available This paper describes the package PtProcess which uses the R statistical language. The package provides a unified approach to fitting and simulating a wide variety of temporal point process or temporal marked point process models. The models are specified by an intensity function which is conditional on the history of the process. The user needs to provide routines for calculating the conditional intensity function. Then the package enables one to carry out maximum likelihood fitting, goodness of fit testing, simulation and comparison of models. The package includes the routines for the conditional intensity functions for a variety of standard point process models. The package is intended to simplify the fitting of point process models indexed by time in much the same way as generalized linear model programs have simplified the fitting of various linear models. The primary examples used in this paper are earthquake sequences but the package is intended to have a much wider applicability.

  9. SAFE(R): A Matlab/Octave Toolbox (and R Package) for Global Sensitivity Analysis

    Science.gov (United States)

    Pianosi, Francesca; Sarrazin, Fanny; Gollini, Isabella; Wagener, Thorsten

    2015-04-01

    Global Sensitivity Analysis (GSA) is increasingly used in the development and assessment of hydrological models, as well as for dominant control analysis and for scenario discovery to support water resource management under deep uncertainty. Here we present a toolbox for the application of GSA, called SAFE (Sensitivity Analysis For Everybody) that implements several established GSA methods, including method of Morris, Regional Sensitivity Analysis, variance-based sensitivity Analysis (Sobol') and FAST. It also includes new approaches and visualization tools to complement these established methods. The Toolbox is released in two versions, one running under Matlab/Octave (called SAFE) and one running in R (called SAFER). Thanks to its modular structure, SAFE(R) can be easily integrated with other toolbox and packages, and with models running in a different computing environment. Another interesting feature of SAFE(R) is that all the implemented methods include specific functions for assessing the robustness and convergence of the sensitivity estimates. Furthermore, SAFE(R) includes numerous visualisation tools for the effective investigation and communication of GSA results. The toolbox is designed to make GSA accessible to non-specialist users, and to provide a fully commented code for more experienced users to complement their own tools. The documentation includes a set of workflow scripts with practical guidelines on how to apply GSA and how to use the toolbox. SAFE(R) is open source and freely available from the following website: http://bristol.ac.uk/cabot/resources/safe-toolbox/ Ultimately, SAFE(R) aims at improving the diffusion and quality of GSA practice in the hydrological modelling community.

  10. Graphics gems

    CERN Document Server

    Heckbert, Paul S

    1994-01-01

    Graphics Gems IV contains practical techniques for 2D and 3D modeling, animation, rendering, and image processing. The book presents articles on polygons and polyhedral; a mix of formulas, optimized algorithms, and tutorial information on the geometry of 2D, 3D, and n-D space; transformations; and parametric curves and surfaces. The text also includes articles on ray tracing; shading 3D models; and frame buffer techniques. Articles on image processing; algorithms for graphical layout; basic interpolation methods; and subroutine libraries for vector and matrix algebra are also demonstrated. Com

  11. Graphic Ecologies

    Directory of Open Access Journals (Sweden)

    Brook Weld Muller

    2014-12-01

    Full Text Available This essay describes strategic approaches to graphic representation associated with critical environmental engagement and that build from the idea of works of architecture as stitches in the ecological fabric of the city. It focuses on the building up of partial or fragmented graphics in order to describe inclusive, open-ended possibilities for making architecture that marry rich experience and responsive performance. An aphoristic approach to crafting drawings involves complex layering, conscious absence and the embracing of tension. A self-critical attitude toward the generation of imagery characterized by the notion of ‘loose precision’ may lead to more transformative and environmentally responsive architectures.

  12. 一种嵌入式图形用户界面系统的设计与实现%Design and Implementation of an Embedded Graphic User Interface

    Institute of Scientific and Technical Information of China (English)

    潘宁喦; 张萌; 王超

    2012-01-01

    为了满足嵌入式系统的界面设计需求,给出了一种图形用户界面系统SKY-GUI的设计思路和其在嵌入式Linux环境下的实现方法.SKY-GUI有四大组成部分:输入抽象层、显示抽象层、事件系统和窗口系统.其特点是界面美观、占用资源少、运行效率高,现已应用于嵌入式视频监控项目.实验证明其设计思路可行,性能优良,适用于典型的嵌入式系统项目.%A new design idea of an embedded graphic user interface SKY-GUI and its implementation on Linux operating system are introduced to satisfy the need of user interface designing in embedded system. Input Abstract Layer,Graphic Abstract Layer, Event System and Window System are the four main components of this GUI. Attractive interface,low system resource consumption and high efficiency are the features. SKY-GUI has been used in an embedded video surveillance project. Its practicable idea, sound performance and feasibility in typical embedded project are proved in our experiment.

  13. SubVis: an interactive R package for exploring the effects of multiple substitution matrices on pairwise sequence alignment

    Directory of Open Access Journals (Sweden)

    Scott Barlowe

    2017-06-01

    Full Text Available Understanding how proteins mutate is critical to solving a host of biological problems. Mutations occur when an amino acid is substituted for another in a protein sequence. The set of likelihoods for amino acid substitutions is stored in a matrix and input to alignment algorithms. The quality of the resulting alignment is used to assess the similarity of two or more sequences and can vary according to assumptions modeled by the substitution matrix. Substitution strategies with minor parameter variations are often grouped together in families. For example, the BLOSUM and PAM matrix families are commonly used because they provide a standard, predefined way of modeling substitutions. However, researchers often do not know if a given matrix family or any individual matrix within a family is the most suitable. Furthermore, predefined matrix families may inaccurately reflect a particular hypothesis that a researcher wishes to model or otherwise result in unsatisfactory alignments. In these cases, the ability to compare the effects of one or more custom matrices may be needed. This laborious process is often performed manually because the ability to simultaneously load multiple matrices and then compare their effects on alignments is not readily available in current software tools. This paper presents SubVis, an interactive R package for loading and applying multiple substitution matrices to pairwise alignments. Users can simultaneously explore alignments resulting from multiple predefined and custom substitution matrices. SubVis utilizes several of the alignment functions found in R, a common language among protein scientists. Functions are tied together with the Shiny platform which allows the modification of input parameters. Information regarding alignment quality and individual amino acid substitutions is displayed with the JavaScript language which provides interactive visualizations for revealing both high-level and low-level alignment

  14. SubVis: an interactive R package for exploring the effects of multiple substitution matrices on pairwise sequence alignment

    Science.gov (United States)

    Coan, Heather B.; Youker, Robert T.

    2017-01-01

    Understanding how proteins mutate is critical to solving a host of biological problems. Mutations occur when an amino acid is substituted for another in a protein sequence. The set of likelihoods for amino acid substitutions is stored in a matrix and input to alignment algorithms. The quality of the resulting alignment is used to assess the similarity of two or more sequences and can vary according to assumptions modeled by the substitution matrix. Substitution strategies with minor parameter variations are often grouped together in families. For example, the BLOSUM and PAM matrix families are commonly used because they provide a standard, predefined way of modeling substitutions. However, researchers often do not know if a given matrix family or any individual matrix within a family is the most suitable. Furthermore, predefined matrix families may inaccurately reflect a particular hypothesis that a researcher wishes to model or otherwise result in unsatisfactory alignments. In these cases, the ability to compare the effects of one or more custom matrices may be needed. This laborious process is often performed manually because the ability to simultaneously load multiple matrices and then compare their effects on alignments is not readily available in current software tools. This paper presents SubVis, an interactive R package for loading and applying multiple substitution matrices to pairwise alignments. Users can simultaneously explore alignments resulting from multiple predefined and custom substitution matrices. SubVis utilizes several of the alignment functions found in R, a common language among protein scientists. Functions are tied together with the Shiny platform which allows the modification of input parameters. Information regarding alignment quality and individual amino acid substitutions is displayed with the JavaScript language which provides interactive visualizations for revealing both high-level and low-level alignment information. PMID:28674656

  15. FluxSimulator: An R Package to Simulate Isotopomer Distributions in Metabolic Networks

    Directory of Open Access Journals (Sweden)

    Thomas W. Binsl

    2007-01-01

    Full Text Available The representation of biochemical knowledge in terms of fluxes (transformation rates in a metabolic network is often a crucial step in the development of new drugs and efficient bioreactors. Mass spectroscopy (MS and nuclear magnetic resonance spectroscopy (NMRS in combination with 13C labeled substrates are experimental techniques resulting in data that may be used to quantify fluxes in the metabolic network underlying a process. The massive amount of data generated by spectroscopic experiments increasingly requires software which models the dynamics of the underlying biological system. In this work we present an approach to handle isotopomer distributions in metabolic networks using an object-oriented programming approach, implemented using S4 classes in R. The developed package is called FluxSimulator and provides a user friendly interface to specify the topological information of the metabolic network as well as carbon atom transitions in plain text files. The package automatically derives the mathematical representation of the formulated network, and assembles a set of ordinary differential equations (ODEs describing the change of each isotopomer pool over time. These ODEs are subsequently solved numerically. In a case study FluxSimulator was applied to an example network. Our results indicate that the package is able to reproduce exact changes in isotopomer compositions of the metabolite pools over time at given flux rates.

  16. MortalitySmooth: An R Package for Smoothing Poisson Counts with P-Splines

    Directory of Open Access Journals (Sweden)

    Carlo G. Camarda

    2012-07-01

    Full Text Available The MortalitySmooth package provides a framework for smoothing count data in both one- and two-dimensional settings. Although general in its purposes, the package is specifically tailored to demographers, actuaries, epidemiologists, and geneticists who may be interested in using a practical tool for smoothing mortality data over ages and/or years. The total number of deaths over a specified age- and year-interval is assumed to be Poisson-distributed, and P-splines and generalized linear array models are employed as a suitable regression methodology. Extra-Poisson variation can also be accommodated.Structured in an S3 object orientation system, MortalitySmooth has two main functions which t the data and dene two classes of objects:Mort1Dsmooth and Mort2Dsmooth. The methods for these classes (print, summary, plot, predict, and residuals are also included. These features make it easy for users to extract and manipulate the outputs.In addition, a collection of mortality data is provided. This paper provides an overview of the design, aims, and principles of MortalitySmooth, as well as strategies for applying it and extending its use.

  17. Graphic notation

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2010-01-01

    Graphic notation is taught to music therapy students at Aalborg University in both simple and elaborate forms. This is a method of depicting music visually, and notations may serve as memory aids, as aids for analysis and reflection, and for communication purposes such as supervision or within...

  18. Coulomb 3.3 Graphic-rich deformation and stress-change software for earthquake, tectonic, and volcano research and teaching-user guide

    Science.gov (United States)

    Toda, Shingi; Stein, Ross S.; Sevilgen, Volkan; Lin, Jian

    2011-01-01

    Coulomb is intended both for publication-directed research and for college and graduate school classroom instruction. We believe that one learns best when one can see the most and can explore alternatives quickly. So the principal feature of Coulomb is ease of input, rapid interactive modification, and intuitive visualization of the results. The program has menus and check-items, and dialogue boxes to ease operation. The internal graphics are suitable for publication, and can be easily imported into Illustrator, GMT, Google Earth, or Flash for further enhancements.

  19. rEHR: An R package for manipulating and analysing Electronic Health Record data

    Science.gov (United States)

    Springate, David A.; Parisi, Rosa; Olier, Ivan; Reeves, David

    2017-01-01

    Research with structured Electronic Health Records (EHRs) is expanding as data becomes more accessible; analytic methods advance; and the scientific validity of such studies is increasingly accepted. However, data science methodology to enable the rapid searching/extraction, cleaning and analysis of these large, often complex, datasets is less well developed. In addition, commonly used software is inadequate, resulting in bottlenecks in research workflows and in obstacles to increased transparency and reproducibility of the research. Preparing a research-ready dataset from EHRs is a complex and time consuming task requiring substantial data science skills, even for simple designs. In addition, certain aspects of the workflow are computationally intensive, for example extraction of longitudinal data and matching controls to a large cohort, which may take days or even weeks to run using standard software. The rEHR package simplifies and accelerates the process of extracting ready-for-analysis datasets from EHR databases. It has a simple import function to a database backend that greatly accelerates data access times. A set of generic query functions allow users to extract data efficiently without needing detailed knowledge of SQL queries. Longitudinal data extractions can also be made in a single command, making use of parallel processing. The package also contains functions for cutting data by time-varying covariates, matching controls to cases, unit conversion and construction of clinical code lists. There are also functions to synthesise dummy EHR. The package has been tested with one for the largest primary care EHRs, the Clinical Practice Research Datalink (CPRD), but allows for a common interface to other EHRs. This simplified and accelerated work flow for EHR data extraction results in simpler, cleaner scripts that are more easily debugged, shared and reproduced. PMID:28231289

  20. rEHR: An R package for manipulating and analysing Electronic Health Record data.

    Science.gov (United States)

    Springate, David A; Parisi, Rosa; Olier, Ivan; Reeves, David; Kontopantelis, Evangelos

    2017-01-01

    Research with structured Electronic Health Records (EHRs) is expanding as data becomes more accessible; analytic methods advance; and the scientific validity of such studies is increasingly accepted. However, data science methodology to enable the rapid searching/extraction, cleaning and analysis of these large, often complex, datasets is less well developed. In addition, commonly used software is inadequate, resulting in bottlenecks in research workflows and in obstacles to increased transparency and reproducibility of the research. Preparing a research-ready dataset from EHRs is a complex and time consuming task requiring substantial data science skills, even for simple designs. In addition, certain aspects of the workflow are computationally intensive, for example extraction of longitudinal data and matching controls to a large cohort, which may take days or even weeks to run using standard software. The rEHR package simplifies and accelerates the process of extracting ready-for-analysis datasets from EHR databases. It has a simple import function to a database backend that greatly accelerates data access times. A set of generic query functions allow users to extract data efficiently without needing detailed knowledge of SQL queries. Longitudinal data extractions can also be made in a single command, making use of parallel processing. The package also contains functions for cutting data by time-varying covariates, matching controls to cases, unit conversion and construction of clinical code lists. There are also functions to synthesise dummy EHR. The package has been tested with one for the largest primary care EHRs, the Clinical Practice Research Datalink (CPRD), but allows for a common interface to other EHRs. This simplified and accelerated work flow for EHR data extraction results in simpler, cleaner scripts that are more easily debugged, shared and reproduced.

  1. Raster graphics display library

    Science.gov (United States)

    Grimsrud, Anders; Stephenson, Michael B.

    1987-01-01

    The Raster Graphics Display Library (RGDL) is a high level subroutine package that give the advanced raster graphics display capabilities needed. The RGDL uses FORTRAN source code routines to build subroutines modular enough to use as stand-alone routines in a black box type of environment. Six examples are presented which will teach the use of RGDL in the fastest, most complete way possible. Routines within the display library that are used to produce raster graphics are presented in alphabetical order, each on a separate page. Each user-callable routine is described by function and calling parameters. All common blocks that are used in the display library are listed and the use of each variable within each common block is discussed. A reference on the include files that are necessary to compile the display library is contained. Each include file and its purpose are listed. The link map for MOVIE.BYU version 6, a general purpose computer graphics display system that uses RGDL software, is also contained.

  2. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark.

  3. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark. 

  4. pcaGoPromoter--an R package for biological and regulatory interpretation of principal components in genome-wide gene expression data

    DEFF Research Database (Denmark)

    Hansen, Morten; Gerds, Thomas Alexander; Nielsen, Ole Haagen

    2012-01-01

    Analyzing data obtained from genome-wide gene expression experiments is challenging due to the quantity of variables, the need for multivariate analyses, and the demands of managing large amounts of data. Here we present the R package pcaGoPromoter, which facilitates the interpretation of genome-...

  5. The R Package MitISEM: Mixture of Student-t Distributions using Importance Sampling Weighted Expectation Maximization for Efficient and Robust Simulation

    NARCIS (Netherlands)

    N. Basturk (Nalan); L.F. Hoogerheide (Lennart); A. Opschoor (Anne); H.K. van Dijk (Herman)

    2012-01-01

    textabstractThis paper presents the R package MitISEM, which provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate de

  6. Recursive filtering for zero offset correction of diving depth time series with GNU R package diveMove.

    Directory of Open Access Journals (Sweden)

    Sebastián P Luque

    Full Text Available Zero offset correction of diving depth measured by time-depth recorders is required to remove artifacts arising from temporal changes in accuracy of pressure transducers. Currently used methods for this procedure are in the proprietary software domain, where researchers cannot study it in sufficient detail, so they have little or no control over how their data were changed. GNU R package diveMove implements a procedure in the Free Software domain that consists of recursively smoothing and filtering the input time series using moving quantiles. This paper describes, demonstrates, and evaluates the proposed method by using a "perfect" data set, which is subsequently corrupted to provide input for the proposed procedure. The method is evaluated by comparing the corrected time series to the original, uncorrupted, data set from an Antarctic fur seal (Arctocephalus gazella Peters, 1875. The Root Mean Square Error of the corrected data set, relative to the "perfect" data set, was nearly identical to the magnitude of noise introduced into the latter. The method, thus, provides a flexible, reliable, and efficient mechanism to perform zero offset correction for analyses of diving behaviour. We illustrate applications of the method to data sets from four species with large differences in diving behaviour, measured using different sampling protocols and instrument characteristics.

  7. HPOSim: an R package for phenotypic similarity measure and enrichment analysis based on the human phenotype ontology.

    Science.gov (United States)

    Deng, Yue; Gao, Lin; Wang, Bingbo; Guo, Xingli

    2015-01-01

    Phenotypic features associated with genes and diseases play an important role in disease-related studies and most of the available methods focus solely on the Online Mendelian Inheritance in Man (OMIM) database without considering the controlled vocabulary. The Human Phenotype Ontology (HPO) provides a standardized and controlled vocabulary covering phenotypic abnormalities in human diseases, and becomes a comprehensive resource for computational analysis of human disease phenotypes. Most of the existing HPO-based software tools cannot be used offline and provide only few similarity measures. Therefore, there is a critical need for developing a comprehensive and offline software for phenotypic features similarity based on HPO. HPOSim is an R package for analyzing phenotypic similarity for genes and diseases based on HPO data. Seven commonly used semantic similarity measures are implemented in HPOSim. Enrichment analysis of gene sets and disease sets are also implemented, including hypergeometric enrichment analysis and network ontology analysis (NOA). HPOSim can be used to predict disease genes and explore disease-related function of gene modules. HPOSim is open source and freely available at SourceForge (https://sourceforge.net/p/hposim/).

  8. EntropyExplorer: an R package for computing and comparing differential Shannon entropy, differential coefficient of variation and differential expression.

    Science.gov (United States)

    Wang, Kai; Phillips, Charles A; Saxton, Arnold M; Langston, Michael A

    2015-12-30

    Differential Shannon entropy (DSE) and differential coefficient of variation (DCV) are effective metrics for the study of gene expression data. They can serve to augment differential expression (DE), and be applied in numerous settings whenever one seeks to measure differences in variability rather than mere differences in magnitude. A general purpose, easily accessible tool for DSE and DCV would help make these two metrics available to data scientists. Automated p value computations would additionally be useful, and are often easier to interpret than raw test statistic values alone. EntropyExplorer is an R package for calculating DSE, DCV and DE. It also computes corresponding p values for each metric. All features are available through a single R function call. Based on extensive investigations in the literature, the Fligner-Killeen test was chosen to compute DCV p values. No standard method was found to be appropriate for DSE, and so permutation testing is used to calculate DSE p values. EntropyExplorer provides a convenient resource for calculating DSE, DCV, DE and associated p values. The package, along with its source code and reference manual, are freely available from the CRAN public repository at http://cran.r-project.org/web/packages/EntropyExplorer/index.html.

  9. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    Science.gov (United States)

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  10. MmPalateMiRNA, an R package compendium illustrating analysis of miRNA microarray data.

    Science.gov (United States)

    Brock, Guy N; Mukhopadhyay, Partha; Pihur, Vasyl; Webb, Cynthia; Greene, Robert M; Pisano, M Michele

    2013-01-08

    MicroRNAs (miRNAs) constitute the largest family of noncoding RNAs involved in gene silencing and represent critical regulators of cell and tissue differentiation. Microarray expression profiling of miRNAs is an effective means of acquiring genome-level information of miRNA activation and inhibition, as well as the potential regulatory role that these genes play within a biological system. As with mRNA expression profiling arrays, miRNA microarrays come in a variety of platforms from numerous manufacturers, and there are a multitude of techniques available for reducing and analyzing these data. In this paper, we present an analysis of a typical two-color miRNA microarray experiment using publicly available packages from R and Bioconductor, the open-source software project for the analysis of genomic data. Covered topics include visualization, normalization, quality checking, differential expression, cluster analysis, miRNA target identification, and gene set enrichment analysis. Many of these tools carry-over from the analysis of mRNA microarrays, but with some notable differences that require special attention. The paper is presented as a "compendium" which, along with the accompanying R package MmPalateMiRNA, contains all of the experimental data and source code to reproduce the analyses contained in the paper. The compendium presented in this paper will provide investigators with an access point for applying the methods available in R and Bioconductor for analysis of their own miRNA array data.

  11. spMC: an R-package for 3D lithological reconstructions based on spatial Markov chains

    Science.gov (United States)

    Sartore, Luca; Fabbri, Paolo; Gaetan, Carlo

    2016-09-01

    The paper presents the spatial Markov Chains (spMC) R-package and a case study of subsoil simulation/prediction located in a plain site of Northeastern Italy. spMC is a quite complete collection of advanced methods for data inspection, besides spMC implements Markov Chain models to estimate experimental transition probabilities of categorical lithological data. Furthermore, simulation methods based on most known prediction methods (as indicator Kriging and CoKriging) were implemented in spMC package. Moreover, other more advanced methods are available for simulations, e.g. path methods and Bayesian procedures, that exploit the maximum entropy. Since the spMC package was developed for intensive geostatistical computations, part of the code is implemented for parallel computations via the OpenMP constructs. A final analysis of this computational efficiency compares the simulation/prediction algorithms by using different numbers of CPU cores, and considering the example data set of the case study included in the package.

  12. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  13. Graphic Novels in the Classroom

    Science.gov (United States)

    Martin, Adam

    2009-01-01

    Today many authors and artists adapt works of classic literature into a medium more "user friendly" to the increasingly visual student population. Stefan Petrucha and Kody Chamberlain's version of "Beowulf" is one example. The graphic novel captures the entire epic in arresting images and contrasts the darkness of the setting and characters with…

  14. GRADE: a graphical programming environment for multicomputers

    OpenAIRE

    Kacsuk, P; G. Dózsa; T. Fadfyas; R. Lovas

    2012-01-01

    To provide high-level graphical support for developing message passing programs, an integrated programming environment (GRADE) is being developed. GRADE currently provides tools to construct, execute, debug, monitor and visualize message passing based parallel programs. GRADE offers the programmer an integrated graphical user interface during the whole life-cycle of program development and provides high-level graphical programming abstraction mechanisms to construct parallel applications. The...

  15. A study of perceptions of graphical passwords

    CSIR Research Space (South Africa)

    Vorster, JS

    2015-10-01

    Full Text Available Depending on the graphical password schema, the key-space can be even bigger than alpha-numeric passwords. However, in conventional passwords, users will re-use letters within a password. This study investigates graphical passwords for symbol...

  16. Graphic Design in Libraries: A Conceptual Process

    Science.gov (United States)

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  17. Graphic Design in Libraries: A Conceptual Process

    Science.gov (United States)

    Ruiz, Miguel

    2014-01-01

    Providing successful library services requires efficient and effective communication with users; therefore, it is important that content creators who develop visual materials understand key components of design and, specifically, develop a holistic graphic design process. Graphic design, as a form of visual communication, is the process of…

  18. Heuristic attacks against graphical password generators

    CSIR Research Space (South Africa)

    Peach, S

    2010-05-01

    Full Text Available In this paper the authors explore heuristic attacks against graphical password generators. A new trend is emerging to use user clickable pictures to generate passwords. This technique of authentication can be successfully used for - for example...

  19. fullfact: an R package for the analysis of genetic and maternal variance components from full factorial mating designs.

    Science.gov (United States)

    Houde, Aimee Lee S; Pitcher, Trevor E

    2016-03-01

    Full factorial breeding designs are useful for quantifying the amount of additive genetic, nonadditive genetic, and maternal variance that explain phenotypic traits. Such variance estimates are important for examining evolutionary potential. Traditionally, full factorial mating designs have been analyzed using a two-way analysis of variance, which may produce negative variance values and is not suited for unbalanced designs. Mixed-effects models do not produce negative variance values and are suited for unbalanced designs. However, extracting the variance components, calculating significance values, and estimating confidence intervals and/or power values for the components are not straightforward using traditional analytic methods. We introduce fullfact - an R package that addresses these issues and facilitates the analysis of full factorial mating designs with mixed-effects models. Here, we summarize the functions of the fullfact package. The observed data functions extract the variance explained by random and fixed effects and provide their significance. We then calculate the additive genetic, nonadditive genetic, and maternal variance components explaining the phenotype. In particular, we integrate nonnormal error structures for estimating these components for nonnormal data types. The resampled data functions are used to produce bootstrap-t confidence intervals, which can then be plotted using a simple function. We explore the fullfact package through a worked example. This package will facilitate the analyses of full factorial mating designs in R, especially for the analysis of binary, proportion, and/or count data types and for the ability to incorporate additional random and fixed effects and power analyses.

  20. hsphase: an R package for pedigree reconstruction, detection of recombination events, phasing and imputation of half-sib family groups

    Science.gov (United States)

    2014-01-01

    Background Identification of recombination events and which chromosomal segments contributed to an individual is useful for a number of applications in genomic analyses including haplotyping, imputation, signatures of selection, and improved estimates of relationship and probability of identity by descent. Genotypic data on half-sib family groups are widely available in livestock genomics. This structure makes it possible to identify recombination events accurately even with only a few individuals and it lends itself well to a range of applications such as parentage assignment and pedigree verification. Results Here we present hsphase, an R package that exploits the genetic structure found in half-sib livestock data to identify and count recombination events, impute and phase un-genotyped sires and phase its offspring. The package also allows reconstruction of family groups (pedigree inference), identification of pedigree errors and parentage assignment. Additional functions in the package allow identification of genomic mapping errors, imputation of paternal high density genotypes from low density genotypes, evaluation of phasing results either from hsphase or from other phasing programs. Various diagnostic plotting functions permit rapid visual inspection of results and evaluation of datasets. Conclusion The hsphase package provides a suite of functions for analysis and visualization of genomic structures in half-sib family groups implemented in the widely used R programming environment. Low level functions were implemented in C++ and parallelized to improve performance. hsphase was primarily designed for use with high density SNP array data but it is fast enough to run directly on sequence data once they become more widely available. The package is available (GPL 3) from the Comprehensive R Archive Network (CRAN) or from http://www-personal.une.edu.au/~cgondro2/hsphase.htm. PMID:24906803

  1. Alarm annunciation in a graphical environment

    Energy Technology Data Exchange (ETDEWEB)

    Adams, D.G.

    1994-08-01

    Well-designed graphical user interfaces, such as Microsoft Windows{trademark} or UNIX{trademark} -- based X-Windows, provide a capability for enhanced display of security alarm information. Conversely, a poorly designed interface can quickly overwhelm an operator. This paper describes types of graphical information that can be displayed and offers guidance on how to best display that information. Limits are proposed for the complexity of the user interface, and guidelines are suggested for the display of maps and sensors.

  2. Graphical Language for Data Processing

    Science.gov (United States)

    Alphonso, Keith

    2011-01-01

    A graphical language for processing data allows processing elements to be connected with virtual wires that represent data flows between processing modules. The processing of complex data, such as lidar data, requires many different algorithms to be applied. The purpose of this innovation is to automate the processing of complex data, such as LIDAR, without the need for complex scripting and programming languages. The system consists of a set of user-interface components that allow the user to drag and drop various algorithmic and processing components onto a process graph. By working graphically, the user can completely visualize the process flow and create complex diagrams. This innovation supports the nesting of graphs, such that a graph can be included in another graph as a single step for processing. In addition to the user interface components, the system includes a set of .NET classes that represent the graph internally. These classes provide the internal system representation of the graphical user interface. The system includes a graph execution component that reads the internal representation of the graph (as described above) and executes that graph. The execution of the graph follows the interpreted model of execution in that each node is traversed and executed from the original internal representation. In addition, there are components that allow external code elements, such as algorithms, to be easily integrated into the system, thus making the system infinitely expandable.

  3. Graphical Multiprocessing Analysis Tool (GMAT)

    Energy Technology Data Exchange (ETDEWEB)

    Seager, M.K.; Campbell, S.; Sikora, S.; Strout, R.; Zosel, M.

    1988-03-01

    The design and debugging of parallel programs is a difficult task due to the complex synchronization and data scoping issues involed. to aid the programmer in paralle code dvelopment we have developed two methodologies for the graphical display of execution of parallel codes. The Graphical Multiprocessing Analysis Tools (GMAT) consist of stategraph, which represents an inheritance tree of task states, and timeline, which represens task as flowing sequence of events. Information about the code can be displayed as the application runs (dynamic mode) or played back with time under user control (static mode). This document discusses the design and user interface issues involved in developing the parallel application display GMAT family. Also, we present an introductory user's guide for both tools. 4 figs.

  4. Graphics editors in CPDev environment

    Directory of Open Access Journals (Sweden)

    Marcin Jamro

    2012-03-01

    Full Text Available According to IEC 61131-3 norm, controllers and distributed control systems can be programmed in textual and graphical languages. In many scenarios using a graphical language is preferred by the user, because diagrams can be more legible and easier to understand or modify also by people who do not have strong programming skills. What is more, they can be attached to the documentation to present a part of a system implementation. CPDev is an engineering environment that makes possible to program PLCs, PACs, softPLCs and distributed control systems with the usage of languages defined in IEC 61131-3 norm. In earlier versions, it supported only textual languages - ST and IL. Currently, graphics editors for FBD, LD and SFC languages are also available, so users can choose a suitable language depending on their skills and a specificity of a program that they have to prepare. The article presents implementation of the graphics editors, made by the author, which support creating program organization units in all graphical languages defined in IEC 61131-3 norm. They are equipped with a set of basic and complex functionalities to provide an easy and intuitive way of creating programs, function blocks and functions with visual programming. In the article the project structure and some important mechanisms are described. They include e.g. automatic connections finding (with A* algorithm, translation to ST code, conversion to and from XML format and an execution mode supporting multiple data sources and breakpoints.

  5. Design Graphics

    Science.gov (United States)

    1990-01-01

    A mathematician, David R. Hedgley, Jr. developed a computer program that considers whether a line in a graphic model of a three-dimensional object should or should not be visible. Known as the Hidden Line Computer Code, the program automatically removes superfluous lines and displays an object from a specific viewpoint, just as the human eye would see it. An example of how one company uses the program is the experience of Birdair which specializes in production of fabric skylights and stadium covers. The fabric called SHEERFILL is a Teflon coated fiberglass material developed in cooperation with DuPont Company. SHEERFILL glazed structures are either tension structures or air-supported tension structures. Both are formed by patterned fabric sheets supported by a steel or aluminum frame or cable network. Birdair uses the Hidden Line Computer Code, to illustrate a prospective structure to an architect or owner. The program generates a three- dimensional perspective with the hidden lines removed. This program is still used by Birdair and continues to be commercially available to the public.

  6. Guidebook to R graphics using Microsoft Windows

    CERN Document Server

    Takezawa, Kunio

    2012-01-01

    Introduces the graphical capabilities of R to readers new to the software Due to its flexibility and availability, R has become the computing software of choice for statistical computing and generating graphics across various fields of research. Guidebook to R Graphics Using Microsoft® Windows offers a unique presentation of R, guiding new users through its many benefits, including the creation of high-quality graphics. Beginning with getting the program up and running, this book takes readers step by step through the process of creating histograms, boxplots, strip charts, time series gra

  7. Inference in Graphical Gaussian Models with Edge and Vertex Symmetries with the gRc Package for R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Lauritzen, Steffen L

    2007-01-01

    In this paper we present the R package gRc for statistical inference in graphical Gaussian models in which symmetry restrictions have been imposed on the concentration or partial correlation matrix. The models are represented by coloured graphs where parameters associated with edges or vertices o...... of same colour are restricted to being identical. We describe algorithms for maximum likelihood estimation and discuss model selection issues. The paper illustrates the practical use of the gRc package......In this paper we present the R package gRc for statistical inference in graphical Gaussian models in which symmetry restrictions have been imposed on the concentration or partial correlation matrix. The models are represented by coloured graphs where parameters associated with edges or vertices...

  8. 基于MATLAB的《线性代数实验课程》GUI平台设计与实现%Linear Algebra Experimental System Design and Realization of Graphical User Interface on MATLAB

    Institute of Scientific and Technical Information of China (English)

    崔秋珍

    2012-01-01

      文章阐述利用MATLAB软件,依据线性代数课程的教学内容和要求,设计和实现线性代数实验课程图形用户界面(GUI)平台的过程和方法。借助于该平台的应用使学生加深对线性代数课程知识的理解和掌握,同时锻炼学生利用计算机以及MATLAB软件处理线性代数问题的能力,为线性代数的实践教学提供一个有效的辅助工具。%  This paper studies the linear algebra experimental system design and realization of graphical user interface(GUI) on matlab,According the theory of linear algebra course content and requirements for the design of experiments.With the help of the platform,the students can understand the theory and knowledge of the linear algebra course.And training the students to use the computer and matlab to deal with the problems of the linear algebra,And provide a tool for linear algebra teaching and practive.

  9. CARMA Correlator Graphical Setup

    Science.gov (United States)

    Wu, D.; Shaya, B.; Pound, M. W.

    2011-07-01

    CARMA Correlator Graphical Setup (CGS) is a Java tool to help users of the Combined Array for Research in Millimeter-wave Astronomy (CARMA) plan observations. It allows users to visualize the correlator bands overlaid on frequency space and view spectral lines within each band. Bands can be click-dragged to anywhere in frequency and can have their properties (e.g., bandwidth, quantization level, rest frequency) changed interactively. Spectral lines can be filtered from the view by expected line strength to reduce visual clutter. Once the user is happy with the setup, a button click generates the Python commands needed to configure the correlator within the observing script. CGS can also read Python configurations from an observing script and reproduce the correlator setup that was used. Because the correlator hardware description is defined in an XML file, the tool can be rapidly reconfigured for changing hardware. This has been quite useful as CARMA has recently commissioned a new correlator. The tool was written in Java by high school summer interns working in UMD's Laboratory for Millimeter Astronomy and has become an essential planning tool for CARMA PIs.

  10. Human response times in a graphic environment

    CERN Document Server

    Yule, A

    1972-01-01

    A summary of the results obtained from measuring the response times of the users of an interactive graphics system available on the CERN central computers is presented. These results are then used to find an optimum time to wait before rolling the user's program to disc.

  11. On Repairing Generated Behaviors for Graphical Characters

    DEFF Research Database (Denmark)

    Corradini, Andrea; Mehta, Manish

    2016-01-01

    In this paper, we continue our work on the creation of artificial intelligence (AI) behaviors for graphical interactive characters by novice users. We refer to novice users as any persons who do not have any particular skills, training and experience in both programming and design. The focus...

  12. GnuForPlot Graphics

    Energy Technology Data Exchange (ETDEWEB)

    2015-11-04

    Gnuforplot Graphics is a Fortran90 program designed to generate two and three dimensional plots of data on a personal computer. The program uses calls to the open source code Gnuplot to generate the plots. Two Fortran90 programs have been written to use the Gnuplot graphics capabilities. The first program, named Plotsetup.f90 reads data from output files created by either the Stadium or LeachXS/Orchestra modeling codes and saves the data in arrays for plotting. This program then calls Gnuforplot which takes the data array along with user specified parameters to set plot specifications and issues Gnuplot commands that generate the screen plots. The user can view the plots and optionally save copies in jpeg format.

  13. Weapon-Centric Graphic Controller User Evaluation

    Science.gov (United States)

    2009-08-01

    is accurate and moves faster. 1 Less jerky. 1 Trackball is quicker, and draws zig zag lines better. 1 Much better to work this than the joystick. 1...6218 1 DIRECTOR US ARMY RESEARCH LAB IMNE ALC HRR 2800 POWDER MILL RD ADELPHI MD 20783-1197 1 DIRECTOR US ARMY RESEARCH LAB...RDRL CIM L 2800 POWDER MILL RD ADELPHI MD 20783-1197 1 DIRECTOR US ARMY RESEARCH LAB RDRL CIM P 2800 POWDER MILL RD ADELPHI MD

  14. Design and Realization of MATLAB Graphical User Interfaces Based on the WebBrowser Controls%基于WebBrowser的MATLAB用户界面设计与实现

    Institute of Scientific and Technical Information of China (English)

    徐辉; 王忠芝

    2011-01-01

    MATLAB图形用户界面环境(GUIDE)一直都无法实现类似VC窗体滚动条的功能.介绍了一种在MATLAB GUIDE中使用WebBrowser控件,利用网页自动生成滚动条的特性,实现大量内容显示的方法.并着重介绍了WebBrowser控件的创建,按钮、文本、图像的添加与显示,以及MATLAB与WebBrowser控件数据传递的方法.通过图像分类系统对此方法进行验证,表明该方法在MATLAB图形用户界面(GUI)设计中具有简单、易扩展的特性.%MATLAB graphical user interfaces development environment (GUIDE) has been unable to achieve the function of scroll bar based on the VC form.This paper introduces a method of displaying large amounts of content that using a WebBrowser ActiveX in MATLAB GUIDE, and utilizes the characteristics of scroll bar's automatic generation in web page to achieve an abundant displaying result.This essay mainly focuses on the creation of WebBrowser controls, adding and displaying the buttons, texts, images in WebBrowser ActiveX, and data transfers between MATLAB and web controls.This method is validated through image classification system, since the results demonstrate its simplicity and expansibility.

  15. Slope-Area Computation Program Graphical User Interface 1.0—A Preprocessing and Postprocessing Tool for Estimating Peak Flood Discharge Using the Slope-Area Method

    Science.gov (United States)

    Bradley, D. Nathan

    2012-01-01

    The slope-area method is a technique for estimating the peak discharge of a flood after the water has receded (Dalrymple and Benson, 1967). This type of discharge estimate is called an “indirect measurement” because it relies on evidence left behind by the flood, such as high-water marks (HWMs) on trees or buildings. These indicators of flood stage are combined with measurements of the cross-sectional geometry of the stream, estimates of channel roughness, and a mathematical model that balances the total energy of the flow between cross sections. This is in contrast to a “direct” measurement of discharge during the flood where cross-sectional area is measured and a current meter or acoustic equipment is used to measure the water velocity. When a direct discharge measurement cannot be made at a gage during high flows because of logistics or safety reasons, an indirect measurement of a peak discharge is useful for defining the high-flow section of the stage-discharge relation (rating curve) at the stream gage, resulting in more accurate computation of high flows. The Slope-Area Computation program (SAC; Fulford, 1994) is an implementation of the slope-area method that computes a peak-discharge estimate from inputs of water-surface slope (from surveyed HWMs), channel geometry, and estimated channel roughness. SAC is a command line program written in Fortran that reads input data from a formatted text file and prints results to another formatted text file. Preparing the input file can be time-consuming and prone to errors. This document describes the SAC graphical user interface (GUI), a crossplatform “wrapper” application that prepares the SAC input file, executes the program, and helps the user interpret the output. The SAC GUI is an update and enhancement of the slope-area method (SAM; Hortness, 2004; Berenbrock, 1996), an earlier spreadsheet tool used to aid field personnel in the completion of a slope-area measurement. The SAC GUI reads survey data

  16. A Survey on Graphical Programming Systems

    Directory of Open Access Journals (Sweden)

    Gurudatt Kulkarni

    2014-04-01

    Full Text Available Recently there has been an increasing interest in the use of graphics to help programming and understanding of computer systems. The Graphical Programming and Program Simulations are exciting areas of active computer science research that show the signs for improving the programming process. An array of different design methodologie s have arisen from research efforts and many graphical programming systems have been developed to address both general programming tasks and specific application areas such as physical simulation and user interface design. This paper presents a survey of t he field of graphical programming languages starting with a historical overview of some of pioneering efforts in the field. In addition this paper also presents different classifications of graphical programming languages.

  17. Graphic Turbulence Guidance

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...

  18. Repellency Awareness Graphic

    Science.gov (United States)

    Companies can apply to use the voluntary new graphic on product labels of skin-applied insect repellents. This graphic is intended to help consumers easily identify the protection time for mosquitoes and ticks and select appropriately.

  19. Graphical Turbulence Guidance - Composite

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Forecast turbulence hazards identified by the Graphical Turbulence Guidance algorithm. The Graphical Turbulence Guidance product depicts mid-level and upper-level...

  20. CAD/CAM from the graphic-design perspective

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, A.

    1982-11-01

    CAD/CAM systems have evolved elaborate human-computer interfaces in order to facilitate the creation of highly detailed and specialized schematic diagrams and texts. Although these systems have powerful capacities in terms of graphics editing, data manipulation, and data storage, insufficient attention has been given to making the online interface (together with supporting documentation) user-friendly, i.e., understandable, memorable, and appealing to the general user. Graphic-design considerations in particular have been routinely overlooked. Graphic design concerns typography, symbol design, color, spatial layout, and temporal sequencing. Graphic design can assist computer science by providing insight and expertise in designing effective communication between human being and machine.

  1. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    , the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  2. The PC graphics handbook

    CERN Document Server

    Sanchez, Julio

    2003-01-01

    Part I - Graphics Fundamentals PC GRAPHICS OVERVIEW History and Evolution Short History of PC Video PS/2 Video Systems SuperVGA Graphics Coprocessors and Accelerators Graphics Applications State-of-the-Art in PC Graphics 3D Application Programming Interfaces POLYGONAL MODELING Vector and Raster Data Coordinate Systems Modeling with Polygons IMAGE TRANSFORMATIONS Matrix-based Representations Matrix Arithmetic 3D Transformations PROGRAMMING MATRIX TRANSFORMATIONS Numeric Data in Matrix Form Array Processing PROJECTIONS AND RENDERING Perspective The Rendering Pipeline LIGHTING AND SHADING Lightin

  3. Graphics gems II

    CERN Document Server

    Arvo, James

    1991-01-01

    Graphics Gems II is a collection of articles shared by a diverse group of people that reflect ideas and approaches in graphics programming which can benefit other computer graphics programmers.This volume presents techniques for doing well-known graphics operations faster or easier. The book contains chapters devoted to topics on two-dimensional and three-dimensional geometry and algorithms, image processing, frame buffer techniques, and ray tracing techniques. The radiosity approach, matrix techniques, and numerical and programming techniques are likewise discussed.Graphics artists and comput

  4. MaskDensity14: An R package for the density approximant of a univariate based on noise multiplied data

    Directory of Open Access Journals (Sweden)

    Yan-Xia Lin

    2015-12-01

    Full Text Available Lin (2014 developed a framework of the method of the sample-moment-based density approximant, for estimating the probability density function of microdata based on noise multiplied data. Theoretically, it provides a promising method for data users in generating the synthetic data of the original data without accessing the original data; however, technical issues can cause problems implementing the method. In this paper, we describe a software package called MaskDensity14, written in the R language, that uses a computational approach to solve the technical issues and makes the method of the sample-moment-based density approximant feasible. MaskDensity14 has applications in many areas, such as sharing clinical trial data and survey data without releasing the original data.

  5. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  6. DiceKriging, DiceOptim: Two R Packages for the Analysis of Computer Experiments by Kriging-Based Metamodeling and Optimization

    Directory of Open Access Journals (Sweden)

    Olivier Roustant

    2012-10-01

    Full Text Available We present two recently released R packages, DiceKriging and DiceOptim, for the approximation and the optimization of expensive-to-evaluate deterministic functions. Following a self-contained mini tutorial on Kriging-based approximation and optimization, the functionalities of both packages are detailed and demonstrated in two distinct sections. In particular, the versatility of DiceKriging with respect to trend and noise specifications, covariance parameter estimation, as well as conditional and unconditional simulations are illustrated on the basis of several reproducible numerical experiments. We then put to the fore the implementation of sequential and parallel optimization strategies relying on the expected improvement criterion on the occasion of DiceOptim’s presentation. An appendix is dedicated to complementary mathematical and computational details.

  7. ChIP-seq Analysis in R (CSAR: An R package for the statistical detection of protein-bound genomic regions

    Directory of Open Access Journals (Sweden)

    van Ham Roeland CHJ

    2011-05-01

    Full Text Available Abstract Background In vivo detection of protein-bound genomic regions can be achieved by combining chromatin-immunoprecipitation with next-generation sequencing technology (ChIP-seq. The large amount of sequence data produced by this method needs to be analyzed in a statistically proper and computationally efficient manner. The generation of high copy numbers of DNA fragments as an artifact of the PCR step in ChIP-seq is an important source of bias of this methodology. Results We present here an R package for the statistical analysis of ChIP-seq experiments. Taking the average size of DNA fragments subjected to sequencing into account, the software calculates single-nucleotide read-enrichment values. After normalization, sample and control are compared using a test based on the ratio test or the Poisson distribution. Test statistic thresholds to control the false discovery rate are obtained through random permutations. Computational efficiency is achieved by implementing the most time-consuming functions in C++ and integrating these in the R package. An analysis of simulated and experimental ChIP-seq data is presented to demonstrate the robustness of our method against PCR-artefacts and its adequate control of the error rate. Conclusions The software ChIP-seq Analysis in R (CSAR enables fast and accurate detection of protein-bound genomic regions through the analysis of ChIP-seq experiments. Compared to existing methods, we found that our package shows greater robustness against PCR-artefacts and better control of the error rate.

  8. Circuit II--A Conversational Graphical Interface.

    Science.gov (United States)

    Singer, Ronald A.

    1993-01-01

    Provides an overview of Circuit II, an interactive system that provides users with a graphical representation of an electronic circuit within which questions may be posed and manipulated, and discusses how mouse selections have analogous roles to certain natural language features, such as anaphora, deixis, and ellipsis. (13 references) (EA)

  9. lumpR 2.0.0: an R package facilitating landscape discretisation for hillslope-based hydrological models

    Science.gov (United States)

    Pilz, Tobias; Francke, Till; Bronstert, Axel

    2017-08-01

    The characteristics of a landscape pose essential factors for hydrological processes. Therefore, an adequate representation of the landscape of a catchment in hydrological models is vital. However, many of such models exist differing, amongst others, in spatial concept and discretisation. The latter constitutes an essential pre-processing step, for which many different algorithms along with numerous software implementations exist. In that context, existing solutions are often model specific, commercial, or depend on commercial back-end software, and allow only a limited or no workflow automation at all. Consequently, a new package for the scientific software and scripting environment R, called lumpR, was developed. lumpR employs an algorithm for hillslope-based landscape discretisation directed to large-scale application via a hierarchical multi-scale approach. The package addresses existing limitations as it is free and open source, easily extendible to other hydrological models, and the workflow can be fully automated. Moreover, it is user-friendly as the direct coupling to a GIS allows for immediate visual inspection and manual adjustment. Sufficient control is furthermore retained via parameter specification and the option to include expert knowledge. Conversely, completely automatic operation also allows for extensive analysis of aspects related to landscape discretisation. In a case study, the application of the package is presented. A sensitivity analysis of the most important discretisation parameters demonstrates its efficient workflow automation. Considering multiple streamflow metrics, the employed model proved reasonably robust to the discretisation parameters. However, parameters determining the sizes of subbasins and hillslopes proved to be more important than the others, including the number of representative hillslopes, the number of attributes employed for the lumping algorithm, and the number of sub-discretisations of the representative hillslopes.

  10. Intelligent Computer Graphics 2012

    CERN Document Server

    Miaoulis, Georgios

    2013-01-01

    In Computer Graphics, the use of intelligent techniques started more recently than in other research areas. However, during these last two decades, the use of intelligent Computer Graphics techniques is growing up year after year and more and more interesting techniques are presented in this area.   The purpose of this volume is to present current work of the Intelligent Computer Graphics community, a community growing up year after year. This volume is a kind of continuation of the previously published Springer volumes “Artificial Intelligence Techniques for Computer Graphics” (2008), “Intelligent Computer Graphics 2009” (2009), “Intelligent Computer Graphics 2010” (2010) and “Intelligent Computer Graphics 2011” (2011).   Usually, this kind of volume contains, every year, selected extended papers from the corresponding 3IA Conference of the year. However, the current volume is made from directly reviewed and selected papers, submitted for publication in the volume “Intelligent Computer Gr...

  11. Deterministic Graphical Games Revisited

    DEFF Research Database (Denmark)

    Andersson, Daniel; Hansen, Kristoffer Arnsfelt; Miltersen, Peter Bro

    2008-01-01

    We revisit the deterministic graphical games of Washburn. A deterministic graphical game can be described as a simple stochastic game (a notion due to Anne Condon), except that we allow arbitrary real payoffs but disallow moves of chance. We study the complexity of solving deterministic graphical...... games and obtain an almost-linear time comparison-based algorithm for computing an equilibrium of such a game. The existence of a linear time comparison-based algorithm remains an open problem....

  12. Introduction to Graphical Modelling

    CERN Document Server

    Scutari, Marco

    2010-01-01

    The aim of this chapter is twofold. In the first part we will provide a brief overview of the mathematical and statistical foundations of graphical models, along with their fundamental properties, estimation and basic inference procedures. In particular we will develop Markov networks (also known as Markov random fields) and Bayesian networks, which comprise most past and current literature on graphical models. In the second part we will review some applications of graphical models in systems biology.

  13. The computer graphics metafile

    CERN Document Server

    Henderson, LR; Shepherd, B; Arnold, D B

    1990-01-01

    The Computer Graphics Metafile deals with the Computer Graphics Metafile (CGM) standard and covers topics ranging from the structure and contents of a metafile to CGM functionality, metafile elements, and real-world applications of CGM. Binary Encoding, Character Encoding, application profiles, and implementations are also discussed. This book is comprised of 18 chapters divided into five sections and begins with an overview of the CGM standard and how it can meet some of the requirements for storage of graphical data within a graphics system or application environment. The reader is then intr

  14. Graphical Models with R

    CERN Document Server

    Højsgaard, Søren; Lauritzen, Steffen

    2012-01-01

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add

  15. The computer graphics interface

    CERN Document Server

    Steinbrugge Chauveau, Karla; Niles Reed, Theodore; Shepherd, B

    2014-01-01

    The Computer Graphics Interface provides a concise discussion of computer graphics interface (CGI) standards. The title is comprised of seven chapters that cover the concepts of the CGI standard. Figures and examples are also included. The first chapter provides a general overview of CGI; this chapter covers graphics standards, functional specifications, and syntactic interfaces. Next, the book discusses the basic concepts of CGI, such as inquiry, profiles, and registration. The third chapter covers the CGI concepts and functions, while the fourth chapter deals with the concept of graphic obje

  16. Use of Interactive Computer Graphics to Solve Routing Problems.

    Science.gov (United States)

    Gillett, B. E.; Lawrence, J. L.

    1981-01-01

    Discusses vehicle routing problems and solutions. Describes testing of an interactive computer graphics package combining several types of solutions that allows users with little or no experience to work out routing problems. (Author/RW)

  17. Scientific and Graphic Design Foundations for C2

    Science.gov (United States)

    2007-06-01

    the elements in the composition. This section presents a summary of the concepts in graphic design layout, typography , color, and data graphics...assist the users in perceiving and recognizing patterns in information. Typography Typography is the art and technique of designing textual...Principles of typography for user interface design, interactions, Vol 5, pp. 15, Nov/Dec 1998 Kahneman, D., & Henik, A. 1981. Perceptual organization and

  18. Graphical Modeling Language Tool

    NARCIS (Netherlands)

    Rumnit, M.

    2003-01-01

    The group of the faculty EE-Math-CS of the University of Twente is developing a graphical modeling language for specifying concurrency in software design. This graphical modeling language has a mathematical background based on the theorie of CSP. This language contains the power to create trustworth

  19. A Linux Workstation for High Performance Graphics

    Science.gov (United States)

    Geist, Robert; Westall, James

    2000-01-01

    The primary goal of this effort was to provide a low-cost method of obtaining high-performance 3-D graphics using an industry standard library (OpenGL) on PC class computers. Previously, users interested in doing substantial visualization or graphical manipulation were constrained to using specialized, custom hardware most often found in computers from Silicon Graphics (SGI). We provided an alternative to expensive SGI hardware by taking advantage of third-party, 3-D graphics accelerators that have now become available at very affordable prices. To make use of this hardware our goal was to provide a free, redistributable, and fully-compatible OpenGL work-alike library so that existing bodies of code could simply be recompiled. for PC class machines running a free version of Unix. This should allow substantial cost savings while greatly expanding the population of people with access to a serious graphics development and viewing environment. This should offer a means for NASA to provide a spectrum of graphics performance to its scientists, supplying high-end specialized SGI hardware for high-performance visualization while fulfilling the requirements of medium and lower performance applications with generic, off-the-shelf components and still maintaining compatibility between the two.

  20. The complete guide to blender graphics computer modeling and animation

    CERN Document Server

    Blain, John M

    2014-01-01

    Smoothly Leads Users into the Subject of Computer Graphics through the Blender GUIBlender, the free and open source 3D computer modeling and animation program, allows users to create and animate models and figures in scenes, compile feature movies, and interact with the models and create video games. Reflecting the latest version of Blender, The Complete Guide to Blender Graphics: Computer Modeling & Animation, 2nd Edition helps beginners learn the basics of computer animation using this versatile graphics program. This edition incorporates many new features of Blender, including developments

  1. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  2. Using the computerized glow curve deconvolution method and the R package tgcd to determination of thermoluminescence kinetic parameters of chilli powder samples by GOK model and OTOR one

    Science.gov (United States)

    Sang, Nguyen Duy; Van Hung, Nguyen; Van Hung, Tran; Hien, Nguyen Quoc

    2017-03-01

    The kinetic parameters of thermoluminescence (TL) glow peaks of chilli powder irradiated by gamma rays with the different doses of 0, 4 and 8 kGy have been calculated and estimate by computerized glow curve deconvolution (CGCD) method and the R package tgcd by using the TL glow curve data. The kinetic parameters of TL glow peaks (i.e. activation energies (E), order of kinetics (b), trapping and recombination probability coefficients (R) and frequency factors (s)) are fitted by modeled general-orders of kinetics (GOK) and one trap-one recombination (OTOR). The kinetic parameters of the chilli powder are different toward the difference of the sample time-storage, radiation doses, GOK model and OTOR one. The samples spending the shorter period of storage time have the smaller the kinetic parameters values than the samples spending the longer period of storage. The results obtained as comparing the kinetic parameters values of the three samples show that the value of non-irradiated samples are lowest whereas the 4 kGy irradiated-samples' value are greater than the 8 kGy irradiated-samples' one time.

  3. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  4. Putting Down Roots: A Graphical Exploration of Community Attachment

    CERN Document Server

    Kaplan, Andee

    2016-01-01

    In this paper, we explore the relationships that individuals have with their communities. This work was prepared as part of the ASA Data Expo '13 sponsored by the Graphics Section and the Computing Section, using data provided by the Knight Foundation Soul of the Community survey. The Knight Foundation in cooperation with Gallup surveyed 43,000 people over three years in 26 communities across the United States with the intention of understanding the association between community attributes and the degree of attachment people feel towards their community. These include the different facets of both urban and rural communities, the impact of quality education, and the trend in the perceived economic conditions of a community over time. The goal of our work is to facilitate understanding of why people feel attachment to their communities through the use of an interactive and web-based visualization. We will explain the development and use of web-based interactive graphics, including an overview of the R package S...

  5. R Package - httk (Syngenta webinar)

    Science.gov (United States)

    Functions and data tables for simulation and statistical analysis of chemical toxicokinetics ("TK") using data obtained from relatively high throughput, in vitro studies. Both physiologically-based ("PBTK") and empirical (e.g., one compartment) "TK" models can be parameterized fo...

  6. Indian Graphic Symbols.

    Science.gov (United States)

    Stump, Sarain

    1979-01-01

    Noting Indian tribes had invented ways to record facts and ideas, with graphic symbols that sometimes reached the complexity of hieroglyphs, this article illustrates and describes Indian symbols. (Author/RTS)

  7. Digital Raster Graphics

    Data.gov (United States)

    U.S. Geological Survey, Department of the Interior — A digital raster graphic (DRG) is a scanned image of a U.S. Geological Survey (USGS) topographic map. The scanned image includes all map collar information. The...

  8. Graphical Potential Games

    OpenAIRE

    Ortiz, Luis E.

    2015-01-01

    Potential games, originally introduced in the early 1990's by Lloyd Shapley, the 2012 Nobel Laureate in Economics, and his colleague Dov Monderer, are a very important class of models in game theory. They have special properties such as the existence of Nash equilibria in pure strategies. This note introduces graphical versions of potential games. Special cases of graphical potential games have already found applicability in many areas of science and engineering beyond economics, including ar...

  9. Graphical symbol recognition

    OpenAIRE

    K.C., Santosh; Wendling, Laurent

    2015-01-01

    International audience; The chapter focuses on one of the key issues in document image processing i.e., graphical symbol recognition. Graphical symbol recognition is a sub-field of a larger research domain: pattern recognition. The chapter covers several approaches (i.e., statistical, structural and syntactic) and specially designed symbol recognition techniques inspired by real-world industrial problems. It, in general, contains research problems, state-of-the-art methods that convey basic s...

  10. HEP graphics and visualization

    CERN Document Server

    Drevermann, Hans; CERN. Geneva

    1992-01-01

    The lectures will give an overview of the use of graphics in high-energy physics, i.e. for detector design, event representation and interactive analysis in 2D and 3D. An introduction to graphics packages (GKS, PHIGS, etc.) will be given, including discussion of the basic concepts of graphics programming. Emphasis is put on new ideas about graphical representation of events. Non-linear visualisation techniques, to improve the ease of understanding, will be described in detail. Physiological aspects, which play a role when using colours and when drawing mathematical objects like points and lines, are discussed. An analysis will be made of the power of graphics to represent very complex data in 2 and 3 dimensions, and the advantages of different representations will be compared.New techniques based on graphics are emerging today, such as multimedia or real-life pictures. Some are used in other domains of scientific research, as will be described and an overview of possible applications in our field will be give...

  11. Justine user`s manual

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.R.

    1995-10-01

    Justine is the graphical user interface to the Los Alamos Radiation Modeling Interactive Environment (LARAMIE). It provides LARAMIE customers with a powerful, robust, easy-to-use, WYSIWYG interface that facilitates geometry construction and problem specification. It is assumed that the reader is familiar with LARAMIE, and the transport codes available, i.e., MCNPTM and DANTSYSTM. No attempt is made in this manual to describe these codes in detail. Information about LARAMIE, DANTSYS, and MCNP are available elsewhere. It i also assumed that the reader is familiar with the Unix operating system and with Motif widgets and their look and feel. However, a brief description of Motif and how one interacts with it can be found in Appendix A.

  12. Engineering computer graphics in gas turbine engine design, analysis and manufacture

    Science.gov (United States)

    Lopatka, R. S.

    1975-01-01

    A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.

  13. Exploring spatial data representation with dynamic graphics

    Science.gov (United States)

    Dykes, Jason A.

    1997-05-01

    Dynamic mapping capabilities are providing enormous potential for visualizing spatial data. Dynamic maps which exhibit observer-related behaviour are particularly appropriate for exploratory analysis, where multiple, short-term, slightly different, views of a data set, each produced with a specific task or question in mind, are an essential part of the analytical process. This paper and the associated coloured and dynamic illustrations take advantage of World Wide Web (WWW) delivery and the digital medium by using interactive graphics to introduce an approach to dynamic cartography based upon the Tcl/Tk graphical user interface (GUI) builder. Generic ways of programming observer-related behaviour, such as brushing, dynamic re-expression, and dynamic comparison, are outlined and demonstrated to show that specialist dynamic views can be developed rapidly in an open, flexible, and high-level graphic environment. Such an approach provides opportunities to reinforce traditional cartographic and statistical representations of spatial data with dynamic graphics and transient symbolism which give supplementary information about a symbol or statistic on demand. A series of examples from recent work which uses the approach demonstrates ways in which dynamic graphics can be effective in complementing methods of measurement and mapping which are well established in geographic enquiry.

  14. Quasi-Graphic Matroids (retracted)

    NARCIS (Netherlands)

    J. Geelen (Jim); A.M.H. Gerards (Bert); G. Whittle (Geoff)

    2017-01-01

    textabstractFrame matroids and lifted-graphic matroids are two interesting generalizations of graphic matroids. Here, we introduce a new generalization, quasi-graphic matroids, that unifies these two existing classes. Unlike frame matroids and lifted-graphic matroids, it is easy to certify that a

  15. Graphic Simulation of a Jackson Network.

    Science.gov (United States)

    1986-09-01

    Z which are algebraic combinations of other point estimates. For example, the quantity W1 is estimated using the quotient L1’/R. If at the point in...will run on IBM Compatible machines. C. TIE PROGRAMMING LANGUAGE JACKQUE is written in IBM Advanced Basic ( BASICA ). This language is provided with PC... BASICA has been loaded from DOS, and if the machine is coniigur d with a color, graphics adapter. * Printing the title screen. o Loading user defined

  16. Mathematical structures for computer graphics

    CERN Document Server

    Janke, Steven J

    2014-01-01

    A comprehensive exploration of the mathematics behind the modeling and rendering of computer graphics scenes Mathematical Structures for Computer Graphics presents an accessible and intuitive approach to the mathematical ideas and techniques necessary for two- and three-dimensional computer graphics. Focusing on the significant mathematical results, the book establishes key algorithms used to build complex graphics scenes. Written for readers with various levels of mathematical background, the book develops a solid foundation for graphics techniques and fills in relevant grap

  17. FAST User Guide

    Science.gov (United States)

    Walatka, Pamela P.; Clucas, Jean; McCabe, R. Kevin; Plessel, Todd; Potter, R.; Cooper, D. M. (Technical Monitor)

    1994-01-01

    The Flow Analysis Software Toolkit, FAST, is a software environment for visualizing data. FAST is a collection of separate programs (modules) that run simultaneously and allow the user to examine the results of numerical and experimental simulations. The user can load data files, perform calculations on the data, visualize the results of these calculations, construct scenes of 3D graphical objects, and plot, animate and record the scenes. Computational Fluid Dynamics (CFD) visualization is the primary intended use of FAST, but FAST can also assist in the analysis of other types of data. FAST combines the capabilities of such programs as PLOT3D, RIP, SURF, and GAS into one environment with modules that share data. Sharing data between modules eliminates the drudgery of transferring data between programs. All the modules in the FAST environment have a consistent, highly interactive graphical user interface. Most commands are entered by pointing and'clicking. The modular construction of FAST makes it flexible and extensible. The environment can be custom configured and new modules can be developed and added as needed. The following modules have been developed for FAST: VIEWER, FILE IO, CALCULATOR, SURFER, TOPOLOGY, PLOTTER, TITLER, TRACER, ARCGRAPH, GQ, SURFERU, SHOTET, and ISOLEVU. A utility is also included to make the inclusion of user defined modules in the FAST environment easy. The VIEWER module is the central control for the FAST environment. From VIEWER, the user can-change object attributes, interactively position objects in three-dimensional space, define and save scenes, create animations, spawn new FAST modules, add additional view windows, and save and execute command scripts. The FAST User Guide uses text and FAST MAPS (graphical representations of the entire user interface) to guide the user through the use of FAST. Chapters include: Maps, Overview, Tips, Getting Started Tutorial, a separate chapter for each module, file formats, and system

  18. XTV users guide

    Energy Technology Data Exchange (ETDEWEB)

    Dearing, J.F.; Johns, R.C. [Los Alamos National Lab., NM (United States). Technology and Safety Assessment Div.

    1996-09-01

    XTV is an X-Windows based Graphical User Interface for viewing results of Transient Reactor Analysis Code (TRAC) calculations. It provides static and animated color mapped visualizations of both thermal-hydraulic and heat conduction components in a TRAC model of a nuclear power plant, as well as both on-screen and hard copy two-dimensional plot capabilities. XTV is the successor to TRAP, the former TRAC postprocessor using the proprietary DISSPLA graphics library. This manual describes Version 2.0, which requires TRAC version 5.4.20 or later for full visualization capabilities.

  19. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  20. Graphics and Animation on iOS A Beginner's Guide to Core Graphics and Core Animation

    CERN Document Server

    Nahavandipoor, Vandad

    2011-01-01

    Jazz up your iPhone and iPad apps with some slick graphics and animation-and keep users from looking elsewhere. This short and concise book shows developers with even little Cocoa programming experience how to create impressive graphics and animation effects with relatively easy coding. Learn how to incorporate smooth animations and draw images in your apps to achieve the classy look you want. The recipes in this book include step-by-step instructions and simple code solutions that you can put to work right away. Learn basic concepts for adapting to different screen sizesConstruct, set, and

  1. Mathematical Graphic Organizers

    Science.gov (United States)

    Zollman, Alan

    2009-01-01

    As part of a math-science partnership, a university mathematics educator and ten elementary school teachers developed a novel approach to mathematical problem solving derived from research on reading and writing pedagogy. Specifically, research indicates that students who use graphic organizers to arrange their ideas improve their comprehension…

  2. Comics & Graphic Novels

    Science.gov (United States)

    Cleaver, Samantha

    2008-01-01

    Not so many years ago, comic books in school were considered the enemy. Students caught sneaking comics between the pages of bulky--and less engaging--textbooks were likely sent to the principal. Today, however, comics, including classics such as "Superman" but also their generally more complex, nuanced cousins, graphic novels, are not only…

  3. Graphics Conference Calendar

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    1. The 13th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision'2005, University of West Bohemia, Campus-Bory Plzen (very close to Prague, the capital of the Czech Republic)Czech Republic, January 31 - February 4, 2005. http://wscg.zcu.cz, skala@kiv.zcu.cz

  4. Graphic Novels: A Roundup.

    Science.gov (United States)

    Kan, Katherine L.

    1994-01-01

    Reviews graphic novels for young adults, including five titles from "The Adventures of Tintin," a French series that often uses ethnic and racial stereotypes which reflect the time in which they were published, and "Wolverine," a Marvel comic character adventure. (Contains six references.) (LRW)

  5. The WebACS - An Accessible Graphical Editor.

    Science.gov (United States)

    Parker, Stefan; Nussbaum, Gerhard; Pölzer, Stephan

    2017-01-01

    This paper is about the solution to accessibility problems met when implementing a graphical editor, a major challenge being the comprehension of the relationships between graphical components, which needs to be guaranteed for blind and vision impaired users. In the concrete case the HTML5 canvas and Javascript were used. Accessibility was reached by implementing a list view of elements, which also enhances the usability of the editor.

  6. Graphics gems V (Macintosh version)

    CERN Document Server

    Paeth, Alan W

    1995-01-01

    Graphics Gems V is the newest volume in The Graphics Gems Series. It is intended to provide the graphics community with a set of practical tools for implementing new ideas and techniques, and to offer working solutions to real programming problems. These tools are written by a wide variety of graphics programmers from industry, academia, and research. The books in the series have become essential, time-saving tools for many programmers.Latest collection of graphics tips in The Graphics Gems Series written by the leading programmers in the field.Contains over 50 new gems displaying some of t

  7. MPGT - THE MISSION PLANNING GRAPHICAL TOOL

    Science.gov (United States)

    Jeletic, J. F.

    1994-01-01

    The Mission Planning Graphical Tool (MPGT) provides mission analysts with a mouse driven graphical representation of the spacecraft and environment data used in spaceflight planning. Developed by the Flight Dynamics Division at NASA's Goddard Space Flight Center, MPGT is designed to be a generic tool that can be configured to analyze any specified earth orbiting spacecraft mission. The data is presented as a series of overlays on top of a 2-dimensional or 3-dimensional projection of the earth. Up to six spacecraft orbit tracks can be drawn at one time. Position data can be obtained by either an analytical process or by use of ephemeris files. If the user chooses to propagate the spacecraft orbit using an ephemeris file, then Goddard Trajectory Determination System (GTDS) formatted ephemeris files must be supplied. The MPGT User's Guide provides a complete description of the GTDS ephemeris file format so that users can create their own. Other overlays included are ground station antenna masks, solar and lunar ephemeris, Tracking Data and Relay Satellite System (TDRSS) coverage, a field-of-view swath, and orbit number. From these graphical representations an analyst can determine such spacecraft-related constraints as communication coverage, interference zone infringement, sunlight availability, and instrument target visibility. The presentation of time and geometric data as graphical overlays on a world map makes possible quick analyses of trends and time-oriented parameters. For instance, MPGT can display the propagation of the position of the Sun and Moon over time, shadowing of sunrise/sunset terminators to indicate spacecraft and Earth day/night, and color coding of the spacecraft orbit tracks to indicate spacecraft day/night. With the 3-dimensional display, the user specifies a vector that represents the position in the universe from which the user wishes to view the earth. From these "viewpoint" parameters the user can zoom in on or rotate around the earth

  8. Hardware accelerated computer graphics algorithms

    OpenAIRE

    Rhodes, DT

    2008-01-01

    The advent of shaders in the latest generations of graphics hardware, which has made consumer level graphics hardware partially programmable, makes now an ideal time to investigate new graphical techniques and algorithms as well as attempting to improve upon existing ones. This work looks at areas of current interest within the graphics community such as Texture Filtering, Bump Mapping and Depth of Field simulation. These are all areas which have enjoyed much interest over the history of comp...

  9. Space Spurred Computer Graphics

    Science.gov (United States)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  10. The IGUANA Interactive Graphics TOolkit with Examples from CMS and D0

    Institute of Scientific and Technical Information of China (English)

    GeorgeAlverson; IannaOsborne; 等

    2001-01-01

    IGUANA(Interactive Graphics for User ANAlysis)is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications,such as data browsers and detector and event visualisation programs.The IGUANA strategy is to use freely available software(e.g.Qt,SoQt,OpenInventor,OpenGL,HEPVis)and package and extend it to provide a general-purpose and experiment-independent toolkit.We describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA.We demonstrate the use of IGUANA with several applications built for CMS and D0.

  11. Graphic engine resource management

    Science.gov (United States)

    Bautin, Mikhail; Dwarakinath, Ashok; Chiueh, Tzi-cker

    2008-01-01

    Modern consumer-grade 3D graphic cards boast a computation/memory resource that can easily rival or even exceed that of standard desktop PCs. Although these cards are mainly designed for 3D gaming applications, their enormous computational power has attracted developers to port an increasing number of scientific computation programs to these cards, including matrix computation, collision detection, cryptography, database sorting, etc. As more and more applications run on 3D graphic cards, there is a need to allocate the computation/memory resource on these cards among the sharing applications more fairly and efficiently. In this paper, we describe the design, implementation and evaluation of a Graphic Processing Unit (GPU) scheduler based on Deficit Round Robin scheduling that successfully allocates to every process an equal share of the GPU time regardless of their demand. This scheduler, called GERM, estimates the execution time of each GPU command group based on dynamically collected statistics, and controls each process's GPU command production rate through its CPU scheduling priority. Measurements on the first GERM prototype show that this approach can keep the maximal GPU time consumption difference among concurrent GPU processes consistently below 5% for a variety of application mixes.

  12. Graphical Facility Information Center (GraFIC{trademark})

    Energy Technology Data Exchange (ETDEWEB)

    Dunigan, J.J.; Gaby, J.E.; Hickerson, T.W.; Miller, M.A. [and others

    1997-07-01

    The Graphical Facility Information Center (GraFIC{trademark}) is an information system that provides an inexpensive and flexible method of remotely verifying complete {open_quotes}up-to-the-minute{close_quotes} inventory status of stored items and facility assets. In addition, GraFIC{trademark} provides features needed for day to day management of storage and other facilities. GraFIC{trademark} combines an easy to use graphical user interface with extensive online help so that users need little training. GraFIC{trademark} can be configured to work with most sensor systems used to monitor facility assets.

  13. Selecting Mangas and Graphic Novels

    Science.gov (United States)

    Nylund, Carol

    2007-01-01

    The decision to add graphic novels, and particularly the Japanese styled called manga, was one the author has debated for a long time. In this article, the author shares her experience when she purchased graphic novels and mangas to add to her library collection. She shares how graphic novels and mangas have revitalized the library.

  14. Selecting Mangas and Graphic Novels

    Science.gov (United States)

    Nylund, Carol

    2007-01-01

    The decision to add graphic novels, and particularly the Japanese styled called manga, was one the author has debated for a long time. In this article, the author shares her experience when she purchased graphic novels and mangas to add to her library collection. She shares how graphic novels and mangas have revitalized the library.

  15. TIA Software User's Manual

    Science.gov (United States)

    Cramer, K. Elliott; Syed, Hazari I.

    1995-01-01

    This user's manual describes the installation and operation of TIA, the Thermal-Imaging acquisition and processing Application, developed by the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center, Hampton, Virginia. TIA is a user friendly graphical interface application for the Macintosh 2 and higher series computers. The software has been developed to interface with the Perceptics/Westinghouse Pixelpipe(TM) and PixelStore(TM) NuBus cards and the GW Instruments MacADIOS(TM) input-output (I/O) card for the Macintosh for imaging thermal data. The software is also capable of performing generic image-processing functions.

  16. User Interface History

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms; Myers, Brad A

    2008-01-01

    User Interfaces have been around as long as computers have existed, even well before the field of Human-Computer Interaction was established. Over the years, some papers on the history of Human-Computer Interaction and User Interfaces have appeared, primarily focusing on the graphical interface era...... and early visionaries such as Bush, Engelbart and Kay. With the User Interface being a decisive factor in the proliferation of computers in society and since it has become a cultural phenomenon, it is time to paint a more comprehensive picture of its history. This SIG will investigate the possibilities...... of  launching a concerted effort towards creating a History of User Interfaces. ...

  17. The Case for Graphic Novels

    Directory of Open Access Journals (Sweden)

    Steven Hoover

    2012-04-01

    Full Text Available Many libraries and librarians have embraced graphic novels. A number of books, articles, and presentations have focused on the history of the medium and offered advice on building and maintaining collections, but very little attention has been given the question of how integrate graphic novels into a library’s instructional efforts. This paper will explore the characteristics of graphic novels that make them a valuable resource for librarians who focus on research and information literacy instruction, identify skills and competencies that can be taught by the study of graphic novels, and will provide specific examples of how to incorporate graphic novels into instruction.

  18. Graphics Gems III IBM version

    CERN Document Server

    Kirk, David

    1994-01-01

    This sequel to Graphics Gems (Academic Press, 1990), and Graphics Gems II (Academic Press, 1991) is a practical collection of computer graphics programming tools and techniques. Graphics Gems III contains a larger percentage of gems related to modeling and rendering, particularly lighting and shading. This new edition also covers image processing, numerical and programming techniques, modeling and transformations, 2D and 3D geometry and algorithms,ray tracing and radiosity, rendering, and more clever new tools and tricks for graphics programming. Volume III also includes a

  19. U.S. Geological Survey groundwater toolbox, a graphical and mapping interface for analysis of hydrologic data (version 1.0): user guide for estimation of base flow, runoff, and groundwater recharge from streamflow data

    Science.gov (United States)

    Barlow, Paul M.; Cunningham, William L.; Zhai, Tong; Gray, Mark

    2015-01-01

    This report is a user guide for the streamflow-hydrograph analysis methods provided with version 1.0 of the U.S. Geological Survey (USGS) Groundwater Toolbox computer program. These include six hydrograph-separation methods to determine the groundwater-discharge (base-flow) and surface-runoff components of streamflow—the Base-Flow Index (BFI; Standard and Modified), HYSEP (Fixed Interval, Sliding Interval, and Local Minimum), and PART methods—and the RORA recession-curve displacement method and associated RECESS program to estimate groundwater recharge from streamflow data. The Groundwater Toolbox is a customized interface built on the nonproprietary, open source MapWindow geographic information system software. The program provides graphing, mapping, and analysis capabilities in a Microsoft Windows computing environment. In addition to the four hydrograph-analysis methods, the Groundwater Toolbox allows for the retrieval of hydrologic time-series data (streamflow, groundwater levels, and precipitation) from the USGS National Water Information System, downloading of a suite of preprocessed geographic information system coverages and meteorological data from the National Oceanic and Atmospheric Administration National Climatic Data Center, and analysis of data with several preprocessing and postprocessing utilities. With its data retrieval and analysis tools, the Groundwater Toolbox provides methods to estimate many of the components of the water budget for a hydrologic basin, including precipitation; streamflow; base flow; runoff; groundwater recharge; and total, groundwater, and near-surface evapotranspiration.

  20. What is Graphic Justice?

    Directory of Open Access Journals (Sweden)

    Thomas Giddens

    2016-12-01

    Full Text Available This article reproduces a poster presented at the Socio-Legal Studies ­Association annual conference, 5–7 April 2016 at Lancaster University, UK. The poster outlines the emerging study of the legal and jurisprudential dimensions of comics. Seeking to answer the question ‘what is graphic justice?’, the poster highlights the variety of potential topics, questions, concerns, issues, and intersections that the crossover between law and comics might encounter. A transcript of the poster’s text is provided for easier reuse, as well as a list of references and suggested readings.