WorldWideScience

Sample records for graphical analysis software

  1. Gamma camera image processing and graphical analysis mutual software system

    International Nuclear Information System (INIS)

    Wang Zhiqian; Chen Yongming; Ding Ailian; Ling Zhiye; Jin Yongjie

    1992-01-01

    GCCS gamma camera image processing and graphical analysis system is a special mutual software system. It is mainly used to analyse various patient data acquired from gamma camera. This system is used on IBM PC, PC/XT or PC/AT. It consists of several parts: system management, data management, device management, program package and user programs. The system provides two kinds of user interfaces: command menu and command characters. It is easy to change and enlarge this system because it is best modularized. The user programs include almost all the clinical protocols used now

  2. Software Graphical User Interface For Analysis Of Images

    Science.gov (United States)

    Leonard, Desiree M.; Nolf, Scott R.; Avis, Elizabeth L.; Stacy, Kathryn

    1992-01-01

    CAMTOOL software provides graphical interface between Sun Microsystems workstation and Eikonix Model 1412 digitizing camera system. Camera scans and digitizes images, halftones, reflectives, transmissives, rigid or flexible flat material, or three-dimensional objects. Users digitize images and select from three destinations: work-station display screen, magnetic-tape drive, or hard disk. Written in C.

  3. Explicet: graphical user interface software for metadata-driven management, analysis and visualization of microbiome data.

    Science.gov (United States)

    Robertson, Charles E; Harris, J Kirk; Wagner, Brandie D; Granger, David; Browne, Kathy; Tatem, Beth; Feazel, Leah M; Park, Kristin; Pace, Norman R; Frank, Daniel N

    2013-12-01

    Studies of the human microbiome, and microbial community ecology in general, have blossomed of late and are now a burgeoning source of exciting research findings. Along with the advent of next-generation sequencing platforms, which have dramatically increased the scope of microbiome-related projects, several high-performance sequence analysis pipelines (e.g. QIIME, MOTHUR, VAMPS) are now available to investigators for microbiome analysis. The subject of our manuscript, the graphical user interface-based Explicet software package, fills a previously unmet need for a robust, yet intuitive means of integrating the outputs of the software pipelines with user-specified metadata and then visualizing the combined data.

  4. Software for graphic display systems

    International Nuclear Information System (INIS)

    Karlov, A.A.

    1978-01-01

    In this paper some aspects of graphic display systems are discussed. The design of a display subroutine library is described, with an example, and graphic dialogue software is considered primarily from the point of view of the programmer who uses a high-level language. (Auth.)

  5. Configurable software for satellite graphics

    Energy Technology Data Exchange (ETDEWEB)

    Hartzman, P D

    1977-12-01

    An important goal in interactive computer graphics is to provide users with both quick system responses for basic graphics functions and enough computing power for complex calculations. One solution is to have a distributed graphics system in which a minicomputer and a powerful large computer share the work. The most versatile type of distributed system is an intelligent satellite system in which the minicomputer is programmable by the application user and can do most of the work while the large remote machine is used for difficult computations. At New York University, the hardware was configured from available equipment. The level of system intelligence resulted almost completely from software development. Unlike previous work with intelligent satellites, the resulting system had system control centered in the satellite. It also had the ability to reconfigure software during realtime operation. The design of the system was done at a very high level using set theoretic language. The specification clearly illustrated processor boundaries and interfaces. The high-level specification also produced a compact, machine-independent virtual graphics data structure for picture representation. The software was written in a systems implementation language; thus, only one set of programs was needed for both machines. A user can program both machines in a single language. Tests of the system with an application program indicate that is has very high potential. A major result of this work is the demonstration that a gigantic investment in new hardware is not necessary for computing facilities interested in graphics.

  6. Collection Of Software For Computer Graphics

    Science.gov (United States)

    Hibbard, Eric A.; Makatura, George

    1990-01-01

    Ames Research Graphics System (ARCGRAPH) collection of software libraries and software utilities assisting researchers in generating, manipulating, and visualizing graphical data. Defines metafile format containing device-independent graphical data. File format used with various computer-graphics-manipulation and -animation software packages at Ames, including SURF (COSMIC Program ARC-12381) and GAS (COSMIC Program ARC-12379). Consists of two-stage "pipeline" used to put out graphical primitives. ARCGRAPH libraries developed on VAX computer running VMS.

  7. Trend Monitoring System (TMS) graphics software

    Science.gov (United States)

    Brown, J. S.

    1979-01-01

    A prototype bus communications systems, which is being used to support the Trend Monitoring System (TMS) and to evaluate the bus concept is considered. A set of FORTRAN-callable graphics subroutines for the host MODCOMP comuter, and an approach to splitting graphics work between the host and the system's intelligent graphics terminals are described. The graphics software in the MODCOMP and the operating software package written for the graphics terminals are included.

  8. Programming Language Software For Graphics Applications

    Science.gov (United States)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  9. Free, cross-platform gRaphical software

    DEFF Research Database (Denmark)

    Dethlefsen, Claus

    2006-01-01

    -recursive graphical models, and models defined using the BUGS language. Today, there exists a wide range of packages to support the analysis of data using graphical models. Here, we focus on Open Source software, making it possible to extend the functionality by integrating these packages into more general tools. We...... will attempt to give an overview of the available Open Source software, with focus on the gR project. This project was launched in 2002 to make facilities in R for graphical modelling. Several R packages have been developed within the gR project both for display and analysis of graphical models...

  10. Graphical modelling software in R - status

    DEFF Research Database (Denmark)

    Detlefsen, Claus; Højsgaard, Søren; Lauritzen, Steffen L

    2007-01-01

    Graphical models in their modern form have been around for nearly a quarter of a century.  Various computer programs for inference in graphical models have been developed over that period. Some examples of free software programs are BUGS (Thomas 1994), CoCo (Badsberg2001), Digram (Klein, Keiding......, and Kreiner 1995), MIM (Edwards  2000), and Tetrad (Glymour, Scheines, Spirtes, and Kelley 1987). The gR initiative (Lauritzen 2002) aims at making graphical models available in R (R Development Core Team 2006). A small grant from the Danish Science Foundation supported this initiative. We will summarize...... the results of the initiative so far. Specifically we will illustrate some of the R packages for graphical modelling currently on CRAN and discuss their strengths and weaknesses....

  11. The graphics software of the Saclay Linear Accelerator control system

    International Nuclear Information System (INIS)

    Gournay, J.F.

    1988-01-01

    The graphics software used for the control of the Saclay Linear Accelerator is described. The specific requirements that such a software must have in this environment are outlined and some typical applications are presented. (orig.)

  12. The graphics software of the Saclay linear accelerator control system

    International Nuclear Information System (INIS)

    Gournay, J.F.

    1987-06-01

    The Control system of the Saclay Linear Accelerator is based upon modern technology hardware. In the graphic software, pictures are created in exactly the same manner for all the graphic devices supported by the system. The informations used to draw a picture are stored in an array called a graphic segment. Three output primitives are used to add graphic material in a segment. Three coordinate systems are defined

  13. TRITON: graphic software for rational engineering of enzymes.

    Science.gov (United States)

    Damboský, J; Prokop, M; Koca, J

    2001-01-01

    Engineering of the catalytic properties of enzymes requires knowledge about amino acid residues interacting with the transition state of the substrate. TRITON is a graphic software package for modelling enzymatic reactions for the analysis of essential interactions between the enzyme and its substrate and for in silico construction of protein mutants. The reactions are modelled using semi-empirical quantum-mechanic methods and the protein mutants are constructed by homology modelling. The users are guided through the calculation and data analysis by wizards.

  14. General-Purpose Software For Computer Graphics

    Science.gov (United States)

    Rogers, Joseph E.

    1992-01-01

    NASA Device Independent Graphics Library (NASADIG) is general-purpose computer-graphics package for computer-based engineering and management applications which gives opportunity to translate data into effective graphical displays for presentation. Features include two- and three-dimensional plotting, spline and polynomial interpolation, control of blanking of areas, multiple log and/or linear axes, control of legends and text, control of thicknesses of curves, and multiple text fonts. Included are subroutines for definition of areas and axes of plots; setup and display of text; blanking of areas; setup of style, interpolation, and plotting of lines; control of patterns and of shading of colors; control of legends, blocks of text, and characters; initialization of devices; and setting of mixed alphabets. Written in FORTRAN 77.

  15. IMAGE information monitoring and applied graphics software environment. Volume 2. Software description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent

  16. Graphic software ''MiniG'' for the Mini-6

    International Nuclear Information System (INIS)

    Zen, J.

    1984-06-01

    MiniG is a set of subprograms, written and aimed at being used in Fortran for graphic applications in nuclear physics (histograms or point clouds). It includes three representation modes of axis scales (linear, semi-log and squared root), five types of vectors and numerous graphic symbols for spectra representation with or without notation (circle, cross, arrow, triangle, spiral, etc.). It offers also the possibilities of the software ''Plot-10'' of Tektronix, and accept all the types of graphic terminals of SATD connected to Mini-6 [fr

  17. Graphics based PC analysis of alpha spectra

    International Nuclear Information System (INIS)

    Chapman, T.C.

    1991-01-01

    New personal computer (PC) software performs interactive analysis of alpha spectra using EGA graphics. Spectra are collected with a commercial MCA board and analyzed using the software described here. The operator is required to approve each peak integration area before analysis proceeds. Sample analysis can use detector efficiencies or spike yields or both. Background corrections are made and upper limit values are calculated when specified. Nuclide identification uses a library of up to 64 nuclides with up to 8 alpha lines for each nuclide. Any one of 32 subset libraries can be used in an analysis. Analysis time is short and is limited by interaction with the operator, not by calculation time. Utilities include nuclide library editing, library subset editing, energy calibration, efficiency calibration, and background update

  18. A graphical simulation software for instruction in cardiovascular mechanics physiology

    Directory of Open Access Journals (Sweden)

    Wenger Roland H

    2011-01-01

    Full Text Available Abstract Background Computer supported, interactive e-learning systems are widely used in the teaching of physiology. However, the currently available complimentary software tools in the field of the physiology of cardiovascular mechanics have not yet been adapted to the latest systems software. Therefore, a simple-to-use replacement for undergraduate and graduate students' education was needed, including an up-to-date graphical software that is validated and field-tested. Methods Software compatible to Windows, based on modified versions of existing mathematical algorithms, has been newly developed. Testing was performed during a full term of physiological lecturing to medical and biology students. Results The newly developed CLabUZH software models a reduced human cardiovascular loop containing all basic compartments: an isolated heart including an artificial electrical stimulator, main vessels and the peripheral resistive components. Students can alter several physiological parameters interactively. The resulting output variables are printed in x-y diagrams and in addition shown in an animated, graphical model. CLabUZH offers insight into the relations of volume, pressure and time dependency in the circulation and their correlation to the electrocardiogram (ECG. Established mechanisms such as the Frank-Starling Law or the Windkessel Effect are considered in this model. The CLabUZH software is self-contained with no extra installation required and runs on most of today's personal computer systems. Conclusions CLabUZH is a user-friendly interactive computer programme that has proved to be useful in teaching the basic physiological principles of heart mechanics.

  19. Development of virtual hands using animation software and graphical modelling

    International Nuclear Information System (INIS)

    Oliveira, Erick da S.; Junior, Alberico B. de C.

    2016-01-01

    The numerical dosimetry uses virtual anthropomorphic simulators to represent the human being in computational framework and thus assess the risks associated with exposure to a radioactive source. With the development of computer animation software, the development of these simulators was facilitated using only knowledge of human anatomy to prepare various types of simulators (man, woman, child and baby) in various positions (sitting, standing, running) or part thereof (head, trunk and limbs). These simulators are constructed by loops of handling and due to the versatility of the method, one can create various geometries irradiation was not possible before. In this work, we have built an exhibition of a radiopharmaceutical scenario manipulating radioactive material using animation software and graphical modeling and anatomical database. (author)

  20. Interactive graphics for the Macintosh: software review of FlexiGraphs.

    Science.gov (United States)

    Antonak, R F

    1990-01-01

    While this product is clearly unique, its usefulness to individuals outside small business environments is somewhat limited. FlexiGraphs is, however, a reasonable first attempt to design a microcomputer software package that controls data through interactive editing within a graph. Although the graphics capabilities of mainframe programs such as MINITAB (Ryan, Joiner, & Ryan, 1981) and the graphic manipulations available through exploratory data analysis (e.g., Velleman & Hoaglin, 1981) will not be surpassed anytime soon by this program, a researcher may want to add this program to a software library containing other Macintosh statistics, drawing, and graphics programs if only to obtain the easy-to-obtain curve fitting and line smoothing options. I welcome the opportunity to review the enhanced "scientific" version of FlexiGraphs that the author of the program indicates is currently under development. An MS-DOS version of the program should be available within the year.

  1. Software Design for Interactive Graphic Radiation Treatment Simulation Systems*

    Science.gov (United States)

    Kalet, Ira J.; Sweeney, Christine; Jacky, Jonathan

    1990-01-01

    We examine issues in the design of interactive computer graphic simulation programs for radiation treatment planning (RTP), as well as expert system programs that automate parts of the RTP process, in light of ten years of experience at designing, building and using such programs. An experiment in object-oriented design using standard Pascal shows that while some advantage is gained from the design, it is still difficult to achieve modularity and to integrate expert system components. A new design based on the Common LISP Object System (CLOS) is described. This series of designs for RTP software shows that this application benefits in specific ways from object-oriented design methods and appropriate languages and tools.

  2. Codesign Analysis of a Computer Graphics Application

    DEFF Research Database (Denmark)

    Madsen, Jan; Brage, Jens P.

    1996-01-01

    This paper describes a codesign case study where a computer graphics application is examined with the intention to speed up its execution. The application is specified as a C program, and is characterized by the lack of a simple compute-intensive kernel. The hardware/software partitioning is based...

  3. Use of Cloud-Based Graphic Narrative Software in Medical Ethics Teaching

    Science.gov (United States)

    Weber, Alan S.

    2015-01-01

    Although used as a common pedagogical tool in K-12 education, online graphic narrative ("comics") software has not generally been incorporated into advanced professional or technical education. This contribution reports preliminary data from a study on the use of cloud-based graphics software Pixton.com to teach basic medical ethics…

  4. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies.

    Science.gov (United States)

    Uebe, Steffen; Pasutto, Francesca; Krumbiegel, Mandy; Schanze, Denny; Ekici, Arif B; Reis, André

    2010-09-21

    Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  5. GPFrontend and GPGraphics: graphical analysis tools for genetic association studies

    Directory of Open Access Journals (Sweden)

    Schanze Denny

    2010-09-01

    Full Text Available Abstract Background Most software packages for whole genome association studies are non-graphical, purely text based programs originally designed to run with UNIX-like operating systems. Graphical output is often not intended or supposed to be performed with other command line tools, e.g. gnuplot. Results Using the Microsoft .NET 2.0 platform and Visual Studio 2005, we have created a graphical software package to analyze data from microarray whole genome association studies, both for a DNA-pooling based approach as well as regular single sample data. Part of this package was made to integrate with GenePool 0.8.2, a previously existing software suite for GNU/Linux systems, which we have modified to run in a Microsoft Windows environment. Further modifications cause it to generate some additional data. This enables GenePool to interact with the .NET parts created by us. The programs we developed are GPFrontend, a graphical user interface and frontend to use GenePool and create metadata files for it, and GPGraphics, a program to further analyze and graphically evaluate output of different WGA analysis programs, among them also GenePool. Conclusions Our programs enable regular MS Windows users without much experience in bioinformatics to easily visualize whole genome data from a variety of sources.

  6. HAZARD ANALYSIS SOFTWARE

    International Nuclear Information System (INIS)

    Sommer, S; Tinh Tran, T.

    2008-01-01

    Washington Safety Management Solutions, LLC developed web-based software to improve the efficiency and consistency of hazard identification and analysis, control selection and classification, and to standardize analysis reporting at Savannah River Site. In the new nuclear age, information technology provides methods to improve the efficiency of the documented safety analysis development process which includes hazard analysis activities. This software provides a web interface that interacts with a relational database to support analysis, record data, and to ensure reporting consistency. A team of subject matter experts participated in a series of meetings to review the associated processes and procedures for requirements and standard practices. Through these meetings, a set of software requirements were developed and compiled into a requirements traceability matrix from which software could be developed. The software was tested to ensure compliance with the requirements. Training was provided to the hazard analysis leads. Hazard analysis teams using the software have verified its operability. The software has been classified as NQA-1, Level D, as it supports the analysis team but does not perform the analysis. The software can be transported to other sites with alternate risk schemes. The software is being used to support the development of 14 hazard analyses. User responses have been positive with a number of suggestions for improvement which are being incorporated as time permits. The software has enforced a uniform implementation of the site procedures. The software has significantly improved the efficiency and standardization of the hazard analysis process

  7. SAS and R data management, statistical analysis, and graphics

    CERN Document Server

    Kleinman, Ken

    2009-01-01

    An All-in-One Resource for Using SAS and R to Carry out Common TasksProvides a path between languages that is easier than reading complete documentationSAS and R: Data Management, Statistical Analysis, and Graphics presents an easy way to learn how to perform an analytical task in both SAS and R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation. The book covers many common tasks, such as data management, descriptive summaries, inferential procedures, regression analysis, and the creation of graphics, along with more complex applicat

  8. Formal Analysis of Graphical Security Models

    DEFF Research Database (Denmark)

    Aslanyan, Zaruhi

    , software components and human actors interacting with each other to form so-called socio-technical systems. The importance of socio-technical systems to modern societies requires verifying their security properties formally, while their inherent complexity makes manual analyses impracticable. Graphical...... models for security offer an unrivalled opportunity to describe socio-technical systems, for they allow to represent different aspects like human behaviour, computation and physical phenomena in an abstract yet uniform manner. Moreover, these models can be assigned a formal semantics, thereby allowing...... formal verification of their properties. Finally, their appealing graphical notations enable to communicate security concerns in an understandable way also to non-experts, often in charge of the decision making. This dissertation argues that automated techniques can be developed on graphical security...

  9. Graphical Acoustic Liner Design and Analysis Tool

    Science.gov (United States)

    Howerton, Brian M. (Inventor); Jones, Michael G. (Inventor)

    2016-01-01

    An interactive liner design and impedance modeling tool comprises software utilized to design acoustic liners for use in constrained spaces, both regularly and irregularly shaped. A graphical user interface allows the acoustic channel geometry to be drawn in a liner volume while the surface impedance calculations are updated and displayed in real-time. A one-dimensional transmission line model may be used as the basis for the impedance calculations.

  10. Advanced software development workstation project: Engineering scripting language. Graphical editor

    Science.gov (United States)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  11. Computer graphics in reactor safety analysis

    International Nuclear Information System (INIS)

    Fiala, C.; Kulak, R.F.

    1989-01-01

    This paper describes a family of three computer graphics codes designed to assist the analyst in three areas: the modelling of complex three-dimensional finite element models of reactor structures; the interpretation of computational results; and the reporting of the results of numerical simulations. The purpose and key features of each code are presented. The graphics output used in actual safety analysis are used to illustrate the capabilities of each code. 5 refs., 10 figs

  12. Using R for Data Management, Statistical Analysis, and Graphics

    CERN Document Server

    Horton, Nicholas J

    2010-01-01

    This title offers quick and easy access to key element of documentation. It includes worked examples across a wide variety of applications, tasks, and graphics. "Using R for Data Management, Statistical Analysis, and Graphics" presents an easy way to learn how to perform an analytical task in R, without having to navigate through the extensive, idiosyncratic, and sometimes unwieldy software documentation and vast number of add-on packages. Organized by short, clear descriptive entries, the book covers many common tasks, such as data management, descriptive summaries, inferential proc

  13. ACHIEVING HIGH INTEGRITY OF PROCESS-CONTROL SOFTWARE BY GRAPHICAL DESIGN AND FORMAL VERIFICATION

    NARCIS (Netherlands)

    HALANG, WA; Kramer, B.J.

    The International Electrotechnical Commission is currently standardising four compatible languages for designing and implementing programmable logic controllers (PLCs). The language family includes a diagrammatic notation that supports the idea of software ICs to encourage graphical design

  14. Office of Education Guide to Graphic Art Software

    Science.gov (United States)

    Davis, Angela M.

    1995-01-01

    During the summer experience in the LARSS program, the author created a performance support system showing the techniques of creating text in Quark XPress, placed the text into Adobe Illustrator along with scanned images, signatures and art work partially created in Adobe Photoshop. The purpose of the project was to familiarize the Office of Education Staff with Graphic Arts and the computer skills utilized to typeset and design certificates, brochures, cover pages, manuals, etc.

  15. Software Graphics Processing Unit (sGPU) for Deep Space Applications

    Science.gov (United States)

    McCabe, Mary; Salazar, George; Steele, Glen

    2015-01-01

    A graphics processing capability will be required for deep space missions and must include a range of applications, from safety-critical vehicle health status to telemedicine for crew health. However, preliminary radiation testing of commercial graphics processing cards suggest they cannot operate in the deep space radiation environment. Investigation into an Software Graphics Processing Unit (sGPU)comprised of commercial-equivalent radiation hardened/tolerant single board computers, field programmable gate arrays, and safety-critical display software shows promising results. Preliminary performance of approximately 30 frames per second (FPS) has been achieved. Use of multi-core processors may provide a significant increase in performance.

  16. A Graphical User Interface for the Computational Fluid Dynamics Software OpenFOAM

    OpenAIRE

    Melbø, Henrik Kaald

    2014-01-01

    A graphical user interface for the computational fluid dynamics software OpenFOAM has been constructed. OpenFOAM is a open source and powerful numerical software, but has much to be wanted in the field of user friendliness. In this thesis the basic operation of OpenFOAM will be introduced and the thesis will emerge in a graphical user interface written in PyQt. The graphical user interface will make the use of OpenFOAM simpler, and hopefully make this powerful tool more available for the gene...

  17. Software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.

    1996-02-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably well understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper

  18. IDAS, software support for mathematical models and map-based graphics

    International Nuclear Information System (INIS)

    Birnbaum, M.D.; Wecker, D.B.

    1984-01-01

    IDAS (Intermediate Dose Assessment System) was developed for the U.S. Nuclear Regulatory Commission as a hardware/software host for radiological models and display of map-based plume graphics at the Operations Center (HQ), regional incident response centers, and site emergency facilities. IDAS design goals acknowledged the likelihood of future changes in the suite of models and the composition of map features for analysis and graphical display. IDAS provides a generalized software support environment to programmers and users of modeling programs. A database manager process provides multi-user access control to all input and output data for modeling programs. A programmer-created data description file (schema) specifies data field names, data types, legal and recommended ranges, default values, preferred units of measurement, and ''help'' text. Subroutine calls to IDAS from a model program invoke a consistent user interface which can show any of the schema contents, convert units of measurement, and route data to multiple logical devices, including the database. A stand-alone data editor allows the user to read and write model data records without execution of a model. IDAS stores digitized map features in a 4-level naming hierarchy. A user can select the map icon, color, and whether to show a stored name tag, for each map feature. The user also selects image scale (zoom) within limits set by map digitization. The resulting image combines static map information, computed analytic modeling results, and the user's feature selections for display to decision-makers

  19. Graphic Communications. Occupational Competency Analysis Profile.

    Science.gov (United States)

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Occupational Competency Analysis Profile (OCAP), which is one of a series of OCAPs developed to identify the skills that Ohio employers deem necessary to entering a given occupation/occupational area, lists the occupational, academic, and employability skills required of individuals entering graphic communications occupations. The…

  20. IMAGE information monitoring and applied graphics software environment. Volume 4. Applications description

    International Nuclear Information System (INIS)

    Hallam, J.W.; Ng, K.B.; Upham, G.L.

    1986-09-01

    The EPRI Information Monitoring and Applied Graphics Environment (IMAGE) system is designed for 'fast proto-typing' of advanced concepts for computer-aided plant operations tools. It is a flexible software system which can be used for rapidly creating, dynamically driving and evaluating advanced operator aid displays. The software is written to be both host computer and graphic device independent. This four volume report includes an Executive Overview of the IMAGE package (Volume 1), followed by Software Description (Volume II), User's Guide (Volume III), and Description of Example Applications (Volume IV)

  1. Internet-based hardware/software co-design framework for embedded 3D graphics applications

    Directory of Open Access Journals (Sweden)

    Wong Weng-Fai

    2011-01-01

    Full Text Available Abstract Advances in technology are making it possible to run three-dimensional (3D graphics applications on embedded and handheld devices. In this article, we propose a hardware/software co-design environment for 3D graphics application development that includes the 3D graphics software, OpenGL ES application programming interface (API, device driver, and 3D graphics hardware simulators. We developed a 3D graphics system-on-a-chip (SoC accelerator using transaction-level modeling (TLM. This gives software designers early access to the hardware even before it is ready. On the other hand, hardware designers also stand to gain from the more complex test benches made available in the software for verification. A unique aspect of our framework is that it allows hardware and software designers from geographically dispersed areas to cooperate and work on the same framework. Designs can be entered and executed from anywhere in the world without full access to the entire framework, which may include proprietary components. This results in controlled and secure transparency and reproducibility, granting leveled access to users of various roles.

  2. ModelMate - A graphical user interface for model analysis

    Science.gov (United States)

    Banta, Edward R.

    2011-01-01

    ModelMate is a graphical user interface designed to facilitate use of model-analysis programs with models. This initial version of ModelMate supports one model-analysis program, UCODE_2005, and one model software program, MODFLOW-2005. ModelMate can be used to prepare input files for UCODE_2005, run UCODE_2005, and display analysis results. A link to the GW_Chart graphing program facilitates visual interpretation of results. ModelMate includes capabilities for organizing directories used with the parallel-processing capabilities of UCODE_2005 and for maintaining files in those directories to be identical to a set of files in a master directory. ModelMate can be used on its own or in conjunction with ModelMuse, a graphical user interface for MODFLOW-2005 and PHAST.

  3. Design of Mariner 9 Science Sequences using Interactive Graphics Software

    Science.gov (United States)

    Freeman, J. E.; Sturms, F. M, Jr.; Webb, W. A.

    1973-01-01

    This paper discusses the analyst/computer system used to design the daily science sequences required to carry out the desired Mariner 9 science plan. The Mariner 9 computer environment, the development and capabilities of the science sequence design software, and the techniques followed in the daily mission operations are discussed. Included is a discussion of the overall mission operations organization and the individual components which played an essential role in the sequence design process. A summary of actual sequences processed, a discussion of problems encountered, and recommendations for future applications are given.

  4. Graphic Design Of “Green Mission” Education Game Using Software Based On Vector

    Directory of Open Access Journals (Sweden)

    Nur Yanti

    2018-01-01

    Full Text Available Educational game is a digital game in its design using the elements of education and in it support teaching and learning by using technology that is interactive media. Generally an educational game has a fun look, an easy-to-use menu, as well as color combinations that are used that are GUI-based (Graphic User Interface so as to create appeal to users. Because it is undeniable that the human brain tends to more quickly capture learning through visual images rather than writings. Therefore, graphic design of an educational game becomes one of the important points. Software applications become one of the solutions in making game design, one of which is a vector-based software applications. There are various software that can be used in accordance with the function and usefulness of each. But in general the way the software works almost same.

  5. Western aeronautical test range real-time graphics software package MAGIC

    Science.gov (United States)

    Malone, Jacqueline C.; Moore, Archie L.

    1988-01-01

    The master graphics interactive console (MAGIC) software package used on the Western Aeronautical Test Range (WATR) of the NASA Ames Research Center is described. MAGIC is a resident real-time research tool available to flight researchers-scientists in the NASA mission control centers of the WATR at the Dryden Flight Research Facility at Edwards, California. The hardware configuration and capabilities of the real-time software package are also discussed.

  6. Effective Results Analysis for the Similar Software Products’ Orthogonality

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-10-01

    Full Text Available It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning of the software product allocated for the orthogonality.

  7. Early Algebra with Graphics Software as a Type II Application of Technology

    Science.gov (United States)

    Abramovich, Sergei

    2006-01-01

    This paper describes the use of Kid Pix-graphics software for creative activities of young children--in the context of early algebra as determined by the mathematics core curriculum of New York state. It shows how grade-two appropriate pedagogy makes it possible to bring about a qualitative change in the learning process of those commonly…

  8. Janus: Graphical Software for Analyzing In-Situ Measurements of Solar-Wind Ions

    Science.gov (United States)

    Maruca, B.; Stevens, M. L.; Kasper, J. C.; Korreck, K. E.

    2016-12-01

    In-situ observations of solar-wind ions provide tremendous insights into the physics of space plasmas. Instrument on spacecraft measure distributions of ion energies, which can be processed into scientifically useful data (e.g., values for ion densities and temperatures). This analysis requires a strong, technical understanding of the instrument, so it has traditionally been carried out by the instrument teams using automated software that they had developed for that purpose. The automated routines are optimized for typical solar-wind conditions, so they can fail to capture the complex (and scientifically interesting) microphysics of transient solar-wind - such as coronal mass ejections (CME's) and co-rotating interaction regions (CIR's) - which are often better analyzed manually.This presentation reports on the ongoing development of Janus, a new software package for processing in-situ measurement of solar-wind ions. Janus will provide user with an easy-to-use graphical user interface (GUI) for carrying out highly customized analyses. Transparent to the user, Janus will automatically handle the most technical tasks (e.g., the retrieval and calibration of measurements). For the first time, users with only limited knowledge about the instruments (e.g., non-instrumentalists and students) will be able to easily process measurements of solar-wind ions. Version 1 of Janus focuses specifically on such measurements from the Wind spacecraft's Faraday Cups and is slated for public release in time for this presentation.

  9. Partial wave analysis using graphics processing units

    Energy Technology Data Exchange (ETDEWEB)

    Berger, Niklaus; Liu Beijiang; Wang Jike, E-mail: nberger@ihep.ac.c [Institute of High Energy Physics, Chinese Academy of Sciences, 19B Yuquan Lu, Shijingshan, 100049 Beijing (China)

    2010-04-01

    Partial wave analysis is an important tool for determining resonance properties in hadron spectroscopy. For large data samples however, the un-binned likelihood fits employed are computationally very expensive. At the Beijing Spectrometer (BES) III experiment, an increase in statistics compared to earlier experiments of up to two orders of magnitude is expected. In order to allow for a timely analysis of these datasets, additional computing power with short turnover times has to be made available. It turns out that graphics processing units (GPUs) originally developed for 3D computer games have an architecture of massively parallel single instruction multiple data floating point units that is almost ideally suited for the algorithms employed in partial wave analysis. We have implemented a framework for tensor manipulation and partial wave fits called GPUPWA. The user writes a program in pure C++ whilst the GPUPWA classes handle computations on the GPU, memory transfers, caching and other technical details. In conjunction with a recent graphics processor, the framework provides a speed-up of the partial wave fit by more than two orders of magnitude compared to legacy FORTRAN code.

  10. Path generation algorithm for UML graphic modeling of aerospace test software

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Chen, Chao

    2018-03-01

    Aerospace traditional software testing engineers are based on their own work experience and communication with software development personnel to complete the description of the test software, manual writing test cases, time-consuming, inefficient, loopholes and more. Using the high reliability MBT tools developed by our company, the one-time modeling can automatically generate test case documents, which is efficient and accurate. UML model to describe the process accurately express the need to rely on the path is reached, the existing path generation algorithm are too simple, cannot be combined into a path and branch path with loop, or too cumbersome, too complicated arrangement generates a path is meaningless, for aerospace software testing is superfluous, I rely on our experience of ten load space, tailor developed a description of aerospace software UML graphics path generation algorithm.

  11. Automated Software Vulnerability Analysis

    Science.gov (United States)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  12. Setting graphic software for data examination of multidetector used in heavy ions physic

    International Nuclear Information System (INIS)

    Le Flecher, C.

    1990-01-01

    In this work, the first chapter presents both the multidetectors of the vacuum chamber NAUTILUS, from a geometrical and parametrical point of view and the manual analysis phase. The second part deals with the main notions of the language Graphical Kernel System. Finally, the third chapter is a presentation of the different graphical programs written in order to accelerate and rationalize the analysis phase [fr

  13. Effective Results Analysis for the Similar Software Products’ Orthogonality

    OpenAIRE

    Ion Ivan; Daniel Milodin

    2009-01-01

    It is defined the concept of similar software. There are established conditions of archiving the software components. It is carried out the orthogonality evaluation and the correlation between the orthogonality and the complexity of the homogenous software components is analyzed. Shall proceed to build groups of similar software products, belonging to the orthogonality intervals. There are presented in graphical form the results of the analysis. There are detailed aspects of the functioning o...

  14. ATM Technology Demonstration-1 Phase II Boeing Configurable Graphical Display (CGD) Software Design Description

    Science.gov (United States)

    Wilber, George F.

    2017-01-01

    This Software Description Document (SDD) captures the design for developing the Flight Interval Management (FIM) system Configurable Graphics Display (CGD) software. Specifically this SDD describes aspects of the Boeing CGD software and the surrounding context and interfaces. It does not describe the Honeywell components of the CGD system. The SDD provides the system overview, architectural design, and detailed design with all the necessary information to implement the Boeing components of the CGD software and integrate them into the CGD subsystem within the larger FIM system. Overall system and CGD system-level requirements are derived from the CGD SRS (in turn derived from the Boeing System Requirements Design Document (SRDD)). Display and look-and-feel requirements are derived from Human Machine Interface (HMI) design documents and working group recommendations. This Boeing CGD SDD is required to support the upcoming Critical Design Review (CDR).

  15. Network Distributed Data Acquisition, Storage, and Graphical Live Display Software for a Laser Ion Source at CERN

    CERN Document Server

    Rossel, Ralf Erik; Rothe, Sebastian

    2014-01-01

    This project documentation outlines the requirements and implementation details for the measurement data recording software currently in development for the Resonance Ionisation Laser Ion Source (RILIS) at CERN. The software is capable of acquiring data from multiple laser parameter monitoring devices and associating the gathered values to represent qualitative and quantitative measurements. The measurement data is displayed graphically within the program and recorded to files for later analysis. The main application of the software is the acquisition coordination and recording of measurement data during spectroscopy experiments performed by RILIS and collaborating experiments. This document describes the design concept and detailed program implementation status at the end of July 2014 and provides an outlook to future developments in RILIS spectroscopy data acquisition.

  16. siGnum: graphical user interface for EMG signal analysis.

    Science.gov (United States)

    Kaur, Manvinder; Mathur, Shilpi; Bhatia, Dinesh; Verma, Suresh

    2015-01-01

    Electromyography (EMG) signals that represent the electrical activity of muscles can be used for various clinical and biomedical applications. These are complicated and highly varying signals that are dependent on anatomical location and physiological properties of the muscles. EMG signals acquired from the muscles require advanced methods for detection, decomposition and processing. This paper proposes a novel Graphical User Interface (GUI) siGnum developed in MATLAB that will apply efficient and effective techniques on processing of the raw EMG signals and decompose it in a simpler manner. It could be used independent of MATLAB software by employing a deploy tool. This would enable researcher's to gain good understanding of EMG signal and its analysis procedures that can be utilized for more powerful, flexible and efficient applications in near future.

  17. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    Science.gov (United States)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  18. Software quality testing process analysis

    OpenAIRE

    Mera Paz, Julián

    2016-01-01

    Introduction: This article is the result of reading, review, analysis of books, magazines and articles well known for their scientific and research quality, which have addressed the software quality testing process. The author, based on his work experience in software development companies, teaching and other areas, has compiled and selected information to argue and substantiate the importance of the software quality testing process. Methodology: the existing literature on the software qualit...

  19. Mvox: Interactive 2-4D medical image and graphics visualization software

    DEFF Research Database (Denmark)

    Bro-Nielsen, Morten

    1996-01-01

    Mvox is a new tool for visualization, segmentation and manipulation of a wide range of 2-4D grey level and colour images, and 3D surface graphics, which has been developed at the Department of Mathematical Modelling, Technical University of Denmark. The principal idea behind the software has been...... to provide a flexible tool that is able to handle all the kinds of data that are typically used in a research environment for medical imaging and visualization. At the same time the software should be easy to use and have a consistent interface providing locally only the functions relevant to the context....... This has been achieved by using Unix standards such as X/Motif/OpenGL and conforming to modern standards of interactive windowed programs...

  20. Software FMEA analysis for safety-related application software

    International Nuclear Information System (INIS)

    Park, Gee-Yong; Kim, Dong Hoon; Lee, Dong Young

    2014-01-01

    Highlights: • We develop a modified FMEA analysis suited for applying to software architecture. • A template for failure modes on a specific software language is established. • A detailed-level software FMEA analysis on nuclear safety software is presented. - Abstract: A method of a software safety analysis is described in this paper for safety-related application software. The target software system is a software code installed at an Automatic Test and Interface Processor (ATIP) in a digital reactor protection system (DRPS). For the ATIP software safety analysis, at first, an overall safety or hazard analysis is performed over the software architecture and modules, and then a detailed safety analysis based on the software FMEA (Failure Modes and Effect Analysis) method is applied to the ATIP program. For an efficient analysis, the software FMEA analysis is carried out based on the so-called failure-mode template extracted from the function blocks used in the function block diagram (FBD) for the ATIP software. The software safety analysis by the software FMEA analysis, being applied to the ATIP software code, which has been integrated and passed through a very rigorous system test procedure, is proven to be able to provide very valuable results (i.e., software defects) that could not be identified during various system tests

  1. Modularity analysis of automotive control software

    OpenAIRE

    Dajsuren, Y.; Brand, van den, M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and control engineers in the automotive industry to ensure the quality of the highly complex MATLAB/Simulink control software. For automotive software, modularity is recognized as being a crucial quality a...

  2. A novel graphical technique for Pinch Analysis applications: Energy Targets and grassroots design

    International Nuclear Information System (INIS)

    Gadalla, Mamdouh A.

    2015-01-01

    Graphical abstract: A new HEN graphical design. - Highlights: • A new graphical technique for heat exchanger networks design. • Pinch Analysis principles and design rules are better interpreted. • Graphical guidelines for optimum heat integration. • New temperature-based graphs provide user-interactive features. - Abstract: Pinch Analysis is for decades a leading tool to energy integration for retrofit and design. This paper presents a new graphical technique, based on Pinch Analysis, for the grassroots design of heat exchanger networks. In the new graph, the temperatures of hot streams are plotted versus those of the cold streams. The temperature–temperature based graph is constructed to include temperatures of hot and cold streams as straight lines, horizontal lines for hot streams, and vertical lines for cold streams. The graph is applied to determine the pinch temperatures and Energy Targets. It is then used to synthesise graphically a complete exchanger network, achieving the Energy Targets. Within the new graph, exchangers are represented by inclined straight lines, whose slopes are proportional to the ratio of heat capacities and flows. Pinch Analysis principles for design are easily interpreted using this new graphical technique to design a complete exchanger network. Network designs achieved by the new technique can guarantee maximum heat recovery. The new technique can also be employed to simulate basic designs of heat exchanger networks. The strengths of the new tool are that it is simply applied using computers, requires no commercial software, and can be used for academic purposes/engineering education

  3. Animated analysis of geoscientific datasets: An interactive graphical application

    Science.gov (United States)

    Morse, Peter; Reading, Anya; Lueg, Christopher

    2017-12-01

    Geoscientists are required to analyze and draw conclusions from increasingly large volumes of data. There is a need to recognise and characterise features and changing patterns of Earth observables within such large datasets. It is also necessary to identify significant subsets of the data for more detailed analysis. We present an innovative, interactive software tool and workflow to visualise, characterise, sample and tag large geoscientific datasets from both local and cloud-based repositories. It uses an animated interface and human-computer interaction to utilise the capacity of human expert observers to identify features via enhanced visual analytics. 'Tagger' enables users to analyze datasets that are too large in volume to be drawn legibly on a reasonable number of single static plots. Users interact with the moving graphical display, tagging data ranges of interest for subsequent attention. The tool provides a rapid pre-pass process using fast GPU-based OpenGL graphics and data-handling and is coded in the Quartz Composer visual programing language (VPL) on Mac OSX. It makes use of interoperable data formats, and cloud-based (or local) data storage and compute. In a case study, Tagger was used to characterise a decade (2000-2009) of data recorded by the Cape Sorell Waverider Buoy, located approximately 10 km off the west coast of Tasmania, Australia. These data serve as a proxy for the understanding of Southern Ocean storminess, which has both local and global implications. This example shows use of the tool to identify and characterise 4 different types of storm and non-storm events during this time. Events characterised in this way are compared with conventional analysis, noting advantages and limitations of data analysis using animation and human interaction. Tagger provides a new ability to make use of humans as feature detectors in computer-based analysis of large-volume geosciences and other data.

  4. Modularity analysis of automotive control software

    NARCIS (Netherlands)

    Dajsuren, Y.; Brand, van den M.G.J.; Serebrenik, A.

    2013-01-01

    A design language and tool like MATLAB/Simulink is used for the graphical modelling and simulation of automotive control software. As the functionality based on electronics and software systems increases in motor vehicles, it is becoming increasingly important for system/software architects and

  5. Integration of rocket turbine design and analysis through computer graphics

    Science.gov (United States)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  6. SraTailor: graphical user interface software for processing and visualizing ChIP-seq data.

    Science.gov (United States)

    Oki, Shinya; Maehara, Kazumitsu; Ohkawa, Yasuyuki; Meno, Chikara

    2014-12-01

    Raw data from ChIP-seq (chromatin immunoprecipitation combined with massively parallel DNA sequencing) experiments are deposited in public databases as SRAs (Sequence Read Archives) that are publically available to all researchers. However, to graphically visualize ChIP-seq data of interest, the corresponding SRAs must be downloaded and converted into BigWig format, a process that involves complicated command-line processing. This task requires users to possess skill with script languages and sequence data processing, a requirement that prevents a wide range of biologists from exploiting SRAs. To address these challenges, we developed SraTailor, a GUI (Graphical User Interface) software package that automatically converts an SRA into a BigWig-formatted file. Simplicity of use is one of the most notable features of SraTailor: entering an accession number of an SRA and clicking the mouse are the only steps required to obtain BigWig-formatted files and to graphically visualize the extents of reads at given loci. SraTailor is also able to make peak calls, generate files of other formats, process users' own data, and accept various command-line-like options. Therefore, this software makes ChIP-seq data fully exploitable by a wide range of biologists. SraTailor is freely available at http://www.devbio.med.kyushu-u.ac.jp/sra_tailor/, and runs on both Mac and Windows machines. © 2014 The Authors Genes to Cells © 2014 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.

  7. Development of data acquisition and analysis software for multichannel detectors

    International Nuclear Information System (INIS)

    Chung, Y.

    1988-06-01

    This report describes the development of data acquisition and analysis software for Apple Macintosh computers, capable of controlling two multichannel detectors. With the help of outstanding graphics capabilities, easy-to-use user interface, and several other built-in convenience features, this application has enhanced the productivity and the efficiency of data analysis. 2 refs., 6 figs

  8. PAW [Physics Analysis Workstation] at Fermilab: CORE based graphics implementation of HIGZ [High Level Interface to Graphics and Zebra

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation system (PAW) is primarily intended to be the last link in the analysis chain of experimental data. The graphical part of PAW is based on HIGZ (High Level Interface to Graphics and Zebra), which is based on the OSI and ANSI standard Graphics Kernel System (GKS). HIGZ is written in the context of PAW. At Fermilab, the CORE based graphics system DI-3000 by Precision Visuals Inc., is widely used in the analysis of experimental data. The graphical part of the PAW routines has been totally rewritten and implemented in the Fermilab environment. 3 refs

  9. Accelerating epistasis analysis in human genetics with consumer graphics hardware

    Directory of Open Access Journals (Sweden)

    Cancare Fabio

    2009-07-01

    Full Text Available Abstract Background Human geneticists are now capable of measuring more than one million DNA sequence variations from across the human genome. The new challenge is to develop computationally feasible methods capable of analyzing these data for associations with common human disease, particularly in the context of epistasis. Epistasis describes the situation where multiple genes interact in a complex non-linear manner to determine an individual's disease risk and is thought to be ubiquitous for common diseases. Multifactor Dimensionality Reduction (MDR is an algorithm capable of detecting epistasis. An exhaustive analysis with MDR is often computationally expensive, particularly for high order interactions. This challenge has previously been met with parallel computation and expensive hardware. The option we examine here exploits commodity hardware designed for computer graphics. In modern computers Graphics Processing Units (GPUs have more memory bandwidth and computational capability than Central Processing Units (CPUs and are well suited to this problem. Advances in the video game industry have led to an economy of scale creating a situation where these powerful components are readily available at very low cost. Here we implement and evaluate the performance of the MDR algorithm on GPUs. Of primary interest are the time required for an epistasis analysis and the price to performance ratio of available solutions. Findings We found that using MDR on GPUs consistently increased performance per machine over both a feature rich Java software package and a C++ cluster implementation. The performance of a GPU workstation running a GPU implementation reduces computation time by a factor of 160 compared to an 8-core workstation running the Java implementation on CPUs. This GPU workstation performs similarly to 150 cores running an optimized C++ implementation on a Beowulf cluster. Furthermore this GPU system provides extremely cost effective

  10. Accelerating epistasis analysis in human genetics with consumer graphics hardware.

    Science.gov (United States)

    Sinnott-Armstrong, Nicholas A; Greene, Casey S; Cancare, Fabio; Moore, Jason H

    2009-07-24

    Human geneticists are now capable of measuring more than one million DNA sequence variations from across the human genome. The new challenge is to develop computationally feasible methods capable of analyzing these data for associations with common human disease, particularly in the context of epistasis. Epistasis describes the situation where multiple genes interact in a complex non-linear manner to determine an individual's disease risk and is thought to be ubiquitous for common diseases. Multifactor Dimensionality Reduction (MDR) is an algorithm capable of detecting epistasis. An exhaustive analysis with MDR is often computationally expensive, particularly for high order interactions. This challenge has previously been met with parallel computation and expensive hardware. The option we examine here exploits commodity hardware designed for computer graphics. In modern computers Graphics Processing Units (GPUs) have more memory bandwidth and computational capability than Central Processing Units (CPUs) and are well suited to this problem. Advances in the video game industry have led to an economy of scale creating a situation where these powerful components are readily available at very low cost. Here we implement and evaluate the performance of the MDR algorithm on GPUs. Of primary interest are the time required for an epistasis analysis and the price to performance ratio of available solutions. We found that using MDR on GPUs consistently increased performance per machine over both a feature rich Java software package and a C++ cluster implementation. The performance of a GPU workstation running a GPU implementation reduces computation time by a factor of 160 compared to an 8-core workstation running the Java implementation on CPUs. This GPU workstation performs similarly to 150 cores running an optimized C++ implementation on a Beowulf cluster. Furthermore this GPU system provides extremely cost effective performance while leaving the CPU available for other

  11. On-Orbit Software Analysis

    Science.gov (United States)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  12. An Imaging And Graphics Workstation For Image Sequence Analysis

    Science.gov (United States)

    Mostafavi, Hassan

    1990-01-01

    This paper describes an application-specific engineering workstation designed and developed to analyze imagery sequences from a variety of sources. The system combines the software and hardware environment of the modern graphic-oriented workstations with the digital image acquisition, processing and display techniques. The objective is to achieve automation and high throughput for many data reduction tasks involving metric studies of image sequences. The applications of such an automated data reduction tool include analysis of the trajectory and attitude of aircraft, missile, stores and other flying objects in various flight regimes including launch and separation as well as regular flight maneuvers. The workstation can also be used in an on-line or off-line mode to study three-dimensional motion of aircraft models in simulated flight conditions such as wind tunnels. The system's key features are: 1) Acquisition and storage of image sequences by digitizing real-time video or frames from a film strip; 2) computer-controlled movie loop playback, slow motion and freeze frame display combined with digital image sharpening, noise reduction, contrast enhancement and interactive image magnification; 3) multiple leading edge tracking in addition to object centroids at up to 60 fields per second from both live input video or a stored image sequence; 4) automatic and manual field-of-view and spatial calibration; 5) image sequence data base generation and management, including the measurement data products; 6) off-line analysis software for trajectory plotting and statistical analysis; 7) model-based estimation and tracking of object attitude angles; and 8) interface to a variety of video players and film transport sub-systems.

  13. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    Science.gov (United States)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  14. Software architecture analysis tool : software architecture metrics collection

    NARCIS (Netherlands)

    Muskens, J.; Chaudron, M.R.V.; Westgeest, R.

    2002-01-01

    The Software Engineering discipline lacks the ability to evaluate software architectures. Here we describe a tool for software architecture analysis that is based on metrics. Metrics can be used to detect possible problems and bottlenecks in software architectures. Even though metrics do not give a

  15. A graphical user interface for infant ERP analysis.

    Science.gov (United States)

    Kaatiala, Jussi; Yrttiaho, Santeri; Forssman, Linda; Perdue, Katherine; Leppänen, Jukka

    2014-09-01

    Recording of event-related potentials (ERPs) is one of the best-suited technologies for examining brain function in human infants. Yet the existing software packages are not optimized for the unique requirements of analyzing artifact-prone ERP data from infants. We developed a new graphical user interface that enables an efficient implementation of a two-stage approach to the analysis of infant ERPs. In the first stage, video records of infant behavior are synchronized with ERPs at the level of individual trials to reject epochs with noncompliant behavior and other artifacts. In the second stage, the interface calls MATLAB and EEGLAB (Delorme & Makeig, Journal of Neuroscience Methods 134(1):9-21, 2004) functions for further preprocessing of the ERP signal itself (i.e., filtering, artifact removal, interpolation, and rereferencing). Finally, methods are included for data visualization and analysis by using bootstrapped group averages. Analyses of simulated and real EEG data demonstrated that the proposed approach can be effectively used to establish task compliance, remove various types of artifacts, and perform representative visualizations and statistical comparisons of ERPs. The interface is available for download from http://www.uta.fi/med/icl/methods/eeg.html in a format that is widely applicable to ERP studies with special populations and open for further editing by users.

  16. Analysis of graphic representations of activity theory in international journals

    Directory of Open Access Journals (Sweden)

    Marco André Mazzarotto

    2016-05-01

    Full Text Available Activity theory is a relevant framework for the Design field, and their graphic representations are cognitive artifacts that aid the understanding, use and communication of this theory. However, there is a lack of consistency around the graphics and labels used in these representations. Based on this, the aim of this study was to identify, analyze and evaluate these differences and propose a representation that aims to be more suitable for the theory. For this, uses as method a literature review based on Engeström (2001 and its three generations of visual models, combined with graphical analysis of representations collected in a hundred papers from international journals.

  17. Incident sequence analysis; event trees, methods and graphical symbols

    International Nuclear Information System (INIS)

    1980-11-01

    When analyzing incident sequences, unwanted events resulting from a certain cause are looked for. Graphical symbols and explanations of graphical representations are presented. The method applies to the analysis of incident sequences in all types of facilities. By means of the incident sequence diagram, incident sequences, i.e. the logical and chronological course of repercussions initiated by the failure of a component or by an operating error, can be presented and analyzed simply and clearly

  18. ggplot2 elegant graphics for data analysis

    CERN Document Server

    Wickham, Hadley

    2016-01-01

    This new edition to the classic book by ggplot2 creator Hadley Wickham highlights compatibility with knitr and RStudio. ggplot2 is a data visualization package for R that helps users create data graphics, including those that are multi-layered, with ease. With ggplot2, it's easy to: • produce handsome, publication-quality plots with automatic legends created from the plot specification • superimpose multiple layers (points, lines, maps, tiles, box plots) from different data sources with automatically adjusted common scales • add customizable smoothers that use powerful modeling capabilities of R, such as loess, linear models, generalized additive models, and robust regression • save any ggplot2 plot (or part thereof) for later modification or reuse • create custom themes that capture in-house or journal style requirements and that can easily be applied to multiple plots • approach a graph from a visual perspective, thinking about how each component of the data is represented on the final plot This...

  19. Software development for teleroentgenogram analysis

    Science.gov (United States)

    Goshkoderov, A. A.; Khlebnikov, N. A.; Obabkov, I. N.; Serkov, K. V.; Gajniyarov, I. M.; Aliev, A. A.

    2017-09-01

    A framework for the analysis and calculation of teleroentgenograms was developed. Software development was carried out in the Department of Children's Dentistry and Orthodontics in Ural State Medical University. The software calculates the teleroentgenogram by the original method which was developed in this medical department. Program allows designing its own methods for calculating the teleroentgenograms by new methods. It is planned to use the technology of machine learning (Neural networks) in the software. This will help to make the process of calculating the teleroentgenograms easier because methodological points will be placed automatically.

  20. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Science.gov (United States)

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  1. GUIdock-VNC: using a graphical desktop sharing system to provide a browser-based interface for containerized software.

    Science.gov (United States)

    Mittal, Varun; Hung, Ling-Hong; Keswani, Jayant; Kristiyanto, Daniel; Lee, Sung Bong; Yeung, Ka Yee

    2017-04-01

    Software container technology such as Docker can be used to package and distribute bioinformatics workflows consisting of multiple software implementations and dependencies. However, Docker is a command line-based tool, and many bioinformatics pipelines consist of components that require a graphical user interface. We present a container tool called GUIdock-VNC that uses a graphical desktop sharing system to provide a browser-based interface for containerized software. GUIdock-VNC uses the Virtual Network Computing protocol to render the graphics within most commonly used browsers. We also present a minimal image builder that can add our proposed graphical desktop sharing system to any Docker packages, with the end result that any Docker packages can be run using a graphical desktop within a browser. In addition, GUIdock-VNC uses the Oauth2 authentication protocols when deployed on the cloud. As a proof-of-concept, we demonstrated the utility of GUIdock-noVNC in gene network inference. We benchmarked our container implementation on various operating systems and showed that our solution creates minimal overhead. © The Authors 2017. Published by Oxford University Press.

  2. Graphical analysis of French nuclear power plant production date

    Energy Technology Data Exchange (ETDEWEB)

    Jourdan, J.P. [Electricite de France (EDF), Projet Production EPR 1, 93 - Saint-Denis (France)

    2001-07-01

    The analysis of values of plant production uses here an original method of graphical analysis. This method clarifies various difficulties of analysing big experience feedback databases among which the language interpretation and distinctions between scarce events and multi-annual events. In general, the method shows the logical processes that production values obey (pure chance logic, administrative logic, and willpower) This method of graphical analysis provides a tool to observe and question in a concrete way so that each person involved can put the events in which he played a role into the general context of other plants. It is a deductible method to improve this big and complex system. (author)

  3. Graphical analysis of French nuclear power plant production date

    International Nuclear Information System (INIS)

    Jourdan, J.P.

    2001-01-01

    The analysis of values of plant production uses here an original method of graphical analysis. This method clarifies various difficulties of analysing big experience feedback databases among which the language interpretation and distinctions between scarce events and multi-annual events. In general, the method shows the logical processes that production values obey (pure chance logic, administrative logic, and willpower) This method of graphical analysis provides a tool to observe and question in a concrete way so that each person involved can put the events in which he played a role into the general context of other plants. It is a deductible method to improve this big and complex system. (author)

  4. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression.......The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...

  5. Software Design for Smile Analysis

    Directory of Open Access Journals (Sweden)

    A. Sarkhosh

    2010-12-01

    Full Text Available Introduction: Esthetics and attractiveness of the smile is one of the major demands in contemporary orthodontic treatment. In order to improve a smile design, it is necessary to record “posed smile” as an intentional, non-pressure, static, natural and reproduciblesmile. The record then should be analyzed to determine its characteristics. In this study,we intended to design and introduce a software to analyze the smile rapidly and precisely in order to produce an attractive smile for the patients.Materials and Methods: For this purpose, a practical study was performed to design multimedia software “Smile Analysis” which can receive patients’ photographs and videographs. After giving records to the software, the operator should mark the points and lines which are displayed on the system’s guide and also define the correct scale for each image. Thirty-three variables are measured by the software and displayed on the report page. Reliability of measurements in both image and video was significantly high(=0.7-1.Results: In order to evaluate intra- operator and inter-operator reliability, five cases were selected randomly. Statistical analysis showed that calculations performed in smile analysis software were both valid and highly reliable (for both video and photo.Conclusion: The results obtained from smile analysis could be used in diagnosis,treatment planning and evaluation of the treatment progress.

  6. A Graphical User Interface for Software-assisted Tracking of Protein Concentration in Dynamic Cellular Protrusions.

    Science.gov (United States)

    Saha, Tanumoy; Rathmann, Isabel; Galic, Milos

    2017-07-11

    Filopodia are dynamic, finger-like cellular protrusions associated with migration and cell-cell communication. In order to better understand the complex signaling mechanisms underlying filopodial initiation, elongation and subsequent stabilization or retraction, it is crucial to determine the spatio-temporal protein activity in these dynamic structures. To analyze protein function in filopodia, we recently developed a semi-automated tracking algorithm that adapts to filopodial shape-changes, thus allowing parallel analysis of protrusion dynamics and relative protein concentration along the whole filopodial length. Here, we present a detailed step-by-step protocol for optimized cell handling, image acquisition and software analysis. We further provide instructions for the use of optional features during image analysis and data representation, as well as troubleshooting guidelines for all critical steps along the way. Finally, we also include a comparison of the described image analysis software with other programs available for filopodia quantification. Together, the presented protocol provides a framework for accurate analysis of protein dynamics in filopodial protrusions using image analysis software.

  7. Data analysis using a data base driven graphics animation system

    International Nuclear Information System (INIS)

    Schwieder, D.H.; Stewart, H.D.; Curtis, J.N.

    1985-01-01

    A graphics animation system has been developed at the Idaho National Engineering Laboratory (INEL) to assist engineers in the analysis of large amounts of time series data. Most prior attempts at computer animation of data involve the development of large and expensive problem-specific systems. This paper discusses a generalized interactive computer animation system designed to be used in a wide variety of data analysis applications. By using relational data base storage of graphics and control information, considerable flexibility in design and development of animated displays is achieved

  8. A Theoretical Analysis of Learning with Graphics--Implications for Computer Graphics Design.

    Science.gov (United States)

    ChanLin, Lih-Juan

    This paper reviews the literature pertinent to learning with graphics. The dual coding theory provides explanation about how graphics are stored and precessed in semantic memory. The level of processing theory suggests how graphics can be employed in learning to encourage deeper processing. In addition to dual coding theory and level of processing…

  9. Design and Development of a Graphical Setup Software for the CMS Global Trigger

    CERN Document Server

    Glaser, P; Bergauer, H; Padrta, M; Taurok,A; Wulz, C E

    2006-01-01

    The CMS experiment at CERNs Large Hadron Collider will search for new physics at high energies. Its trigger system is an essential component in the selection process of potentially interesting events. The Global Trigger is the final stage of the first-level selection process. It is implemented as a complex electronic system containing logic devices, which need to be programmed according to physics requirements. It has to reject or accept events for further processing based on coarse measurements of particle properties such as energies, momenta and location. Algorithms similar to the ones used in the physics analysis are executed in parallel during the event selection process. A graphical setup program to define these algorithms and to subsequently configure the hardware has been developed. The design and implementation of the program, guided by the principal requirements of flexibility, quality assurance, platform-independence and extensibility, are described.

  10. Graphic Communications--Commercial Photography. Ohio's Competency Analysis Profile.

    Science.gov (United States)

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    This Ohio Competency Analysis Profile (OCAP), derived from a modified Developing a Curriculum (DACUM) process, is a current comprehensive and verified employer competency program list for graphic communications--commercial photography. Each unit (with or without subunits) contains competencies and competency builders that identify the…

  11. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  12. KMWin – A Convenient Tool for Graphical Presentation of Results from Kaplan-Meier Survival Time Analysis

    Science.gov (United States)

    Gross, Arnd; Ziepert, Marita; Scholz, Markus

    2012-01-01

    Background Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows) facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav), SAS export (xpt) or text file (dat), which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. Results On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. Conclusions We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups. PMID:22723912

  13. KMWin--a convenient tool for graphical presentation of results from Kaplan-Meier survival time analysis.

    Directory of Open Access Journals (Sweden)

    Arnd Gross

    Full Text Available BACKGROUND: Analysis of clinical studies often necessitates multiple graphical representations of the results. Many professional software packages are available for this purpose. Most packages are either only commercially available or hard to use especially if one aims to generate or customize a huge number of similar graphical outputs. We developed a new, freely available software tool called KMWin (Kaplan-Meier for Windows facilitating Kaplan-Meier survival time analysis. KMWin is based on the statistical software environment R and provides an easy to use graphical interface. Survival time data can be supplied as SPSS (sav, SAS export (xpt or text file (dat, which is also a common export format of other applications such as Excel. Figures can directly be exported in any graphical file format supported by R. RESULTS: On the basis of a working example, we demonstrate how to use KMWin and present its main functions. We show how to control the interface, customize the graphical output, and analyse survival time data. A number of comparisons are performed between KMWin and SPSS regarding graphical output, statistical output, data management and development. Although the general functionality of SPSS is larger, KMWin comprises a number of features useful for survival time analysis in clinical trials and other applications. These are for example number of cases and number of cases under risk within the figure or provision of a queue system for repetitive analyses of updated data sets. Moreover, major adjustments of graphical settings can be performed easily on a single window. CONCLUSIONS: We conclude that our tool is well suited and convenient for repetitive analyses of survival time data. It can be used by non-statisticians and provides often used functions as well as functions which are not supplied by standard software packages. The software is routinely applied in several clinical study groups.

  14. Integrated graphical user interface for the back-end software sub-system

    International Nuclear Information System (INIS)

    Badescu, E.; Caprini, M.

    2001-01-01

    The ATLAS data acquisition and Event Filter prototype '-1' project was intended to produce a prototype system for evaluating candidate technologies and architectures for the final ATLAS DAQ system on the LHC accelerator at CERN. Within the prototype project, the back-end sub-system encompasses the software for configuring, controlling and monitoring the data acquisition (DAQ). The back-end sub-system includes core components and detector integration components. One of the detector integration components is the Integrated Graphical User Interface (IGUI), which is intended to give a view of the status of the DAQ system and its sub-systems (Dataflow, Event Filter and Back-end) and to allow the user (general users, such as a shift operator at a test beam or experts, in order to control and debug the DAQ system) to control its operation. The IGUI is intended to be a Status Display and a Control Interface too, so there are three groups of functional requirements: display requirements (the information to be displayed); control requirements (the actions the IGUI shall perform on the DAQ components); general requirements, applying to the general functionality of the IGUI. The constraint requirements include requirements related to the access control (shift operator or expert user). The quality requirements are related to the portability on different platforms. The IGUI has to interact with many components in a distributed environment. The following design guidelines have been considered in order to fulfil the requirements: use a modular design with easy possibility to integrate different sub-systems; use Java language for portability and powerful graphical features; use CORBA interfaces for communication with other components. The actual implementation of Back-end software components use Inter-Language Unification (ILU) for inter-process communication. Different methods of access of Java applications to ILU C++ servers have been evaluated (native methods, ILU Java support

  15. Novel software for quantitative evaluation and graphical representation of masticatory efficiency.

    Science.gov (United States)

    Halazonetis, D J; Schimmel, M; Antonarakis, G S; Christou, P

    2013-05-01

    Blending of chewing gums of different colours is used in the clinical setting, as a simple and reliable means for the assessment of chewing efficiency. However, the available software is difficult to use in an everyday clinical setting, and there is no possibility of automated classification of the patient's chewing ability in a graph, to facilitate visualisation of the results and to evaluate potential chewing difficulties. The aims of this study were to test the validity of ViewGum - a novel image analysis software for the evaluation of boli derived from a two-colour mixing ability test - and to establish a baseline graph for the representation of the masticatory efficiency in a healthy population. Image analysis demonstrated significant hue variation decrease as the number of chewing cycles increased, indicating a higher degree of colour mixture. Standard deviation of hue (SDHue) was significantly different between all chewing cycles. Regression of the log-transformed values of the medians of SDHue on the number of chewing cycles showed a high statistically significant correlation (r² = 0.94, P test methods by the simplicity of its application. The newly developed ViewGum software provides speed, ease of use and immediate extraction of clinically useful conclusions to the already established method of chewing efficiency evaluation and is a valid adjunct for the evaluation of masticatory efficiency with two-colour chewing gum. © 2013 Blackwell Publishing Ltd.

  16. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    Science.gov (United States)

    Jedlovec, Gary; Srikishen, Jayanthi; Edwards, Rita; Cross, David; Welch, Jon; Smith, Matt

    2013-01-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of "big data" available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Shortterm Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video

  17. Scalable Adaptive Graphics Environment (SAGE) Software for the Visualization of Large Data Sets on a Video Wall

    Science.gov (United States)

    Jedlovec, G.; Srikishen, J.; Edwards, R.; Cross, D.; Welch, J. D.; Smith, M. R.

    2013-12-01

    The use of collaborative scientific visualization systems for the analysis, visualization, and sharing of 'big data' available from new high resolution remote sensing satellite sensors or four-dimensional numerical model simulations is propelling the wider adoption of ultra-resolution tiled display walls interconnected by high speed networks. These systems require a globally connected and well-integrated operating environment that provides persistent visualization and collaboration services. This abstract and subsequent presentation describes a new collaborative visualization system installed for NASA's Short-term Prediction Research and Transition (SPoRT) program at Marshall Space Flight Center and its use for Earth science applications. The system consists of a 3 x 4 array of 1920 x 1080 pixel thin bezel video monitors mounted on a wall in a scientific collaboration lab. The monitors are physically and virtually integrated into a 14' x 7' for video display. The display of scientific data on the video wall is controlled by a single Alienware Aurora PC with a 2nd Generation Intel Core 4.1 GHz processor, 32 GB memory, and an AMD Fire Pro W600 video card with 6 mini display port connections. Six mini display-to-dual DVI cables are used to connect the 12 individual video monitors. The open source Scalable Adaptive Graphics Environment (SAGE) windowing and media control framework, running on top of the Ubuntu 12 Linux operating system, allows several users to simultaneously control the display and storage of high resolution still and moving graphics in a variety of formats, on tiled display walls of any size. The Ubuntu operating system supports the open source Scalable Adaptive Graphics Environment (SAGE) software which provides a common environment, or framework, enabling its users to access, display and share a variety of data-intensive information. This information can be digital-cinema animations, high-resolution images, high-definition video

  18. Development Of 12 Head GAMMA Detection And Graphical Presentation Software Suitable For Industrial Process Investigation By Radiotracer Technique

    International Nuclear Information System (INIS)

    Saengchantr, Dhanaj; Chueinta, Siripone

    2009-07-01

    Full text: Data logging with prompt graphical presentation software accommodating gamma radiation signals from 12 scintillation detectors through standard RS-232 interface has been developed. Laboratory testing by detection of injected-mixed radioactive tracer in a fluid flowing inside a pipe was conducted. The radioactive mixed fluid passed through the detectors located at several points along the pipe and the generated signals correspond to the mass flow inside the pipe were recorded. Up to 10,000 data points of fast (20 millisecond) dwell time could be accumulated. Graphical presentation allowed fast interpretation while the output data were suitable for more accurate evaluation with standard software e.g. Residence Time Distribution (RTD), Computed Tomography Visualization. Further utilization in the industry, in conjunction with radiotracer techniques, for troubleshooting and process optimization will be further carried out

  19. Application of Software Safety Analysis Methods

    International Nuclear Information System (INIS)

    Park, G. Y.; Hur, S.; Cheon, S. W.; Kim, D. H.; Lee, D. Y.; Kwon, K. C.; Lee, S. J.; Koo, Y. H.

    2009-01-01

    A fully digitalized reactor protection system, which is called the IDiPS-RPS, was developed through the KNICS project. The IDiPS-RPS has four redundant and separated channels. Each channel is mainly composed of a group of bistable processors which redundantly compare process variables with their corresponding setpoints and a group of coincidence processors that generate a final trip signal when a trip condition is satisfied. Each channel also contains a test processor called the ATIP and a display and command processor called the COM. All the functions were implemented in software. During the development of the safety software, various software safety analysis methods were applied, in parallel to the verification and validation (V and V) activities, along the software development life cycle. The software safety analysis methods employed were the software hazard and operability (Software HAZOP) study, the software fault tree analysis (Software FTA), and the software failure modes and effects analysis (Software FMEA)

  20. Interactive graphics analysis system for nuclear engineering applications

    International Nuclear Information System (INIS)

    Danchak, M.; Moyer, W.R.; Becker, M.

    1973-01-01

    From working with continuous slowing down theory, the need was recognized for a system which allowed rapid calculation of the theoretical flux, instant comparison with experiment and a simple means of iterating on the slowing down parameters to force flux agreement and reflect cross section modification. Similar requirements exist in other areas of nuclear work for streamlining and simplifying the data analysis process. As a solution, a unique interactive graphics analysis system (RIGAS) was devised to allow a user to calculate, display, compare, manipulate and modify his data without requiring any programming on his part. This was accomplished by establishing human primacy, through extensive human factor considerations, and designing a man-machine dialogue which responds to the mere push of a button. This system results in an instrument which maximizes man's decision making capability and the computer's speed to improve graphic communication and data analysis. (14 figs) (U.S.)

  1. MAUS: MICE Analysis User Software

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    The Muon Ionization Cooling Experiment (MICE) has developed the MICE Analysis User Software (MAUS) to simulate and analyse experimental data. It serves as the primary codebase for the experiment, providing for online data quality checks and offline batch simulation and reconstruction. The code is structured in a Map-Reduce framework to allow parallelization whether on a personal machine or in the control room. Various software engineering practices from industry are also used to ensure correct and maintainable physics code, which include unit, functional and integration tests, continuous integration and load testing, code reviews, and distributed version control systems. Lastly, there are various small design decisions like using JSON as the data structure, using SWIG to allow developers to write components in either Python or C++, or using the SCons python-based build system that may be of interest to other experiments.

  2. Development of a Computer-aided Learning System for Graphical Analysis of Continuous-Time Control Systems

    Directory of Open Access Journals (Sweden)

    J. F. Opadiji

    2010-06-01

    Full Text Available We present the development and deployment process of a computer-aided learning tool which serves as a training aid for undergraduate control engineering courses. We show the process of algorithm construction and implementation of the software which is also aimed at teaching software development at undergraduate level. The scope of this project is limited to graphical analysis of continuous-time control systems.

  3. INSPECT: A graphical user interface software package for IDARC-2D

    Science.gov (United States)

    AlHamaydeh, Mohammad; Najib, Mohamad; Alawnah, Sameer

    Modern day Performance-Based Earthquake Engineering (PBEE) pivots about nonlinear analysis and its feasibility. IDARC-2D is a widely used and accepted software for nonlinear analysis; it possesses many attractive features and capabilities. However, it is operated from the command prompt in the DOS/Unix systems and requires elaborate text-based input files creation by the user. To complement and facilitate the use of IDARC-2D, a pre-processing GUI software package (INSPECT) is introduced herein. INSPECT is created in the C# environment and utilizes the .NET libraries and SQLite database. Extensive testing and verification demonstrated successful and high-fidelity re-creation of several existing IDARC-2D input files. Its design and built-in features aim at expediting, simplifying and assisting in the modeling process. Moreover, this practical aid enhances the reliability of the results and improves accuracy by reducing and/or eliminating many potential and common input mistakes. Such benefits would be appreciated by novice and veteran IDARC-2D users alike.

  4. GWAMA: software for genome-wide association meta-analysis

    Directory of Open Access Journals (Sweden)

    Mägi Reedik

    2010-05-01

    Full Text Available Abstract Background Despite the recent success of genome-wide association studies in identifying novel loci contributing effects to complex human traits, such as type 2 diabetes and obesity, much of the genetic component of variation in these phenotypes remains unexplained. One way to improving power to detect further novel loci is through meta-analysis of studies from the same population, increasing the sample size over any individual study. Although statistical software analysis packages incorporate routines for meta-analysis, they are ill equipped to meet the challenges of the scale and complexity of data generated in genome-wide association studies. Results We have developed flexible, open-source software for the meta-analysis of genome-wide association studies. The software incorporates a variety of error trapping facilities, and provides a range of meta-analysis summary statistics. The software is distributed with scripts that allow simple formatting of files containing the results of each association study and generate graphical summaries of genome-wide meta-analysis results. Conclusions The GWAMA (Genome-Wide Association Meta-Analysis software has been developed to perform meta-analysis of summary statistics generated from genome-wide association studies of dichotomous phenotypes or quantitative traits. Software with source files, documentation and example data files are freely available online at http://www.well.ox.ac.uk/GWAMA.

  5. The use of Graphic User Interface for development of a user-friendly CRS-Stack software

    Science.gov (United States)

    Sule, Rachmat; Prayudhatama, Dythia; Perkasa, Muhammad D.; Hendriyana, Andri; Fatkhan; Sardjito; Adriansyah

    2017-04-01

    The development of a user-friendly Common Reflection Surface (CRS) Stack software that has been built by implementing Graphical User Interface (GUI) is described in this paper. The original CRS-Stack software developed by WIT Consortium is compiled in the unix/linux environment, which is not a user-friendly software, so that a user must write the commands and parameters manually in a script file. Due to this limitation, the CRS-Stack become a non popular method, although applying this method is actually a promising way in order to obtain better seismic sections, which have better reflector continuity and S/N ratio. After obtaining successful results that have been tested by using several seismic data belong to oil companies in Indonesia, it comes to an idea to develop a user-friendly software in our own laboratory. Graphical User Interface (GUI) is a type of user interface that allows people to interact with computer programs in a better way. Rather than typing commands and module parameters, GUI allows the users to use computer programs in much simple and easy. Thus, GUI can transform the text-based interface into graphical icons and visual indicators. The use of complicated seismic unix shell script can be avoided. The Java Swing GUI library is used to develop this CRS-Stack GUI. Every shell script that represents each seismic process is invoked from Java environment. Besides developing interactive GUI to perform CRS-Stack processing, this CRS-Stack GUI is design to help geophysicists to manage a project with complex seismic processing procedures. The CRS-Stack GUI software is composed by input directory, operators, and output directory, which are defined as a seismic data processing workflow. The CRS-Stack processing workflow involves four steps; i.e. automatic CMP stack, initial CRS-Stack, optimized CRS-Stack, and CRS-Stack Supergather. Those operations are visualized in an informative flowchart with self explanatory system to guide the user inputting the

  6. GRAPHICAL ANALYSIS OF LAFFER'S THEORY FOR EUROPEAN UNION MEMBER STATES

    Directory of Open Access Journals (Sweden)

    LILIANA BUNESCU

    2013-04-01

    Full Text Available Most times the current situation of one or another country depends on the historical development of own tax system. A practical question of any governance is to determine the optimal taxation rate level, bringing to the state the highest tax revenues. A good place to start is with what is popularly known as the Laffer curve. This paper aims to determine in graphical terms the level where European economies ranks by using Laffer curve based on the data series provided by the European Commission and the World Bank. Graphical analysis of Laffer's theory can emphasize only the positioning on one or another side of point for maximum tax revenues, a position that can influence fiscal policy decisions. Conclusions at European Union level are simple. Value of taxation rate for fiscal optimal point varies from one Member State to another, from 48.9% in Denmark to 28% in Romania, with an average of 37.1% for the EU-27.

  7. Analysis of graphical representation among freshmen in undergraduate physics laboratory

    Science.gov (United States)

    Adam, A. S.; Anggrayni, S.; Kholiq, A.; Putri, N. P.; Suprapto, N.

    2018-03-01

    Physics concept understanding is the importance of the physics laboratory among freshmen in the undergraduate program. These include the ability to interpret the meaning of the graph to make an appropriate conclusion. This particular study analyses the graphical representation among freshmen in an undergraduate physics laboratory. This study uses empirical study with quantitative approach. The graphical representation covers 3 physics topics: velocity of sound, simple pendulum and spring system. The result of this study shows most of the freshmen (90% of the sample) make a graph based on the data from physics laboratory. It means the transferring process of raw data which illustrated in the table to physics graph can be categorised. Most of the Freshmen use the proportional principle of the variable in graph analysis. However, Freshmen can't make the graph in an appropriate variable to gain more information and can't analyse the graph to obtain the useful information from the slope.

  8. A software platform for the analysis of dermatology images

    Science.gov (United States)

    Vlassi, Maria; Mavraganis, Vlasios; Asvestas, Panteleimon

    2017-11-01

    The purpose of this paper is to present a software platform developed in Python programming environment that can be used for the processing and analysis of dermatology images. The platform provides the capability for reading a file that contains a dermatology image. The platform supports image formats such as Windows bitmaps, JPEG, JPEG2000, portable network graphics, TIFF. Furthermore, it provides suitable tools for selecting, either manually or automatically, a region of interest (ROI) on the image. The automated selection of a ROI includes filtering for smoothing the image and thresholding. The proposed software platform has a friendly and clear graphical user interface and could be a useful second-opinion tool to a dermatologist. Furthermore, it could be used to classify images including from other anatomical parts such as breast or lung, after proper re-training of the classification algorithms.

  9. Plots, Calculations and Graphics Tools (PCG2). Software Transfer Request Presentation

    Science.gov (United States)

    Richardson, Marilou R.

    2010-01-01

    This slide presentation reviews the development of the Plots, Calculations and Graphics Tools (PCG2) system. PCG2 is an easy to use tool that provides a single user interface to view data in a pictorial, tabular or graphical format. It allows the user to view the same display and data in the Control Room, engineering office area, or remote sites. PCG2 supports extensive and regular engineering needs that are both planned and unplanned and it supports the ability to compare, contrast and perform ad hoc data mining over the entire domain of a program's test data.

  10. The SAVI Vulnerability Analysis Software Package

    International Nuclear Information System (INIS)

    Mc Aniff, R.J.; Paulus, W.K.; Key, B.; Simpkins, B.

    1987-01-01

    SAVI (Systematic Analysis of Vulnerability to Intrusion) is a new PC-based software package for modeling Physical Protection Systems (PPS). SAVI utilizes a path analysis approach based on the Adversary Sequence Diagram (ASD) methodology. A highly interactive interface allows the user to accurately model complex facilities, maintain a library of these models on disk, and calculate the most vulnerable paths through any facility. Recommendations are provided to help the user choose facility upgrades which should reduce identified path vulnerabilities. Pop-up windows throughout SAVI are used for the input and display of information. A menu at the top of the screen presents all options to the user. These options are further explained on a message line directly below the menu. A diagram on the screen graphically represents the current protection system model. All input is checked for errors, and data are presented in a logical and clear manner. Print utilities provide the user with hard copies of all information and calculated results

  11. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  12. SDAR: a practical tool for graphical analysis of two-dimensional data

    Directory of Open Access Journals (Sweden)

    Weeratunga Saroja

    2012-08-01

    Full Text Available Abstract Background Two-dimensional data needs to be processed and analysed in almost any experimental laboratory. Some tasks in this context may be performed with generic software such as spreadsheet programs which are available ubiquitously, others may require more specialised software that requires paid licences. Additionally, more complex software packages typically require more time by the individual user to understand and operate. Practical and convenient graphical data analysis software in Java with a user-friendly interface are rare. Results We have developed SDAR, a Java application to analyse two-dimensional data with an intuitive graphical user interface. A smart ASCII parser allows import of data into SDAR without particular format requirements. The centre piece of SDAR is the Java class GraphPanel which provides methods for generic tasks of data visualisation. Data can be manipulated and analysed with respect to the most common operations experienced in an experimental biochemical laboratory. Images of the data plots can be generated in SVG-, TIFF- or PNG-format. Data exported by SDAR is annotated with commands compatible with the Grace software. Conclusion Since SDAR is implemented in Java, it is truly cross-platform compatible. The software is easy to install, and very convenient to use judging by experience in our own laboratories. It is freely available to academic users at http://www.structuralchemistry.org/pcsb/. To download SDAR, users will be asked for their name, institution and email address. A manual, as well as the source code of the GraphPanel class can also be downloaded from this site.

  13. Software applications for flux balance analysis.

    Science.gov (United States)

    Lakshmanan, Meiyappan; Koh, Geoffrey; Chung, Bevan K S; Lee, Dong-Yup

    2014-01-01

    Flux balance analysis (FBA) is a widely used computational method for characterizing and engineering intrinsic cellular metabolism. The increasing number of its successful applications and growing popularity are possibly attributable to the availability of specific software tools for FBA. Each tool has its unique features and limitations with respect to operational environment, user-interface and supported analysis algorithms. Presented herein is an in-depth evaluation of currently available FBA applications, focusing mainly on usability, functionality, graphical representation and inter-operability. Overall, most of the applications are able to perform basic features of model creation and FBA simulation. COBRA toolbox, OptFlux and FASIMU are versatile to support advanced in silico algorithms to identify environmental and genetic targets for strain design. SurreyFBA, WEbcoli, Acorn, FAME, GEMSiRV and MetaFluxNet are the distinct tools which provide the user friendly interfaces in model handling. In terms of software architecture, FBA-SimVis and OptFlux have the flexible environments as they enable the plug-in/add-on feature to aid prospective functional extensions. Notably, an increasing trend towards the implementation of more tailored e-services such as central model repository and assistance to collaborative efforts was observed among the web-based applications with the help of advanced web-technologies. Furthermore, most recent applications such as the Model SEED, FAME, MetaFlux and MicrobesFlux have even included several routines to facilitate the reconstruction of genome-scale metabolic models. Finally, a brief discussion on the future directions of FBA applications was made for the benefit of potential tool developers.

  14. UPVapor: Cofrentes nuclear power plant production results analysis software

    International Nuclear Information System (INIS)

    Curiel, M.; Palomo, M. J.; Baraza, A.; Vaquer, J.

    2010-10-01

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  15. UPVapor: Cofrentes nuclear power plant production results analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Curiel, M. [Logistica y Acondicionamientos Industriales SAU, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain); Palomo, M. J. [ISIRYM, Universidad Politecnica de Valencia, Camino de Vera s/n, Valencia (Spain); Baraza, A. [Iberdrola Generacion S. A., Central Nuclear Cofrentes, Carretera Almansa Requena s/n, 04662 Cofrentes, Valencia (Spain); Vaquer, J., E-mail: m.curiel@lainsa.co [TITANIA Servicios Tecnologicos SL, Sorolla Center, local 10, Av. de las Cortes Valencianas No. 58, 46015 Valencia (Spain)

    2010-10-15

    UPVapor software version 02 has been developed for the Cofrentes nuclear power plant Data Analysis Department (Spain). It is an analysis graphical environment in which users have available all the plant variables registered in the process computer system (SIEC). In order to perform this, UPVapor software has many advanced graphic tools for work simplicity, as well as a friendly environment easy to use and with many configuration possibilities. Plant variables are classified in the same way that they are in SIEC computer and these values are taken from it through the network of Iberdrola. UPVapor can generate two different types of graphics: evolution graphs and X Y graphs. The first ones analyse the evolution up to twenty plant variables in a user's defined time period and according to historic plant files. Many tools are available: cursors, graphic configuration, mobile means, non valid data visualization ... Moreover, a particular analysis configuration can be saved, as a pre selection, giving the possibility of charging pre selection directly and developing quick monitoring of a group of preselected plant variables. In X Y graphs, it is possible to analyse a variable value against another variable in a defined time. As an option, users can filter previous data depending on a variable certain range, with the possibility of programming up to five filters. As well as the other graph, X Y graph has many configurations, saving and printing options. With UPVapor software, data analysts can save a valuable time during daily work and, as it is of easy utilization, it permits to other users to perform their own analysis without ask the analysts to develop. Besides, it can be used from any work centre with access to network framework. (Author)

  16. EPRI compact analyzer: A compact, interactive and color-graphics based simulator for power plant analysis

    International Nuclear Information System (INIS)

    Ipakchi, A.; Khadem, M.; Chen, H.; Colley, R.W.

    1986-01-01

    This paper presents the results of an EPRI sponsored project (RP2395-2) for design and development of an interactive, and color graphics based simulator for power plant analysis. The system is called Compact Analyzer and can be applied to engineering and training applications in the utility industry. The Compact Analyzer's software and system design are described. Results of two demonstration system for a nuclear plant, and a fossil plant are presented, and the applications of the Compact Analyzer to operating procedures evaluation are discussed

  17. Intercomparison of gamma ray analysis software packages

    International Nuclear Information System (INIS)

    1998-04-01

    The IAEA undertook an intercomparison exercise to review available software for gamma ray spectra analysis. This document describes the methods used in the intercomparison exercise, characterizes the software packages reviewed and presents the results obtained. Only direct results are given without any recommendation for a particular software or method for gamma ray spectra analysis

  18. Infusing Reliability Techniques into Software Safety Analysis

    Science.gov (United States)

    Shi, Ying

    2015-01-01

    Software safety analysis for a large software intensive system is always a challenge. Software safety practitioners need to ensure that software related hazards are completely identified, controlled, and tracked. This paper discusses in detail how to incorporate the traditional reliability techniques into the entire software safety analysis process. In addition, this paper addresses how information can be effectively shared between the various practitioners involved in the software safety analyses. The author has successfully applied the approach to several aerospace applications. Examples are provided to illustrate the key steps of the proposed approach.

  19. Graphical analysis of power systems for mobile robotics

    Science.gov (United States)

    Raade, Justin William

    The field of mobile robotics places stringent demands on the power system. Energetic autonomy, or the ability to function for a useful operation time independent of any tether, refueling, or recharging, is a driving force in a robot designed for a field application. The focus of this dissertation is the development of two graphical analysis tools, namely Ragone plots and optimal hybridization plots, for the design of human scale mobile robotic power systems. These tools contribute to the intuitive understanding of the performance of a power system and expand the toolbox of the design engineer. Ragone plots are useful for graphically comparing the merits of different power systems for a wide range of operation times. They plot the specific power versus the specific energy of a system on logarithmic scales. The driving equations in the creation of a Ragone plot are derived in terms of several important system parameters. Trends at extreme operation times (both very short and very long) are examined. Ragone plot analysis is applied to the design of several power systems for high-power human exoskeletons. Power systems examined include a monopropellant-powered free piston hydraulic pump, a gasoline-powered internal combustion engine with hydraulic actuators, and a fuel cell with electric actuators. Hybrid power systems consist of two or more distinct energy sources that are used together to meet a single load. They can often outperform non-hybrid power systems in low duty-cycle applications or those with widely varying load profiles and long operation times. Two types of energy sources are defined: engine-like and capacitive. The hybridization rules for different combinations of energy sources are derived using graphical plots of hybrid power system mass versus the primary system power. Optimal hybridization analysis is applied to several power systems for low-power human exoskeletons. Hybrid power systems examined include a fuel cell and a solar panel coupled with

  20. A graphical user-interface for propulsion system analysis

    Science.gov (United States)

    Curlett, Brian P.; Ryall, Kathleen

    1993-01-01

    NASA LeRC uses a series of computer codes to calculate installed propulsion system performance and weight. The need to evaluate more advanced engine concepts with a greater degree of accuracy has resulted in an increase in complexity of this analysis system. Therefore, a graphical user interface was developed to allow the analyst to more quickly and easily apply these codes. The development of this interface and the rationale for the approach taken are described. The interface consists of a method of pictorially representing and editing the propulsion system configuration, forms for entering numerical data, on-line help and documentation, post processing of data, and a menu system to control execution.

  1. Development of Software for Measurement and Analysis of Solar Radiation

    International Nuclear Information System (INIS)

    Mohamad Idris Taib; Abul Adli Anuar; Noor Ezati Shuib

    2015-01-01

    This software was under development using LabVIEW to be using with StellarNet spectrometers system with USB communication to computer. LabVIEW have capabilities in hardware interfacing, graphical user interfacing and mathematical calculation including array manipulation and processing. This software read data from StellarNet spectrometer in real-time and then processed for analysis. Several measurement of solar radiation and analysis have been done. Solar radiation involved mainly infra-red, visible light and ultra-violet. With solar radiation spectrum data, information of weather and suitability of plant can be gathered and analyzed. Furthermore, optimization of utilization and safety precaution of solar radiation can be planned. Using this software, more research and development in utilization and safety of solar radiation can be explored. (author)

  2. Development of virtual hands using animation software and graphical modelling; Elaboracao de maos virtuais usando software de animacao e modelagem grafica

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Erick da S.; Junior, Alberico B. de C. [Universidade Federal de Sergipe (UFSE), Sao Cristovao, SE (Brazil)

    2016-07-01

    The numerical dosimetry uses virtual anthropomorphic simulators to represent the human being in computational framework and thus assess the risks associated with exposure to a radioactive source. With the development of computer animation software, the development of these simulators was facilitated using only knowledge of human anatomy to prepare various types of simulators (man, woman, child and baby) in various positions (sitting, standing, running) or part thereof (head, trunk and limbs). These simulators are constructed by loops of handling and due to the versatility of the method, one can create various geometries irradiation was not possible before. In this work, we have built an exhibition of a radiopharmaceutical scenario manipulating radioactive material using animation software and graphical modeling and anatomical database. (author)

  3. Development of Emittance Analysis Software for Ion Beam Characterization

    International Nuclear Information System (INIS)

    Padilla, M.J.; Liu, Yuan

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a figure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally, a high-quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifield Radioactive Ion Beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profiles, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fitting are also incorporated into the software. The software will provide a simplified, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate

  4. DEVELOPMENT OF EMITTANCE ANALYSIS SOFTWARE FOR ION BEAM CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, M. J.; Liu, Y.

    2007-01-01

    Transverse beam emittance is a crucial property of charged particle beams that describes their angular and spatial spread. It is a fi gure of merit frequently used to determine the quality of ion beams, the compatibility of an ion beam with a given beam transport system, and the ability to suppress neighboring isotopes at on-line mass separator facilities. Generally a high quality beam is characterized by a small emittance. In order to determine and improve the quality of ion beams used at the Holifi eld Radioactive Ion beam Facility (HRIBF) for nuclear physics and nuclear astrophysics research, the emittances of the ion beams are measured at the off-line Ion Source Test Facilities. In this project, emittance analysis software was developed to perform various data processing tasks for noise reduction, to evaluate root-mean-square emittance, Twiss parameters, and area emittance of different beam fractions. The software also provides 2D and 3D graphical views of the emittance data, beam profi les, emittance contours, and RMS. Noise exclusion is essential for accurate determination of beam emittance values. A Self-Consistent, Unbiased Elliptical Exclusion (SCUBEEx) method is employed. Numerical data analysis techniques such as interpolation and nonlinear fi tting are also incorporated into the software. The software will provide a simplifi ed, fast tool for comprehensive emittance analysis. The main functions of the software package have been completed. In preliminary tests with experimental emittance data, the analysis results using the software were shown to be accurate.

  5. Flowfield computer graphics

    Science.gov (United States)

    Desautel, Richard

    1993-01-01

    The objectives of this research include supporting the Aerothermodynamics Branch's research by developing graphical visualization tools for both the branch's adaptive grid code and flow field ray tracing code. The completed research for the reporting period includes development of a graphical user interface (GUI) and its implementation into the NAS Flowfield Analysis Software Tool kit (FAST), for both the adaptive grid code (SAGE) and the flow field ray tracing code (CISS).

  6. C language program analysis system (CLAS) part 1: graphical user interface (GUI)

    International Nuclear Information System (INIS)

    Bhattacharjee, A.K.; Seby, A.; Sen, Gopa; Dhodapkar, S.D.

    1994-01-01

    CLAS (C Language Program Analysis System) is a reverse engineering tool intended for use in the verification and validation (V and V) phase of software programs developed in the ANSI C language. From the source code, CLAS generates data pertaining to two conceptual models of software programs viz., Entity-Relationship (E-R) model and Control Flow Graphs (CFG) model. Browsing tools within CLAS, make use of this data, to provide different graphical views of the project. Static analysis tools have been developed earlier for analysing assembly language programs. CLAS is a continuation of this work to provide automated support in analysis of ANSI C language programs. CLAS provides an integrated Graphical User Interface (GUI) based environment under which programs can be analysed into the above mentioned models and the analysed data can be viewed using the browsing tools. The GUI of CLAS is implemented using an OPEN LOOK compliant tool kit XVIEW on Sun SPARC IPC workstation running Sun OS 4.1.1 rev. B. This report describes the GUI of CLAS. CLAS is also expected to be useful in other contexts which may involve understanding architecture/structure of already developed C language programs. Such requirements can arise while carrying out activities like code modification, parallelising etc. (author). 5 refs., 13 figs., 1 appendix

  7. Fault tree analysis of KNICS RPS software

    International Nuclear Information System (INIS)

    Park, Gee Yong; Kwon, Kee Choon; Koh, Kwang Yong; Jee, Eun Kyoung; Seong, Poong Hyun; Lee, Dae Hyung

    2008-01-01

    This paper describes the application of a software Fault Tree Analysis (FTA) as one of the analysis techniques for a Software Safety Analysis (SSA) at the design phase and its analysis results for the safety-critical software of a digital reactor protection system, which is called the KNICS RPS, being developed in the KNICS (Korea Nuclear Instrumentation and Control Systems) project. The software modules in the design description were represented by Function Blocks (FBs), and the software FTA was performed based on the well-defined fault tree templates for the FBs. The SSA, which is part of the verification and validation (V and V) activities, was activated at each phase of the software lifecycle for the KNICS RPS. At the design phase, the software HAZOP (Hazard and Operability) and the software FTA were employed in the SSA in such a way that the software HAZOP was performed first and then the software FTA was applied. The software FTA was applied to some critical modules selected from the software HAZOP analysis

  8. Path Planning Software and Graphics Interface for an Autonomous Vehicle, Accounting for Terrain Features

    National Research Council Canada - National Science Library

    Hurezeanu, Vlad

    2000-01-01

    .... This vehicle performs tasks to include surveying fields, laying mines, and teleoperation. The capability of the vehicle will be increased if its supporting software plans paths that take into account the terrain features...

  9. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics.

    Science.gov (United States)

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-03-15

    RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId's core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goal here. We have constructed a graphical user interface to facilitate the use of RAId on users' local machines. Written in Java, RAId_GUI not only makes easy executions of RAId but also provides tools for data/spectra visualization, MS-product analysis, molecular isotopic distribution analysis, and graphing the retrieval versus the proportion of false discoveries. The results viewer displays and allows the users to download the analyses results. Both the knowledge-integrated organismal databases and the code package (containing source code, the graphical user interface, and a user manual) are available for download at https://www.ncbi.nlm.nih.gov/CBBresearch/Yu/downloads/raid.html .

  10. Dependability Analysis Methods For Configurable Software

    International Nuclear Information System (INIS)

    Dahll, Gustav; Pulkkinen, Urho

    1996-01-01

    Configurable software systems are systems which are built up by standard software components in the same way as a hardware system is built up by standard hardware components. Such systems are often used in the control of NPPs, also in safety related applications. A reliability analysis of such systems is therefore necessary. This report discusses what configurable software is, and what is particular with respect to reliability assessment of such software. Two very commonly used techniques in traditional reliability analysis, viz. failure mode, effect and criticality analysis (FMECA) and fault tree analysis are investigated. A real example is used to illustrate the discussed methods. Various aspects relevant to the assessment of the software reliability in such systems are discussed. Finally some models for quantitative software reliability assessment applicable on configurable software systems are described. (author)

  11. A real-time GNSS-R system based on software-defined radio and graphics processing units

    Science.gov (United States)

    Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki

    2012-04-01

    Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.

  12. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  13. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, Eelke

    2005-01-01

    One of the qualities that has received increased attention in recent decades is usability. A software product with poor usability is likely to fail in a highly competitive market; therefore software developing organizations are paying more and more attention to ensuring the usability of their

  14. Advanced software development workstation. Engineering scripting language graphical editor: DRAFT design document

    Science.gov (United States)

    1991-01-01

    The Engineering Scripting Language (ESL) is a language designed to allow nonprogramming users to write Higher Order Language (HOL) programs by drawing directed graphs to represent the program and having the system generate the corresponding program in HOL. The ESL system supports user generation of HOL programs through the manipulation of directed graphs. The components of this graphs (nodes, ports, and connectors) are objects each of which has its own properties and property values. The purpose of the ESL graphical editor is to allow the user to create or edit graph objects which represent programs.

  15. Development of output user interface software to support analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wahanani, Nursinta Adi, E-mail: sintaadi@batan.go.id; Natsir, Khairina, E-mail: sintaadi@batan.go.id; Hartini, Entin, E-mail: sintaadi@batan.go.id [Center for Development of Nuclear Informatics - National Nuclear Energy Agency, PUSPIPTEK, Serpong, Tangerang, Banten (Indonesia)

    2014-09-30

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu{sup 239} and Pu{sup 241}. Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis.

  16. Development of output user interface software to support analysis

    International Nuclear Information System (INIS)

    Wahanani, Nursinta Adi; Natsir, Khairina; Hartini, Entin

    2014-01-01

    Data processing software packages such as VSOP and MCNPX are softwares that has been scientifically proven and complete. The result of VSOP and MCNPX are huge and complex text files. In the analyze process, user need additional processing like Microsoft Excel to show informative result. This research develop an user interface software for output of VSOP and MCNPX. VSOP program output is used to support neutronic analysis and MCNPX program output is used to support burn-up analysis. Software development using iterative development methods which allow for revision and addition of features according to user needs. Processing time with this software 500 times faster than with conventional methods using Microsoft Excel. PYTHON is used as a programming language, because Python is available for all major operating systems: Windows, Linux/Unix, OS/2, Mac, Amiga, among others. Values that support neutronic analysis are k-eff, burn-up and mass Pu 239 and Pu 241 . Burn-up analysis used the mass inventory values of actinide (Thorium, Plutonium, Neptunium and Uranium). Values are visualized in graphical shape to support analysis

  17. Integrating PAW, a graphical analysis interface to Sybase

    International Nuclear Information System (INIS)

    Fry, A.; Chow, I.

    1993-04-01

    The program PAW (Physics Analysis Workstation) enjoys tremendous popularity within the high energy physics community. It is implemented on a large number of platforms and is available to the high energy physics community free of charge from the CERN computing division. PAW combines extensive graphical display capability (HPLOT/HIGZ), with histogramming (HBOOK4), file and data handling (ZEBRA), vector arithmetic manipulation (SIGMA), user defined functions (COMIS), powerful function minimization (MINUIT), and a command interpreter (KUIP). To facilitate the possibility of using relational databases in physics analysis, we have added an SQL interface to PAW. This interface allows users to create PAW N-tuples from Sybase tables and vice versa. We discuss the implementations below

  18. TREEFINDER: a powerful graphical analysis environment for molecular phylogenetics

    Directory of Open Access Journals (Sweden)

    von Haeseler Arndt

    2004-06-01

    Full Text Available Abstract Background Most analysis programs for inferring molecular phylogenies are difficult to use, in particular for researchers with little programming experience. Results TREEFINDER is an easy-to-use integrative platform-independent analysis environment for molecular phylogenetics. In this paper the main features of TREEFINDER (version of April 2004 are described. TREEFINDER is written in ANSI C and Java and implements powerful statistical approaches for inferring gene tree and related analyzes. In addition, it provides a user-friendly graphical interface and a phylogenetic programming language. Conclusions TREEFINDER is a versatile framework for analyzing phylogenetic data across different platforms that is suited both for exploratory as well as advanced studies.

  19. Analysis of open source GIS software

    OpenAIRE

    Božnis, Andrius

    2006-01-01

    GIS is one of the most perspective information technology sciences sphere. GIS conjuncts the digital image analysis and data base systems. This makes GIS wide applicable and very high skills demanding system. There is a lot of commercial GIS software which is well advertised and which functionality is pretty well known, while open source software is forgotten. In this diploma work is made analysis of available open source GIS software on the Internet, in the scope of different projects interr...

  20. STARS software tool for analysis of reliability and safety

    International Nuclear Information System (INIS)

    Poucet, A.; Guagnini, E.

    1989-01-01

    This paper reports on the STARS (Software Tool for the Analysis of Reliability and Safety) project aims at developing an integrated set of Computer Aided Reliability Analysis tools for the various tasks involved in systems safety and reliability analysis including hazard identification, qualitative analysis, logic model construction and evaluation. The expert system technology offers the most promising perspective for developing a Computer Aided Reliability Analysis tool. Combined with graphics and analysis capabilities, it can provide a natural engineering oriented environment for computer assisted reliability and safety modelling and analysis. For hazard identification and fault tree construction, a frame/rule based expert system is used, in which the deductive (goal driven) reasoning and the heuristic, applied during manual fault tree construction, is modelled. Expert system can explain their reasoning so that the analyst can become aware of the why and the how results are being obtained. Hence, the learning aspect involved in manual reliability and safety analysis can be maintained and improved

  1. Contextual analysis of Biology and Chemistry academic graphical abstracts

    Directory of Open Access Journals (Sweden)

    Cristiane Salete Florek

    2016-10-01

    Full Text Available http://dx.doi.org/10.5007/1984-8412.2016v13n3p1363 The Graphical Abstract (GA is a non-regular discursive practice held in the academic context, and that, when occurs, coexists with the academic abstract (AA in the table of contents of scientific journals, and in HTML versions of academic articles, materializing by the combination of the verbal and visual semiotics. In this paper, in the light of the Critical Analysis genres (MEURER, 2002; BHATIA, 2004; MOTTA-ROTH, 2006, 2008, which allow us to study a text based on the investigation of its context’s critical research, we present the results of the contextual analysis of GAs in the areas of Biology and Chemistry. This analysis was done by: i interviews with researchers of the investigated areas; and ii documentary analysis. Results show that, in general, GA: i is highlighted by presenting an advertising nature, which seeks to attract the reader’s attention; ii: summarizes the topic and the main findings of scientific research; and iii does not replace the academia abstract (AA.

  2. Development of a Monte Carlo software to photon transportation in voxel structures using graphic processing units

    International Nuclear Information System (INIS)

    Bellezzo, Murillo

    2014-01-01

    As the most accurate method to estimate absorbed dose in radiotherapy, Monte Carlo Method (MCM) has been widely used in radiotherapy treatment planning. Nevertheless, its efficiency can be improved for clinical routine applications. In this thesis, the CUBMC code is presented, a GPU-based MC photon transport algorithm for dose calculation under the Compute Unified Device Architecture (CUDA) platform. The simulation of physical events is based on the algorithm used in PENELOPE, and the cross section table used is the one generated by the MATERIAL routine, also present in PENELOPE code. Photons are transported in voxel-based geometries with different compositions. There are two distinct approaches used for transport simulation. The rst of them forces the photon to stop at every voxel frontier, the second one is the Woodcock method, where the photon ignores the existence of borders and travels in homogeneous fictitious media. The CUBMC code aims to be an alternative of Monte Carlo simulator code that, by using the capability of parallel processing of graphics processing units (GPU), provide high performance simulations in low cost compact machines, and thus can be applied in clinical cases and incorporated in treatment planning systems for radiotherapy. (author)

  3. Gamma-Ray Spectrum Analysis Software GDA

    International Nuclear Information System (INIS)

    Wanabongse, P.

    1998-01-01

    The developmental work on computer software for gamma-ray spectrum analysis has been completed as a software package version 1.02 named GDA, which is an acronym for Gamma-spectrum Deconvolution and Analysis. The software package consists of three 3.5-inch diskettes for setup and a user's manual. GDA software can be installed for using on a personal computer with Windows 95 or Windows NT 4.0 operating system. A computer maybe the type of 80486 CPU with 8 megabytes of memory

  4. An analysis software of tritium distribution in food and environmental water in China

    International Nuclear Information System (INIS)

    Li Wenhong; Xu Cuihua; Ren Tianshan; Deng Guilong

    2006-01-01

    Objective: The purpose of developing this analysis-software of tritium distribution in food and environmental water is to collect tritium monitoring data, to analyze the data, both automatically, statistically and graphically, and to study and share the data. Methods: Based on the data obtained before, analysis-software is wrote by using VC++. NET as tool software. The software first transfers data from EXCEL into a database. It has additive function of data-append, so operators can embody new monitoring data easily. Results: After turning the monitoring data saved as EXCEL file by original researchers into a database, people can easily access them. The software provides a tool of distributing-analysis of tritium. Conclusion: This software is a first attempt of data-analysis about tritium level in food and environmental water in China. Data achieving, searching and analyzing become easily and directly with the software. (authors)

  5. Software architecture analysis of usability

    NARCIS (Netherlands)

    Folmer, E; van Gurp, J; Bosch, J; Bastide, R; Palanque, P; Roth, J

    2005-01-01

    Studies of software engineering projects show that a large number of usability related change requests are made after its deployment. Fixing usability problems during the later stages of development often proves to be costly, since many of the necessary changes require changes to the system that

  6. Graphical representation of ribosomal RNA probe accessibility data using ARB software package

    Directory of Open Access Journals (Sweden)

    Amann Rudolf

    2005-03-01

    Full Text Available Abstract Background Taxon specific hybridization probes in combination with a variety of commonly used hybridization formats nowadays are standard tools in microbial identification. A frequently applied technology, fluorescence in situ hybridization (FISH, besides single cell identification, allows the localization and functional studies of the microbial community composition. Careful in silico design and evaluation of potential oligonucleotide probe targets is therefore crucial for performing successful hybridization experiments. Results The PROBE Design tools of the ARB software package take into consideration several criteria such as number, position and quality of diagnostic sequence differences while designing oligonucleotide probes. Additionally, new visualization tools were developed to enable the user to easily examine further sequence associated criteria such as higher order structure, conservation, G+C content, transition-transversion profiles and in situ target accessibility patterns. The different types of sequence associated information (SAI can be visualized by user defined background colors within the ARB primary and secondary structure editors as well as in the PROBE Match tool. Conclusion Using this tool, in silico probe design and evaluation can be performed with respect to in situ probe accessibility data. The evaluation of proposed probe targets with respect to higher-order rRNA structure is of importance for successful design and performance of in situ hybridization experiments. The entire ARB software package along with the probe accessibility data is available from the ARB home page http://www.arb-home.de.

  7. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  8. Development of a fatigue analysis software system

    International Nuclear Information System (INIS)

    Choi, B. I.; Lee, H. J.; Han, S. W.; Kim, J. Y.; Hwang, K. H.; Kang, J. Y.

    2001-01-01

    A general purpose fatigue analysis software to predict fatigue lives of mechanical components and structures was developed. This software has some characteristic features including functions of searching weak regions on the free surface in order to reduce computing time significantly, a database of fatigue properties for various materials, and an expert system which can assist any users to get more proper results. This software can be used in the environment consists of commercial finite element packages. Using the software developed fatigue analyses for a SAE keyhole specimen and an automobile knuckle were carried out. It was observed that the results were agree well with those from commercial packages

  9. Numerical methods in software and analysis

    CERN Document Server

    Rice, John R

    1992-01-01

    Numerical Methods, Software, and Analysis, Second Edition introduces science and engineering students to the methods, tools, and ideas of numerical computation. Introductory courses in numerical methods face a fundamental problem-there is too little time to learn too much. This text solves that problem by using high-quality mathematical software. In fact, the objective of the text is to present scientific problem solving using standard mathematical software. This book discusses numerous programs and software packages focusing on the IMSL library (including the PROTRAN system) and ACM Algorithm

  10. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    An improvement plan is then drawn and implemented. This paper studied the state of Nigerian software development organizations based on selected attributes. Force field analysis is used to partition the factors obtained into driving and restraining forces. An attempt was made to improve the software development process ...

  11. Using Interactive Graphics to Teach Multivariate Data Analysis to Psychology Students

    Science.gov (United States)

    Valero-Mora, Pedro M.; Ledesma, Ruben D.

    2011-01-01

    This paper discusses the use of interactive graphics to teach multivariate data analysis to Psychology students. Three techniques are explored through separate activities: parallel coordinates/boxplots; principal components/exploratory factor analysis; and cluster analysis. With interactive graphics, students may perform important parts of the…

  12. Development of a graphical user interface for the TRAC plant/safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, A.E.; Harkins, C.K.; Smith, R.J.

    1995-09-01

    A graphical user interface (GUI) for the Transient Reactor Analysis Code (TRAC) has been developed at Knolls Atomic Power Laboratory. This X Window based GUI supports the design and analysis process, acting as a preprocessor, runtime editor, help system and post processor to TRAC-PF1/MOD2. TRAC was developed at the Los Alamos National Laboratory (LANL). The preprocessor is an icon-based interface which allows the user to create a TRAC model. When the model is complete, the runtime editor provides the capability to execute and monitor TRAC runs on the workstation or supercomputer. After runs are made, the output processor allows the user to extract and format data from the TRAC graphics file. The TRAC GUI is currently compatible with TRAC-PF1/MOD2 V5.3 and is available with documentation from George Niederauer, Section Leader of the Software Development Section, Group TSA-8, at LANL. Users may become functional in creating, running, and interpreting results from TRAC without having to know Unix commands and the detailed format of any of the data files. This reduces model development and debug time and increases quality control. Integration with post-processing and visualization tools increases engineering effectiveness.

  13. Development of a graphical user interface for the TRAC plant/safety analysis code

    International Nuclear Information System (INIS)

    Kelly, A.E.; Harkins, C.K.; Smith, R.J.

    1995-01-01

    A graphical user interface (GUI) for the Transient Reactor Analysis Code (TRAC) has been developed at Knolls Atomic Power Laboratory. This X Window based GUI supports the design and analysis process, acting as a preprocessor, runtime editor, help system and post processor to TRAC-PF1/MOD2. TRAC was developed at the Los Alamos National Laboratory (LANL). The preprocessor is an icon-based interface which allows the user to create a TRAC model. When the model is complete, the runtime editor provides the capability to execute and monitor TRAC runs on the workstation or supercomputer. After runs are made, the output processor allows the user to extract and format data from the TRAC graphics file. The TRAC GUI is currently compatible with TRAC-PF1/MOD2 V5.3 and is available with documentation from George Niederauer, Section Leader of the Software Development Section, Group TSA-8, at LANL. Users may become functional in creating, running, and interpreting results from TRAC without having to know Unix commands and the detailed format of any of the data files. This reduces model development and debug time and increases quality control. Integration with post-processing and visualization tools increases engineering effectiveness

  14. Use of interactive graphics in bridge analysis and design.

    Science.gov (United States)

    1983-01-01

    This study evaluated the role of computer-aided design (CAD), including interactive graphics, in engineering design applications, especially in the design activities of the Virginia Department of Highways and Transportation. A review of the hardware ...

  15. Graphical Data Analysis on the Circle: Wrap-Around Time Series Plots for (Interrupted) Time Series Designs.

    Science.gov (United States)

    Rodgers, Joseph Lee; Beasley, William Howard; Schuelke, Matthew

    2014-01-01

    Many data structures, particularly time series data, are naturally seasonal, cyclical, or otherwise circular. Past graphical methods for time series have focused on linear plots. In this article, we move graphical analysis onto the circle. We focus on 2 particular methods, one old and one new. Rose diagrams are circular histograms and can be produced in several different forms using the RRose software system. In addition, we propose, develop, illustrate, and provide software support for a new circular graphical method, called Wrap-Around Time Series Plots (WATS Plots), which is a graphical method useful to support time series analyses in general but in particular in relation to interrupted time series designs. We illustrate the use of WATS Plots with an interrupted time series design evaluating the effect of the Oklahoma City bombing on birthrates in Oklahoma County during the 10 years surrounding the bombing of the Murrah Building in Oklahoma City. We compare WATS Plots with linear time series representations and overlay them with smoothing and error bands. Each method is shown to have advantages in relation to the other; in our example, the WATS Plots more clearly show the existence and effect size of the fertility differential.

  16. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  17. Software safety analysis practice in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shyu, S. S.

    2010-10-01

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  18. Computer-assisted qualitative data analysis software.

    Science.gov (United States)

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  19. User's guide to the CALVEC software library: a computer program for emulation of CALCOMP graphics on a Versatec printer/plotter

    International Nuclear Information System (INIS)

    Gray, W.H.

    1978-08-01

    This document describes a set of FORTRAN subroutines collectively called the CALVEC subprogram library. The purpose of the CALVEC software library is the emulation of CALCOMP pen and ink graphics on a DECsystem 10. A user level interface with CALVEC software allows standard CALCOMP subprogram calls to produce a VECtor file, SEGMNT.VEC. This vector file may subsequently be postprocessed into an image in a variety of ways

  20. User's guide to the CALVEC software library: a computer program for emulation of CALCOMP graphics on a versatec printer/plotter

    International Nuclear Information System (INIS)

    Gray, W.H.

    1979-03-01

    this document describes a set of FORTRAN subroutines collectively called the CALVEC subprogram library. The purpose of the CALVEC software library is the emulation of CALCOMP pen and ink graphics on a DECsystem 10. A user level interface with CALVEC software allows standard CALCOMP subprogram calls to produce a VECtor file, FOR24.VEC. This vector file may subsequently be postprocessed into an image in a variety of ways

  1. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  2. Potku – New analysis software for heavy ion elastic recoil detection analysis

    International Nuclear Information System (INIS)

    Arstila, K.; Julin, J.; Laitinen, M.I.; Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T.; Sajavaara, T.

    2014-01-01

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments

  3. Potku – New analysis software for heavy ion elastic recoil detection analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arstila, K., E-mail: kai.arstila@jyu.fi [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Julin, J.; Laitinen, M.I. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Aalto, J.; Konu, T.; Kärkkäinen, S.; Rahkonen, S.; Raunio, M.; Itkonen, J.; Santanen, J.-P.; Tuovinen, T. [Department of Mathematical Information Technology, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland); Sajavaara, T. [Department of Physics, University of Jyväskylä, P.O. Box 35, FI-40014, Jyväskylä (Finland)

    2014-07-15

    Time-of-flight elastic recoil detection (ToF-ERD) analysis software has been developed. The software combines a Python-language graphical front-end with a C code computing back-end in a user-friendly way. The software uses a list of coincident time-of-flight–energy (ToF–E) events as an input. The ToF calibration can be determined with a simple graphical procedure. The graphical interface allows the user to select different elements and isotopes from a ToF–E histogram and to convert the selections to individual elemental energy and depth profiles. The resulting sample composition can be presented as relative or absolute concentrations by integrating the depth profiles over user-defined ranges. Beam induced composition changes can be studied by displaying the event-based data in fractions relative to the substrate reference data. Optional angular input data allows for kinematic correction of the depth profiles. This open source software is distributed under the GPL license for Linux, Mac, and Windows environments.

  4. Software safety analysis application in installation phase

    International Nuclear Information System (INIS)

    Huang, H. W.; Yih, S.; Wang, L. H.; Liao, B. C.; Lin, J. M.; Kao, T. M.

    2010-01-01

    This work performed a software safety analysis (SSA) in the installation phase of the Lungmen nuclear power plant (LMNPP) in Taiwan, under the cooperation of INER and TPC. The US Nuclear Regulatory Commission (USNRC) requests licensee to perform software safety analysis (SSA) and software verification and validation (SV and V) in each phase of software development life cycle with Branch Technical Position (BTP) 7-14. In this work, 37 safety grade digital instrumentation and control (I and C) systems were analyzed by Failure Mode and Effects Analysis (FMEA), which is suggested by IEEE Standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The FMEA showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (authors)

  5. Graphical User Interface for Simulink Integrated Performance Analysis Model

    Science.gov (United States)

    Durham, R. Caitlyn

    2009-01-01

    The J-2X Engine (built by Pratt & Whitney Rocketdyne,) in the Upper Stage of the Ares I Crew Launch Vehicle, will only start within a certain range of temperature and pressure for Liquid Hydrogen and Liquid Oxygen propellants. The purpose of the Simulink Integrated Performance Analysis Model is to verify that in all reasonable conditions the temperature and pressure of the propellants are within the required J-2X engine start boxes. In order to run the simulation, test variables must be entered at all reasonable values of parameters such as heat leak and mass flow rate. To make this testing process as efficient as possible in order to save the maximum amount of time and money, and to show that the J-2X engine will start when it is required to do so, a graphical user interface (GUI) was created to allow the input of values to be used as parameters in the Simulink Model, without opening or altering the contents of the model. The GUI must allow for test data to come from Microsoft Excel files, allow those values to be edited before testing, place those values into the Simulink Model, and get the output from the Simulink Model. The GUI was built using MATLAB, and will run the Simulink simulation when the Simulate option is activated. After running the simulation, the GUI will construct a new Microsoft Excel file, as well as a MATLAB matrix file, using the output values for each test of the simulation so that they may graphed and compared to other values.

  6. Interactive graphics for data analysis principles and examples

    CERN Document Server

    Theus, Martin

    2008-01-01

    Introduction PRINCIPLESInteractivity Queries Selection and Linked Highlighting Linking AnalysesInteracting with Graphics Examining a Single Variable Categorical DataContinuous DataTransforming Data Weighted Plots Interactions between Two VariablesTwo Categorical VariablesOne Categorical Variable and One Continuous VariableTwo Continuous VariablesMultidimensional Plots Mosaic PlotsParallel Coordinate Plots Trellis Displays Plot Ensembles and Statistical ModelsResponse ModelsANOVALoglinear ModelsGeographical DataMore Interactivity Sorting and Ordering Zooming Multiple ViewsInteractive Graphics ?

  7. Reflector antenna analysis using physical optics on Graphics Processing Units

    DEFF Research Database (Denmark)

    Borries, Oscar Peter; Sørensen, Hans Henrik Brandenborg; Dammann, Bernd

    2014-01-01

    The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate the perform......The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate...

  8. A novel R-package graphic user interface for the analysis of metabonomic profiles

    Directory of Open Access Journals (Sweden)

    Villa Palmira

    2009-10-01

    Full Text Available Abstract Background Analysis of the plethora of metabolites found in the NMR spectra of biological fluids or tissues requires data complexity to be simplified. We present a graphical user interface (GUI for NMR-based metabonomic analysis. The "Metabonomic Package" has been developed for metabonomics research as open-source software and uses the R statistical libraries. Results The package offers the following options: Raw 1-dimensional spectra processing: phase, baseline correction and normalization. Importing processed spectra. Including/excluding spectral ranges, optional binning and bucketing, detection and alignment of peaks. Sorting of metabolites based on their ability to discriminate, metabolite selection, and outlier identification. Multivariate unsupervised analysis: principal components analysis (PCA. Multivariate supervised analysis: partial least squares (PLS, linear discriminant analysis (LDA, k-nearest neighbor classification. Neural networks. Visualization and overlapping of spectra. Plot values of the chemical shift position for different samples. Furthermore, the "Metabonomic" GUI includes a console to enable other kinds of analyses and to take advantage of all R statistical tools. Conclusion We made complex multivariate analysis user-friendly for both experienced and novice users, which could help to expand the use of NMR-based metabonomics.

  9. Software criticality analysis of COTS/SOUP

    International Nuclear Information System (INIS)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-01-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading

  10. Software criticality analysis of COTS/SOUP

    Energy Technology Data Exchange (ETDEWEB)

    Bishop, Peter; Bloomfield, Robin; Clement, Tim; Guerra, Sofia

    2003-09-01

    This paper describes the Software Criticality Analysis (SCA) approach that was developed to support the justification of using commercial off-the-shelf software (COTS) in a safety-related system. The primary objective of SCA is to assess the importance to safety of the software components within the COTS and to show there is segregation between software components with different safety importance. The approach taken was a combination of Hazops based on design documents and on a detailed analysis of the actual code (100 kloc). Considerable effort was spent on validation and ensuring the conservative nature of the results. The results from reverse engineering from the code showed that results based only on architecture and design documents would have been misleading.

  11. SIMA: Python software for analysis of dynamic fluorescence imaging data

    Directory of Open Access Journals (Sweden)

    Patrick eKaifosh

    2014-09-01

    Full Text Available Fluorescence imaging is a powerful method for monitoring dynamic signals in the nervous system. However, analysis of dynamic fluorescence imaging data remains burdensome, in part due to the shortage of available software tools. To address this need, we have developed SIMA, an open source Python package that facilitates common analysis tasks related to fluorescence imaging. Functionality of this package includes correction of motion artifacts occurring during in vivo imaging with laser-scanning microscopy, segmentation of imaged fields into regions of interest (ROIs, and extraction of signals from the segmented ROIs. We have also developed a graphical user interface (GUI for manual editing of the automatically segmented ROIs and automated registration of ROIs across multiple imaging datasets. This software has been designed with flexibility in mind to allow for future extension with different analysis methods and potential integration with other packages. Software, documentation, and source code for the SIMA package and ROI Buddy GUI are freely available at http://www.losonczylab.org/sima/.

  12. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  13. Perception and multimeaning analysis of graphic symbols for Thai picture-based communication system.

    Science.gov (United States)

    Chompoobutr, Sarinya; Potibal, Puttachart; Boriboon, Monthika; Phantachat, Wantanee

    2013-03-01

    Graphic symbols are a vital part of most augmentative and alternative communication systems. Communication fluency of graphic symbol user depends on how well the relationship between symbols and its referents are learnt. The first aim of this study is to survey the perception of the selected graphic symbols across seven age groups of participants with different educational background. Sixty-five individuals identified themselves as Thai and ranged in age from 10 to 50 years participated in the investigation used 64 graphic symbols. The last aim of this study is to demonstrate the analysis of multimeaning graphic symbols, which will be used in Thai Picture-based communication system. The twenty graphic symbols with 9-14 meanings are analyzed in both syntactic and semantic aspects. The meanings are divided into five categories: noun, verb/adjective, size, color and shape. Respect to the first aim, the results suggest that the participants under investigation with different sexes, age groups, as well as various educational levels perceive the features or inherent characteristics of such graphic symbols similarly. The results of the analysis of multimeaning of graphic symbols indicate that the foundation of Minspeak, polysemy and redundancy of the words illustrates the inherit meanings of the real-life objects, and it also conveys that the Thai graphic symbols are influenced by numerous factors in Thai circumstance such as ability, motivation, experience, worldview and culture.

  14. Acoustic Emission Analysis Applet (AEAA) Software

    Science.gov (United States)

    Nichols, Charles T.; Roth, Don J.

    2013-01-01

    NASA Glenn Research and NASA White Sands Test Facility have developed software supporting an automated pressure vessel structural health monitoring (SHM) system based on acoustic emissions (AE). The software, referred to as the Acoustic Emission Analysis Applet (AEAA), provides analysts with a tool that can interrogate data collected on Digital Wave Corp. and Physical Acoustics Corp. software using a wide spectrum of powerful filters and charts. This software can be made to work with any data once the data format is known. The applet will compute basic AE statistics, and statistics as a function of time and pressure (see figure). AEAA provides value added beyond the analysis provided by the respective vendors' analysis software. The software can handle data sets of unlimited size. A wide variety of government and commercial applications could benefit from this technology, notably requalification and usage tests for compressed gas and hydrogen-fueled vehicles. Future enhancements will add features similar to a "check engine" light on a vehicle. Once installed, the system will ultimately be used to alert International Space Station crewmembers to critical structural instabilities, but will have little impact to missions otherwise. Diagnostic information could then be transmitted to experienced technicians on the ground in a timely manner to determine whether pressure vessels have been impacted, are structurally unsound, or can be safely used to complete the mission.

  15. User-driven integrated software lives: ``Paleomag'' paleomagnetics analysis on the Macintosh

    Science.gov (United States)

    Jones, Craig H.

    2002-12-01

    "PaleoMag," a paleomagnetics analysis package originally developed for the Macintosh operating system in 1988, allows examination of demagnetization of individual samples and analysis of directional data from collections of samples. Prior to recent reinvigorated development of the software for both Macintosh and Windows, it was widely used despite not running properly on machines and operating systems sold after 1995. This somewhat surprising situation demonstrates that there is a continued need for integrated analysis software within the earth sciences, in addition to well-developed scripting and batch-mode software. One distinct advantage of software like PaleoMag is in the ability to combine quality control with analysis within a unique graphical environment. Because such demands are frequent within the earth sciences, means of nurturing the development of similar software should be found.

  16. User's manual for the Graphical Constituent Loading Analysis System (GCLAS)

    Science.gov (United States)

    Koltun, G.F.; Eberle, Michael; Gray, J.R.; Glysson, G.D.

    2006-01-01

    This manual describes the Graphical Constituent Loading Analysis System (GCLAS), an interactive cross-platform program for computing the mass (load) and average concentration of a constituent that is transported in stream water over a period of time. GCLAS computes loads as a function of an equal-interval streamflow time series and an equal- or unequal-interval time series of constituent concentrations. The constituent-concentration time series may be composed of measured concentrations or a combination of measured and estimated concentrations. GCLAS is not intended for use in situations where concentration data (or an appropriate surrogate) are collected infrequently or where an appreciable amount of the concentration values are censored. It is assumed that the constituent-concentration time series used by GCLAS adequately represents the true time-varying concentration. Commonly, measured constituent concentrations are collected at a frequency that is less than ideal (from a load-computation standpoint), so estimated concentrations must be inserted in the time series to better approximate the expected chemograph. GCLAS provides tools to facilitate estimation and entry of instantaneous concentrations for that purpose. Water-quality samples collected for load computation frequently are collected in a single vertical or at single point in a stream cross section. Several factors, some of which may vary as a function of time and (or) streamflow, can affect whether the sample concentrations are representative of the mean concentration in the cross section. GCLAS provides tools to aid the analyst in assessing whether concentrations in samples collected in a single vertical or at single point in a stream cross section exhibit systematic bias with respect to the mean concentrations. In cases where bias is evident, the analyst can construct coefficient relations in GCLAS to reduce or eliminate the observed bias. GCLAS can export load and concentration data in formats

  17. Introducing a New Software for Geodetic Analysis

    Science.gov (United States)

    Hjelle, Geir Arne; Dähnn, Michael; Fausk, Ingrid; Kirkvik, Ann-Silje; Mysen, Eirik

    2017-04-01

    At the Norwegian Mapping Authority, we are currently developing Where, a new software for geodetic analysis. Where is built on our experiences with the Geosat software, and will be able to analyse and combine data from VLBI, SLR, GNSS and DORIS. The software is mainly written in Python which has proved very fruitful. The code is quick to write and the architecture is easily extendable and maintainable, while at the same time taking advantage of well-tested code like the SOFA and IERS libraries. This presentation will show some of the current capabilities of Where, including benchmarks against other software packages, and outline our plans for further progress. In addition we will report on some investigations we have done experimenting with alternative weighting strategies for VLBI.

  18. Software for Graph Analysis and Visualization

    Directory of Open Access Journals (Sweden)

    M. I. Kolomeychenko

    2014-01-01

    Full Text Available This paper describes the software for graph storage, analysis and visualization. The article presents a comparative analysis of existing software for analysis and visualization of graphs, describes the overall architecture of application and basic principles of construction and operation of the main modules. Furthermore, a description of the developed graph storage oriented to storage and processing of large-scale graphs is presented. The developed algorithm for finding communities and implemented algorithms of autolayouts of graphs are the main functionality of the product. The main advantage of the developed software is high speed processing of large size networks (up to millions of nodes and links. Moreover, the proposed graph storage architecture is unique and has no analogues. The developed approaches and algorithms are optimized for operating with big graphs and have high productivity.

  19. Structure Analysis of the Graphic Simulator for the PRIDE Equipment

    International Nuclear Information System (INIS)

    Kim, Chang Hoi; Kim, Seong Hyun; Park, Byung Suk; Lee, Jong Kwang; Lee, Hyo Jik; Kim, Ki Ho

    2010-12-01

    Simulation technology based on the computer graphics is able to minimize the trial and error and reduce the development cost and period dramatically at the design stage of the pyroprocessing facility construction and the equipment development. For this purpose, the 3D graphic simulation program named HotCell has been developed. HotCell has continuously updated for the functional addition and the bug fix, and now it reaches version third. The Digital mockup of PRIDE is furnished with the MSM(matster-slave manipulator), BDSM(bridge transported dual arm servo manipulator) and Crane in order to remote handling the processing equipment. HotCell program can be interface with the 3D mouse, the haptic device and the joystick for the realistic operation of above device. The posture of MSM can be recorded with the simple keyboard operation in order to reproduce the behavior of the MSM

  20. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 4: Graphical status display

    Science.gov (United States)

    Mckee, James W.

    1990-01-01

    This volume (4 of 4) contains the description, structured flow charts, prints of the graphical displays, and source code to generate the displays for the AMPS graphical status system. The function of these displays is to present to the manager of the AMPS system a graphical status display with the hot boxes that allow the manager to get more detailed status on selected portions of the AMPS system. The development of the graphical displays is divided into two processes; the creation of the screen images and storage of them in files on the computer, and the running of the status program which uses the screen images.

  1. Development of a User Interface for a Regression Analysis Software Tool

    Science.gov (United States)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  2. Power and performance software analysis and optimization

    CERN Document Server

    Kukunas, Jim

    2015-01-01

    Power and Performance: Software Analysis and Optimization is a guide to solving performance problems in modern Linux systems. Power-efficient chips are no help if the software those chips run on is inefficient. Starting with the necessary architectural background as a foundation, the book demonstrates the proper usage of performance analysis tools in order to pinpoint the cause of performance problems, and includes best practices for handling common performance issues those tools identify. Provides expert perspective from a key member of Intel's optimization team on how processors and memory

  3. Computer graphics at VAX JINR

    International Nuclear Information System (INIS)

    Balashov, V.K.

    1991-01-01

    The structure of the software for computer graphics at VAX JINR is described. It consists of graphical packages GKS, WAND and a set graphicals packages for High Energy Physics application designed at CERN. 17 refs.; 1 tab

  4. The Focinator v2-0 - Graphical Interface, Four Channels, Colocalization Analysis and Cell Phase Identification.

    Science.gov (United States)

    Oeck, Sebastian; Malewicz, Nathalie M; Hurst, Sebastian; Al-Refae, Klaudia; Krysztofiak, Adam; Jendrossek, Verena

    2017-07-01

    The quantitative analysis of foci plays an important role in various cell biological methods. In the fields of radiation biology and experimental oncology, the effect of ionizing radiation, chemotherapy or molecularly targeted drugs on DNA damage induction and repair is frequently performed by the analysis of protein clusters or phosphorylated proteins recruited to so called repair foci at DNA damage sites, involving for example γ-H2A.X, 53BP1 or RAD51. We recently developed "The Focinator" as a reliable and fast tool for automated quantitative and qualitative analysis of nuclei and DNA damage foci. The refined software is now even more user-friendly due to a graphical interface and further features. Thus, we included an R-script-based mode for automated image opening, file naming, progress monitoring and an error report. Consequently, the evaluation no longer required the attendance of the operator after initial parameter definition. Moreover, the Focinator v2-0 is now able to perform multi-channel analysis of four channels and evaluation of protein-protein colocalization by comparison of up to three foci channels. This enables for example the quantification of foci in cells of a specific cell cycle phase.

  5. PIV/HPIV Film Analysis Software Package

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A PIV/HPIV film analysis software system was developed that calculates the 2-dimensional spatial autocorrelations of subregions of Particle Image Velocimetry (PIV) or Holographic Particle Image Velocimetry (HPIV) film recordings. The software controls three hardware subsystems including (1) a Kodak Megaplus 1.4 camera and EPIX 4MEG framegrabber subsystem, (2) an IEEE/Unidex 11 precision motion control subsystem, and (3) an Alacron I860 array processor subsystem. The software runs on an IBM PC/AT host computer running either the Microsoft Windows 3.1 or Windows 95 operating system. It is capable of processing five PIV or HPIV displacement vectors per second, and is completely automated with the exception of user input to a configuration file prior to analysis execution for update of various system parameters.

  6. Radiation and environmental data analysis computer (REDAC) hardware, software band analysis procedures

    International Nuclear Information System (INIS)

    Hendricks, T.J.

    1985-01-01

    The REDAC was conceived originally as a tape verifier for the Radiation and Environmental Data Acquisition Recorder (REDAR). From that simple beginning in 1971, the REDAC has evolved into a family of systems used for complete analysis of data obtained by the REDAR and other acquisition systems. Portable or mobile REDACs are deployed to support checkout and analysis tasks in the field. Laboratory systems are additionally used for software development, physics investigations, data base management and graphics. System configurations range from man-portable systems to a large laboratory-based system which supports time-shared analysis and development tasks. Custom operating software allows the analyst to process data either interactively or by batch procedures. Analysis packages are provided for numerous necessary functions. All these analysis procedures can be performed even on the smallest man-portable REDAC. Examples of the multi-isotope stripping and radiation isopleth mapping are presented. Techniques utilized for these operations are also presented

  7. Software for computerised analysis of cardiotocographic traces.

    Science.gov (United States)

    Romano, M; Bifulco, P; Ruffo, M; Improta, G; Clemente, F; Cesarelli, M

    2016-02-01

    Despite the widespread use of cardiotocography in foetal monitoring, the evaluation of foetal status suffers from a considerable inter and intra-observer variability. In order to overcome the main limitations of visual cardiotocographic assessment, computerised methods to analyse cardiotocographic recordings have been recently developed. In this study, a new software for automated analysis of foetal heart rate is presented. It allows an automatic procedure for measuring the most relevant parameters derivable from cardiotocographic traces. Simulated and real cardiotocographic traces were analysed to test software reliability. In artificial traces, we simulated a set number of events (accelerations, decelerations and contractions) to be recognised. In the case of real signals, instead, results of the computerised analysis were compared with the visual assessment performed by 18 expert clinicians and three performance indexes were computed to gain information about performances of the proposed software. The software showed preliminary performance we judged satisfactory in that the results matched completely the requirements, as proved by tests on artificial signals in which all simulated events were detected from the software. Performance indexes computed in comparison with obstetricians' evaluations are, on the contrary, not so satisfactory; in fact they led to obtain the following values of the statistical parameters: sensitivity equal to 93%, positive predictive value equal to 82% and accuracy equal to 77%. Very probably this arises from the high variability of trace annotation carried out by clinicians. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. GRACAT, Software for grounding and collision analysis

    DEFF Research Database (Denmark)

    Friis-Hansen, Peter; Simonsen, Bo Cerup

    2002-01-01

    and grounding accidents. The software consists of three basic analysis modules and one risk mitigation module: 1) frequency, 2) damage, and 3) consequence. These modules can be used individually or in series and the analyses can be performed in deterministic or probabilistic mode. Finally, in the mitigation...

  9. Daylighting Design in Classroom Based on Yearly-Graphic Analysis

    Directory of Open Access Journals (Sweden)

    Yang Guan

    2016-07-01

    Full Text Available In China, existing buildings comprise more than 40 billion square meters, most being of high energy consumption. A substantial reduction in electrical energy costs could be obtained through greater use of daylight. Daylight varies widely due to the movement of sun, changing seasons and diverse weather conditions. Custom static daylight assessments, simulations represent only one time of the year or one time of the day, are inadequate to evaluate the dynamics of daylight variability. Using the intuitive graphic tool Temporal Map to display the annual daylight data, this study compared different passive architectural design strategies under the climate conditions of five representative Chinese cities and selected the most suitable design scheme for each city. In this study, the dynamic yearly-graphic tool was utilized for architectural design in China, and we integrated the optimal design with the Chinese academic calendar to achieve improvements within the occupancy time. This modified map connects design work with human activity that makes daylight evaluation more accurate and efficient. The results of this study will provide preliminary recommendations for energy-saving design in China, and reference to other similar studies.

  10. Graphical analysis of electron inertia induced acoustic instability

    International Nuclear Information System (INIS)

    Karmakar, P.K.; Deka, U.; Dwivedi, C.B.

    2005-01-01

    Recently, the practical significance of the asymptotic limit of m e /m i →0 for electron density distribution has been judged in a two-component plasma system with drifting ions. It is reported that in the presence of drifting ions with drift speed exceeding the ion acoustic wave speed, the electron inertial delay effect facilitates the resonance coupling of the usual fluid ion acoustic mode with the ion-beam mode. In this contribution the same instability is analyzed by graphical and numerical methods. This is to note that the obtained dispersion relation differs from those of the other known normal modes of low frequency ion plasma oscillations and waves. This is due to consideration of electron inertial delay in derivation of the dispersion relation of the ion acoustic wave fluctuations. Numerical calculations of the dispersion relation and wave energy are carried out to depict the graphical appearance of poles and positive-negative energy modes. It is found that the electron inertia induced ion acoustic wave instability arises out of linear resonance coupling between the negative and positive energy modes. Characterization of the resonance nature of the instability in Mach number space for different wave numbers of the ion acoustic mode is presented

  11. Analysis of graphic representation ability in oscillation phenomena

    Science.gov (United States)

    Dewi, A. R. C.; Putra, N. M. D.; Susilo

    2018-03-01

    This study aims to investigates how the ability of students to representation graphs of linear function and harmonic function in understanding of oscillation phenomena. Method of this research used mix methods with concurrent embedded design. The subjects were 35 students of class X MIA 3 SMA 1 Bae Kudus. Data collection through giving essays and interviews that lead to the ability to read and draw graphs in material of Hooke's law and oscillation characteristics. The results of study showed that most of the students had difficulty in drawing graph of linear function and harmonic function of deviation with time. Students’ difficulties in drawing the graph of linear function is the difficulty of analyzing the variable data needed in graph making, confusing the placement of variable data on the coordinate axis, the difficulty of determining the scale interval on each coordinate, and the variation of how to connect the dots forming the graph. Students’ difficulties in representing the graph of harmonic function is to determine the time interval of sine harmonic function, the difficulty to determine the initial deviation point of the drawing, the difficulty of finding the deviation equation of the case of oscillation characteristics and the confusion to different among the maximum deviation (amplitude) with the length of the spring caused the load.Complexity of the characteristic attributes of the oscillation phenomena graphs, students tend to show less well the ability of graphical representation of harmonic functions than the performance of the graphical representation of linear functions.

  12. Design of a Software for Calculating Isoelectric Point of a Polypeptide According to Their Net Charge Using the Graphical Programming Language LabVIEW

    Science.gov (United States)

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel…

  13. PROMETHEE Method and Sensitivity Analysis in the Software Application for the Support of Decision-Making

    Directory of Open Access Journals (Sweden)

    Petr Moldrik

    2008-01-01

    Full Text Available PROMETHEE is one of methods, which fall into multi-criteria analysis (MCA. The MCA, as the name itself indicates, deals with the evaluation of particular variants according to several criteria. Developed software application (MCA8 for the support of multi-criteria decision-making was upgraded about PROMETHEE method and a graphic tool, which enables the execution of the sensitivity analysis. This analysis is used to ascertain how a given model output depends upon the input parameters. The MCA8 software application with mentioned graphic upgrade was developed for purposes of solving multi-criteria decision tasks. In the MCA8 is possible to perform sensitivity analysis by a simple form – through column graphs. We can change criteria significances (weights in these column graphs directly and watch the changes of the order of variants immediately.

  14. Residence time distribution software analysis. User's manual

    International Nuclear Information System (INIS)

    1996-01-01

    Radiotracer applications cover a wide range of industrial activities in chemical and metallurgical processes, water treatment, mineral processing, environmental protection and civil engineering. Experiment design, data acquisition, treatment and interpretation are the basic elements of tracer methodology. The application of radiotracers to determine impulse response as RTD as well as the technical conditions for conducting experiments in industry and in the environment create a need for data processing using special software. Important progress has been made during recent years in the preparation of software programs for data treatment and interpretation. The software package developed for industrial process analysis and diagnosis by the stimulus-response methods contains all the methods for data processing for radiotracer experiments

  15. Improving Software Systems By Flow Control Analysis

    Directory of Open Access Journals (Sweden)

    Piotr Poznanski

    2012-01-01

    Full Text Available Using agile methods during the implementation of the system that meets mission critical requirements can be a real challenge. The change in the system built of dozens or even hundreds of specialized devices with embedded software requires the cooperation of a large group of engineers. This article presents a solution that supports parallel work of groups of system analysts and software developers. Deployment of formal rules to the requirements written in natural language enables using formal analysis of artifacts being a bridge between software and system requirements. Formalism and textual form of requirements allowed the automatic generation of message flow graph for the (sub system, called the “big-picture-model”. Flow diagram analysis helped to avoid a large number of defects whose repair cost in extreme cases could undermine the legitimacy of agile methods in projects of this scale. Retrospectively, a reduction of technical debt was observed. Continuous analysis of the “big picture model” improves the control of the quality parameters of the software architecture. The article also tries to explain why the commercial platform based on UML modeling language may not be sufficient in projects of this complexity.

  16. Intraprocedural dataflow analysis for software product lines

    DEFF Research Database (Denmark)

    Brabrand, Claus; Ribeiro, Márcio; Tolêdo, Társis

    2013-01-01

    Software product lines (SPLs) developed using annotative approaches such as conditional compilation come with an inherent risk of constructing erroneous products. For this reason, it is essential to be able to analyze such SPLs. However, as dataflow analysis techniques are not able to deal with SP...... and memory characteristics on five qualitatively different SPLs. On our benchmarks, the combined analysis strategy is up to almost eight times faster than the brute-force approach....

  17. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  18. RNAstructure: software for RNA secondary structure prediction and analysis.

    Science.gov (United States)

    Reuter, Jessica S; Mathews, David H

    2010-03-15

    To understand an RNA sequence's mechanism of action, the structure must be known. Furthermore, target RNA structure is an important consideration in the design of small interfering RNAs and antisense DNA oligonucleotides. RNA secondary structure prediction, using thermodynamics, can be used to develop hypotheses about the structure of an RNA sequence. RNAstructure is a software package for RNA secondary structure prediction and analysis. It uses thermodynamics and utilizes the most recent set of nearest neighbor parameters from the Turner group. It includes methods for secondary structure prediction (using several algorithms), prediction of base pair probabilities, bimolecular structure prediction, and prediction of a structure common to two sequences. This contribution describes new extensions to the package, including a library of C++ classes for incorporation into other programs, a user-friendly graphical user interface written in JAVA, and new Unix-style text interfaces. The original graphical user interface for Microsoft Windows is still maintained. The extensions to RNAstructure serve to make RNA secondary structure prediction user-friendly. The package is available for download from the Mathews lab homepage at http://rna.urmc.rochester.edu/RNAstructure.html.

  19. A graphical user interface for RAId, a knowledge integrated proteomics analysis suite with accurate statistics

    OpenAIRE

    Joyce, Brendan; Lee, Danny; Rubio, Alex; Ogurtsov, Aleksey; Alves, Gelio; Yu, Yi-Kuo

    2018-01-01

    Abstract Objective RAId is a software package that has been actively developed for the past 10 years for computationally and visually analyzing MS/MS data. Founded on rigorous statistical methods, RAId’s core program computes accurate E-values for peptides and proteins identified during database searches. Making this robust tool readily accessible for the proteomics community by developing a graphical user interface (GUI) is our main goa...

  20. Development of Graphical User Interface for Finite Element Analysis of Static Loading of a Column using MATLAB

    Directory of Open Access Journals (Sweden)

    Moses Omolayo PETINRIN

    2010-12-01

    Full Text Available In this work, the capability of MATLAB software package to develop graphical user interface (GUI package was demonstrated. A GUI was successfully developed using MATLAB programming language to study the behaviour of a suspended column under uniaxial static loading by solving the numerical model created based on the finite element method (FEM. The comparison between the exact solution from previous researches and the numerical analysis showed good agreement. The column average strain, average stress and average load are equivalent but more accurate to the ones obtained when the whole column is taken as one element (two nodes for one dimensional linear finite element problem. It was established in this work that MATLAB is not only a software package for numerical computation but also for application development.

  1. Development of neutron activation analysis software

    International Nuclear Information System (INIS)

    Wang Liyu

    1987-10-01

    The software for quantitative neutron activation analysis was developed to run under the MS/DOS operating system. The programmes of the IBM/SPAN include: spectra file transfer from and to a Canberra Series 35 multichannel analyzer, spectrum evaluation routines, calibration subprogrammes, and quantitative analysis. The programmes for spectrum analysis include fitting routine for separation of multiple lines by reproducing the peak shape with a combination of Gaussian and exponential terms. The programmes were tested on an IBM/AT-compatible computer. The programmes and the sources are available costfree for the IAEA projects of Technical Cooperation. 7 refs, 3 figs

  2. Using R and RStudio for data management, statistical analysis and graphics

    CERN Document Server

    Horton, Nicholas J

    2015-01-01

    This is the second edition of the popular book on using R for statistical analysis and graphics. The authors, who run a popular blog supplementing their books, have focused on adding many new examples to this new edition. These examples are presented primarily in new chapters based on the following themes: simulation, probability, statistics, mathematics/computing, and graphics. The authors have also added many other updates, including a discussion of RStudio-a very popular development environment for R.

  3. mcaGUI: microbial community analysis R-Graphical User Interface (GUI)

    OpenAIRE

    Copeland, Wade K.; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A.; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M.E.; Zhou, Xia; Williams, Christopher J.; Forney, Larry J.; Abdo, Zaid

    2012-01-01

    Summary: Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance ...

  4. Stochastic Analysis of a Queue Length Model Using a Graphics Processing Unit

    Czech Academy of Sciences Publication Activity Database

    Přikryl, Jan; Kocijan, J.

    2012-01-01

    Roč. 5, č. 2 (2012), s. 55-62 ISSN 1802-971X R&D Projects: GA MŠk(CZ) MEB091015 Institutional support: RVO:67985556 Keywords : graphics processing unit * GPU * Monte Carlo simulation * computer simulation * modeling Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2012/AS/prikryl-stochastic analysis of a queue length model using a graphics processing unit.pdf

  5. IUE Data Analysis Software for Personal Computers

    Science.gov (United States)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  6. Graphical Models with R

    DEFF Research Database (Denmark)

    Højsgaard, Søren; Edwards, David; Lauritzen, Steffen

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many...... of these software developments have taken place within the R community, either in the form of new packages or by providing an R ingerface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In addition......, the book provides examples of how more advanced aspects of graphical modeling can be represented and handled within R. Topics covered in the seven chapters include graphical models for contingency tables, Gaussian and mixed graphical models, Bayesian networks and modeling high dimensional data...

  7. Graphical Models with R

    CERN Document Server

    Højsgaard, Søren; Lauritzen, Steffen

    2012-01-01

    Graphical models in their modern form have been around since the late 1970s and appear today in many areas of the sciences. Along with the ongoing developments of graphical models, a number of different graphical modeling software programs have been written over the years. In recent years many of these software developments have taken place within the R community, either in the form of new packages or by providing an R interface to existing software. This book attempts to give the reader a gentle introduction to graphical modeling using R and the main features of some of these packages. In add

  8. Application of Metric-based Software Reliability Analysis to Example Software

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Smidts, Carol

    2008-07-01

    The software reliability of TELLERFAST ATM software is analyzed by using two metric-based software reliability analysis methods, a state transition diagram-based method and a test coverage-based method. The procedures for the software reliability analysis by using the two methods and the analysis results are provided in this report. It is found that the two methods have a relation of complementary cooperation, and therefore further researches on combining the two methods to reflect the benefit of the complementary cooperative effect to the software reliability analysis are recommended

  9. ATLAS tile calorimeter cesium calibration control and analysis software

    International Nuclear Information System (INIS)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N

    2008-01-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented

  10. ATLAS tile calorimeter cesium calibration control and analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Solovyanov, O; Solodkov, A; Starchenko, E; Karyukhin, A; Isaev, A; Shalanda, N [Institute for High Energy Physics, Protvino 142281 (Russian Federation)], E-mail: Oleg.Solovyanov@ihep.ru

    2008-07-01

    An online control system to calibrate and monitor ATLAS Barrel hadronic calorimeter (TileCal) with a movable radioactive source, driven by liquid flow, is described. To read out and control the system an online software has been developed, using ATLAS TDAQ components like DVS (Diagnostic and Verification System) to verify the hardware before running, IS (Information Server) for data and status exchange between networked computers, and other components like DDC (DCS to DAQ Connection), to connect to PVSS-based slow control systems of Tile Calorimeter, high voltage and low voltage. A system of scripting facilities, based on Python language, is used to handle all the calibration and monitoring processes from hardware perspective to final data storage, including various abnormal situations. A QT based graphical user interface to display the status of the calibration system during the cesium source scan is described. The software for analysis of the detector response, using online data, is discussed. Performance of the system and first experience from the ATLAS pit are presented.

  11. The NOD3 software package: A graphical user interface-supported reduction package for single-dish radio continuum and polarisation observations

    Science.gov (United States)

    Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip

    2017-10-01

    Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be

  12. Digital PIV (DPIV) Software Analysis System

    Science.gov (United States)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  13. Microarray Я US: a user-friendly graphical interface to Bioconductor tools that enables accurate microarray data analysis and expedites comprehensive functional analysis of microarray results.

    Science.gov (United States)

    Dai, Yilin; Guo, Ling; Li, Meng; Chen, Yi-Bu

    2012-06-08

    Microarray data analysis presents a significant challenge to researchers who are unable to use the powerful Bioconductor and its numerous tools due to their lack of knowledge of R language. Among the few existing software programs that offer a graphic user interface to Bioconductor packages, none have implemented a comprehensive strategy to address the accuracy and reliability issue of microarray data analysis due to the well known probe design problems associated with many widely used microarray chips. There is also a lack of tools that would expedite the functional analysis of microarray results. We present Microarray Я US, an R-based graphical user interface that implements over a dozen popular Bioconductor packages to offer researchers a streamlined workflow for routine differential microarray expression data analysis without the need to learn R language. In order to enable a more accurate analysis and interpretation of microarray data, we incorporated the latest custom probe re-definition and re-annotation for Affymetrix and Illumina chips. A versatile microarray results output utility tool was also implemented for easy and fast generation of input files for over 20 of the most widely used functional analysis software programs. Coupled with a well-designed user interface, Microarray Я US leverages cutting edge Bioconductor packages for researchers with no knowledge in R language. It also enables a more reliable and accurate microarray data analysis and expedites downstream functional analysis of microarray results.

  14. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  15. Analysis of signal acquisition in GPS receiver software

    Directory of Open Access Journals (Sweden)

    Vlada S. Sokolović

    2011-01-01

    . the whole receiver is software implemented in a MATLAB software package. One of the processes during the signal processing is the initial synchronization (acquisition, where a signal is detected and the carrier frequency is determined as well as the phase sequence code and the carrier Doppler frequency. The acquisition aim is to determine, in the shortest time possible, the parameters of the detected signals and forward them to the next block in synchronization. Depending on the speed and accuracy of the signal parameter determination, different methods of acquisition are applied in practice. The paper presents the methods of serial, parallel and cyclic convolution. For comparison purposes, the architectures of signal processing of particular methods for implementation in receiver software are shown. All measurements were performed on the same signal under the same conditions. On the basis of the tests performed, a detailed analysis of the collected data was carried out and the most acceptable acquisition method for implementation in software GPS receiver was proposed. Because of a relatively high level of noise at the receiver entrance and the received signal interference, the comparison of the results has been done on the basis of the analytical results and the mean time of signal synchronization. The measurement results are shown in tables for easy comparison. The results of measurements using the proposed method are presented as well. The technology of receiver software allows the user to access easily to the architecture of the receiver and therefore allows a simple change of parameters. The influence of the parameters on the process of signal acquisition is also shown in the paper. The graphic presentation shows how and to what extent some of the parameters affect the process of the receiver signal processing. All listed acquisition methods are used in practice. The proposed method is the most suitable for application in software receivers. Based on the analysis

  16. Software development processes and analysis software: a mismatch and a novel framework

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2011-01-01

    This paper discusses the salient characteristics of analysis software and the impact of those characteristics on its development. From this discussion, it can be seen that mainstream software development processes, usually characterized as Plan Driven or Agile, are built upon assumptions that are mismatched to the development and maintenance of analysis software. We propose a novel software development framework that would match the process normally observed in the development of analysis software. In the discussion of this framework, we suggest areas of research and directions for future work. (author)

  17. CMS Computing Software and Analysis Challenge 2006

    Energy Technology Data Exchange (ETDEWEB)

    De Filippis, N. [Dipartimento interateneo di Fisica M. Merlin and INFN Bari, Via Amendola 173, 70126 Bari (Italy)

    2007-10-15

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work.

  18. CMS Computing Software and Analysis Challenge 2006

    International Nuclear Information System (INIS)

    De Filippis, N.

    2007-01-01

    The CMS (Compact Muon Solenoid) collaboration is making a big effort to test the workflow and the dataflow associated with the data handling model. With this purpose the computing, software and analysis Challenge 2006, namely CSA06, started the 15th of September. It was a 50 million event exercise that included all the steps of the analysis chain, like the prompt reconstruction, the data streaming, calibration and alignment iterative executions, the data distribution to regional sites, up to the end-user analysis. Grid tools provided by the LCG project are also experimented to gain access to the data and the resources by providing a user friendly interface to the physicists submitting the production and the analysis jobs. An overview of the status and results of the CSA06 is presented in this work

  19. Integrating R and Java for Enhancing Interactivity of Algorithmic Data Analysis Software Solutions

    Directory of Open Access Journals (Sweden)

    Titus Felix FURTUNĂ

    2016-06-01

    Full Text Available Conceiving software solutions for statistical processing and algorithmic data analysis involves handling diverse data, fetched from various sources and in different formats, and presenting the results in a suggestive, tailorable manner. Our ongoing research aims to design programming technics for integrating R developing environment with Java programming language for interoperability at a source code level. The goal is to combine the intensive data processing capabilities of R programing language, along with the multitude of statistical function libraries, with the flexibility offered by Java programming language and platform, in terms of graphical user interface and mathematical function libraries. Both developing environments are multiplatform oriented, and can complement each other through interoperability. R is a comprehensive and concise programming language, benefiting from a continuously expanding and evolving set of packages for statistical analysis, developed by the open source community. While is a very efficient environment for statistical data processing, R platform lacks support for developing user friendly, interactive, graphical user interfaces (GUIs. Java on the other hand, is a high level object oriented programming language, which supports designing and developing performant and interactive frameworks for general purpose software solutions, through Java Foundation Classes, JavaFX and various graphical libraries. In this paper we treat both aspects of integration and interoperability that refer to integrating Java code into R applications, and bringing R processing sequences into Java driven software solutions. Our research has been conducted focusing on case studies concerning pattern recognition and cluster analysis.

  20. Radio-science performance analysis software

    Science.gov (United States)

    Morabito, D. D.; Asmar, S. W.

    1995-02-01

    The Radio Science Systems Group (RSSG) provides various support functions for several flight project radio-science teams. Among these support functions are uplink and sequence planning, real-time operations monitoring and support, data validation, archiving and distribution functions, and data processing and analysis. This article describes the support functions that encompass radio-science data performance analysis. The primary tool used by the RSSG to fulfill this support function is the STBLTY program set. STBLTY is used to reconstruct observable frequencies and calculate model frequencies, frequency residuals, frequency stability in terms of Allan deviation, reconstructed phase, frequency and phase power spectral density, and frequency drift rates. In the case of one-way data, using an ultrastable oscillator (USO) as a frequency reference, the program set computes the spacecraft transmitted frequency and maintains a database containing the in-flight history of the USO measurements. The program set also produces graphical displays. Some examples and discussions on operating the program set on Galileo and Ulysses data will be presented.

  1. Static analysis of software the abstract interpretation

    CERN Document Server

    Boulanger, Jean-Louis

    2013-01-01

    The existing literature currently available to students and researchers is very general, covering only the formal techniques of static analysis. This book presents real examples of the formal techniques called ""abstract interpretation"" currently being used in various industrial fields: railway, aeronautics, space, automotive, etc. The purpose of this book is to present students and researchers, in a single book, with the wealth of experience of people who are intrinsically involved in the realization and evaluation of software-based safety critical systems. As the authors are people curr

  2. A new paradigm for the development of analysis software

    International Nuclear Information System (INIS)

    Kelly, D.; Harauz, J.

    2012-01-01

    For the CANDU industry, analysis software is an important tool for scientists and engineers to examine issues related to safety, operation, and design. However, the software quality assurance approach currently used for these tools assumes the software is the delivered product. In this paper, we present a model that shifts the emphasis from software being the end-product to software being support for the end-product, the science. We describe a novel software development paradigm that supports this shift and provides the groundwork for re-examining the quality assurance practices used for analysis software. (author)

  3. Fast analysis of molecular dynamics trajectories with graphics processing units-Radial distribution function histogramming

    International Nuclear Information System (INIS)

    Levine, Benjamin G.; Stone, John E.; Kohlmeyer, Axel

    2011-01-01

    The calculation of radial distribution functions (RDFs) from molecular dynamics trajectory data is a common and computationally expensive analysis task. The rate limiting step in the calculation of the RDF is building a histogram of the distance between atom pairs in each trajectory frame. Here we present an implementation of this histogramming scheme for multiple graphics processing units (GPUs). The algorithm features a tiling scheme to maximize the reuse of data at the fastest levels of the GPU's memory hierarchy and dynamic load balancing to allow high performance on heterogeneous configurations of GPUs. Several versions of the RDF algorithm are presented, utilizing the specific hardware features found on different generations of GPUs. We take advantage of larger shared memory and atomic memory operations available on state-of-the-art GPUs to accelerate the code significantly. The use of atomic memory operations allows the fast, limited-capacity on-chip memory to be used much more efficiently, resulting in a fivefold increase in performance compared to the version of the algorithm without atomic operations. The ultimate version of the algorithm running in parallel on four NVIDIA GeForce GTX 480 (Fermi) GPUs was found to be 92 times faster than a multithreaded implementation running on an Intel Xeon 5550 CPU. On this multi-GPU hardware, the RDF between two selections of 1,000,000 atoms each can be calculated in 26.9 s per frame. The multi-GPU RDF algorithms described here are implemented in VMD, a widely used and freely available software package for molecular dynamics visualization and analysis.

  4. Development of an analysis methodology applied to 4πβ-γ software coincidence data acquisition system

    International Nuclear Information System (INIS)

    Brancaccio, Franco; Dias, Mauro da Silva; Toledo, Fabio de

    2009-01-01

    The present work describes the new software methodology under development at the IPEN Nuclear Metrology Laboratory for radionuclide standardizations with 4πβ-γ coincidence technique. The software includes the Coincidence Graphic User Interface (GUI) and the Coincidence Analysis Program. The first results for a 60 Co sample measurement are discussed and compared to the results obtained with two different conventional coincidence systems. (author)

  5. Quantitative graphical analysis of simultaneous dynamic PET/MRI for assessment of prostate cancer.

    Science.gov (United States)

    Rosenkrantz, Andrew B; Koesters, Thomas; Vahle, Anne-Kristin; Friedman, Kent; Bartlett, Rachel M; Taneja, Samir S; Ding, Yu-Shin; Logan, Jean

    2015-04-01

    Dynamic FDG imaging for prostate cancer characterization is limited by generally small size and low uptake in prostate tumors. Our aim in this pilot study was to explore feasibility of simultaneous PET/MRI to guide localization of prostate lesions for dynamic FDG analysis using a graphical approach. Three patients with biopsy-proven prostate cancer underwent simultaneous FDG PET/MRI, incorporating dynamic prostate imaging. Histology and multiparametric MRI findings were used to localize tumors, which in turn guided identification of tumors on FDG images. Regions of interest were manually placed on tumor and benign prostate tissue. Blood activity was extracted from a region of interest placed on the femoral artery on PET images. FDG data were analyzed by graphical analysis using the influx constant Ki (Patlak analysis) when FDG binding seemed irreversible and distribution volume VT (reversible graphical analysis) when FDG binding seemed reversible given the presence of washout. Given inherent coregistration, simultaneous acquisition facilitated use of MRI data to localize small lesions on PET and subsequent graphical analysis in all cases. In 2 cases with irreversible binding, tumor had higher Ki than benign using Patlak analysis (0.023 vs 0.006 and 0.019 vs 0.008 mL/cm3 per minute). In 1 case appearing reversible, tumor had higher VT than benign using reversible graphical analysis (0.68 vs 0.52 mL/cm3). Simultaneous PET/MRI allows localization of small prostate tumors for dynamic PET analysis. By taking advantage of inclusion of the femoral arteries in the FOV, we applied advanced PET data analysis methods beyond conventional static measures and without blood sampling.

  6. A simple, sensitive graphical method of treating thermogravimetric analysis data

    Science.gov (United States)

    Abraham Broido

    1969-01-01

    Thermogravimetric Analysis (TGA) is finding increasing utility in investigations of the pyrolysis and combustion behavior of materuals. Although a theoretical treatment of the TGA behavior of an idealized reaction is relatively straight-forward, major complications can be introduced when the reactions are complex, e.g., in the pyrolysis of cellulose, and when...

  7. Development of Image Analysis Software of MAXI

    Science.gov (United States)

    Eguchi, S.; Ueda, Y.; Hiroi, K.; Isobe, N.; Sugizaki, M.; Suzuki, M.; Tomida, H.; Maxi Team

    2010-12-01

    Monitor of All-sky X-ray Image (MAXI) is an X-ray all-sky monitor, attached to the Japanese experiment module Kibo on the International Space Station. The main scientific goals of the MAXI mission include the discovery of X-ray novae followed by prompt alerts to the community (Negoro et al., in this conference), and production of X-ray all-sky maps and new source catalogs with unprecedented sensitivities. To extract the best capabilities of the MAXI mission, we are working on the development of detailed image analysis tools. We utilize maximum likelihood fitting to a projected sky image, where we take account of the complicated detector responses, such as the background and point spread functions (PSFs). The modeling of PSFs, which strongly depend on the orbit and attitude of MAXI, is a key element in the image analysis. In this paper, we present the status of our software development.

  8. STAR: Software Toolkit for Analysis Research

    International Nuclear Information System (INIS)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R.; Helman, P.

    1993-01-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems

  9. The ESA's Space Trajectory Analysis software suite

    Science.gov (United States)

    Ortega, Guillermo

    The European Space Agency (ESA) initiated in 2005 an internal activity to develop an open source software suite involving university science departments and research institutions all over the world. This project is called the "Space Trajectory Analysis" or STA. This article describes the birth of STA and its present configuration. One of the STA aims is to promote the exchange of technical ideas, and raise knowledge and competence in the areas of applied mathematics, space engineering, and informatics at University level. Conceived as a research and education tool to support the analysis phase of a space mission, STA is able to visualize a wide range of space trajectories. These include among others ascent, re-entry, descent and landing trajectories, orbits around planets and moons, interplanetary trajectories, rendezvous trajectories, etc. The article explains that STA project is an original idea of the Technical Directorate of ESA. It was born in August 2005 to provide a framework in astrodynamics research at University level. As research and education software applicable to Academia, a number of Universities support this development by joining ESA in leading the development. ESA and Universities partnership are expressed in the STA Steering Board. Together with ESA, each University has a chair in the board whose tasks are develop, control, promote, maintain, and expand the software suite. The article describes that STA provides calculations in the fields of spacecraft tracking, attitude analysis, coverage and visibility analysis, orbit determination, position and velocity of solar system bodies, etc. STA implements the concept of "space scenario" composed of Solar system bodies, spacecraft, ground stations, pads, etc. It is able to propagate the orbit of a spacecraft where orbital propagators are included. STA is able to compute communication links between objects of a scenario (coverage, line of sight), and to represent the trajectory computations and

  10. Computer generated multi-color graphics in whole body gamma spectral analysis

    International Nuclear Information System (INIS)

    Phillips, W.G.; Curtis, S.P.; Environmental Protection Agency, Las Vegas, NV)

    1984-01-01

    A medium resolution color graphics terminal (512 x 512 pixels) was appended to a computerized gamma spectrometer for the display of whole body counting data. The color display enhances the ability of a spectroscopist to identify at a glance multicolored spectral regions of interest immediate qualitative interpretation. Spectral data from subjects containing low concentrations of gamma emitters obtained by both NaI(T1) and phoswich detectors are viewed by the method. In addition, software generates a multispectral display by which the gross, background, and net spectra are displayed in color simultaneously on a single screen

  11. PyElph - a software tool for gel images analysis and phylogenetics

    Directory of Open Access Journals (Sweden)

    Pavel Ana Brânduşa

    2012-01-01

    Full Text Available Abstract Background This paper presents PyElph, a software tool which automatically extracts data from gel images, computes the molecular weights of the analyzed molecules or fragments, compares DNA patterns which result from experiments with molecular genetic markers and, also, generates phylogenetic trees computed by five clustering methods, using the information extracted from the analyzed gel image. The software can be successfully used for population genetics, phylogenetics, taxonomic studies and other applications which require gel image analysis. Researchers and students working in molecular biology and genetics would benefit greatly from the proposed software because it is free, open source, easy to use, has a friendly Graphical User Interface and does not depend on specific image acquisition devices like other commercial programs with similar functionalities do. Results PyElph software tool is entirely implemented in Python which is a very popular programming language among the bioinformatics community. It provides a very friendly Graphical User Interface which was designed in six steps that gradually lead to the results. The user is guided through the following steps: image loading and preparation, lane detection, band detection, molecular weights computation based on a molecular weight marker, band matching and finally, the computation and visualization of phylogenetic trees. A strong point of the software is the visualization component for the processed data. The Graphical User Interface provides operations for image manipulation and highlights lanes, bands and band matching in the analyzed gel image. All the data and images generated in each step can be saved. The software has been tested on several DNA patterns obtained from experiments with different genetic markers. Examples of genetic markers which can be analyzed using PyElph are RFLP (Restriction Fragment Length Polymorphism, AFLP (Amplified Fragment Length Polymorphism, RAPD

  12. Graphic Organizers and Students with Learning Disabilities: A Meta-Analysis

    Science.gov (United States)

    Dexter, Douglas D.; Hughes, Charles A.

    2011-01-01

    This meta-analysis reviews experimental and quasi-experimental studies in which upper-elementary, intermediate, and secondary students with learning disabilities learned from graphic organizers. Following an exhaustive search for studies meeting specified design criteria, 55 standardized mean effect sizes were extracted from 16 articles involving…

  13. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  14. Thin-plate spline (TPS) graphical analysis of the mandible on cephalometric radiographs.

    Science.gov (United States)

    Chang, H P; Liu, P H; Chang, H F; Chang, C H

    2002-03-01

    We describe two cases of Class III malocclusion with and without orthodontic treatment. A thin-plate spline (TPS) analysis of lateral cephalometric radiographs was used to visualize transformations of the mandible. The actual sites of mandibular skeletal change are not detectable with conventional cephalometric analysis. These case analyses indicate that specific patterns of mandibular transformation are associated with Class III malocclusion with or without orthopaedic therapy, and visualization of these deformations is feasible using TPS graphical analysis.

  15. Visual querying and analysis of large software repositories

    NARCIS (Netherlands)

    Voinea, Lucian; Telea, Alexandru

    We present a software framework for mining software repositories. Our extensible framework enables the integration of data extraction from repositories with data analysis and interactive visualization. We demonstrate the applicability of the framework by presenting several case studies performed on

  16. Image processing and analysis software development

    International Nuclear Information System (INIS)

    Shahnaz, R.

    1999-01-01

    The work presented in this project is aimed at developing a software 'IMAGE GALLERY' to investigate various image processing and analysis techniques. The work was divided into two parts namely the image processing techniques and pattern recognition, which further comprised of character and face recognition. Various image enhancement techniques including negative imaging, contrast stretching, compression of dynamic, neon, diffuse, emboss etc. have been studied. Segmentation techniques including point detection, line detection, edge detection have been studied. Also some of the smoothing and sharpening filters have been investigated. All these imaging techniques have been implemented in a window based computer program written in Visual Basic Neural network techniques based on Perception model have been applied for face and character recognition. (author)

  17. A new graphical method for Pinch Analysis applications: Heat exchanger network retrofit and energy integration

    International Nuclear Information System (INIS)

    Gadalla, Mamdouh A.

    2015-01-01

    Energy integration is a key solution in chemical process and crude refining industries to minimise external fuel consumption and to face the impact of growing energy crises. Typical energy integration projects can reach a reduction of heating fuels and cold utilities by up to 40% compared with original designs or existing installations. Pinch Analysis is a leading tool and regarded as an efficient method to increase energy efficiency and minimise fuel flow consumptions. It is valid for both natures of design, grassroots and retrofit situations. It can practically be applied to synthesise a HEN (heat exchanger network) or modify an existing preheat train for minimum energy consumption. Heat recovery systems or HENs are networks for exchanging heat between hot and cold process sources. All heat transferred from hot process sources into cold process sinks represent the scope for energy integration. On the other hand, energies required beyond this integrated amount are to be satisfied by external utilities. Graphical representations of Pinch Analysis, such as Composite and Grand Composite Curves are very useful for grassroots designs. Nevertheless, in retrofit situation the analysis is not adequate and besides it is graphically tedious to represent existing exchangers on such graphs. This research proposes a new graphical method for the analysis of heat recovery systems, applicable to HEN retrofit. The new graphical method is based on plotting temperatures of process hot streams versus temperatures of process cold streams. A new graph is constructed for representing existing HENs. For a given network, each existing exchanger is represented by a straight line, whose slope is proportional to the ratio of heat capacities and flows. Further, the length of each exchanger line is related to the heat flow transferred across this exchanger. This new graphical representation can easily identify exchangers across the pinch, Network Pinch, pinching matches and improper placement

  18. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    Science.gov (United States)

    Sun, Z. J.; Wells, D.; Segebade, C.; Green, J.

    2011-06-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  19. Standardizing Activation Analysis: New Software for Photon Activation Analysis

    International Nuclear Information System (INIS)

    Sun, Z. J.; Wells, D.; Green, J.; Segebade, C.

    2011-01-01

    Photon Activation Analysis (PAA) of environmental, archaeological and industrial samples requires extensive data analysis that is susceptible to error. For the purpose of saving time, manpower and minimizing error, a computer program was designed, built and implemented using SQL, Access 2007 and asp.net technology to automate this process. Based on the peak information of the spectrum and assisted by its PAA library, the program automatically identifies elements in the samples and calculates their concentrations and respective uncertainties. The software also could be operated in browser/server mode, which gives the possibility to use it anywhere the internet is accessible. By switching the nuclide library and the related formula behind, the new software can be easily expanded to neutron activation analysis (NAA), charged particle activation analysis (CPAA) or proton-induced X-ray emission (PIXE). Implementation of this would standardize the analysis of nuclear activation data. Results from this software were compared to standard PAA analysis with excellent agreement. With minimum input from the user, the software has proven to be fast, user-friendly and reliable.

  20. Graphic notation

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2010-01-01

    Graphic notation is taught to music therapy students at Aalborg University in both simple and elaborate forms. This is a method of depicting music visually, and notations may serve as memory aids, as aids for analysis and reflection, and for communication purposes such as supervision or within...

  1. Spike-train acquisition, analysis and real-time experimental control using a graphical programming language (LabView).

    Science.gov (United States)

    Nordstrom, M A; Mapletoft, E A; Miles, T S

    1995-11-01

    A solution is described for the acquisition on a personal computer of standard pulses derived from neuronal discharge, measurement of neuronal discharge times, real-time control of stimulus delivery based on specified inter-pulse interval conditions in the neuronal spike train, and on-line display and analysis of the experimental data. The hardware consisted of an Apple Macintosh IIci computer and a plug-in card (National Instruments NB-MIO16) that supports A/D, D/A, digital I/O and timer functions. The software was written in the object-oriented graphical programming language LabView. Essential elements of the source code of the LabView program are presented and explained. The use of the system is demonstrated in an experiment in which the reflex responses to muscle stretch are assessed for a single motor unit in the human masseter muscle.

  2. HeteroGenius: A Framework for Hybrid Analysis of Heterogeneous Software Specifications

    Directory of Open Access Journals (Sweden)

    Manuel Giménez

    2014-01-01

    Full Text Available Nowadays, software artifacts are ubiquitous in our lives being an essential part of home appliances, cars, cell phones, and even in more critical activities like aeronautics and health sciences. In this context software failures may produce enormous losses, either economical or, in the worst case, in human lives. Software analysis is an area in software engineering concerned with the application of diverse techniques in order to prove the absence of errors in software pieces. In many cases different analysis techniques are applied by following specific methodological combinations that ensure better results. These interactions between tools are usually carried out at the user level and it is not supported by the tools. In this work we present HeteroGenius, a framework conceived to develop tools that allow users to perform hybrid analysis of heterogeneous software specifications. HeteroGenius was designed prioritising the possibility of adding new specification languages and analysis tools and enabling a synergic relation of the techniques under a graphical interface satisfying several well-known usability enhancement criteria. As a case-study we implemented the functionality of Dynamite on top of HeteroGenius.

  3. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    Directory of Open Access Journals (Sweden)

    R. Schmidt

    2012-08-01

    Full Text Available The European Space Agency (ESA is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk associated to a landing site in terms of a successful touchdown and subsequent surface operation of the lander. In addition, global illumination conditions at the landing site have to be simulated and analyzed. The Landing Site Risk Analysis software framework (LandSAfe is a system for the analysis, selection and certification of safe landing sites on the lunar surface. LandSAfe generates several data products including high resolution digital terrain models (DTMs, hazard maps, illumination maps, temperature maps and surface reflectance maps which assist the user in evaluating potential landing site candidates. This paper presents the LandSAfe system and describes the methods and products of the different modules. For one candidate landing site on the rim of Shackleton crater at the south pole of the Moon a high resolution DTM is showcased.

  4. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction

    Directory of Open Access Journals (Sweden)

    Jon Hill

    2014-03-01

    Full Text Available Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1 including new processing steps, such as Safe Taxonomic Reduction, 2 using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3 a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  5. The Supertree Toolkit 2: a new and improved software package with a Graphical User Interface for supertree construction.

    Science.gov (United States)

    Hill, Jon; Davis, Katie E

    2014-01-01

    Building large supertrees involves the collection, storage, and processing of thousands of individual phylogenies to create large phylogenies with thousands to tens of thousands of taxa. Such large phylogenies are useful for macroevolutionary studies, comparative biology and in conservation and biodiversity. No easy to use and fully integrated software package currently exists to carry out this task. Here, we present a new Python-based software package that uses well defined XML schema to manage both data and metadata. It builds on previous versions by 1) including new processing steps, such as Safe Taxonomic Reduction, 2) using a user-friendly GUI that guides the user to complete at least the minimum information required and includes context-sensitive documentation, and 3) a revised storage format that integrates both tree- and meta-data into a single file. These data can then be manipulated according to a well-defined, but flexible, processing pipeline using either the GUI or a command-line based tool. Processing steps include standardising names, deleting or replacing taxa, ensuring adequate taxonomic overlap, ensuring data independence, and safe taxonomic reduction. This software has been successfully used to store and process data consisting of over 1000 trees ready for analyses using standard supertree methods. This software makes large supertree creation a much easier task and provides far greater flexibility for further work.

  6. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    Science.gov (United States)

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  7. Software patterns, knowledge maps, and domain analysis

    CERN Document Server

    Fayad, Mohamed E; Hegde, Srikanth GK; Basia, Anshu; Vakil, Ashka

    2014-01-01

    Preface AcknowledgmentsAuthors INTRODUCTIONAn Overview of Knowledge MapsIntroduction: Key Concepts-Software Stable Models, Knowledge Maps, Pattern Language, Goals, Capabilities (Enduring Business Themes + Business Objects) The Motivation The Problem The Objectives Overview of Software Stability Concepts Overview of Knowledge Maps Pattern Languages versus Knowledge Maps: A Brief ComparisonThe Solution Knowledge Maps Methodology or Concurrent Software Development ModelWhy Knowledge Maps? Research Methodology Undertaken Research Verification and Validation The Stratification of This Book Summary

  8. Graphical Derivatives and Stability Analysis for Parameterized Equilibria with Conic Constraints

    Czech Academy of Sciences Publication Activity Database

    Mordukhovich, B. S.; Outrata, Jiří; Ramírez, H. C.

    2015-01-01

    Roč. 23, č. 4 (2015), s. 687-704 ISSN 1877-0533 R&D Projects: GA ČR(CZ) GAP201/12/0671 Institutional support: RVO:67985556 Keywords : Variational analysis and optimization * Parameterized equilibria * Conic constraints * Sensitivity and stability analysis * Solution maps * Graphical derivatives * Normal and tangent cones Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/outrata-0449259.pdf

  9. A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity

    OpenAIRE

    Vieira, Aitor Couce; Houmb, Siv Hilde; Insua, David Rios

    2014-01-01

    Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.

  10. A Graphical Adversarial Risk Analysis Model for Oil and Gas Drilling Cybersecurity

    Directory of Open Access Journals (Sweden)

    Aitor Couce Vieira

    2014-04-01

    Full Text Available Oil and gas drilling is based, increasingly, on operational technology, whose cybersecurity is complicated by several challenges. We propose a graphical model for cybersecurity risk assessment based on Adversarial Risk Analysis to face those challenges. We also provide an example of the model in the context of an offshore drilling rig. The proposed model provides a more formal and comprehensive analysis of risks, still using the standard business language based on decisions, risks, and value.

  11. Exploring the field of public construction clients by a graphical network analysis

    OpenAIRE

    Eisma, P.R.; Volker, L.

    2014-01-01

    Because public construction clients form the majority of construction clients and procure over 40% of the construction output in most countries, they are important actors in the construction industry. Yet, the field of research on clients is still underdeveloped. In order to identify the research gaps in this field, a graphical network analysis of existing literature is performed. The analysis is based on a query executed in the scientific database Scopus resulting in around 3,300 publication...

  12. Social network analysis in software process improvement

    DEFF Research Database (Denmark)

    Nielsen, Peter Axel; Tjørnehøj, Gitte

    2010-01-01

    Software process improvement in small organisation is often problematic and communication and knowledge sharing is more informal. To improve software processes we need to understand how they communicate and share knowledge. In this article have studied the company SmallSoft through action research...

  13. Graphics and Statistics for Cardiology: Data visualisation for meta-analysis.

    Science.gov (United States)

    Kiran, Amit; Crespillo, Abel Pérez; Rahimi, Kazem

    2017-01-01

    Graphical displays play a pivotal role in understanding data sets and disseminating results. For meta-analysis, they are instrumental in presenting findings from multiple studies. This report presents guidance to authors wishing to submit graphical displays as part of their meta-analysis to a clinical cardiology journal, such as HeartWhen using graphical displays for meta-analysis, we recommend the following: Use a flow diagram to describe the number of studies returned from the initial search, the inclusion/exclusion criteria applied and the final number of studies used in the meta-analysis.Present results from the meta-analysis using a figure that incorporates a forest plot and underlying (tabulated) statistics, including test for heterogeneity.Use displays such as funnel plot (minimum 10 studies) and Galbraith plot to visually present distribution of effect sizes or associations in order to evaluate small-study effects and publication bias).For meta-regression, the bubble plot is a useful display for assessing associations by study-level factors.Final checks on graphs, such as appropriate use of axis scale, line pattern, text size and graph resolution, should always be performed. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  14. Leaf extraction and analysis framework graphical user interface: segmenting and analyzing the structure of leaf veins and areoles.

    Science.gov (United States)

    Price, Charles A; Symonova, Olga; Mileyko, Yuriy; Hilley, Troy; Weitz, Joshua S

    2011-01-01

    Interest in the structure and function of physical biological networks has spurred the development of a number of theoretical models that predict optimal network structures across a broad array of taxonomic groups, from mammals to plants. In many cases, direct tests of predicted network structure are impossible given the lack of suitable empirical methods to quantify physical network geometry with sufficient scope and resolution. There is a long history of empirical methods to quantify the network structure of plants, from roots, to xylem networks in shoots and within leaves. However, with few exceptions, current methods emphasize the analysis of portions of, rather than entire networks. Here, we introduce the Leaf Extraction and Analysis Framework Graphical User Interface (LEAF GUI), a user-assisted software tool that facilitates improved empirical understanding of leaf network structure. LEAF GUI takes images of leaves where veins have been enhanced relative to the background, and following a series of interactive thresholding and cleaning steps, returns a suite of statistics and information on the structure of leaf venation networks and areoles. Metrics include the dimensions, position, and connectivity of all network veins, and the dimensions, shape, and position of the areoles they surround. Available for free download, the LEAF GUI software promises to facilitate improved understanding of the adaptive and ecological significance of leaf vein network structure.

  15. Design of a software for calculating isoelectric point of a polypeptide according to their net charge using the graphical programming language LabVIEW.

    Science.gov (United States)

    Tovar, Glomen

    2018-01-01

    A software to calculate the net charge and to predict the isoelectric point (pI) of a polypeptide is developed in this work using the graphical programming language LabVIEW. Through this instrument the net charges of the ionizable residues of the polypeptide chains of the proteins are calculated at different pH values, tabulated, pI is predicted and an Excel (-xls) type file is generated. In this work, the experimental values of the pIs (pI) of different proteins are compared with the values of the pIs (pI) calculated graphically, achieving a correlation coefficient (R) of 0.934746 which represents a good reliability for a p program can constitute an instrument applicable in the laboratory, facilitating the calculation to graduate students and junior researchers. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):39-46, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.

  16. Application of standard softWare of the CDC-6500 interactive graphic therminal for representation of spiral scanning data and event saving

    International Nuclear Information System (INIS)

    Nehrguj, B.; Ososkov, G.A.

    1978-01-01

    A system of programs, based on a standard graphic display software, is developed which enables the user to display the results of spiral scanning using a terminal keyboard and a FILTR program. Quality assessment of filtering is also available. The use of a cursor which provides a certain feedback between the display and the CDC-6500 computer provides good capabilities for the investigation of the filtering program failures and saves the most interesting events. To speed up scanning of events, a special program is written which performs pre-filtration and reduces 4 to 5 fold the amount of source numerical data of track projections. Its flowchart is based on the well-known method of cords, which allows to save 10-12 events/hour

  17. Meta-Analyst: software for meta-analysis of binary, continuous and diagnostic data

    Directory of Open Access Journals (Sweden)

    Schmid Christopher H

    2009-12-01

    Full Text Available Abstract Background Meta-analysis is increasingly used as a key source of evidence synthesis to inform clinical practice. The theory and statistical foundations of meta-analysis continually evolve, providing solutions to many new and challenging problems. In practice, most meta-analyses are performed in general statistical packages or dedicated meta-analysis programs. Results Herein, we introduce Meta-Analyst, a novel, powerful, intuitive, and free meta-analysis program for the meta-analysis of a variety of problems. Meta-Analyst is implemented in C# atop of the Microsoft .NET framework, and features a graphical user interface. The software performs several meta-analysis and meta-regression models for binary and continuous outcomes, as well as analyses for diagnostic and prognostic test studies in the frequentist and Bayesian frameworks. Moreover, Meta-Analyst includes a flexible tool to edit and customize generated meta-analysis graphs (e.g., forest plots and provides output in many formats (images, Adobe PDF, Microsoft Word-ready RTF. The software architecture employed allows for rapid changes to be made to either the Graphical User Interface (GUI or to the analytic modules. We verified the numerical precision of Meta-Analyst by comparing its output with that from standard meta-analysis routines in Stata over a large database of 11,803 meta-analyses of binary outcome data, and 6,881 meta-analyses of continuous outcome data from the Cochrane Library of Systematic Reviews. Results from analyses of diagnostic and prognostic test studies have been verified in a limited number of meta-analyses versus MetaDisc and MetaTest. Bayesian statistical analyses use the OpenBUGS calculation engine (and are thus as accurate as the standalone OpenBUGS software. Conclusion We have developed and validated a new program for conducting meta-analyses that combines the advantages of existing software for this task.

  18. Development of a gamma ray spectrometry software for neutron activation analysis using the open source concept

    International Nuclear Information System (INIS)

    Lucia, Silvio Rogerio de; Maihara, Vera Akiko; Menezes, Mario O. de

    2009-01-01

    In this work, a new software - SAANI (Instrumental Neutron Activation Analysis Software) was developed and used for gamma ray spectra analysis in the Neutron Activation Laboratory (LAN) of the Nuclear and Energetic Research Institute (IPEN-CNEN/SP). The software was developed to completely replace the old one - VISPECT. Besides the visual improvement in the user interface, the new software will allow the standardization of several procedures which are done nowadays in several different ways by each researcher, avoiding intermediate steps in the calculations. By using a modern programming language - Python, together with the graphical library Qt (by Trolltech), both multi-platform, the new software is able to run in Windows, Linux and other platforms. In addition to this, the new software has being designed to be extensible through plug-ins. In order to achieve the proposed initial scope, that is, completely replace the old software, SAANI has undergone several and different kinds of tests, using spectra from certified reference materials, standards and common spectra already analyzed by other software or that were used in international inter-comparisons. The results obtained by SAANI in all tests were considered very good. Some small discrepancies were found and after careful search and analysis, their source was identified as being an accuracy bug in the old software. Usability and robustness tests were conducted by installing SAANI in several laboratory computers and following them during daily utilization. The results of these tests also indicated that SAANI was ready to be used by all researchers in the LAN-IPEN. (author)

  19. Graphical analysis of NMR structural quality and interactive contact map of NOE assignments in ARIA

    Directory of Open Access Journals (Sweden)

    Malliavin Thérèse E

    2008-06-01

    Full Text Available Abstract Background The Ambiguous Restraints for Iterative Assignment (ARIA approach is widely used for NMR structure determination. It is based on simultaneously calculating structures and assigning NOE through an iterative protocol. The final solution consists of a set of conformers and a list of most probable assignments for the input NOE peak list. Results ARIA was extended with a series of graphical tools to facilitate a detailed analysis of the intermediate and final results of the ARIA protocol. These additional features provide (i an interactive contact map, serving as a tool for the analysis of assignments, and (ii graphical representations of structure quality scores and restraint statistics. The interactive contact map between residues can be clicked to obtain information about the restraints and their contributions. Profiles of quality scores are plotted along the protein sequence, and contact maps provide information of the agreement with the data on a residue pair level. Conclusion The graphical tools and outputs described here significantly extend the validation and analysis possibilities of NOE assignments given by ARIA as well as the analysis of the quality of the final structure ensemble. These tools are included in the latest version of ARIA, which is available at http://aria.pasteur.fr. The Web site also contains an installation guide, a user manual and example calculations.

  20. mcaGUI: microbial community analysis R-Graphical User Interface (GUI).

    Science.gov (United States)

    Copeland, Wade K; Krishnan, Vandhana; Beck, Daniel; Settles, Matt; Foster, James A; Cho, Kyu-Chul; Day, Mitch; Hickey, Roxana; Schütte, Ursel M E; Zhou, Xia; Williams, Christopher J; Forney, Larry J; Abdo, Zaid

    2012-08-15

    Microbial communities have an important role in natural ecosystems and have an impact on animal and human health. Intuitive graphic and analytical tools that can facilitate the study of these communities are in short supply. This article introduces Microbial Community Analysis GUI, a graphical user interface (GUI) for the R-programming language (R Development Core Team, 2010). With this application, researchers can input aligned and clustered sequence data to create custom abundance tables and perform analyses specific to their needs. This GUI provides a flexible modular platform, expandable to include other statistical tools for microbial community analysis in the future. The mcaGUI package and source are freely available as part of Bionconductor at http://www.bioconductor.org/packages/release/bioc/html/mcaGUI.html

  1. DIII-D Thomson Scattering Diagnostic Data Acquisition, Processing and Analysis Software

    International Nuclear Information System (INIS)

    Middaugh, K.R.; Bray, B.D.; Hsieh, C.L.; McHarg, B.B.Jr.; Penaflor, B.G.

    1999-01-01

    One of the diagnostic systems critical to the success of the DIII-D tokamak experiment is the Thomson scattering diagnostic. This diagnostic is unique in that it measures local electron temperature and density: (1) at multiple locations within the tokamak plasma; and (2) at different times throughout the plasma duration. Thomson ''raw'' data are digitized signals of scattered light, measured at different times and locations, from the laser beam paths fired into the plasma. Real-time acquisition of this data is performed by specialized hardware. Once obtained, the raw data are processed into meaningful temperature and density values which can be analyzed for measurement quality. This paper will provide an overview of the entire Thomson scattering diagnostic software and will focus on the data acquisition, processing, and analysis software implementation. The software falls into three general categories: (1) Set-up and Control: Initializes and controls all Thomson hardware and software, synchronizes with other DIII-D computers, and invokes other Thomson software as appropriate. (2) Data Acquisition and Processing: Obtains raw measured data from memory and processes it into temperature and density values. (3) Analysis: Provides a graphical user interface in which to perform analysis and sophisticated plotting of analysis parameters

  2. Structural zooming research and development of an interactive computer graphical interface for stress analysis of cracks

    Science.gov (United States)

    Gerstle, Walter

    1989-01-01

    Engineering problems sometimes involve the numerical solution of boundary value problems over domains containing geometric feature with widely varying scales. Often, a detailed solution is required at one or more of these features. Small details in large structures may have profound effects upon global performance. Conversely, large-scale conditions may effect local performance. Many man-hours and CPU-hours are currently spent in modeling such problems. With the structural zooming technique, it is now possible to design an integrated program which allows the analyst to interactively focus upon a small region of interest, to modify the local geometry, and then to obtain highly accurate responses in that region which reflect both the properties of the overall structure and the local detail. A boundary integral equation analysis program, called BOAST, was recently developed for the stress analysis of cracks. This program can accurately analyze two-dimensional linear elastic fracture mechanics problems with far less computational effort than existing finite element codes. An interactive computer graphical interface to BOAST was written. The graphical interface would have several requirements: it would be menu-driven, with mouse input; all aspects of input would be entered graphically; the results of a BOAST analysis would be displayed pictorially but also the user would be able to probe interactively to get numerical values of displacement and stress at desired locations within the analysis domain; the entire procedure would be integrated into a single, easy to use package; and it would be written using calls to the graphic package called HOOPS. The program is nearing completion. All of the preprocessing features are working satisfactorily and were debugged. The postprocessing features are under development, and rudimentary postprocessing should be available by the end of the summer. The program was developed and run on a VAX workstation, and must be ported to the SUN

  3. COMPARATIVE ANALYSIS OF SOFTWARE DEVELOPMENT MODELS

    OpenAIRE

    Sandeep Kaur*

    2017-01-01

    No geek is unfamiliar with the concept of software development life cycle (SDLC). This research deals with the various SDLC models covering waterfall, spiral, and iterative, agile, V-shaped, prototype model. In the modern era, all the software systems are fallible as they can’t stand with certainty. So, it is tried to compare all aspects of the various models, their pros and cons so that it could be easy to choose a particular model at the time of need

  4. Interactive graphical system for small-angle scattering analysis of polydisperse systems

    International Nuclear Information System (INIS)

    Konarev, P V; Volkov, V V; Svergun, D I

    2016-01-01

    A program suite for one-dimensional small-angle scattering analysis of polydisperse systems and multiple data sets is presented. The main program, POLYSAS , has a menu-driven graphical user interface calling computational modules from ATSAS package to perform data treatment and analysis. The graphical menu interface allows one to process multiple (time, concentration or temperature-dependent) data sets and interactively change the parameters for the data modelling using sliders. The graphical representation of the data is done via the Winteracter-based program SASPLOT . The package is designed for the analysis of polydisperse systems and mixtures, and permits one to obtain size distributions and evaluate the volume fractions of the components using linear and non-linear fitting algorithms as well as model-independent singular value decomposition. The use of the POLYSAS package is illustrated by the recent examples of its application to study concentration-dependent oligomeric states of proteins and time kinetics of polymer micelles for anticancer drug delivery. (paper)

  5. CAX a software for automated spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.

    2017-01-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  6. CAX a software for automated spectrum analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A., E-mail: gzahn@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (CRPq/IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro do Reator de Pesquisas

    2017-11-01

    In this work, the scripting capabilities of Genie-2000 were used to develop a software that automatically analyses all spectrum files in either Ortec's CHN or Canberra's MCA or CNF formats in a folder, generating two output files: a print-ready text le (.DAT) and a Comma-Separated Values (.CSV) le which can be easily imported in any major spreadsheet software. This software, named CAX ('Convert and Analyse for eXcel'), uses Genie-2000's functions to import spectrum files into Genie's native CNF format and analyze the converted spectra. The software can also, if requested, import energy and FWHM calibrations from a stored calibrated spectrum. The print-ready output le (.DAT) is generated by Genie-2000 using a customized script, and the CSV le is generated by a custom-built DAT2CSV software which generates a CSV le that complies to the Brazilian standards, with commas as a decimal indicator and semicolons as eld separators. This software is already used in the daily routines in IPEN's Neutron Activation Laboratory, greatly reducing the time required for sample analyses, as well as reducing the possibility of transcription errors. (author)

  7. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    Science.gov (United States)

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  8. Development of a New VLBI Data Analysis Software

    Science.gov (United States)

    Bolotin, Sergei; Gipson, John M.; MacMillan, Daniel S.

    2010-01-01

    We present an overview of a new VLBI analysis software under development at NASA GSFC. The new software will replace CALC/SOLVE and many related utility programs. It will have the capabilities of the current system as well as incorporate new models and data analysis techniques. In this paper we give a conceptual overview of the new software. We formulate the main goals of the software. The software should be flexible and modular to implement models and estimation techniques that currently exist or will appear in future. On the other hand it should be reliable and possess production quality for processing standard VLBI sessions. Also, it needs to be capable of processing observations from a fully deployed network of VLBI2010 stations in a reasonable time. We describe the software development process and outline the software architecture.

  9. Continuous software quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The software for the ATLAS experiment on the Large Hadron Collider at CERN has evolved over many years to meet the demands of Monte Carlo simulation, particle detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by an active worldwide developer community. In order to run the experiment software efficiently at hundreds of computing centres it is essential to maintain a high level of software quality standards. The methods proposed to improve software quality practices by incorporating checks into the new ATLAS software build infrastructure.

  10. Contribution to the sample mean plot for graphical and numerical sensitivity analysis

    International Nuclear Information System (INIS)

    Bolado-Lavin, R.; Castaings, W.; Tarantola, S.

    2009-01-01

    The contribution to the sample mean plot, originally proposed by Sinclair, is revived and further developed as practical tool for global sensitivity analysis. The potentials of this simple and versatile graphical tool are discussed. Beyond the qualitative assessment provided by this approach, a statistical test is proposed for sensitivity analysis. A case study that simulates the transport of radionuclides through the geosphere from an underground disposal vault containing nuclear waste is considered as a benchmark. The new approach is tested against a very efficient sensitivity analysis method based on state dependent parameter meta-modelling

  11. The use of computer-generated color graphic images for transient thermal analysis. [for hypersonic aircraft

    Science.gov (United States)

    Edwards, C. L. W.; Meissner, F. T.; Hall, J. B.

    1979-01-01

    Color computer graphics techniques were investigated as a means of rapidly scanning and interpreting large sets of transient heating data. The data presented were generated to support the conceptual design of a heat-sink thermal protection system (TPS) for a hypersonic research airplane. Color-coded vector and raster displays of the numerical geometry used in the heating calculations were employed to analyze skin thicknesses and surface temperatures of the heat-sink TPS under a variety of trajectory flight profiles. Both vector and raster displays proved to be effective means for rapidly identifying heat-sink mass concentrations, regions of high heating, and potentially adverse thermal gradients. The color-coded (raster) surface displays are a very efficient means for displaying surface-temperature and heating histories, and thereby the more stringent design requirements can quickly be identified. The related hardware and software developments required to implement both the vector and the raster displays for this application are also discussed.

  12. GRO/EGRET data analysis software: An integrated system of custom and commercial software using standard interfaces

    Science.gov (United States)

    Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.

    1992-01-01

    The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and

  13. Software Piracy in Research: A Moral Analysis.

    Science.gov (United States)

    Santillanes, Gary; Felder, Ryan Marshall

    2015-08-01

    Researchers in virtually every discipline rely on sophisticated proprietary software for their work. However, some researchers are unable to afford the licenses and instead procure the software illegally. We discuss the prohibition of software piracy by intellectual property laws, and argue that the moral basis for the copyright law offers the possibility of cases where software piracy may be morally justified. The ethics codes that scientific institutions abide by are informed by a rule-consequentialist logic: by preserving personal rights to authored works, people able to do so will be incentivized to create. By showing that the law has this rule-consequentialist grounding, we suggest that scientists who blindly adopt their institutional ethics codes will commit themselves to accepting that software piracy could be morally justified, in some cases. We hope that this conclusion will spark debate over important tensions between ethics codes, copyright law, and the underlying moral basis for these regulations. We conclude by offering practical solutions (other than piracy) for researchers.

  14. Civacuve analysis software for mis machine examination of pressurized water reactor vessels

    International Nuclear Information System (INIS)

    Dubois, Ph.; Gagnor, A.

    2001-01-01

    The product software CIVACUVE is used by INTERCONTROLE for the analysis of UT examinations, for detection, performed by the In-Service Inspection Machine (MIS) of the vessels of nuclear power plants. This software is based on an adaptation of an algorithm of SEGMENTATION (CEA CEREM), which is applied prior to any analysis. It is equipped with tools adapted to industrial use. It allows to: - perform image analysis thanks to advanced graphic tools (Zooms, True Bscan, 'contour' selection...), - backup of all data in a database (complete and transparent backup of all informations used and obtained during the different analysis operations), - connect PC to the Database (export of Reports and even of segmented points), - issue Examination Reports, Operating Condition Sheets, Sizing curves... - and last, perform a graphic and numerical comparison between different inspections of the same vessel. Used in Belgium and France on different kind of reactor vessels, CIVACUVE has allowed to show that the principle of SEGMENTATION can be adapted to detection exams. The use of CIVACUVE generates a important time gain as well as the betterment of quality in analysis. Wide data opening toward PC's allows a real flexibility with regard to client's requirements and preoccupations

  15. A proposal for performing software safety hazard analysis

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Gallagher, J.M.

    1997-01-01

    Techniques for analyzing the safety and reliability of analog-based electronic protection systems that serve to mitigate hazards in process control systems have been developed over many years, and are reasonably understood. An example is the protection system in a nuclear power plant. The extension of these techniques to systems which include digital computers is not well developed, and there is little consensus among software engineering experts and safety experts on how to analyze such systems. One possible technique is to extend hazard analysis to include digital computer-based systems. Software is frequently overlooked during system hazard analyses, but this is unacceptable when the software is in control of a potentially hazardous operation. In such cases, hazard analysis should be extended to fully cover the software. A method for performing software hazard analysis is proposed in this paper. The method concentrates on finding hazards during the early stages of the software life cycle, using an extension of HAZOP

  16. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...

  17. INFOS: spectrum fitting software for NMR analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Albert A., E-mail: alsi@nmr.phys.chem.ethz.ch [ETH Zürich, Physical Chemistry (Switzerland)

    2017-02-15

    Software for fitting of NMR spectra in MATLAB is presented. Spectra are fitted in the frequency domain, using Fourier transformed lineshapes, which are derived using the experimental acquisition and processing parameters. This yields more accurate fits compared to common fitting methods that use Lorentzian or Gaussian functions. Furthermore, a very time-efficient algorithm for calculating and fitting spectra has been developed. The software also performs initial peak picking, followed by subsequent fitting and refinement of the peak list, by iteratively adding and removing peaks to improve the overall fit. Estimation of error on fitting parameters is performed using a Monte-Carlo approach. Many fitting options allow the software to be flexible enough for a wide array of applications, while still being straightforward to set up with minimal user input.

  18. Fourier Analysis: Graphical Animation and Analysis of Experimental Data with Excel

    Directory of Open Access Journals (Sweden)

    Margarida Oliveira

    2012-05-01

    Full Text Available According to Fourier formulation, any function that can be represented in a graph may be approximated by the “sum” of infinite sinusoidal functions (Fourier series, termed as “waves”.The adopted approach is accessible to students of the first years of university studies, in which the emphasis is put on the understanding of mathematical concepts through illustrative graphic representations, the students being encouraged to prepare animated Excel-based computational modules (VBA-Visual Basic for Applications.Reference is made to the part played by both trigonometric and complex representations of Fourier series in the concept of discrete Fourier transform. Its connection with the continuous Fourier transform is demonstrated and a brief mention is made of the generalization leading to Laplace transform.As application, the example presented refers to the analysis of vibrations measured on engineering structures: horizontal accelerations of a one-storey building deriving from environment noise. This example is integrated in the curriculum of the discipline “Matemática Aplicada à Engenharia Civil” (Mathematics Applied to Civil Engineering, lectured at ISEL (Instituto Superior de Engenharia de Lisboa. In this discipline, the students have the possibility of performing measurements using an accelerometer and a data acquisition system, which, when connected to a PC, make it possible to record the accelerations measured in a file format recognizable by Excel.

  19. InterFace: A software package for face image warping, averaging, and principal components analysis.

    Science.gov (United States)

    Kramer, Robin S S; Jenkins, Rob; Burton, A Mike

    2017-12-01

    We describe InterFace, a software package for research in face recognition. The package supports image warping, reshaping, averaging of multiple face images, and morphing between faces. It also supports principal components analysis (PCA) of face images, along with tools for exploring the "face space" produced by PCA. The package uses a simple graphical user interface, allowing users to perform these sophisticated image manipulations without any need for programming knowledge. The program is available for download in the form of an app, which requires that users also have access to the (freely available) MATLAB Runtime environment.

  20. ACEMAN (II): a PDP-11 software package for acoustic emission analysis

    International Nuclear Information System (INIS)

    Tobias, A.

    1976-01-01

    A powerful, but easy-to-use, software package (ACEMAN) for acoustic emission analysis has been developed at Berkeley Nuclear Laboratories. The system is based on a PDP-11 minicomputer with 24 K of memory, an RK05 DISK Drive and a Tektronix 4010 Graphics terminal. The operation of the system is described in detail in terms of the functions performed in response to the various command mnemonics. The ACEMAN software package offers many useful facilities not found on other acoustic emission monitoring systems. Its main features, many of which are unique, are summarised. The ACEMAN system automatically handles arrays of up to 12 sensors in real-time operation during which data are acquired, analysed, stored on the computer disk for future analysis and displayed on the terminal if required. (author)

  1. Software design and code generation for the engineering graphical user interface of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    Science.gov (United States)

    Tanci, Claudio; Tosti, Gino; Antolini, Elisa; Gambini, Giorgio F.; Bruno, Pietro; Canestrari, Rodolfo; Conforti, Vito; Lombardi, Saverio; Russo, Federico; Sangiorgi, Pierluca; Scuderi, Salvatore

    2016-08-01

    ASTRI is an on-going project developed in the framework of the Cherenkov Telescope Array (CTA). An end- to-end prototype of a dual-mirror small-size telescope (SST-2M) has been installed at the INAF observing station on Mt. Etna, Italy. The next step is the development of the ASTRI mini-array composed of nine ASTRI SST-2M telescopes proposed to be installed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort carried on by Italy, Brazil and South-Africa and led by the Italian National Institute of Astrophysics, INAF. To control the ASTRI telescopes, a specific ASTRI Mini-Array Software System (MASS) was designed using a scalable and distributed architecture to monitor all the hardware devices for the telescopes. Using code generation we built automatically from the ASTRI Interface Control Documents a set of communication libraries and extensive Graphical User Interfaces that provide full access to the capabilities offered by the telescope hardware subsystems for testing and maintenance. Leveraging these generated libraries and components we then implemented a human designed, integrated, Engineering GUI for MASS to perform the verification of the whole prototype and test shared services such as the alarms, configurations, control systems, and scientific on-line outcomes. In our experience the use of code generation dramatically reduced the amount of effort in development, integration and testing of the more basic software components and resulted in a fast software release life cycle. This approach could be valuable for the whole CTA project, characterized by a large diversity of hardware components.

  2. Granatum: a graphical single-cell RNA-Seq analysis pipeline for genomics scientists.

    Science.gov (United States)

    Zhu, Xun; Wolfgruber, Thomas K; Tasato, Austin; Arisdakessian, Cédric; Garmire, David G; Garmire, Lana X

    2017-12-05

    Single-cell RNA sequencing (scRNA-Seq) is an increasingly popular platform to study heterogeneity at the single-cell level. Computational methods to process scRNA-Seq data are not very accessible to bench scientists as they require a significant amount of bioinformatic skills. We have developed Granatum, a web-based scRNA-Seq analysis pipeline to make analysis more broadly accessible to researchers. Without a single line of programming code, users can click through the pipeline, setting parameters and visualizing results via the interactive graphical interface. Granatum conveniently walks users through various steps of scRNA-Seq analysis. It has a comprehensive list of modules, including plate merging and batch-effect removal, outlier-sample removal, gene-expression normalization, imputation, gene filtering, cell clustering, differential gene expression analysis, pathway/ontology enrichment analysis, protein network interaction visualization, and pseudo-time cell series construction. Granatum enables broad adoption of scRNA-Seq technology by empowering bench scientists with an easy-to-use graphical interface for scRNA-Seq data analysis. The package is freely available for research use at http://garmiregroup.org/granatum/app.

  3. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    Science.gov (United States)

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. Enhancement to the Tektronix PLOT-10 Terminal Control System for creation of graphics metafiles

    International Nuclear Information System (INIS)

    Gray, W.H.

    1983-01-01

    Many data handling and analysis codes at the Oak Ridge National Laboratory (ORNL) use the Tektronix PLOT-10 Terminal Control System to graphically display data upon Tektronix or Tektronix-emulating graphics devices. Prior to the development of the software libraries and postprocessors discussed within this report, ORNL users were limited to the type of hardcopy output obtainable from the Tektronix PLOT-10 software library. Only Tektronix graphics devices are supported by the PLOT-10 library. The graphics library presented here eliminates this restriction by implementing a suite of software that optionally creates a graphics metafile within the user's disk area while simultaneously drawing a display image on the screen of a user's Tektronix terminal. This graphics metafile can then be postprocessed onto any of the graphics devices at ORNL via the ORNL PLOT command

  5. Research and Development on Food Nutrition Statistical Analysis Software System

    OpenAIRE

    Du Li; Ke Yun

    2013-01-01

    Designing and developing a set of food nutrition component statistical analysis software can realize the automation of nutrition calculation, improve the nutrition processional professional’s working efficiency and achieve the informatization of the nutrition propaganda and education. In the software development process, the software engineering method and database technology are used to calculate the human daily nutritional intake and the intelligent system is used to evaluate the user’s hea...

  6. Cross-instrument Analysis Correlation Software

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-28

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog box driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.

  7. Nuclear Fuel Depletion Analysis Using Matlab Software

    Science.gov (United States)

    Faghihi, F.; Nematollahi, M. R.

    Coupled first order IVPs are frequently used in many parts of engineering and sciences. In this article, we presented a code including three computer programs which are joint with the Matlab software to solve and plot the solutions of the first order coupled stiff or non-stiff IVPs. Some engineering and scientific problems related to IVPs are given and fuel depletion (production of the 239Pu isotope) in a Pressurized Water Nuclear Reactor (PWR) are computed by the present code.

  8. Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data

    Science.gov (United States)

    Okladnikov, I.; Gordov, E. P.; Titov, A. G.

    2011-12-01

    Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis

  9. A study of software safety analysis system for safety-critical software

    International Nuclear Information System (INIS)

    Chang, H. S.; Shin, H. K.; Chang, Y. W.; Jung, J. C.; Kim, J. H.; Han, H. H.; Son, H. S.

    2004-01-01

    The core factors and requirements for the safety-critical software traced and the methodology adopted in each stage of software life cycle are presented. In concept phase, Failure Modes and Effects Analysis (FMEA) for the system has been performed. The feasibility evaluation of selected safety parameter was performed and Preliminary Hazards Analysis list was prepared using HAZOP(Hazard and Operability) technique. And the check list for management control has been produced via walk-through technique. Based on the evaluation of the check list, activities to be performed in requirement phase have been determined. In the design phase, hazard analysis has been performed to check the safety capability of the system with regard to safety software algorithm using Fault Tree Analysis (FTA). In the test phase, the test items based on FMEA have been checked for fitness guided by an accident scenario. The pressurizer low pressure trip algorithm has been selected to apply FTA method to software safety analysis as a sample. By applying CASE tool, the requirements traceability of safety critical system has been enhanced during all of software life cycle phases

  10. ClonoCalc and ClonoPlot: immune repertoire analysis from raw files to publication figures with graphical user interface.

    Science.gov (United States)

    Fähnrich, Anke; Krebbel, Moritz; Decker, Normann; Leucker, Martin; Lange, Felix D; Kalies, Kathrin; Möller, Steffen

    2017-03-11

    Next generation sequencing (NGS) technologies enable studies and analyses of the diversity of both T and B cell receptors (TCR and BCR) in human and animal systems to elucidate immune functions in health and disease. Over the last few years, several algorithms and tools have been developed to support respective analyses of raw sequencing data of the immune repertoire. These tools focus on distinct aspects of the data processing and require a strong bioinformatics background. To facilitate the analysis of T and B cell repertoires by less experienced users, software is needed that combines the most common tools for repertoire analysis. We introduce a graphical user interface (GUI) providing a complete analysis pipeline for processing raw NGS data for human and animal TCR and BCR clonotype determination and advanced differential repertoire studies. It provides two applications. ClonoCalc prepares the raw data for downstream analyses. It combines a demultiplexer for barcode splitting and employs MiXCR for paired-end read merging and the extraction of human and animal TCR/BCR sequences. ClonoPlot wraps the R package tcR and further contributes self-developed plots for the descriptive comparative investigation of immune repertoires. This workflow reduces the amount of programming required to perform the respective analyses and supports both communication and training between scientists and technicians, and across scientific disciplines. The Open Source development in Java and R is modular and invites advanced users to extend its functionality. Software and documentation are freely available at https://bitbucket.org/ClonoSuite/clonocalc-plot .

  11. Change Impact Analysis of Crosscutting in Software Architectural Design

    NARCIS (Netherlands)

    van den Berg, Klaas

    2006-01-01

    Software architectures should be amenable to changes in user requirements and implementation technology. The analysis of the impact of these changes can be based on traceability of architectural design elements. Design elements have dependencies with other software artifacts but also evolve in time.

  12. Integrated analysis software for bulk power system stability

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, T; Nagao, T; Takahashi, K [Central Research Inst. of Electric Power Industry, Tokyo (Japan)

    1994-12-31

    This paper presents Central Research Inst.of Electric Power Industry - CRIEPI`s - own developed three softwares for bulk power network analysis and the user support system which arranges tremendous data necessary for these softwares with easy and high reliability. (author) 3 refs., 7 figs., 2 tabs.

  13. Development of analysis software for radiation time-series data with the use of visual studio 2005

    International Nuclear Information System (INIS)

    Hohara, Sin-ya; Horiguchi, Tetsuo; Ito, Shin

    2008-01-01

    Time-Series Analysis supplies a new vision that conventional analysis methods such as energy spectroscopy haven't achieved ever. However, application of time-series analysis to radiation measurements needs much effort in software and hardware development. By taking advantage of Visual Studio 2005, we developed an analysis software, 'ListFileConverter', for time-series radiation measurement system called as 'MPA-3'. The software is based on graphical user interface (GUI) architecture that enables us to save a large amount of operation time in the analysis, and moreover to make an easy-access to special file structure of MPA-3 data. In this paper, detailed structure of ListFileConverter is fully explained, and experimental results for counting capability of MPA-3 hardware system and those for neutron measurements with our UTR-KINKI reactor are also given. (author)

  14. A projection graphic display for the computer aided analysis of bubble chamber images

    International Nuclear Information System (INIS)

    Solomos, E.

    1979-01-01

    A projection graphic display for aiding the analysis of bubble chamber photographs has been developed by the Instrumentation Group of EF Division at CERN. The display image is generated on a very high brightness cathode ray tube and projected on to the table of the scanning-measuring machines as a superposition to the image of the bubble chamber. The display can send messages to the operator and aid the measurement by indicating directly on the chamber image the tracks which are measured correctly or not. (orig.)

  15. Continuous Software Quality analysis for the ATLAS experiment

    CERN Document Server

    Washbrook, Andrew; The ATLAS collaboration

    2017-01-01

    The regular application of software quality tools in large collaborative projects is required to reduce code defects to an acceptable level. If left unchecked the accumulation of defects invariably results in performance degradation at scale and problems with the long-term maintainability of the code. Although software quality tools are effective for identification there remains a non-trivial sociological challenge to resolve defects in a timely manner. This is a ongoing concern for the ATLAS software which has evolved over many years to meet the demands of Monte Carlo simulation, detector reconstruction and data analysis. At present over 3.8 million lines of C++ code (and close to 6 million total lines of code) are maintained by a community of hundreds of developers worldwide. It is therefore preferable to address code defects before they are introduced into a widely used software release. Recent wholesale changes to the ATLAS software infrastructure have provided an ideal opportunity to apply software quali...

  16. Meta-DiSc: a software for meta-analysis of test accuracy data.

    Science.gov (United States)

    Zamora, Javier; Abraira, Victor; Muriel, Alfonso; Khan, Khalid; Coomarasamy, Arri

    2006-07-12

    Systematic reviews and meta-analyses of test accuracy studies are increasingly being recognised as central in guiding clinical practice. However, there is currently no dedicated and comprehensive software for meta-analysis of diagnostic data. In this article, we present Meta-DiSc, a Windows-based, user-friendly, freely available (for academic use) software that we have developed, piloted, and validated to perform diagnostic meta-analysis. Meta-DiSc a) allows exploration of heterogeneity, with a variety of statistics including chi-square, I-squared and Spearman correlation tests, b) implements meta-regression techniques to explore the relationships between study characteristics and accuracy estimates, c) performs statistical pooling of sensitivities, specificities, likelihood ratios and diagnostic odds ratios using fixed and random effects models, both overall and in subgroups and d) produces high quality figures, including forest plots and summary receiver operating characteristic curves that can be exported for use in manuscripts for publication. All computational algorithms have been validated through comparison with different statistical tools and published meta-analyses. Meta-DiSc has a Graphical User Interface with roll-down menus, dialog boxes, and online help facilities. Meta-DiSc is a comprehensive and dedicated test accuracy meta-analysis software. It has already been used and cited in several meta-analyses published in high-ranking journals. The software is publicly available at http://www.hrc.es/investigacion/metadisc_en.htm.

  17. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: II. Algorithms.

    Science.gov (United States)

    Appel, R D; Vargas, J R; Palagi, P M; Walther, D; Hochstrasser, D F

    1997-12-01

    After two generations of software systems for the analysis of two-dimensional electrophoresis (2-DE) images, a third generation of such software packages has recently emerged that combines state-of-the-art graphical user interfaces with comprehensive spot data analysis capabilities. A key characteristic common to most of these software packages is that many of their tools are implementations of algorithms that resulted from research areas such as image processing, vision, artificial intelligence or machine learning. This article presents the main algorithms implemented in the Melanie II 2-D PAGE software package. The applications of these algorithms, embodied as the feature of the program, are explained in an accompanying article (R. D. Appel et al.; Electrophoresis 1997, 18, 2724-2734).

  18. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    Science.gov (United States)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  19. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Science.gov (United States)

    Zandevakili, Pooya; Hu, Ming; Qin, Zhaohui

    2012-01-01

    Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS) that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU)-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  20. GPUmotif: an ultra-fast and energy-efficient motif analysis program using graphics processing units.

    Directory of Open Access Journals (Sweden)

    Pooya Zandevakili

    Full Text Available Computational detection of TF binding patterns has become an indispensable tool in functional genomics research. With the rapid advance of new sequencing technologies, large amounts of protein-DNA interaction data have been produced. Analyzing this data can provide substantial insight into the mechanisms of transcriptional regulation. However, the massive amount of sequence data presents daunting challenges. In our previous work, we have developed a novel algorithm called Hybrid Motif Sampler (HMS that enables more scalable and accurate motif analysis. Despite much improvement, HMS is still time-consuming due to the requirement to calculate matching probabilities position-by-position. Using the NVIDIA CUDA toolkit, we developed a graphics processing unit (GPU-accelerated motif analysis program named GPUmotif. We proposed a "fragmentation" technique to hide data transfer time between memories. Performance comparison studies showed that commonly-used model-based motif scan and de novo motif finding procedures such as HMS can be dramatically accelerated when running GPUmotif on NVIDIA graphics cards. As a result, energy consumption can also be greatly reduced when running motif analysis using GPUmotif. The GPUmotif program is freely available at http://sourceforge.net/projects/gpumotif/

  1. Analysis of Local Dependence and Multidimensionality in Graphical Loglinear Rasch Models

    DEFF Research Database (Denmark)

    Kreiner, Svend; Christensen, Karl Bang

    2004-01-01

    Local independence; Multidimensionality; Differential item functioning; Uniform local dependence and DIF; Graphical Rasch models; Loglinear Rasch model......Local independence; Multidimensionality; Differential item functioning; Uniform local dependence and DIF; Graphical Rasch models; Loglinear Rasch model...

  2. Long-term preservation of analysis software environment

    International Nuclear Information System (INIS)

    Toppe Larsen, Dag; Blomer, Jakob; Buncic, Predrag; Charalampidis, Ioannis; Haratyunyan, Artem

    2012-01-01

    Long-term preservation of scientific data represents a challenge to experiments, especially regarding the analysis software. Preserving data is not enough; the full software and hardware environment is needed. Virtual machines (VMs) make it possible to preserve hardware “in software”. A complete infrastructure package has been developed for easy deployment and management of VMs, based on CERN virtual machine (CernVM). Further, a HTTP-based file system, CernVM file system (CVMFS), is used for the distribution of the software. It is possible to process data with any given software version, and a matching, regenerated VM version. A point-and-click web user interface is being developed for setting up the complete processing chain, including VM and software versions, number and type of processing nodes, and the particular type of analysis and data. This paradigm also allows for distributed cloud-computing on private and public clouds, for both legacy and contemporary experiments.

  3. Engineering computer graphics in gas turbine engine design, analysis and manufacture

    Science.gov (United States)

    Lopatka, R. S.

    1975-01-01

    A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.

  4. VACTIV-DELPHI graphical dialog based program for the analysis of gamma-ray spectra

    International Nuclear Information System (INIS)

    Zlokazov, V.B.

    2002-01-01

    The program VACTIV - Visual ACTIV - has been developed for the analysis of gamma-ray spectra and is a standard graphical dialog based Windows XX application, driven by a menu, mouse and keyboard. On the one hand, it is a conversion of an existing Fortran program ACTIV to the DELPHI-5 language; on the other hand, it is a transformation of the sequential syntax of Fortran programming to a new object-oriented style, based on the organization of event interaction. Since VACTIV is seemingly the first attempt of applying the newest programming languages and styles to systems of spectrum analysis, the goal of its creation was both getting a convenient and efficient technique for data processing, their methods and events. Now the program is widely used for the processing of gamma-ray spectra in experiments on activation analysis

  5. A complete graphical criterion for the adjustment formula in mediation analysis.

    Science.gov (United States)

    Shpitser, Ilya; VanderWeele, Tyler J

    2011-03-04

    Various assumptions have been used in the literature to identify natural direct and indirect effects in mediation analysis. These effects are of interest because they allow for effect decomposition of a total effect into a direct and indirect effect even in the presence of interactions or non-linear models. In this paper, we consider the relation and interpretation of various identification assumptions in terms of causal diagrams interpreted as a set of non-parametric structural equations. We show that for such causal diagrams, two sets of assumptions for identification that have been described in the literature are in fact equivalent in the sense that if either set of assumptions holds for all models inducing a particular causal diagram, then the other set of assumptions will also hold for all models inducing that diagram. We moreover build on prior work concerning a complete graphical identification criterion for covariate adjustment for total effects to provide a complete graphical criterion for using covariate adjustment to identify natural direct and indirect effects. Finally, we show that this criterion is equivalent to the two sets of independence assumptions used previously for mediation analysis.

  6. Ignominy: a tool for software dependency and metric analysis with examples from large HEP packages

    International Nuclear Information System (INIS)

    Tuura, L.A.; Taylor, L.

    2001-01-01

    Ignominy is a tool developed in the CMS IGUANA project to analyse the structure of software systems. Its primary component is a dependency scanner that distills information into human-usable forms. It also includes several tools to visualise the collected data in the form of graphical views and numerical metrics. Ignominy was designed to adapt to almost any reasonable structure, and it has been used to analyse several large projects. The original purpose of Ignominy was to help us better ensure the quality of our own software, and in particular warn us about possible structural problems early on. As a part of this activity it is now used as a standard part of our release procedure. The authors also use it to evaluate and study the quality of external packages they plan to make use of. The authors describe what Ignominy can find out, and how it can be used to visualise and assess a software structure. The authors also discuss the inherent problems of the analysis as well as the different approaches to modularity the tool makes quite evident. The focus is the illustration of these issues through the analysis results for several sizable HEP software projects

  7. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  8. Visual evaluation of kinetic characteristics of PET probe for neuroreceptors using a two-phase graphic plot analysis.

    Science.gov (United States)

    Ito, Hiroshi; Ikoma, Yoko; Seki, Chie; Kimura, Yasuyuki; Kawaguchi, Hiroshi; Takuwa, Hiroyuki; Ichise, Masanori; Suhara, Tetsuya; Kanno, Iwao

    2017-05-01

    Objectives In PET studies for neuroreceptors, tracer kinetics are described by the two-tissue compartment model (2-TCM), and binding parameters, including the total distribution volume (V T ), non-displaceable distribution volume (V ND ), and binding potential (BP ND ), can be determined from model parameters estimated by kinetic analysis. The stability of binding parameter estimates depends on the kinetic characteristics of radioligands. To describe these kinetic characteristics, we previously developed a two-phase graphic plot analysis in which V ND and V T can be estimated from the x-intercept of regression lines for early and delayed phases, respectively. In this study, we applied this graphic plot analysis to visual evaluation of the kinetic characteristics of radioligands for neuroreceptors, and investigated a relationship between the shape of these graphic plots and the stability of binding parameters estimated by the kinetic analysis with 2-TCM in simulated brain tissue time-activity curves (TACs) with various binding parameters. Methods 90-min TACs were generated with the arterial input function and assumed kinetic parameters according to 2-TCM. Graphic plot analysis was applied to these simulated TACs, and the curvature of the plot for each TAC was evaluated visually. TACs with several noise levels were also generated with various kinetic parameters, and the bias and variation of binding parameters estimated by kinetic analysis were calculated in each TAC. These bias and variation were compared with the shape of graphic plots. Results The graphic plots showed larger curvature for TACs with higher specific binding and slower dissociation of specific binding. The quartile deviations of V ND and BP ND determined by kinetic analysis were smaller for radioligands with slow dissociation. Conclusions The larger curvature of graphic plots for radioligands with slow dissociation might indicate a stable determination of V ND and BP ND by kinetic analysis. For

  9. MethLAB: a graphical user interface package for the analysis of array-based DNA methylation data.

    Science.gov (United States)

    Kilaru, Varun; Barfield, Richard T; Schroeder, James W; Smith, Alicia K; Conneely, Karen N

    2012-03-01

    Recent evidence suggests that DNA methylation changes may underlie numerous complex traits and diseases. The advent of commercial, array-based methods to interrogate DNA methylation has led to a profusion of epigenetic studies in the literature. Array-based methods, such as the popular Illumina GoldenGate and Infinium platforms, estimate the proportion of DNA methylated at single-base resolution for thousands of CpG sites across the genome. These arrays generate enormous amounts of data, but few software resources exist for efficient and flexible analysis of these data. We developed a software package called MethLAB (http://genetics.emory.edu/conneely/MethLAB) using R, an open source statistical language that can be edited to suit the needs of the user. MethLAB features a graphical user interface (GUI) with a menu-driven format designed to efficiently read in and manipulate array-based methylation data in a user-friendly manner. MethLAB tests for association between methylation and relevant phenotypes by fitting a separate linear model for each CpG site. These models can incorporate both continuous and categorical phenotypes and covariates, as well as fixed or random batch or chip effects. MethLAB accounts for multiple testing by controlling the false discovery rate (FDR) at a user-specified level. Standard output includes a spreadsheet-ready text file and an array of publication-quality figures. Considering the growing interest in and availability of DNA methylation data, there is a great need for user-friendly open source analytical tools. With MethLAB, we present a timely resource that will allow users with no programming experience to implement flexible and powerful analyses of DNA methylation data.

  10. Analysis of impact of general-purpose graphics processor units in supersonic flow modeling

    Science.gov (United States)

    Emelyanov, V. N.; Karpenko, A. G.; Kozelkov, A. S.; Teterina, I. V.; Volkov, K. N.; Yalozo, A. V.

    2017-06-01

    Computational methods are widely used in prediction of complex flowfields associated with off-normal situations in aerospace engineering. Modern graphics processing units (GPU) provide architectures and new programming models that enable to harness their large processing power and to design computational fluid dynamics (CFD) simulations at both high performance and low cost. Possibilities of the use of GPUs for the simulation of external and internal flows on unstructured meshes are discussed. The finite volume method is applied to solve three-dimensional unsteady compressible Euler and Navier-Stokes equations on unstructured meshes with high resolution numerical schemes. CUDA technology is used for programming implementation of parallel computational algorithms. Solutions of some benchmark test cases on GPUs are reported, and the results computed are compared with experimental and computational data. Approaches to optimization of the CFD code related to the use of different types of memory are considered. Speedup of solution on GPUs with respect to the solution on central processor unit (CPU) is compared. Performance measurements show that numerical schemes developed achieve 20-50 speedup on GPU hardware compared to CPU reference implementation. The results obtained provide promising perspective for designing a GPU-based software framework for applications in CFD.

  11. smRithm: Graphical user interface for heart rate variability analysis.

    Science.gov (United States)

    Nara, Sanjeev; Kaur, Manvinder; Datta, Saurav

    2015-01-01

    Over the past 25 years, Heart rate variability (HRV) has become a non-invasive research and clinical tool for indirectly carrying out investigation of both cardiac and autonomic system function in both healthy and diseased. It provides valuable information about a wide range of cardiovascular disorders, pulmonary diseases, neurological diseases, etc. Its primary purpose is to access the functioning of the nervous system. The source of information for HRV analysis is the continuous beat to beat measurement of inter-beat intervals. The electrocardiography (ECG or EKG) is considered as the best way to measure inter-beat intervals. This paper proposes an open source Graphical User Interface (GUI): smRithm developed in MATLAB for HRV analysis that will apply effective techniques on the raw ECG signals to process and decompose it in a simpler manner to obtain more useful information out of signals that can be utilized for more powerful and efficient applications in the near future related to HRV.

  12. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  13. Change impact analysis for software product lines

    Directory of Open Access Journals (Sweden)

    Jihen Maâzoun

    2016-10-01

    Full Text Available A software product line (SPL represents a family of products in a given application domain. Each SPL is constructed to provide for the derivation of new products by covering a wide range of features in its domain. Nevertheless, over time, some domain features may become obsolete with the apparition of new features while others may become refined. Accordingly, the SPL must be maintained to account for the domain evolution. Such evolution requires a means for managing the impact of changes on the SPL models, including the feature model and design. This paper presents an automated method that analyzes feature model evolution, traces their impact on the SPL design, and offers a set of recommendations to ensure the consistency of both models. The proposed method defines a set of new metrics adapted to SPL evolution to identify the effort needed to maintain the SPL models consistently and with a quality as good as the original models. The method and its tool are illustrated through an example of an SPL in the Text Editing domain. In addition, they are experimentally evaluated in terms of both the quality of the maintained SPL models and the precision of the impact change management.

  14. Dispersion analysis of biotoxins using HPAC software

    International Nuclear Information System (INIS)

    Wu, A.; Nurthen, N.; Horstman, A.; Watson, R.; Phillips, M.

    2009-01-01

    Biotoxins are emerging threat agents produced by living organisms: bacteria, plants, or animals. Biotoxins are generally classified as cyanotoxins, hemotoxins, necrotoxins, neurotoxins, and cytotoxins. The application of classical biotoxins as weapons of terror has been realized because of extreme potency and lethality; ease of production, transport, and misuse; and the need for prolonged intensive care among affected persons. Recently, emerging biotoxins, such as ricin and T2 micotoxin have been clandestinely used by either terrorist groups or military combat operations. It is thus highly desirable to have a modeling system to simulate dispersions of biotoxins in a terrorist attack scenario in order to provide prompt technical support and casualty estimation to the first responders and military rescuers. The Hazard Prediction and Assessment Capability (HPAC) automated software system provides the means to accurately predict the effects of hazardous material released into the atmosphere and its impact on civilian and military populations. The system uses integrated source terms, high-resolution weather forecasts and atmospheric transport and dispersion analyses to model hazard areas produced by military or terrorist incidents and industrial accidents. We have successfully incorporated physical, chemical, epidemiological and biological characteristics of a variety of biotoxins into the HPAC system and have conducted numerous analyses for our emergency responders. The health effects caused by these hazards are closely reflected in HPAC output results.(author)

  15. OST: analysis tool for real time software by simulation of material and software environments

    International Nuclear Information System (INIS)

    Boulc'h; Le Meur; Lapassat; Salichon; Segalard

    1988-07-01

    The utilization of microprocessors systems in a nuclear installation control oblige a great operation safety in the installation operation and in the environment protection. For the safety analysis of these installations the Institute of Protection and Nuclear Safety (IPSN) will dispose tools which permit to make controls during all the life of the software. The simulation and test tool (OST) which have been created is completely made by softwares. It is used on VAX calculators and can be easily transportable on other calculators [fr

  16. Dedicated algorithm and software for the integrated analysis of AC and DC electrical outputs of piezoelectric vibration energy harvesters

    International Nuclear Information System (INIS)

    Kim, Jae Eum

    2014-01-01

    DC electrical outputs of a piezoelectric vibration energy harvester by nonlinear rectifying circuitry can hardly be obtained either by any mathematical models developed so far or by finite element analysis. To address the issue, this work used an equivalent electrical circuit model and newly developed an algorithm to efficiently identify relevant circuit parameters of arbitrarily-shaped cantilevered piezoelectric energy harvesters. The developed algorithm was then realized as a dedicated software module by adopting ANSYS finite element analysis software for the parameters identification and the Tcl/Tk programming language for a graphical user interface and linkage with ANSYS. For verifications, various AC electrical outputs by the developed software were compared with those by traditional finite element analysis. DC electrical outputs through rectifying circuitry were also examined for varying values of the smoothing capacitance and load resistance.

  17. Dedicated algorithm and software for the integrated analysis of AC and DC electrical outputs of piezoelectric vibration energy harvesters

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Eum [Catholic University of Daegu, Gyeongsan (Korea, Republic of)

    2014-10-15

    DC electrical outputs of a piezoelectric vibration energy harvester by nonlinear rectifying circuitry can hardly be obtained either by any mathematical models developed so far or by finite element analysis. To address the issue, this work used an equivalent electrical circuit model and newly developed an algorithm to efficiently identify relevant circuit parameters of arbitrarily-shaped cantilevered piezoelectric energy harvesters. The developed algorithm was then realized as a dedicated software module by adopting ANSYS finite element analysis software for the parameters identification and the Tcl/Tk programming language for a graphical user interface and linkage with ANSYS. For verifications, various AC electrical outputs by the developed software were compared with those by traditional finite element analysis. DC electrical outputs through rectifying circuitry were also examined for varying values of the smoothing capacitance and load resistance.

  18. TLM-Tracker: software for cell segmentation, tracking and lineage analysis in time-lapse microscopy movies.

    Science.gov (United States)

    Klein, Johannes; Leupold, Stefan; Biegler, Ilona; Biedendieck, Rebekka; Münch, Richard; Jahn, Dieter

    2012-09-01

    Time-lapse imaging in combination with fluorescence microscopy techniques enable the investigation of gene regulatory circuits and uncovered phenomena like culture heterogeneity. In this context, computational image processing for the analysis of single cell behaviour plays an increasing role in systems biology and mathematical modelling approaches. Consequently, we developed a software package with graphical user interface for the analysis of single bacterial cell behaviour. A new software called TLM-Tracker allows for the flexible and user-friendly interpretation for the segmentation, tracking and lineage analysis of microbial cells in time-lapse movies. The software package, including manual, tutorial video and examples, is available as Matlab code or executable binaries at http://www.tlmtracker.tu-bs.de.

  19. ANALYSIS OF CELLULAR REACTION TO IFN-γ STIMULATION BY A SOFTWARE PACKAGE GeneExpressionAnalyser

    Directory of Open Access Journals (Sweden)

    A. V. Saetchnikov

    2014-01-01

    Full Text Available The software package GeneExpressionAnalyser for analysis of the DNA microarray experi-mental data has been developed. The algorithms of data analysis, differentially expressed genes and biological functions of the cell are described. The efficiency of the developed package is tested on the published experimental data devoted to the time-course research of the changes in the human cell un-der the influence of IFN-γ on melanoma. The developed software has a number of advantages over the existing software: it is free, has a simple and intuitive graphical interface, allows to analyze different types of DNA microarrays, contains a set of methods for complete data analysis and performs effec-tive gene annotation for a selected list of genes.

  20. Power Analysis Software for Educational Researchers

    Science.gov (United States)

    Peng, Chao-Ying Joanne; Long, Haiying; Abaci, Serdar

    2012-01-01

    Given the importance of statistical power analysis in quantitative research and the repeated emphasis on it by American Educational Research Association/American Psychological Association journals, the authors examined the reporting practice of power analysis by the quantitative studies published in 12 education/psychology journals between 2005…

  1. JEM-X science analysis software

    DEFF Research Database (Denmark)

    Westergaard, Niels Jørgen Stenfeldt; Kretschmar, P.; Oxborrow, Carol Anne

    2003-01-01

    The science analysis of the data from JEM-X on INTEGRAL is performed through a number of levels including corrections, good time selection, imaging and source finding, spectrum and light-curve extraction. These levels consist of individual executables and the running of the complete analysis...

  2. Analysis of Variance: What Is Your Statistical Software Actually Doing?

    Science.gov (United States)

    Li, Jian; Lomax, Richard G.

    2011-01-01

    Users assume statistical software packages produce accurate results. In this article, the authors systematically examined Statistical Package for the Social Sciences (SPSS) and Statistical Analysis System (SAS) for 3 analysis of variance (ANOVA) designs, mixed-effects ANOVA, fixed-effects analysis of covariance (ANCOVA), and nested ANOVA. For each…

  3. R graphics

    CERN Document Server

    Murrell, Paul

    2005-01-01

    R is revolutionizing the world of statistical computing. Powerful, flexible, and best of all free, R is now the program of choice for tens of thousands of statisticians. Destined to become an instant classic, R Graphics presents the first complete, authoritative exposition on the R graphical system. Paul Murrell, widely known as the leading expert on R graphics, has developed an in-depth resource that takes nothing for granted and helps both neophyte and seasoned users master the intricacies of R graphics. After an introductory overview of R graphics facilities, the presentation first focuses

  4. Adapted wavelet analysis from theory to software

    CERN Document Server

    Wickerhauser, Mladen Victor

    1994-01-01

    This detail-oriented text is intended for engineers and applied mathematicians who must write computer programs to perform wavelet and related analysis on real data. It contains an overview of mathematical prerequisites and proceeds to describe hands-on programming techniques to implement special programs for signal analysis and other applications. From the table of contents: - Mathematical Preliminaries - Programming Techniques - The Discrete Fourier Transform - Local Trigonometric Transforms - Quadrature Filters - The Discrete Wavelet Transform - Wavelet Packets - The Best Basis Algorithm - Multidimensional Library Trees - Time-Frequency Analysis - Some Applications - Solutions to Some of the Exercises - List of Symbols - Quadrature Filter Coefficients

  5. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    Science.gov (United States)

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  6. Graphical interpretation of numerical model results

    International Nuclear Information System (INIS)

    Drewes, D.R.

    1979-01-01

    Computer software has been developed to produce high quality graphical displays of data from a numerical grid model. The code uses an existing graphical display package (DISSPLA) and overcomes some of the problems of both line-printer output and traditional graphics. The software has been designed to be flexible enough to handle arbitrarily placed computation grids and a variety of display requirements

  7. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    International Nuclear Information System (INIS)

    Wingle, W.L.; Poeter, E.P.; McKenna, S.A.

    1999-01-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  8. Integrated structural analysis tool using the linear matching method part 1 – Software development

    International Nuclear Information System (INIS)

    Ure, James; Chen, Haofeng; Tipping, David

    2014-01-01

    A number of direct methods based upon the Linear Matching Method (LMM) framework have been developed to address structural integrity issues for components subjected to cyclic thermal and mechanical load conditions. This paper presents a new integrated structural analysis tool using the LMM framework for the assessment of load carrying capacity, shakedown limit, ratchet limit and steady state cyclic response of structures. First, the development of the LMM for the evaluation of design limits in plasticity is introduced. Second, preliminary considerations for the development of the LMM into a tool which can be used on a regular basis by engineers are discussed. After the re-structuring of the LMM subroutines for multiple central processing unit (CPU) solution, the LMM software tool for the assessment of design limits in plasticity is implemented by developing an Abaqus CAE plug-in with graphical user interfaces. Further demonstration of this new LMM analysis tool including practical application and verification is presented in an accompanying paper. - Highlights: • A new structural analysis tool using the Linear Matching Method (LMM) is developed. • The software tool is able to evaluate the design limits in plasticity. • Able to assess limit load, shakedown, ratchet limit and steady state cyclic response. • Re-structuring of the LMM subroutines for multiple CPU solution is conducted. • The software tool is implemented by developing an Abaqus CAE plug-in with GUI

  9. MAPPIX: A software package for off-line micro-pixe single particle aerosol analysis

    International Nuclear Information System (INIS)

    Ceccato, D.

    2009-01-01

    In the framework of a multiannual experiment performed at Baia Terra Nova, Antarctica, size-segregated aerosol samples were collected by using a 12-stage SDI impactor (Hillamo design). Approximately 2800 particles, belonging to the first four supermicrometric SDI stages - 8.39, 4.08, 2.68, 1.66 μm dynamic aerosol diameter cuts - were analyzed at the INFN-LNL micro-PIXE facility, a three lens Oxford Microprobe (OM) product, installed in the early nineties. Four regions on each of the 12 sub-samples were measured; 60 aerosol particles were detected on average in each of the analyzed regions. The off-line single aerosol particle (SAP) analysis of such big amount of data required software that is able to rapidly handle the acquired data, with a simple and fast area selection procedure; the subsequent automated PIXE spectra analysis with a specialized code was also needed. The MAPPIX 2.0 software was designed to make easier and faster the user jobs during the SAP analysis. The package is composed of two separate routines: the first one is devoted to data format conversion (OM-LMF file format to MAPPIX format), while the second one is devoted to micro-PIXE maps graphical presentation and aerosol particle selection procedure. The MAPPIX data format and software features will be discussed; a short report of the speed performances will be presented.

  10. Automated Freedom from Interference Analysis for Automotive Software

    OpenAIRE

    Leitner-Fischer , Florian; Leue , Stefan; Liu , Sirui

    2016-01-01

    International audience; Freedom from Interference for automotive software systems developed according to the ISO 26262 standard means that a fault in a less safety critical software component will not lead to a fault in a more safety critical component. It is an important concern in the realm of functional safety for automotive systems. We present an automated method for the analysis of concurrency-related interferences based on the QuantUM approach and tool that we have previously developed....

  11. EDS operator and control software

    International Nuclear Information System (INIS)

    Ott, L.L.

    1985-04-01

    The Enrichment Diagnostic System (EDS) was developed at Lawrence Livermore National Laboratory (LLNL) to acquire, display and analyze large quantities of transient data for a real-time Advanced Vapor Laser Isotope Separation (AVLIS) experiment. Major topics discussed in this paper are the EDS operator interface (SHELL) program, the data acquisition and analysis scheduling software, and the graphics software. The workstation concept used in EDS, the software used to configure a user's workstation, and the ownership and management of a diagnostic are described. An EDS diagnostic is a combination of hardware and software designed to study specific aspects of the process. Overall system performance is discussed from the standpoint of scheduling techniques, evaluation tools, optimization techniques, and program-to-program communication methods. EDS is based on a data driven design which keeps the need to modify software to a minimum. This design requires a fast and reliable data base management system. A third party data base management product, Berkeley Software System Database, written explicitly for HP1000's, is used for all EDS data bases. All graphics is done with an in-house graphics product, Device Independent Graphics Library (DIGLIB). Examples of devices supported by DIGLIB are: Versatec printer/plotters, Raster Technologies Graphic Display Controllers, and HP terminals (HP264x and HP262x). The benefits derived by using HP hardware and software as well as obstacles imposed by the HP environment are presented in relation to EDS development and implementation

  12. FTA, Fault Tree Analysis for Minimal Cut Sets, Graphics for CALCOMP

    International Nuclear Information System (INIS)

    Van Slyke, W.J.; Griffing, D.E.; Diven, J.

    1978-01-01

    1 - Description of problem or function: The FTA (Fault Tree Analysis) system was designed to predict probabilities of the modes of failure for complex systems and to graphically present the structure of systems. There are three programs in the system. Program ALLCUTS performs the calculations. Program KILMER constructs a CalComp plot file of the system fault tree. Program BRANCH builds a cross-reference list of the system fault tree. 2 - Method of solution: ALLCUTS employs a top-down set expansion algorithm to find fault tree cut-sets and then optionally calculates their probability using a currently accepted cut-set quantification method. The methodology is adapted from that in WASH-1400 (draft), August 1974. 3 - Restrictions on the complexity of the problem: Maxima of: 175 basic events, 425 rate events. ALLCUTS may be expanded to solve larger problems depending on available core memory

  13. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines.

  14. Software safety analysis techniques for developing safety critical software in the digital protection system of the LMR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Cheon, Se Woo; Kim, Chang Hoi; Sim, Yun Sub

    2001-02-01

    This report has described the software safety analysis techniques and the engineering guidelines for developing safety critical software to identify the state of the art in this field and to give the software safety engineer a trail map between the code and standards layer and the design methodology and documents layer. We have surveyed the management aspects of software safety activities during the software lifecycle in order to improve the safety. After identifying the conventional safety analysis techniques for systems, we have surveyed in details the software safety analysis techniques, software FMEA(Failure Mode and Effects Analysis), software HAZOP(Hazard and Operability Analysis), and software FTA(Fault Tree Analysis). We have also surveyed the state of the art in the software reliability assessment techniques. The most important results from the reliability techniques are not the specific probability numbers generated, but the insights into the risk importance of software features. To defend against potential common-mode failures, high quality, defense-in-depth, and diversity are considered to be key elements in digital I and C system design. To minimize the possibility of CMFs and thus increase the plant reliability, we have provided D-in-D and D analysis guidelines

  15. The IGUANA interactive graphics toolkit with examples from CMS and D0

    International Nuclear Information System (INIS)

    Alverson, G.; Osborne, I.; Taylor, L.; Tuura, L.

    2001-01-01

    IGUANA (Interactive Graphics for User ANAlysis) is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications, such as data browsers and detector and event visualisation programs. The IGUANA strategy is to use freely available software (e.g. Qt, SoQt, OpenInventor, OpenGL, HEPVis) and package and extend it to provide a general-purpose and experiment-independent toolkit. The authors describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA. The authors demonstrate the use of IGUANA with several applications built for CMS and D0

  16. LANDSAFE: LANDING SITE RISK ANALYSIS SOFTWARE FRAMEWORK

    OpenAIRE

    Schmidt, Ralph; Bostelmann, Jonas; Cornet, Yves; Heipke, Christian; Philippe, Christian; Poncelet, Nadia; de Rosa, Diego; Vandeloise, Yannick

    2012-01-01

    The European Space Agency (ESA) is planning a Lunar Lander mission in the 2018 timeframe that will demonstrate precise soft landing at the polar regions of the Moon. To ensure a safe and successful landing a careful risk analysis has to be carried out. This is comprised of identifying favorable target areas and evaluating the surface conditions in these areas. Features like craters, boulders, steep slopes, rough surfaces and shadow areas have to be identified in order to assess the risk assoc...

  17. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  18. How to do Meta-Analysis using HLM software

    OpenAIRE

    Petscher, Yaacov

    2013-01-01

    This is a step-by-step presentation of how to run a meta-analysis using HLM software. Because it's a variance known model, it is not run through the GUI, but batch mode. These slides show how to prepare the data and run the analysis.

  19. A relational approach to support software architecture analysis

    NARCIS (Netherlands)

    Feijs, L.M.G.; Krikhaar, R.L.; van Ommering, R.C.

    1998-01-01

    This paper reports on our experience with a relational approach to support the analysis of existing software architectures. The analysis options provide for visualization and view calculation. The approach has been applied for reverse engineering. It is also possible to check concrete designs

  20. SEDA: A software package for the Statistical Earthquake Data Analysis

    Science.gov (United States)

    Lombardi, A. M.

    2017-03-01

    In this paper, the first version of the software SEDA (SEDAv1.0), designed to help seismologists statistically analyze earthquake data, is presented. The package consists of a user-friendly Matlab-based interface, which allows the user to easily interact with the application, and a computational core of Fortran codes, to guarantee the maximum speed. The primary factor driving the development of SEDA is to guarantee the research reproducibility, which is a growing movement among scientists and highly recommended by the most important scientific journals. SEDAv1.0 is mainly devoted to produce accurate and fast outputs. Less care has been taken for the graphic appeal, which will be improved in the future. The main part of SEDAv1.0 is devoted to the ETAS modeling. SEDAv1.0 contains a set of consistent tools on ETAS, allowing the estimation of parameters, the testing of model on data, the simulation of catalogs, the identification of sequences and forecasts calculation. The peculiarities of routines inside SEDAv1.0 are discussed in this paper. More specific details on the software are presented in the manual accompanying the program package.

  1. Confirmatory Factor Analysis Alternative: Free, Accessible CBID Software.

    Science.gov (United States)

    Bott, Marjorie; Karanevich, Alex G; Garrard, Lili; Price, Larry R; Mudaranthakam, Dinesh Pal; Gajewski, Byron

    2018-02-01

    New software that performs Classical and Bayesian Instrument Development (CBID) is reported that seamlessly integrates expert (content validity) and participant data (construct validity) to produce entire reliability estimates with smaller sample requirements. The free CBID software can be accessed through a website and used by clinical investigators in new instrument development. Demonstrations are presented of the three approaches using the CBID software: (a) traditional confirmatory factor analysis (CFA), (b) Bayesian CFA using flat uninformative prior, and (c) Bayesian CFA using content expert data (informative prior). Outcomes of usability testing demonstrate the need to make the user-friendly, free CBID software available to interdisciplinary researchers. CBID has the potential to be a new and expeditious method for instrument development, adding to our current measurement toolbox. This allows for the development of new instruments for measuring determinants of health in smaller diverse populations or populations of rare diseases.

  2. A 'Toolbox' Equivalent Process for Safety Analysis Software

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Eng, Tony

    2004-01-01

    Defense Nuclear Facilities Safety Board (DNFSB) Recommendation 2002-1 (Quality Assurance for Safety-Related Software) identified a number of quality assurance issues on the use of software in Department of Energy (DOE) facilities for analyzing hazards, and designing and operating controls that prevent or mitigate potential accidents. The development and maintenance of a collection, or 'toolbox', of multiple-site use, standard solution, Software Quality Assurance (SQA)-compliant safety software is one of the major improvements identified in the associated DOE Implementation Plan (IP). The DOE safety analysis toolbox will contain a set of appropriately quality-assured, configuration-controlled, safety analysis codes, recognized for DOE-broad, safety basis applications. Currently, six widely applied safety analysis computer codes have been designated for toolbox consideration. While the toolbox concept considerably reduces SQA burdens among DOE users of these codes, many users of unique, single-purpose, or single-site software may still have sufficient technical justification to continue use of their computer code of choice, but are thwarted by the multiple-site condition on toolbox candidate software. The process discussed here provides a roadmap for an equivalency argument, i.e., establishing satisfactory SQA credentials for single-site software that can be deemed ''toolbox-equivalent''. The process is based on the model established to meet IP Commitment 4.2.1.2: Establish SQA criteria for the safety analysis ''toolbox'' codes. Implementing criteria that establish the set of prescriptive SQA requirements are based on implementation plan/procedures from the Savannah River Site, also incorporating aspects of those from the Waste Isolation Pilot Plant (SNL component) and the Yucca Mountain Project. The major requirements are met with evidence of a software quality assurance plan, software requirements and design documentation, user's instructions, test report, a

  3. Immunogenetic Management Software: a new tool for visualization and analysis of complex immunogenetic datasets.

    Science.gov (United States)

    Johnson, Z P; Eady, R D; Ahmad, S F; Agravat, S; Morris, T; Else, J; Lank, S M; Wiseman, R W; O'Connor, D H; Penedo, M C T; Larsen, C P; Kean, L S

    2012-04-01

    Here we describe the Immunogenetic Management Software (IMS) system, a novel web-based application that permits multiplexed analysis of complex immunogenetic traits that are necessary for the accurate planning and execution of experiments involving large animal models, including nonhuman primates. IMS is capable of housing complex pedigree relationships, microsatellite-based MHC typing data, as well as MHC pyrosequencing expression analysis of class I alleles. It includes a novel, automated MHC haplotype naming algorithm and has accomplished an innovative visualization protocol that allows users to view multiple familial and MHC haplotype relationships through a single, interactive graphical interface. Detailed DNA and RNA-based data can also be queried and analyzed in a highly accessible fashion, and flexible search capabilities allow experimental choices to be made based on multiple, individualized and expandable immunogenetic factors. This web application is implemented in Java, MySQL, Tomcat, and Apache, with supported browsers including Internet Explorer and Firefox on Windows and Safari on Mac OS. The software is freely available for distribution to noncommercial users by contacting Leslie.kean@emory.edu. A demonstration site for the software is available at http://typing.emory.edu/typing_demo , user name: imsdemo7@gmail.com and password: imsdemo.

  4. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  5. FORECAST: Regulatory effects cost analysis software annual

    International Nuclear Information System (INIS)

    Lopez, B.; Sciacca, F.W.

    1991-11-01

    Over the past several years the NRC has developed a generic cost methodology for the quantification of cost/economic impacts associated with a wide range of new or revised regulatory requirements. This methodology has been developed to aid the NRC in preparing Regulatory Impact Analyses (RIAs). These generic costing methods can be useful in quantifying impacts both to industry and to the NRC. The FORECAST program was developed to facilitate the use of the generic costing methodology. This PC program integrates the major cost considerations that may be required because of a regulatory change. FORECAST automates much of the calculations typically needed in an RIA and thus reduces the time and labor required to perform these analysis. More importantly, its integrated and consistent treatment of the different cost elements should help assure comprehensiveness, uniformity, and accuracy in the preparation of needed cost estimates

  6. Equipment Obsolescence Analysis and Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Redmond, J.; Carret, L.; Shaon, S.; Schultz, C.

    2015-07-01

    The procurement engineering resources at Nuclear Power Plants (NPPs) are experiencing increasing backlog for procurement items primarily due to the inability to order the original replacement parts. The level of effort and time required to prepare procurement packages is increasing since the number of obsolete parts are increasing exponentially. Procurement packages for obsolete components and parts are much more complex and take more time to prepare because of the need to perform equivalency evaluations, testing requirements and test acceptance criteria development, commercial grade dedication or equipment qualification, and increasing efforts to verify that no fraudulent or counterfeit parts are procured. This problem will be further compounded when NPPs pursue license renewal and approval for plant-life extension. Advanced planning and advanced knowledge of equipment obsolescence is required to allow for sufficient time to properly procure replacement parts for obsolete items. The uncertain supply chain capability due to obsolescence is a real problem and can cause a risk to reliable plant operations due to the potential for a lack of available spare parts and replacement components to support outages and unplanned component failures. Advanced notification of obsolescence is increasingly more important to ensure that adequate time and planning is scheduled to procure the proper replacement parts. A thorough analysis of Original Equipment Manufacturer (OEM) availability and inventory as well as an analysis of failure rates and usage rates is required to predict critical part needs to allow for early identification of obsolescence issues so that a planned and controlled strategy to qualify replacement equipment can be implemented. (Author)

  7. Thin-plate spline graphical analysis of the mandible in mandibular prognathism.

    Science.gov (United States)

    Chang, Hsin-Fu; Chang, Hong-Po; Liu, Pao-Hsin; Chang, Chih-Han

    2002-11-01

    The chin cup has been used to treat skeletal mandibular prognathism in growing patients for 200 years. The pull on the orthopedic-force chin cup is oriented along a line from the mandibular symphysis to the mandibular condyle. Various levels of success have been reported with this restraining device. The vertical chin cup produces strong vertical compression stress on the maxillary molar regions when the direction of traction is 20 degrees more vertical than the chin-condyle line. This treatment strategy may prevent relapse due to counter-clockwise rotation of the mandible. In this report, we describe a new strategy for using chin-cup therapy involving thin-plate spline (TPS) analysis of lateral cephalometric roentgenograms to visualize transformation of the mandible. The actual sites of mandibular skeletal change are not detectable with conventional cephalometric analysis. A case of mandibular prognathism treated with a chin cup and a case of dental Class III malocclusion without orthodontic treatment are described. The case analysis illustrates that specific patterns of mandibular transformation are associated with Class III malocclusion with or without orthopedic therapy, and that visualization of these deformations is feasible using TPS graphical analysis.

  8. Development and validation of MIX: comprehensive free software for meta-analysis of causal research data

    Directory of Open Access Journals (Sweden)

    Ikeda Noriaki

    2006-10-01

    Full Text Available Abstract Background Meta-analysis has become a well-known method for synthesis of quantitative data from previously conducted research in applied health sciences. So far, meta-analysis has been particularly useful in evaluating and comparing therapies and in assessing causes of disease. Consequently, the number of software packages that can perform meta-analysis has increased over the years. Unfortunately, it can take a substantial amount of time to get acquainted with some of these programs and most contain little or no interactive educational material. We set out to create and validate an easy-to-use and comprehensive meta-analysis package that would be simple enough programming-wise to remain available as a free download. We specifically aimed at students and researchers who are new to meta-analysis, with important parts of the development oriented towards creating internal interactive tutoring tools and designing features that would facilitate usage of the software as a companion to existing books on meta-analysis. Results We took an unconventional approach and created a program that uses Excel as a calculation and programming platform. The main programming language was Visual Basic, as implemented in Visual Basic 6 and Visual Basic for Applications in Excel 2000 and higher. The development took approximately two years and resulted in the 'MIX' program, which can be downloaded from the program's website free of charge. Next, we set out to validate the MIX output with two major software packages as reference standards, namely STATA (metan, metabias, and metatrim and Comprehensive Meta-Analysis Version 2. Eight meta-analyses that had been published in major journals were used as data sources. All numerical and graphical results from analyses with MIX were identical to their counterparts in STATA and CMA. The MIX program distinguishes itself from most other programs by the extensive graphical output, the click-and-go (Excel interface, and the

  9. Microcomputer Simulated CAD for Engineering Graphics.

    Science.gov (United States)

    Huggins, David L.; Myers, Roy E.

    1983-01-01

    Describes a simulated computer-aided-graphics (CAD) program at The Pennsylvania State University. Rationale for the program, facilities, microcomputer equipment (Apple) used, and development of a software package for simulating applied engineering graphics are considered. (JN)

  10. A new graphic plot analysis for determination of neuroreceptor binding in positron emission tomography studies.

    Science.gov (United States)

    Ito, Hiroshi; Yokoi, Takashi; Ikoma, Yoko; Shidahara, Miho; Seki, Chie; Naganawa, Mika; Takahashi, Hidehiko; Takano, Harumasa; Kimura, Yuichi; Ichise, Masanori; Suhara, Tetsuya

    2010-01-01

    In positron emission tomography (PET) studies with radioligands for neuroreceptors, tracer kinetics have been described by the standard two-tissue compartment model that includes the compartments of nondisplaceable binding and specific binding to receptors. In the present study, we have developed a new graphic plot analysis to determine the total distribution volume (V(T)) and nondisplaceable distribution volume (V(ND)) independently, and therefore the binding potential (BP(ND)). In this plot, Y(t) is the ratio of brain tissue activity to time-integrated arterial input function, and X(t) is the ratio of time-integrated brain tissue activity to time-integrated arterial input function. The x-intercept of linear regression of the plots for early phase represents V(ND), and the x-intercept of linear regression of the plots for delayed phase after the equilibrium time represents V(T). BP(ND) can be calculated by BP(ND)=V(T)/V(ND)-1. Dynamic PET scanning with measurement of arterial input function was performed on six healthy men after intravenous rapid bolus injection of [(11)C]FLB457. The plot yielded a curve in regions with specific binding while it yielded a straight line through all plot data in regions with no specific binding. V(ND), V(T), and BP(ND) values calculated by the present method were in good agreement with those by conventional non-linear least-squares fitting procedure. This method can be used to distinguish graphically whether the radioligand binding includes specific binding or not.

  11. The software analysis project for the Office of Human Resources

    Science.gov (United States)

    Tureman, Robert L., Jr.

    1994-01-01

    There were two major sections of the project for the Office of Human Resources (OHR). The first section was to conduct a planning study to analyze software use with the goal of recommending software purchases and determining whether the need exists for a file server. The second section was analysis and distribution planning for retirement planning computer program entitled VISION provided by NASA Headquarters. The software planning study was developed to help OHR analyze the current administrative desktop computing environment and make decisions regarding software acquisition and implementation. There were three major areas addressed by the study: current environment new software requirements, and strategies regarding the implementation of a server in the Office. To gather data on current environment, employees were surveyed and an inventory of computers were produced. The surveys were compiled and analyzed by the ASEE fellow with interpretation help by OHR staff. New software requirements represented a compilation and analysis of the surveyed requests of OHR personnel. Finally, the information on the use of a server represents research done by the ASEE fellow and analysis of survey data to determine software requirements for a server. This included selection of a methodology to estimate the number of copies of each software program required given current use and estimated growth. The report presents the results of the computing survey, a description of the current computing environment, recommenations for changes in the computing environment, current software needs, management advantages of using a server, and management considerations in the implementation of a server. In addition, detailed specifications were presented for the hardware and software recommendations to offer a complete picture to OHR management. The retirement planning computer program available to NASA employees will aid in long-range retirement planning. The intended audience is the NASA civil

  12. Attitudes and Motivation of Students in an Introductory Technical Graphics Course: A Meta-Analysis Study

    Science.gov (United States)

    Ernst, Jeremy V.; Clark, Aaron C.

    2012-01-01

    Students in introductory engineering graphics courses at North Carolina State University (NCSU) were asked to complete surveys to help educators and administrators understand their attitudes toward learning and their motivation to learn. Analyses of the completed surveys provided the Graphic Communications Program at NCSU with an understanding of…

  13. Science Textbooks' Use of Graphical Representation: A Descriptive Analysis of Four Sixth Grade Science Texts

    Science.gov (United States)

    Slough, Scott W.; McTigue, Erin M.; Kim, Suyeon; Jennings, Susan K.

    2010-01-01

    Middle school teachers tend to rely heavily on texts that have become increasing more visual. There is little information available about the graphical demands of general middle grades' science texts. The purpose of this study was to quantify the type and quality of the graphical representations and how they interacted with the textual material in…

  14. The use of computer graphics in the visual analysis of the proposed Sunshine Ski Area expansion

    Science.gov (United States)

    Mark Angelo

    1979-01-01

    This paper describes the use of computer graphics in designing part of the Sunshine Ski Area in Banff National Park. The program used was capable of generating perspective landscape drawings from a number of different viewpoints. This allowed managers to predict, and subsequently reduce, the adverse visual impacts of ski-run development. Computer graphics have proven,...

  15. Graphical Interfaces for Simulation.

    Science.gov (United States)

    Hollan, J. D.; And Others

    This document presents a discussion of the development of a set of software tools to assist in the construction of interfaces to simulations and real-time systems. Presuppositions to the approach to interface design that was used are surveyed, the tools are described, and the conclusions drawn from these experiences in graphical interface design…

  16. Development of interactive software for fuel management analysis

    International Nuclear Information System (INIS)

    Graves, H.W. Jr.

    1986-01-01

    Electronic computation plays a central part in engineering analysis of all types. Utilization of microcomputers for calculations that were formerly carried out on large mainframe computers presents a unique opportunity to develop software that not only takes advantage of the lower cost of using these machines, but also increases the efficiency of the engineers performing these calculations. This paper reviews the use of electronic computers in engineering analysis, discusses the potential for microcomputer utilization in this area, and describes a series of steps to be followed in software development that can yield significant gains in engineering design efficiency

  17. GENIE: a software package for gene-gene interaction analysis in genetic association studies using multiple GPU or CPU cores

    Directory of Open Access Journals (Sweden)

    Wang Kai

    2011-05-01

    Full Text Available Abstract Background Gene-gene interaction in genetic association studies is computationally intensive when a large number of SNPs are involved. Most of the latest Central Processing Units (CPUs have multiple cores, whereas Graphics Processing Units (GPUs also have hundreds of cores and have been recently used to implement faster scientific software. However, currently there are no genetic analysis software packages that allow users to fully utilize the computing power of these multi-core devices for genetic interaction analysis for binary traits. Findings Here we present a novel software package GENIE, which utilizes the power of multiple GPU or CPU processor cores to parallelize the interaction analysis. GENIE reads an entire genetic association study dataset into memory and partitions the dataset into fragments with non-overlapping sets of SNPs. For each fragment, GENIE analyzes: 1 the interaction of SNPs within it in parallel, and 2 the interaction between the SNPs of the current fragment and other fragments in parallel. We tested GENIE on a large-scale candidate gene study on high-density lipoprotein cholesterol. Using an NVIDIA Tesla C1060 graphics card, the GPU mode of GENIE achieves a speedup of 27 times over its single-core CPU mode run. Conclusions GENIE is open-source, economical, user-friendly, and scalable. Since the computing power and memory capacity of graphics cards are increasing rapidly while their cost is going down, we anticipate that GENIE will achieve greater speedups with faster GPU cards. Documentation, source code, and precompiled binaries can be downloaded from http://www.cceb.upenn.edu/~mli/software/GENIE/.

  18. A software package for biomedical image processing and analysis

    International Nuclear Information System (INIS)

    Goncalves, J.G.M.; Mealha, O.

    1988-01-01

    The decreasing cost of computing power and the introduction of low cost imaging boards justifies the increasing number of applications of digital image processing techniques in the area of biomedicine. There is however a large software gap to be fulfilled, between the application and the equipment. The requirements to bridge this gap are twofold: good knowledge of the hardware provided and its interface to the host computer, and expertise in digital image processing and analysis techniques. A software package incorporating these two requirements was developed using the C programming language, in order to create a user friendly image processing programming environment. The software package can be considered in two different ways: as a data structure adapted to image processing and analysis, which acts as the backbone and the standard of communication for all the software; and as a set of routines implementing the basic algorithms used in image processing and analysis. Hardware dependency is restricted to a single module upon which all hardware calls are based. The data structure that was built has four main features: hierchical, open, object oriented, and object dependent dimensions. Considering the vast amount of memory needed by imaging applications and the memory available in small imaging systems, an effective image memory management scheme was implemented. This software package is being used for more than one and a half years by users with different applications. It proved to be an excellent tool for helping people to get adapted into the system, and for standardizing and exchanging software, yet preserving flexibility allowing for users' specific implementations. The philosophy of the software package is discussed and the data structure that was built is described in detail

  19. Using MATLAB software with Tomcat server and Java platform for remote image analysis in pathology.

    Science.gov (United States)

    Markiewicz, Tomasz

    2011-03-30

    The Matlab software is a one of the most advanced development tool for application in engineering practice. From our point of view the most important is the image processing toolbox, offering many built-in functions, including mathematical morphology, and implementation of a many artificial neural networks as AI. It is very popular platform for creation of the specialized program for image analysis, also in pathology. Based on the latest version of Matlab Builder Java toolbox, it is possible to create the software, serving as a remote system for image analysis in pathology via internet communication. The internet platform can be realized based on Java Servlet Pages with Tomcat server as servlet container. In presented software implementation we propose remote image analysis realized by Matlab algorithms. These algorithms can be compiled to executable jar file with the help of Matlab Builder Java toolbox. The Matlab function must be declared with the set of input data, output structure with numerical results and Matlab web figure. Any function prepared in that manner can be used as a Java function in Java Servlet Pages (JSP). The graphical user interface providing the input data and displaying the results (also in graphical form) must be implemented in JSP. Additionally the data storage to database can be implemented within algorithm written in Matlab with the help of Matlab Database Toolbox directly with the image processing. The complete JSP page can be run by Tomcat server. The proposed tool for remote image analysis was tested on the Computerized Analysis of Medical Images (CAMI) software developed by author. The user provides image and case information (diagnosis, staining, image parameter etc.). When analysis is initialized, input data with image are sent to servlet on Tomcat. When analysis is done, client obtains the graphical results as an image with marked recognized cells and also the quantitative output. Additionally, the results are stored in a server

  20. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    Science.gov (United States)

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  1. Applications of the BEam Cross section Analysis Software (BECAS)

    DEFF Research Database (Denmark)

    Blasques, José Pedro Albergaria Amaral; Bitsche, Robert; Fedorov, Vladimir

    2013-01-01

    A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used for the gener......A newly developed framework is presented for structural design and analysis of long slender beam-like structures, e.g., wind turbine blades. The framework is based on the BEam Cross section Analysis Software – BECAS – a finite element based cross section analysis tool. BECAS is used...... for the generation of beam finite element models which correctly account for effects stemming from material anisotropy and inhomogeneity in cross sections of arbitrary geometry. These type of modelling approach allows for an accurate yet computationally inexpensive representation of a general class of three...

  2. The HEASARC graphical user interface

    Science.gov (United States)

    White, N.; Barrett, P.; Jacobs, P.; Oneel, B.

    1992-01-01

    An OSF/Motif-based graphical user interface has been developed to facilitate the use of the database and data analysis software packages available from the High Energy Astrophysics Science Archive Research Center (HEASARC). It can also be used as an interface to other, similar, routines. A small number of tables are constructed to specify the possible commands and command parameters for a given set of analysis routines. These tables can be modified by a designer to affect the appearance of the interface screens. They can also be dynamically changed in response to parameter adjustments made while the underlying program is running. Additionally, a communication protocol has been designed so that the interface can operate locally or across a network. It is intended that this software be able to run on a variety of workstations and X terminals.

  3. Graphic Storytelling

    Science.gov (United States)

    Thompson, John

    2009-01-01

    Graphic storytelling is a medium that allows students to make and share stories, while developing their art communication skills. American comics today are more varied in genre, approach, and audience than ever before. When considering the impact of Japanese manga on the youth, graphic storytelling emerges as a powerful player in pop culture. In…

  4. SeDA: A software package for the statistical analysis of the instrument drift

    International Nuclear Information System (INIS)

    Lee, H. J.; Jang, S. C.; Lim, T. J.

    2006-01-01

    The setpoints for safety-related equipment are affected by many sources of an uncertainty. ANSI/ISA-S67.04.01-2000 [1] and ISA-RP6 7.04.02-2000 [2] suggested the statistical approaches for ensuring that the safety-related instrument setpoints were established and maintained within the technical specification limits [3]. However, Jang et al. [4] indicated that the preceding methodologies for a setpoint drift analysis might be insufficient to manage a setpoint drift on an instrumentation device and proposed new statistical analysis procedures for the management of a setpoint drift, based on the plant specific as-found/as-left data. Although IHPA (Instrument History Performance Analysis) is a widely known commercial software package to analyze an instrument setpoint drift, several steps in the new procedure cannot be performed by using it because it is based on the statistical approaches suggested in the ANSI/ISA-S67.04.01 -2000 [1] and ISA-RP67.04.02-2000 [2], In this paper we present a software package (SeDA: Setpoint Drift Analysis) that implements new methodologies, and which is easy to use, as it is accompanied by powerful graphical tools. (authors)

  5. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  6. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  7. FIMTrack: An open source tracking and locomotion analysis software for small animals.

    Directory of Open Access Journals (Sweden)

    Benjamin Risse

    2017-05-01

    Full Text Available Imaging and analyzing the locomotion behavior of small animals such as Drosophila larvae or C. elegans worms has become an integral subject of biological research. In the past we have introduced FIM, a novel imaging system feasible to extract high contrast images. This system in combination with the associated tracking software FIMTrack is already used by many groups all over the world. However, so far there has not been an in-depth discussion of the technical aspects. Here we elaborate on the implementation details of FIMTrack and give an in-depth explanation of the used algorithms. Among others, the software offers several tracking strategies to cover a wide range of different model organisms, locomotion types, and camera properties. Furthermore, the software facilitates stimuli-based analysis in combination with built-in manual tracking and correction functionalities. All features are integrated in an easy-to-use graphical user interface. To demonstrate the potential of FIMTrack we provide an evaluation of its accuracy using manually labeled data. The source code is available under the GNU GPLv3 at https://github.com/i-git/FIMTrack and pre-compiled binaries for Windows and Mac are available at http://fim.uni-muenster.de.

  8. Perception in statistical graphics

    Science.gov (United States)

    VanderPlas, Susan Ruth

    There has been quite a bit of research on statistical graphics and visualization, generally focused on new types of graphics, new software to create graphics, interactivity, and usability studies. Our ability to interpret and use statistical graphics hinges on the interface between the graph itself and the brain that perceives and interprets it, and there is substantially less research on the interplay between graph, eye, brain, and mind than is sufficient to understand the nature of these relationships. The goal of the work presented here is to further explore the interplay between a static graph, the translation of that graph from paper to mental representation (the journey from eye to brain), and the mental processes that operate on that graph once it is transferred into memory (mind). Understanding the perception of statistical graphics should allow researchers to create more effective graphs which produce fewer distortions and viewer errors while reducing the cognitive load necessary to understand the information presented in the graph. Taken together, these experiments should lay a foundation for exploring the perception of statistical graphics. There has been considerable research into the accuracy of numerical judgments viewers make from graphs, and these studies are useful, but it is more effective to understand how errors in these judgments occur so that the root cause of the error can be addressed directly. Understanding how visual reasoning relates to the ability to make judgments from graphs allows us to tailor graphics to particular target audiences. In addition, understanding the hierarchy of salient features in statistical graphics allows us to clearly communicate the important message from data or statistical models by constructing graphics which are designed specifically for the perceptual system.

  9. UTOOLS: microcomputer software for spatial analysis and landscape visualization.

    Science.gov (United States)

    Alan A. Ager; Robert J. McGaughey

    1997-01-01

    UTOOLS is a collection of programs designed to integrate various spatial data in a way that allows versatile spatial analysis and visualization. The programs were designed for watershed-scale assessments in which a wide array of resource data must be integrated, analyzed, and interpreted. UTOOLS software combines raster, attribute, and vector data into "spatial...

  10. Using Business Analysis Software in a Business Intelligence Course

    Science.gov (United States)

    Elizondo, Juan; Parzinger, Monica J.; Welch, Orion J.

    2011-01-01

    This paper presents an example of a project used in an undergraduate business intelligence class which integrates concepts from statistics, marketing, and information systems disciplines. SAS Enterprise Miner software is used as the foundation for predictive analysis and data mining. The course culminates with a competition and the project is used…

  11. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  12. WinDAM C earthen embankment internal erosion analysis software

    Science.gov (United States)

    Two primary causes of dam failure are overtopping and internal erosion. For the purpose of evaluating dam safety for existing earthen embankment dams and proposed earthen embankment dams, Windows Dam Analysis Modules C (WinDAM C) software will simulate either internal erosion or erosion resulting f...

  13. ANALYSIS OF CONTEMPORARY SOFTWARE BEING USED FOR FORWARDING SERVICES

    Directory of Open Access Journals (Sweden)

    Naumov, V.

    2013-01-01

    Full Text Available The role of information technologies in the forwarding services has been specified. The typical structure of the logistic sites providing the search of requests of freight owners and carriers has been described. The analysis of the software for transportation companies was conducted. The perspective directions of improvement of forwarding services process have been revealed.

  14. Toward a Shared Vocabulary for Visual Analysis: An Analytic Toolkit for Deconstructing the Visual Design of Graphic Novels

    Science.gov (United States)

    Connors, Sean P.

    2012-01-01

    Literacy educators might advocate using graphic novels to develop students' visual literacy skills, but teachers who lack a vocabulary for engaging in close analysis of visual texts may be reluctant to teach them. Recognizing this, teacher educators should equip preservice teachers with a vocabulary for analyzing visual texts. This article…

  15. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  16. eRNA: a graphic user interface-based tool optimized for large data analysis from high-throughput RNA sequencing.

    Science.gov (United States)

    Yuan, Tiezheng; Huang, Xiaoyi; Dittmar, Rachel L; Du, Meijun; Kohli, Manish; Boardman, Lisa; Thibodeau, Stephen N; Wang, Liang

    2014-03-05

    RNA sequencing (RNA-seq) is emerging as a critical approach in biological research. However, its high-throughput advantage is significantly limited by the capacity of bioinformatics tools. The research community urgently needs user-friendly tools to efficiently analyze the complicated data generated by high throughput sequencers. We developed a standalone tool with graphic user interface (GUI)-based analytic modules, known as eRNA. The capacity of performing parallel processing and sample management facilitates large data analyses by maximizing hardware usage and freeing users from tediously handling sequencing data. The module miRNA identification" includes GUIs for raw data reading, adapter removal, sequence alignment, and read counting. The module "mRNA identification" includes GUIs for reference sequences, genome mapping, transcript assembling, and differential expression. The module "Target screening" provides expression profiling analyses and graphic visualization. The module "Self-testing" offers the directory setups, sample management, and a check for third-party package dependency. Integration of other GUIs including Bowtie, miRDeep2, and miRspring extend the program's functionality. eRNA focuses on the common tools required for the mapping and quantification analysis of miRNA-seq and mRNA-seq data. The software package provides an additional choice for scientists who require a user-friendly computing environment and high-throughput capacity for large data analysis. eRNA is available for free download at https://sourceforge.net/projects/erna/?source=directory.

  17. CDP a graphic system for the interactive simulation and the dynamic analysis of continuous systems

    International Nuclear Information System (INIS)

    Ricci, A.; Teolis, A.

    1973-01-01

    An IBM 2250 graphic system for the interactive simulation of continuous sytems is illustrated. Time dependent quantities can be plotted or an animated, real or schematic, representation of the system being studied can be given

  18. Synchronized analysis of testbeam data with the Judith software

    CERN Document Server

    McGoldrick, Garrin; Gorišek, Andrej

    2014-01-01

    The Judith software performs pixel detector analysis tasks utilizing two different data streams such as those produced by the reference and tested devices typically found in a testbeam. This software addresses and fixes problems arising from the desynchronization of the two simultaneously triggered data streams by detecting missed triggers in either of the streams. The software can perform all tasks required to generate particle tracks using multiple detector planes: it can align the planes, cluster hits and generate tracks from these clusters. This information can then be used to measure the properties of a particle detector with very fine spatial resolution. It was tested at DESY in the Kartel telescope, a silicon tracking detector, with ATLAS Diamond Beam Monitor modules as a device under test.

  19. One-Click Data Analysis Software for Science Operations

    Science.gov (United States)

    Navarro, Vicente

    2015-12-01

    One of the important activities of ESA Science Operations Centre is to provide Data Analysis Software (DAS) to enable users and scientists to process data further to higher levels. During operations and post-operations, Data Analysis Software (DAS) is fully maintained and updated for new OS and library releases. Nonetheless, once a Mission goes into the "legacy" phase, there are very limited funds and long-term preservation becomes more and more difficult. Building on Virtual Machine (VM), Cloud computing and Software as a Service (SaaS) technologies, this project has aimed at providing long-term preservation of Data Analysis Software for the following missions: - PIA for ISO (1995) - SAS for XMM-Newton (1999) - Hipe for Herschel (2009) - EXIA for EXOSAT (1983) Following goals have guided the architecture: - Support for all operations, post-operations and archive/legacy phases. - Support for local (user's computer) and cloud environments (ESAC-Cloud, Amazon - AWS). - Support for expert users, requiring full capabilities. - Provision of a simple web-based interface. This talk describes the architecture, challenges, results and lessons learnt gathered in this project.

  20. Application of econometric and ecology analysis methods in physics software

    Science.gov (United States)

    Han, Min Cheol; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Sung Hun; Grazia Pia, Maria; Ronchieri, Elisabetta; Saracco, Paolo

    2017-10-01

    Some data analysis methods typically used in econometric studies and in ecology have been evaluated and applied in physics software environments. They concern the evolution of observables through objective identification of change points and trends, and measurements of inequality, diversity and evenness across a data set. Within each analysis area, various statistical tests and measures have been examined. This conference paper summarizes a brief overview of some of these methods.

  1. Comparison of two three-dimensional cephalometric analysis computer software.

    Science.gov (United States)

    Sawchuk, Dena; Alhadlaq, Adel; Alkhadra, Thamer; Carlyle, Terry D; Kusnoto, Budi; El-Bialy, Tarek

    2014-10-01

    Three-dimensional cephalometric analyses are getting more attraction in orthodontics. The aim of this study was to compare two softwares to evaluate three-dimensional cephalometric analyses of orthodontic treatment outcomes. Twenty cone beam computed tomography images were obtained using i-CAT(®) imaging system from patient's records as part of their regular orthodontic records. The images were analyzed using InVivoDental5.0 (Anatomage Inc.) and 3DCeph™ (University of Illinois at Chicago, Chicago, IL, USA) software. Before and after orthodontic treatments data were analyzed using t-test. Reliability test using interclass correlation coefficient was stronger for InVivoDental5.0 (0.83-0.98) compared with 3DCeph™ (0.51-0.90). Paired t-test comparison of the two softwares shows no statistical significant difference in the measurements made in the two softwares. InVivoDental5.0 measurements are more reproducible and user friendly when compared to 3DCeph™. No statistical difference between the two softwares in linear or angular measurements. 3DCeph™ is more time-consuming in performing three-dimensional analysis compared with InVivoDental5.0.

  2. Diagrams and Relational Maps: The Use of Graphic Elicitation Techniques with Interviewing for Data Collection, Analysis, and Display

    Directory of Open Access Journals (Sweden)

    Andrea J. Copeland PhD

    2012-12-01

    Full Text Available Graphic elicitation techniques, which ask research participants to provide visual data representing personal understandings of concepts, experiences, beliefs, or behaviors, can be especially useful in helping participants to express complex or abstract ideas or opinions. The benefits and drawbacks of using graphic elicitation techniques for data collection, data analysis, and data display in qualitative research studies are analyzed using examples from a research study that employed data matrices and relational maps in conjunction with semi-structured interviews. Results from this analysis demonstrate that the use of these combined techniques for data collection facilitates triangulation and helps to establish internal consistency of data, thereby increasing the trustworthiness of the interpretation of that data and lending support to validity and reliability claims. Findings support the notion that graphic elicitation techniques can be highly useful in qualitative research studies at the data collection, the data analysis, and the data reporting stages. For example, this study found that graphic elicitation techniques are especially useful for eliciting data related to emotions and emotional experiences.

  3. Accelerated fluctuation analysis by graphic cards and complex pattern formation in financial markets

    International Nuclear Information System (INIS)

    Preis, Tobias; Virnau, Peter; Paul, Wolfgang; Schneider, Johannes J

    2009-01-01

    The compute unified device architecture is an almost conventional programming approach for managing computations on a graphics processing unit (GPU) as a data-parallel computing device. With a maximum number of 240 cores in combination with a high memory bandwidth, a recent GPU offers resources for computational physics. We apply this technology to methods of fluctuation analysis, which includes determination of the scaling behavior of a stochastic process and the equilibrium autocorrelation function. Additionally, the recently introduced pattern formation conformity (Preis T et al 2008 Europhys. Lett. 82 68005), which quantifies pattern-based complex short-time correlations of a time series, is calculated on a GPU and analyzed in detail. Results are obtained up to 84 times faster than on a current central processing unit core. When we apply this method to high-frequency time series of the German BUND future, we find significant pattern-based correlations on short time scales. Furthermore, an anti-persistent behavior can be found on short time scales. Additionally, we compare the recent GPU generation, which provides a theoretical peak performance of up to roughly 10 12 floating point operations per second with the previous one.

  4. A study of perceptual analysis in a high-level autistic subject with exceptional graphic abilities.

    Science.gov (United States)

    Mottron, L; Belleville, S

    1993-11-01

    We report here the case study of a patient (E.C.) with an Asperger syndrome, or autism with quasinormal intelligence, who shows an outstanding ability for three-dimensional drawing of inanimate objects (savant syndrome). An assessment of the subsystems proposed in recent models of object recognition evidenced intact perceptual analysis and identification. The initial (or primal sketch), viewer-centered (or 2-1/2-D), or object-centered (3-D) representations and the recognition and name levels were functional. In contrast, E.C.'s pattern of performance in three different types of tasks converge to suggest an anomaly in the hierarchical organization of the local and global parts of a figure: a local interference effect in incongruent hierarchical visual stimuli, a deficit in relating local parts to global form information in impossible figures, and an absence of feature-grouping in graphic recall. The results are discussed in relation to normal visual perception and to current accounts of the savant syndrome in autism.

  5. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    Science.gov (United States)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  6. A case study of GAMM (graphical analysis for maintenance management) in the mining industry

    International Nuclear Information System (INIS)

    Barberá, Luis; Crespo, Adolfo; Viveros, Pablo; Stegmaier, Raúl

    2014-01-01

    This paper presents a case for practical application of the GAMM method, which has been developed and published by the authors (Barberá L., Crespo A. and Viveros P.) The GAMM method supports decision-making in the overall maintenance management, through the visualization and graphical analysis of data. In addition, it allows for the identification of anomalous behavior in the equipment analyzed, whether derived from its own operations, maintenance activities, improper use of equipment or even as a result of design errors in the equipment itself. As a basis for analysis, the GAMM method uses a nonparametric estimator of the reliability function using all historical data or, alternatively, part of the history, allowing it to perform an analysis even with limited available data. In the case study developed, GAMM has been used to analyze two slurry pumps in a mining plant located in Chile. Both pumps are part of the same industrial process, which is described in Section 3, and both pumps had a higher failure rate but one more than the other. GAMM identified deficiencies in each of the pumps being studied, thus improving decision-making and problem solving process related to the maintenance of the pumps. Particularly, this work initially provides a description of the GAMM method (Section 1), and, afterwards, it is depicted with special attention the approach to the problem (Section 2). In Section 3, a background of the industrial context is presented. Then, Section 4 shows step by step the application of GAMM method. Finally, results and conclusions are presented in Section 5 where the main improvements obtained are summarized

  7. Power consumption analysis of pump station control systems based on fuzzy controllers with discrete terms in iThink software

    Science.gov (United States)

    Muravyova, E. A.; Bondarev, A. V.; Sharipov, M. I.; Galiaskarova, G. R.; Kubryak, A. I.

    2018-03-01

    In this article, power consumption of pumping station control systems is discussed. To study the issue, two simulation models of oil level control in the iThink software have been developed, using a frequency converter only and using a frequency converter and a fuzzy controller. A simulation of the oil-level control was carried out in a graphic form, and plots of pumps power consumption were obtained. Based on the initial and obtained data, the efficiency of the considered control systems has been compared, and also the power consumption of the systems was shown graphically using a frequency converter only and using a frequency converter and a fuzzy controller. The models analysis has shown that it is more economical and safe to use a control circuit with a frequency converter and a fuzzy controller.

  8. THERAPIE - THErmix-RAps-Plot-InterfacE. A graphic software for representation of THERMIX-2D results with the interactive plot program RAPS

    International Nuclear Information System (INIS)

    Duensing, P.; Jahn, W.; Rehm, W.

    1986-09-01

    The performance of safety analyses for gas-cooled high temperature reactor power plants requires efficient plot codes for the evaluation and representation of computer results. The report describes the coupling between the thermodynamic simulation code THERMIX and the graphic plot code RAPS via the interface program THERAPIE. Especially the structure and the handling of the interface program are explained as well as the dialogue with the plot code. Further options of the colour graphic system are demonstrated for the representation of temperature distributions in components of HTR concepts (HTR-500). (orig.) [de

  9. Graphics gems

    CERN Document Server

    Glassner, Andrew S

    1993-01-01

    ""The GRAPHICS GEMS Series"" was started in 1990 by Andrew Glassner. The vision and purpose of the Series was - and still is - to provide tips, techniques, and algorithms for graphics programmers. All of the gems are written by programmers who work in the field and are motivated by a common desire to share interesting ideas and tools with their colleagues. Each volume provides a new set of innovative solutions to a variety of programming problems.

  10. Graphic notation

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    1992-01-01

    Texbook to be used along with training the practise of graphic notation. Describes method; exercises; bibliography; collection of examples. If you can read Danish, please refer to that edition which is by far much more updated.......Texbook to be used along with training the practise of graphic notation. Describes method; exercises; bibliography; collection of examples. If you can read Danish, please refer to that edition which is by far much more updated....

  11. Knickpoint finder: A software tool that improves neotectonic analysis

    Science.gov (United States)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  12. BROCCOLI: Software for Fast fMRI Analysis on Many-Core CPUs and GPUs

    Directory of Open Access Journals (Sweden)

    Anders eEklund

    2014-03-01

    Full Text Available Analysis of functional magnetic resonance imaging (fMRI data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU can perform non-linear spatial normalization to a 1 mm3 brain template in 4-6 seconds, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/.

  13. XplorSeq: a software environment for integrated management and phylogenetic analysis of metagenomic sequence data.

    Science.gov (United States)

    Frank, Daniel N

    2008-10-07

    Advances in automated DNA sequencing technology have accelerated the generation of metagenomic DNA sequences, especially environmental ribosomal RNA gene (rDNA) sequences. As the scale of rDNA-based studies of microbial ecology has expanded, need has arisen for software that is capable of managing, annotating, and analyzing the plethora of diverse data accumulated in these projects. XplorSeq is a software package that facilitates the compilation, management and phylogenetic analysis of DNA sequences. XplorSeq was developed for, but is not limited to, high-throughput analysis of environmental rRNA gene sequences. XplorSeq integrates and extends several commonly used UNIX-based analysis tools by use of a Macintosh OS-X-based graphical user interface (GUI). Through this GUI, users may perform basic sequence import and assembly steps (base-calling, vector/primer trimming, contig assembly), perform BLAST (Basic Local Alignment and Search Tool; 123) searches of NCBI and local databases, create multiple sequence alignments, build phylogenetic trees, assemble Operational Taxonomic Units, estimate biodiversity indices, and summarize data in a variety of formats. Furthermore, sequences may be annotated with user-specified meta-data, which then can be used to sort data and organize analyses and reports. A document-based architecture permits parallel analysis of sequence data from multiple clones or amplicons, with sequences and other data stored in a single file. XplorSeq should benefit researchers who are engaged in analyses of environmental sequence data, especially those with little experience using bioinformatics software. Although XplorSeq was developed for management of rDNA sequence data, it can be applied to most any sequencing project. The application is available free of charge for non-commercial use at http://vent.colorado.edu/phyloware.

  14. Graphical analysis of pH-dependent properties of proteins predicted using PROPKA.

    Science.gov (United States)

    Rostkowski, Michał; Olsson, Mats H M; Søndergaard, Chresten R; Jensen, Jan H

    2011-01-26

    Charge states of ionizable residues in proteins determine their pH-dependent properties through their pKa values. Thus, various theoretical methods to determine ionization constants of residues in biological systems have been developed. One of the more widely used approaches for predicting pKa values in proteins is the PROPKA program, which provides convenient structural rationalization of the predicted pKa values without any additional calculations. The PROPKA Graphical User Interface (GUI) is a new tool for studying the pH-dependent properties of proteins such as charge and stabilization energy. It facilitates a quantitative analysis of pKa values of ionizable residues together with their structural determinants by providing a direct link between the pKa data, predicted by the PROPKA calculations, and the structure via the Visual Molecular Dynamics (VMD) program. The GUI also calculates contributions to the pH-dependent unfolding free energy at a given pH for each ionizable group in the protein. Moreover, the PROPKA-computed pKa values or energy contributions of the ionizable residues in question can be displayed interactively. The PROPKA GUI can also be used for comparing pH-dependent properties of more than one structure at the same time. The GUI considerably extends the analysis and validation possibilities of the PROPKA approach. The PROPKA GUI can conveniently be used to investigate ionizable groups, and their interactions, of residues with significantly perturbed pKa values or residues that contribute to the stabilization energy the most. Charge-dependent properties can be studied either for a single protein or simultaneously with other homologous structures, which makes it a helpful tool, for instance, in protein design studies or structure-based function predictions. The GUI is implemented as a Tcl/Tk plug-in for VMD, and can be obtained online at http://propka.ki.ku.dk/~luca/wiki/index.php/GUI_Web.

  15. A graphical user interface for real-time analysis of XPCS using HPC

    Energy Technology Data Exchange (ETDEWEB)

    Sikorski, M., E-mail: sikorski@aps.anl.gov [Argonne National Laboratory, Advanced Photon Source, 9700 S Cass Ave, Argonne, IL 60439 (United States); Jiang, Z. [Argonne National Laboratory, Advanced Photon Source, 9700 S Cass Ave, Argonne, IL 60439 (United States); Sprung, M. [HASYLAB at DESY, Notkestr. 85, D 22-607 Hamburg (Germany); Narayanan, S.; Sandy, A.R.; Tieman, B. [Argonne National Laboratory, Advanced Photon Source, 9700 S Cass Ave, Argonne, IL 60439 (United States)

    2011-09-01

    With the development of third generation synchrotron radiation sources, X-ray photon correlation spectroscopy has emerged as a powerful technique for characterizing equilibrium and non-equilibrium dynamics in complex materials at nanometer length scales over a wide range of time-scales (0.001-1000 s). Moreover, the development of powerful new direct detection CCD cameras has allowed investigation of faster dynamical processes. A consequence of these technical improvements is the need to reduce a very large amount of area detector data within a short time. This problem can be solved by utilizing a large number of processors (32-64) in the cluster architecture to improve the efficiency of the calculations by 1-2 orders of magnitude (Tieman et al., this issue). However, to make such a data analysis system operational, powerful and user-friendly control software needs to be developed. As a part of the effort to maintain a high data acquisition and reduction rate, we have developed a Matlab-based software that acts as an interface between the user and the high performance computing (HPC) cluster.

  16. A graphical user interface for real-time analysis of XPCS using HPC

    International Nuclear Information System (INIS)

    Sikorski, M.; Jiang, Z.; Sprung, M.; Narayanan, S.; Sandy, A.R.; Tieman, B.

    2011-01-01

    With the development of third generation synchrotron radiation sources, X-ray photon correlation spectroscopy has emerged as a powerful technique for characterizing equilibrium and non-equilibrium dynamics in complex materials at nanometer length scales over a wide range of time-scales (0.001-1000 s). Moreover, the development of powerful new direct detection CCD cameras has allowed investigation of faster dynamical processes. A consequence of these technical improvements is the need to reduce a very large amount of area detector data within a short time. This problem can be solved by utilizing a large number of processors (32-64) in the cluster architecture to improve the efficiency of the calculations by 1-2 orders of magnitude (Tieman et al., this issue). However, to make such a data analysis system operational, powerful and user-friendly control software needs to be developed. As a part of the effort to maintain a high data acquisition and reduction rate, we have developed a Matlab-based software that acts as an interface between the user and the high performance computing (HPC) cluster.

  17. Confidence ellipses: A variation based on parametric bootstrapping applicable on Multiple Factor Analysis results for rapid graphical evaluation

    DEFF Research Database (Denmark)

    Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender L. P.

    2012-01-01

    A new way of parametric bootstrapping allows similar construction of confidence ellipses applicable on all results from Multiple Factor Analysis obtained from the FactoMineR package in the statistical program R. With this procedure, a similar approach will be applied to Multiple Factor Analysis r...... in different studies performed on the same set of products. In addition, the graphical display of confidence ellipses eases interpretation and communication of results....

  18. STAMPS: development and verification of swallowing kinematic analysis software.

    Science.gov (United States)

    Lee, Woo Hyung; Chun, Changmook; Seo, Han Gil; Lee, Seung Hak; Oh, Byung-Mo

    2017-10-17

    Swallowing impairment is a common complication in various geriatric and neurodegenerative diseases. Swallowing kinematic analysis is essential to quantitatively evaluate the swallowing motion of the oropharyngeal structures. This study aims to develop a novel swallowing kinematic analysis software, called spatio-temporal analyzer for motion and physiologic study (STAMPS), and verify its validity and reliability. STAMPS was developed in MATLAB, which is one of the most popular platforms for biomedical analysis. This software was constructed to acquire, process, and analyze the data of swallowing motion. The target of swallowing structures includes bony structures (hyoid bone, mandible, maxilla, and cervical vertebral bodies), cartilages (epiglottis and arytenoid), soft tissues (larynx and upper esophageal sphincter), and food bolus. Numerous functions are available for the spatiotemporal parameters of the swallowing structures. Testing for validity and reliability was performed in 10 dysphagia patients with diverse etiologies and using the instrumental swallowing model which was designed to mimic the motion of the hyoid bone and the epiglottis. The intra- and inter-rater reliability tests showed excellent agreement for displacement and moderate to excellent agreement for velocity. The Pearson correlation coefficients between the measured and instrumental reference values were nearly 1.00 (P software is expected to be useful for researchers who are interested in the swallowing motion analysis.

  19. Characterization of Disulfide-Linked Peptides Using Tandem Mass Spectrometry Coupled with Automated Data Analysis Software.

    Science.gov (United States)

    Liang, Zhidan; McGuinness, Kenneth N; Crespo, Alejandro; Zhong, Wendy

    2018-01-25

    Disulfide bond formation is critical for maintaining structure stability and function of many peptides and proteins. Mass spectrometry has become an important tool for the elucidation of molecular connectivity. However, the interpretation of the tandem mass spectral data of disulfide-linked peptides has been a major challenge due to the lack of appropriate tools. Developing proper data analysis software is essential to quickly characterize disulfide-linked peptides. A thorough and in-depth understanding of how disulfide-linked peptides fragment in mass spectrometer is a key in developing software to interpret the tandem mass spectra of these peptides. Two model peptides with inter- and intra-chain disulfide linkages were used to study fragmentation behavior in both collisional-activated dissociation (CAD) and electron-based dissociation (ExD) experiments. Fragments generated from CAD and ExD can be categorized into three major types, which result from different S-S and C-S bond cleavage patterns. DiSulFinder is a computer algorithm that was newly developed based on the fragmentation observed in these peptides. The software is vendor neutral and capable of quickly and accurately identifying a variety of fragments generated from disulfide-linked peptides. DiSulFinder identifies peptide backbone fragments with S-S and C-S bond cleavages and, more importantly, can also identify fragments with the S-S bond still intact to aid disulfide linkage determination. With the assistance of this software, more comprehensive disulfide connectivity characterization can be achieved. Graphical Abstract ᅟ.

  20. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  1. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  2. Hardware and software constructs for a vibration analysis network

    International Nuclear Information System (INIS)

    Cook, S.A.; Crowe, R.D.; Toffer, H.

    1985-01-01

    Vibration level monitoring and analysis has been initiated at N Reactor, the dual purpose reactor operated at Hanford, Washington by UNC Nuclear Industries (UNC) for the Department of Energy (DOE). The machinery to be monitored was located in several buildings scattered over the plant site, necessitating an approach using satellite stations to collect, monitor and temporarily store data. The satellite stations are, in turn, linked to a centralized processing computer for further analysis. The advantages of a networked data analysis system are discussed in this paper along with the hardware and software required to implement such a system

  3. GGT2.0: Versatile Software for visualization and analysis of genetic data

    NARCIS (Netherlands)

    Berloo, van R.

    2008-01-01

    Ever since its first release in 1999, the free software package for visualization of molecular marker data, graphical genotype (GGT), has been constantly adapted and improved. The GGT package was developed in a plant-breeding context and thus focuses on plant genetic data but was not intended to be

  4. Calibration Analysis Software for the ATLAS Pixel Detector

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372086; The ATLAS collaboration

    2016-01-01

    The calibration of the ATLAS Pixel detector at LHC fulfils two main purposes: to tune the front-end configuration parameters for establishing the best operational settings and to measure the tuning performance through a subset of scans. An analysis framework has been set up in order to take actions on the detector given the outcome of a calibration scan (e.g. to create a mask for disabling noisy pixels). The software framework to control all aspects of the Pixel detector scans and analyses is called Calibration Console. The introduction of a new layer, equipped with new Front End-I4 Chips, required an update the Console architecture. It now handles scans and scans analyses applied together to chips with different characteristics. An overview of the newly developed Calibration Analysis Software will be presented, together with some preliminary result.

  5. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  6. Spectrum analysis on quality requirements consideration in software design documents.

    Science.gov (United States)

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  7. A versatile system for the rapid collection, handling and graphics analysis of multidimensional data

    International Nuclear Information System (INIS)

    O'Brien, P.M.; Moloney, G.; O'Oconnor, A.; Legge, G.J.F.

    1991-01-01

    The paper discusses the performances of a versatile computerized system developed at the Microanalytical Research Centre of the Melbourne University, for handling multiparameter data that may arise from a variety of experiments - nuclear, accelerator mass spectrometry, microprobe elemental analysis or 3-D microtomography. Some of the most demanding requirements arise in the application of microprobes to quantitative elemental mapping and to microtomography. A system to handle data from such experiments had been under continuous development. It has been reprogramed to run on a DG DS7540 workstation. The whole system of software has been rewritten, greatly expanded and made much more powerful and faster, by use of modern computer technology - a VME bus computer with a real-time operating system and a RISC workstation running UNIX and the X-window environment

  8. Software for a measuring facility for activation analysis

    International Nuclear Information System (INIS)

    De Keyser, A.; De Roost, E.

    1985-01-01

    A software package has been developed for an APPLE P.C. The programs are intended to control an automated measuring station for photon activation analysis at GELINA, the linear accelerator of C.B.N.M. at Geel (Belgium). They allow to set-up a measuring scheme, to execute it under computer control, to accumulate and store 2 K-spectra using a built-in ADC and to output the results as listings, plots or evaluated reports

  9. Phenomenology and Qualitative Data Analysis Software (QDAS): A Careful Reconciliation

    OpenAIRE

    Brian Kelleher Sohn

    2017-01-01

    An oft-cited phenomenological methodologist, Max VAN MANEN (2014), claims that qualitative data analysis software (QDAS) is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011) urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform p...

  10. Nuclear analysis software. Pt. 1: Spectrum transfer and reformatting (SPEDAC)

    International Nuclear Information System (INIS)

    1991-01-01

    GANAAS (Gamma, Activity, and Neutron Activation Analysis System) is one in the family of software packages developed under the auspices of the International Atomic Energy Agency. Primarily, the package was intended to support the IAEA Technical Assistance and Cooperation projects in developing countries. However, it is open domain software that can be copied and used by anybody, except for commercial purposes. All the nuclear analysis software provided by the IAEA has the same design philosophy and similar structure. The intention was to provide the user with maximum flexibility, at the same time with a simple and logical organization that requires minimum digging through the manuals. GANAAS is a modular system. It consists of several programmes that can be installed on the hard disk as the are needed. Obviously, some parts of they system are required in all cases. Those are installed at the beginning, without consulting the operator. GANAAS offers the opportunity to expand and improve the system. The gamma spectrum evaluation programmes using different fitting algorithms can be added to GANAAS, under the condition that the format of their input and output files corresponds to the rules of GANAAS. The same applies to the quantitative analysis parts of the programme

  11. Visualization of scientific data for high energy physics: PAW, a general-purpose portable software tool for data analysis and presentation

    International Nuclear Information System (INIS)

    Brun, R.; Couet, O.; Vandoni, C.E.; Zanarini, P.

    1990-01-01

    Visualization of scientific data although a fashionable word in the world of computer graphics, is not a new invention, but it is hundreds years old. With the advent of computer graphics the visualization of Scientific Data has now become a well understood and widely used technology, with hundreds of applications in the most different fields, ranging from media applications to real scientific ones. In the present paper, we shall discuss the design concepts of the Visualization of Scientific Data systems in particular in the specific field of High Energy Physics. During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to High Energy Physics (HEP). The results of the integration of resources from many different Laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW-Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years. 6 figs

  12. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  13. An ion beam analysis software based on ImageJ

    International Nuclear Information System (INIS)

    Udalagama, C.; Chen, X.; Bettiol, A.A.; Watt, F.

    2013-01-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation

  14. An ion beam analysis software based on ImageJ

    Energy Technology Data Exchange (ETDEWEB)

    Udalagama, C., E-mail: chammika@nus.edu.sg [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore); Chen, X.; Bettiol, A.A.; Watt, F. [Centre for Ion Beam Applications (CIBA), Department of Physics, National University of Singapore, 2 Science Drive 3, Singapore 117 542 (Singapore)

    2013-07-01

    The suit of techniques (RBS, STIM, ERDS, PIXE, IL, IF,…) available in ion beam analysis yields a variety of rich information. Typically, after the initial challenge of acquiring data we are then faced with the task of having to extract relevant information or to present the data in a format with the greatest impact. This process sometimes requires developing new software tools. When faced with such situations the usual practice at the Centre for Ion Beam Applications (CIBA) in Singapore has been to use our computational expertise to develop ad hoc software tools as and when we need them. It then became apparent that the whole ion beam community can benefit from such tools; specifically from a common software toolset that can be developed and maintained by everyone with freedom to use and allowance to modify. In addition to the benefits of readymade tools and sharing the onus of development, this also opens up the possibility for collaborators to access and analyse ion beam data without having to depend on an ion beam lab. This has the virtue of making the ion beam techniques more accessible to a broader scientific community. We have identified ImageJ as an appropriate software base to develop such a common toolset. In addition to being in the public domain and been setup for collaborative tool development, ImageJ is accompanied by hundreds of modules (plugins) that allow great breadth in analysis. The present work is the first step towards integrating ion beam analysis into ImageJ. Some of the features of the current version of the ImageJ ‘ion beam’ plugin are: (1) reading list mode or event-by-event files, (2) energy gates/sorts, (3) sort stacks, (4) colour function, (5) real time map updating, (6) real time colour updating and (7) median and average map creation.

  15. Survey and analyses of computer software usage in Calabar ...

    African Journals Online (AJOL)

    This work is to find out the most used software and the type of jobs mostly done. A descriptive analysis using simple percentages revealed that word processing software is the most used software followed by graphics, database and accounting in a decreasing order respectively. A comparative examination of the use of the ...

  16. POST-CASKETSS: a graphic computer program for thermal and structural analysis of nuclear fuel shipping casks

    International Nuclear Information System (INIS)

    Ikushima, Takeshi

    1988-12-01

    A computer program POST-CASKETSS has been developed for the purpose of calculation result representation for thermal and structural analysis computer code system CASKETSS (CASKETSS means a modular code system for CASK Evaluation code system for Thermal and Structural Safety). Main features of POST-CASKETSS are as follows; (1) Function of calculation result representation for thermal and structural analysis computer programs is provided in the program. (2) Two and three dimensional graphic representation for finite element and finite difference programs are available in the program. (3) The capacity of graphics of geometry, temperature contor and temperature-time curve are provided for thermal analysis. (4) The capacity of graphics of geometry, deformation, stress contor, displacement-time curve, velocity-time curve, acceleration-time curve, stress-time curve, force-time curve and moment-time curve are provided for structural analysis. (5) This computer program operates both the time shearing system and the batch system. In the paper, brief illustration of calculation method, input data and sample calculations are presented. (author)

  17. Comparative Investigation on Modal analysis of LM25 Aluminium alloy with other Aluminim alloys using Finite element analysis software

    Science.gov (United States)

    Arunkumar, S.; Baskaralal, V. P. M.; Muthuraman, V.

    2017-03-01

    The rudimentary steps of the modal analysis and simulation are carried out. The modal analysis is carried out on the different Aluminum Alloys cantilever beam. The cantilever beam is designed in the graphical environment of the ANSYS. The cantilever beam was fine-tuned on one end with all degree of liberation on this end were taken, beam cannot move and rotate. Mode shapes and natural frequencies are premeditated in platforms ANSYS with arithmetical formulation of the direct solver including the block Lanczos method. Aluminum alloys are widely utilized in much application due to their estimable weight to vigor property. Many examination works have been distributed out to make developments the mechanical properties of aluminum alloys. The composition of alloying elements plays a consequential role in deciding the properties of an alloy. In this study a numerical analysis implement i.e., finite element analysis (FEA) is utilized. The work obtainable in this paper is aimed at the study of effect of modal analysis of different aluminum alloys. The modeling and analysis is carried out utilizing ANSYS FEA software. A modal analysis is carried out to understand the modes of frequency demeanor of the material considered. The modal analysis play a vital role in the design of components subjected to high vibration.

  18. Versatile software for semiautomatic analysis and processing of laser-induced plasma spectra

    International Nuclear Information System (INIS)

    Mateo, M.P.; Nicolas, G.; Pinon, V.; Alvarez, J.C.; Ramil, A.; Yanez, A.

    2005-01-01

    The present article describes the main characteristics and operations of SALIPS (software for the analysis of laser-induced plasma spectra), a computer program designed for use in Spectroscopy. During the last years laser-induced plasma spectroscopy (LIPS) has grown in popularity and different applications have been developed in several fields. However, until now there is no software reported to perform the recognition of the elemental composition of a generic sample from its LIP spectrum, which must be achieved by hand in a tedious comparative process of experimental peaks with emission lines from databases. For this reason, a computer program that includes several tools to provide a semi-automatic identification of the peaks of a LIP spectrum has been developed. The program, written in Microsoft registered Visual Basic registered code, has a user-friendly graphical interface and is a flexible tool that enables to handle, edit, copy and print a quick presentation of the data including automatically the identification results in the graph. SALIPS also provides some physical properties of the elements and includes algorithms for performing the simulation of spectra. The potential of the program is illustrated with some examples

  19. ZODET: software for the identification, analysis and visualisation of outlier genes in microarray expression data.

    Directory of Open Access Journals (Sweden)

    Daniel L Roden

    Full Text Available Complex human diseases can show significant heterogeneity between patients with the same phenotypic disorder. An outlier detection strategy was developed to identify variants at the level of gene transcription that are of potential biological and phenotypic importance. Here we describe a graphical software package (z-score outlier detection (ZODET that enables identification and visualisation of gross abnormalities in gene expression (outliers in individuals, using whole genome microarray data. Mean and standard deviation of expression in a healthy control cohort is used to detect both over and under-expressed probes in individual test subjects. We compared the potential of ZODET to detect outlier genes in gene expression datasets with a previously described statistical method, gene tissue index (GTI, using a simulated expression dataset and a publicly available monocyte-derived macrophage microarray dataset. Taken together, these results support ZODET as a novel approach to identify outlier genes of potential pathogenic relevance in complex human diseases. The algorithm is implemented using R packages and Java.The software is freely available from http://www.ucl.ac.uk/medicine/molecular-medicine/publications/microarray-outlier-analysis.

  20. Graphics workflow optimization when editing standard tasks using modern graphics editing programs

    OpenAIRE

    Khabirova, Maja

    2012-01-01

    This work focuses on the description and characteristics of common problems which graphic designers face daily when working for advertising agencies. This work describes tasks and organises them according to the type of graphic being processed and the types of output. In addition, this work describes the ways these common tasks can be completed using modern graphics editing software. It also provides a practical definition of a graphic designer and graphic agency. The aim of this work is to m...

  1. Expected Utility Illustrated: A Graphical Analysis of Gambles with More than Two Possible Outcomes

    Science.gov (United States)

    Chen, Frederick H.

    2010-01-01

    The author presents a simple geometric method to graphically illustrate the expected utility from a gamble with more than two possible outcomes. This geometric result gives economics students a simple visual aid for studying expected utility theory and enables them to analyze a richer set of decision problems under uncertainty compared to what…

  2. A Visualisation-Based Semiotic Analysis of Learners' Conceptual Understanding of Graphical Functional Relationships

    Science.gov (United States)

    Mudaly, Vimolan

    2014-01-01

    Within the South African school curriculum, the section on graphical functional relationships consists of signs which include symbols, notation and imagery. In a previous article we explored the role visualisation played in the way learners understood mathematical concepts. That paper reported on the learners' fixation with the physical features…

  3. Needs Analysis for Graphic Design Learning Module Based on Technology & Learning Styles of Deaf Students

    Science.gov (United States)

    Ibrahim, Zainuddin; Alias, Norlidah; Nordin, Abu Bakar

    2016-01-01

    The field of Information Communication Technology has offered a promising future for deaf students. Web design, animation, and multimedia application design are a branch of graphic design area, which aim to aid their learning visually. However, most of the technical terms cannot be interpreted in Malaysian sign language. Moreover, the development…

  4. Graphic Ecologies

    Directory of Open Access Journals (Sweden)

    Brook Weld Muller

    2014-12-01

    Full Text Available This essay describes strategic approaches to graphic representation associated with critical environmental engagement and that build from the idea of works of architecture as stitches in the ecological fabric of the city. It focuses on the building up of partial or fragmented graphics in order to describe inclusive, open-ended possibilities for making architecture that marry rich experience and responsive performance. An aphoristic approach to crafting drawings involves complex layering, conscious absence and the embracing of tension. A self-critical attitude toward the generation of imagery characterized by the notion of ‘loose precision’ may lead to more transformative and environmentally responsive architectures.

  5. Graphics gems

    CERN Document Server

    Heckbert, Paul S

    1994-01-01

    Graphics Gems IV contains practical techniques for 2D and 3D modeling, animation, rendering, and image processing. The book presents articles on polygons and polyhedral; a mix of formulas, optimized algorithms, and tutorial information on the geometry of 2D, 3D, and n-D space; transformations; and parametric curves and surfaces. The text also includes articles on ray tracing; shading 3D models; and frame buffer techniques. Articles on image processing; algorithms for graphical layout; basic interpolation methods; and subroutine libraries for vector and matrix algebra are also demonstrated. Com

  6. Open source software and crowdsourcing for energy analysis

    International Nuclear Information System (INIS)

    Bazilian, Morgan; Rice, Andrew; Rotich, Juliana; Howells, Mark; DeCarolis, Joseph; Macmillan, Stuart; Brooks, Cameron; Bauer, Florian; Liebreich, Michael

    2012-01-01

    Informed energy decision making requires effective software, high-quality input data, and a suitably trained user community. Developing these resources can be expensive and time consuming. Even when data and tools are intended for public re-use they often come with technical, legal, economic and social barriers that make them difficult to adopt, adapt and combine for use in new contexts. We focus on the promise of open, publically accessible software and data as well as crowdsourcing techniques to develop robust energy analysis tools that can deliver crucial, policy-relevant insight, particularly in developing countries, where planning resources are highly constrained—and the need to adapt these resources and methods to the local context is high. We survey existing research, which argues that these techniques can produce high-quality results, and also explore the potential role that linked, open data can play in both supporting the modelling process and in enhancing public engagement with energy issues. - Highlights: ► We focus on the promise of open, publicly accessible software and data. ► These emerging techniques can produce high-quality results for energy analysis. ► Developing economies require new techniques for energy planning.

  7. PuMA: the Porous Microstructure Analysis software

    Science.gov (United States)

    Ferguson, Joseph C.; Panerai, Francesco; Borner, Arnaud; Mansour, Nagi N.

    2018-01-01

    The Porous Microstructure Analysis (PuMA) software has been developed in order to compute effective material properties and perform material response simulations on digitized microstructures of porous media. PuMA is able to import digital three-dimensional images obtained from X-ray microtomography or to generate artificial microstructures. PuMA also provides a module for interactive 3D visualizations. Version 2.1 includes modules to compute porosity, volume fractions, and surface area. Two finite difference Laplace solvers have been implemented to compute the continuum tortuosity factor, effective thermal conductivity, and effective electrical conductivity. A random method has been developed to compute tortuosity factors from the continuum to rarefied regimes. Representative elementary volume analysis can be performed on each property. The software also includes a time-dependent, particle-based model for the oxidation of fibrous materials. PuMA was developed for Linux operating systems and is available as a NASA software under a US & Foreign release.

  8. Uses of software in digital image analysis: a forensic report

    Science.gov (United States)

    Sharma, Mukesh; Jha, Shailendra

    2010-02-01

    Forensic image analysis is required an expertise to interpret the content of an image or the image itself in legal matters. Major sub-disciplines of forensic image analysis with law enforcement applications include photo-grammetry, photographic comparison, content analysis and image authentication. It has wide applications in forensic science range from documenting crime scenes to enhancing faint or indistinct patterns such as partial fingerprints. The process of forensic image analysis can involve several different tasks, regardless of the type of image analysis performed. Through this paper authors have tried to explain these tasks, which are described in to three categories: Image Compression, Image Enhancement & Restoration and Measurement Extraction. With the help of examples like signature comparison, counterfeit currency comparison and foot-wear sole impression using the software Canvas and Corel Draw.

  9. New analysis software for Viking Lander meteorological data

    Directory of Open Access Journals (Sweden)

    O. Kemppinen

    2013-02-01

    Full Text Available We have developed a set of tools that enable us to process Viking Lander meteorological data beyond what has been previously publicly available. Besides providing data for new periods of time, the existing data periods have been augmented by enhancing the data resolution significantly. This was accomplished by first transferring the original Prime computer version of the data analysis software to a standard Linux platform, and then by modifying the software to be able to process the data despite irregularities in the original raw data and reverse engineering various parameter files. In addition to this, the processing pipeline has been streamlined, making processing the data faster and easier. As a case example of new data, freshly processed Viking Lander 1 and 2 temperature records are described and briefly analyzed in ways that have not been previously possible due to the lack of data.

  10. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  11. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Science.gov (United States)

    2011-09-30

    ... Software AGENCY: Nuclear Regulatory Commission. ACTION: Regulatory issue summary; request for comment... computer software package, WESTEMS TM , to demonstrate compliance with Section III, ``Rules for... Software Addressees All holders of, and applicants for, a power reactor operating license or construction...

  12. A software architectural framework specification for neutron activation analysis

    International Nuclear Information System (INIS)

    Preston, J.A.; Grant, C.N.

    2013-01-01

    Neutron Activation Analysis (NAA) is a sensitive multi-element nuclear analytical technique that has been routinely applied by research reactor (RR) facilities to environmental, nutritional, health related, geological and geochemical studies. As RR facilities face calls to increase their research output and impact, with existing or reducing budgets, automation of NAA offers a possible solution. However, automation has many challenges, not the least of which is a lack of system architecture standards to establish acceptable mechanisms for the various hardware/software and software/software interactions among data acquisition systems, specialised hardware such as sample changers, sample loaders, and data processing modules. This lack of standardization often results in automation hardware and software being incompatible with existing system components, in a facility looking to automate its NAA operations. This limits the availability of automation to a few RR facilities with adequate budgets or in-house engineering resources. What is needed is a modern open system architecture for NAA, that provides the required set of functionalities. This paper describes such an 'architectural framework' (OpenNAA), and portions of a reference implementation. As an example of the benefits, calculations indicate that applying this architecture to the compilation and QA steps associated with the analysis of 35 elements in 140 samples, with 14 SRM's, can reduce the time required by over 80 %. The adoption of open standards in the nuclear industry has been very successful over the years in promoting interchangeability and maximising the lifetime and output of nuclear measurement systems. OpenNAA will provide similar benefits within the NAA application space, safeguarding user investments in their current system, while providing a solid path for development into the future. (author)

  13. Design, testing, and delivery of an interactive graphics display subsystem

    Science.gov (United States)

    Holmes, B.

    1973-01-01

    An interactive graphics display system was designed to be used in locating components on a printed circuit card and outputting data concerning their thermal values. The manner in which this was accomplished in terms of both hardware and software is described. An analysis of the accuracy of this approach is also included.

  14. AUSPEX: a graphical tool for X-ray diffraction data analysis.

    Science.gov (United States)

    Thorn, Andrea; Parkhurst, James; Emsley, Paul; Nicholls, Robert A; Vollmar, Melanie; Evans, Gwyndaf; Murshudov, Garib N

    2017-09-01

    In this paper, AUSPEX, a new software tool for experimental X-ray data analysis, is presented. Exploring the behaviour of diffraction intensities and the associated estimated uncertainties facilitates the discovery of underlying problems and can help users to improve their data acquisition and processing in order to obtain better structural models. The program enables users to inspect the distribution of observed intensities (or amplitudes) against resolution as well as the associated estimated uncertainties (sigmas). It is demonstrated how AUSPEX can be used to visually and automatically detect ice-ring artefacts in integrated X-ray diffraction data. Such artefacts can hamper structure determination, but may be difficult to identify from the raw diffraction images produced by modern pixel detectors. The analysis suggests that a significant portion of the data sets deposited in the PDB contain ice-ring artefacts. Furthermore, it is demonstrated how other problems in experimental X-ray data caused, for example, by scaling and data-conversion procedures can be detected by AUSPEX.

  15. BIM Software Capability and Interoperability Analysis : An analytical approach toward structural usage of BIM software (S-BIM)

    OpenAIRE

    A. Taher, Ali

    2016-01-01

    This study focused on the structuralanalysis of BIM models. Different commercial software (Autodesk products and Rhinoceros)are presented through modelling and analysis of different structures with varying complexity,section properties, geometry, and material. Beside the commercial software, differentarchitectural and different tools for structural analysis are evaluated (dynamo, grasshopper,add-on tool, direct link, indirect link via IFC). BIM and Structural BIM (S-BIM)

  16. Gamma-ray spectral analysis software designed for extreme ease of use or unattended operation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.; Romine, W.A.

    1993-07-01

    We are developing isotopic analysis software in the Safeguards Technology Program that advances usability in two complimentary directions. The first direction is towards Graphical User Interfaces (GUIs) for very easy. to use applications. The second is toward a minimal user interface, but with additional features for unattended or fully automatic applications. We are developing a GUI-based spectral viewing engine that is currently running in the MS-Windows environment. We intend to use this core application to provide the common user interface for our data analysis, and subsequently data acquisition and instrument control applications. We are also investigating sets of cases where the MGA methodology produces reduced accuracy results, incorrect errors, or incorrect results. We try to determine the root cause for the problem and extend the methodology or replace portions of the Methodology so that MGA will function over a wider domain of analysis without requiring intervention and analysis by a spectroscopist. This effort is necessary for applications where such intervention is inconvenient or impractical

  17. Melanie II--a third-generation software package for analysis of two-dimensional electrophoresis images: I. Features and user interface.

    Science.gov (United States)

    Appel, R D; Palagi, P M; Walther, D; Vargas, J R; Sanchez, J C; Ravier, F; Pasquali, C; Hochstrasser, D F

    1997-12-01

    Although two-dimensional electrophoresis (2-DE) computer analysis software packages have existed ever since 2-DE technology was developed, it is only now that the hardware and software technology allows large-scale studies to be performed on low-cost personal computers or workstations, and that setting up a 2-DE computer analysis system in a small laboratory is no longer considered a luxury. After a first attempt in the seventies and early eighties to develop 2-DE analysis software systems on hardware that had poor or even no graphical capabilities, followed in the late eighties by a wave of innovative software developments that were possible thanks to new graphical interface standards such as XWindows, a third generation of 2-DE analysis software packages has now come to maturity. It can be run on a variety of low-cost, general-purpose personal computers, thus making the purchase of a 2-DE analysis system easily attainable for even the smallest laboratory that is involved in proteome research. Melanie II 2-D PAGE, developed at the University Hospital of Geneva, is such a third-generation software system for 2-DE analysis. Based on unique image processing algorithms, this user-friendly object-oriented software package runs on multiple platforms, including Unix, MS-Windows 95 and NT, and Power Macintosh. It provides efficient spot detection and quantitation, state-of-the-art image comparison, statistical data analysis facilities, and is Internet-ready. Linked to proteome databases such as those available on the World Wide Web, it represents a valuable tool for the "Virtual Lab" of the post-genome area.

  18. An Analysis of Related Software Cycles Among Organizations, People and the Software Industry

    Science.gov (United States)

    2008-06-01

    confirmatory, theory testing type of research. Miles and Huberman (1994) claim: Much qualitative research lies between these two extremes. Something is...either graphically or in narrative form, the main things to be studied” ( Miles and Huberman 1994). The “main things” mentioned here refer to the key

  19. Statistical Methods and Software for the Analysis of Occupational Exposure Data with Non-detectable Values

    Energy Technology Data Exchange (ETDEWEB)

    Frome, EL

    2005-09-20

    Environmental exposure measurements are, in general, positive and may be subject to left censoring; i.e,. the measured value is less than a ''detection limit''. In occupational monitoring, strategies for assessing workplace exposures typically focus on the mean exposure level or the probability that any measurement exceeds a limit. Parametric methods used to determine acceptable levels of exposure, are often based on a two parameter lognormal distribution. The mean exposure level, an upper percentile, and the exceedance fraction are used to characterize exposure levels, and confidence limits are used to describe the uncertainty in these estimates. Statistical methods for random samples (without non-detects) from the lognormal distribution are well known for each of these situations. In this report, methods for estimating these quantities based on the maximum likelihood method for randomly left censored lognormal data are described and graphical methods are used to evaluate the lognormal assumption. If the lognormal model is in doubt and an alternative distribution for the exposure profile of a similar exposure group is not available, then nonparametric methods for left censored data are used. The mean exposure level, along with the upper confidence limit, is obtained using the product limit estimate, and the upper confidence limit on an upper percentile (i.e., the upper tolerance limit) is obtained using a nonparametric approach. All of these methods are well known but computational complexity has limited their use in routine data analysis with left censored data. The recent development of the R environment for statistical data analysis and graphics has greatly enhanced the availability of high-quality nonproprietary (open source) software that serves as the basis for implementing the methods in this paper.

  20. Computer graphics from basic to application

    International Nuclear Information System (INIS)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-01

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  1. Computer graphics from basic to application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Do Hyeong; Mun, Sung Min

    1998-04-15

    This book mentions conception of computer graphics, background history, necessity and applied field like construction design, image processing, auto mobile design, fashion design and TV broadcast, basic principle of computer, computer graphics hardware, computer graphics software such as adobe illustrator tool box and adobe photo shop, quarkXpress like introduction, application and operating circumstance, 3D graphics with summary, difference of versions of 3D studio and system, and Auto CAD application.

  2. International Atomic Energy Agency intercomparison of ion beam analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Barradas, N.P. [Instituto Tecnologico e Nuclear, Estrada Nacional No. 10, Apartado 21, 2686-953 Sacavem (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Avenida do Professor Gama Pinto 2, 1649-003 Lisboa (Portugal)], E-mail: nunoni@itn.pt; Arstila, K. [K.U. Leuven, Instituut voor Kern-en Stralingsfysica, Celestijnenlaan 200D, B-3001 Leuven (Belgium); Battistig, G. [MFA Research Institute for Technical Physics and Materials Science, P.O. Box 49, H-1525 Budapest (Hungary); Bianconi, M. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Dytlewski, N. [International Atomic Energy Agency, Wagramer Strasse 5, P.O. Box 100, A-1400 Vienna (Austria); Jeynes, C. [Surrey Ion Beam Centre, University of Surrey, Guildford, Surrey GU2 7XH (United Kingdom); Kotai, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Lulli, G. [CNR-IMM-Sezione di Bologna, Via P. Gobetti, 101, I-40129 Bologna (Italy); Mayer, M. [Max-Planck-Institut fuer Plasmaphysik, EURATOM Association, Boltzmannstrasse 2, D-85748 Garching (Germany); Rauhala, E. [Accelerator Laboratory, Department of Physics, University of Helsinki, P.O. Box 43, FIN-00014 Helsinki (Finland); Szilagyi, E. [KFKI Research Institute for Particle and Nuclear Physics, P.O. Box 49, H-1525 Budapest (Hungary); Thompson, M. [Department of MS and E/Bard Hall 328, Cornell University, Ithaca, NY 14853 (United States)

    2007-09-15

    Ion beam analysis (IBA) includes a group of techniques for the determination of elemental concentration depth profiles of thin film materials. Often the final results rely on simulations, fits and calculations, made by dedicated codes written for specific techniques. Here we evaluate numerical codes dedicated to the analysis of Rutherford backscattering spectrometry, non-Rutherford elastic backscattering spectrometry, elastic recoil detection analysis and non-resonant nuclear reaction analysis data. Several software packages have been presented and made available to the community. New codes regularly appear, and old codes continue to be used and occasionally updated and expanded. However, those codes have to date not been validated, or even compared to each other. Consequently, IBA practitioners use codes whose validity, correctness and accuracy have never been validated beyond the authors' efforts. In this work, we present the results of an IBA software intercomparison exercise, where seven different packages participated. These were DEPTH, GISA, DataFurnace (NDF), RBX, RUMP, SIMNRA (all analytical codes) and MCERD (a Monte Carlo code). In a first step, a series of simulations were defined, testing different capabilities of the codes, for fixed conditions. In a second step, a set of real experimental data were analysed. The main conclusion is that the codes perform well within the limits of their design, and that the largest differences in the results obtained are due to differences in the fundamental databases used (stopping power and scattering cross section). In particular, spectra can be calculated including Rutherford cross sections with screening, energy resolution convolutions including energy straggling, and pileup effects, with agreement between the codes available at the 0.1% level. This same agreement is also available for the non-RBS techniques. This agreement is not limited to calculation of spectra from particular structures with predetermined

  3. Research and Development of Statistical Analysis Software System of Maize Seedling Experiment

    OpenAIRE

    Hui Cao

    2014-01-01

    In this study, software engineer measures were used to develop a set of software system for maize seedling experiments statistics and analysis works. During development works, B/S structure software design method was used and a set of statistics indicators for maize seedling evaluation were established. The experiments results indicated that this set of software system could finish quality statistics and analysis for maize seedling very well. The development of this software system explored a...

  4. A Parallel Software Pipeline for DMET Microarray Genotyping Data Analysis

    Directory of Open Access Journals (Sweden)

    Giuseppe Agapito

    2018-06-01

    Full Text Available Personalized medicine is an aspect of the P4 medicine (predictive, preventive, personalized and participatory based precisely on the customization of all medical characters of each subject. In personalized medicine, the development of medical treatments and drugs is tailored to the individual characteristics and needs of each subject, according to the study of diseases at different scales from genotype to phenotype scale. To make concrete the goal of personalized medicine, it is necessary to employ high-throughput methodologies such as Next Generation Sequencing (NGS, Genome-Wide Association Studies (GWAS, Mass Spectrometry or Microarrays, that are able to investigate a single disease from a broader perspective. A side effect of high-throughput methodologies is the massive amount of data produced for each single experiment, that poses several challenges (e.g., high execution time and required memory to bioinformatic software. Thus a main requirement of modern bioinformatic softwares, is the use of good software engineering methods and efficient programming techniques, able to face those challenges, that include the use of parallel programming and efficient and compact data structures. This paper presents the design and the experimentation of a comprehensive software pipeline, named microPipe, for the preprocessing, annotation and analysis of microarray-based Single Nucleotide Polymorphism (SNP genotyping data. A use case in pharmacogenomics is presented. The main advantages of using microPipe are: the reduction of errors that may happen when trying to make data compatible among different tools; the possibility to analyze in parallel huge datasets; the easy annotation and integration of data. microPipe is available under Creative Commons license, and is freely downloadable for academic and not-for-profit institutions.

  5. Comparison of two software versions for assessment of body-composition analysis by DXA

    DEFF Research Database (Denmark)

    Vozarova, B; Wang, J; Weyer, C

    2001-01-01

    To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA.......To compare two software versions provided by Lunar CO: for assessment of body composition analysis by DXA....

  6. Theoretical Foundations of Software Technology.

    Science.gov (United States)

    1983-02-14

    major research interests are software testing, aritificial intelligence , pattern recogu- tion, and computer graphics. Dr. Chandranekaran is currently...produce PASCAL language code for the problems. Because of its relationship to many issues in Artificial Intelligence , we also investigated problems of...analysis to concurmt-prmcess software re- are not " intelligent " enough to discover these by themselves, ouirl more complex control flow models. The PAF

  7. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    Science.gov (United States)

    2010-11-01

    Graphical model for the HDP. . . . . . . . . . . . . . . . . . . . . . . . . 15 2.5 Chinese Restaurant Franchise (CRF) for three groups of eight observations...associated with ob- servations indirectly through table assignments in the Chinese Restaurant Franchise (CRF). This means that the concentration...other kj,−i in the same song j and on the global component proportions β is given by the Chinese 95 restaurant franchise : p(kji|kj,−i,β, α) =  n·kj

  8. Three dimensional analysis of coelacanth body structure by computer graphics and X-ray CT images

    International Nuclear Information System (INIS)

    Suzuki, Naoki; Hamada, Takashi.

    1990-01-01

    Three dimensional imaging processes were applied for the structural and functional analyses of the modern coelacanth (Latimeria chalumnae). Visualization of the obtained images is performed with computer graphics on the basis of serial images by an X-ray CT scanning method. Reconstruction of three dimensional images of the body structure of coelacanth using the volume rendering and surface rendering methods provides us various information about external and internal shapes of this exquisite fish. (author)

  9. Formal Model for Data Dependency Analysis between Controls and Actions of a Graphical User Interface

    Directory of Open Access Journals (Sweden)

    SKVORC, D.

    2012-02-01

    Full Text Available End-user development is an emerging computer science discipline that provides programming paradigms, techniques, and tools suitable for users not trained in software engineering. One of the techniques that allow ordinary computer users to develop their own applications without the need to learn a classic programming language is a GUI-level programming based on programming-by-demonstration. To build wizard-based tools that assist users in application development and to verify the correctness of user programs, a computer-supported method for GUI-level data dependency analysis is necessary. Therefore, formal model for GUI representation is needed. In this paper, we present a finite state machine for modeling the data dependencies between GUI controls and GUI actions. Furthermore, we present an algorithm for automatic construction of finite state machine for arbitrary GUI application. We show that proposed state aggregation scheme successfully manages state explosion in state machine construction algorithm, which makes the model applicable for applications with complex GUIs.

  10. Coulomb 3.3 Graphic-rich deformation and stress-change software for earthquake, tectonic, and volcano research and teaching-user guide

    Science.gov (United States)

    Toda, Shingi; Stein, Ross S.; Sevilgen, Volkan; Lin, Jian

    2011-01-01

    Coulomb is intended both for publication-directed research and for college and graduate school classroom instruction. We believe that one learns best when one can see the most and can explore alternatives quickly. So the principal feature of Coulomb is ease of input, rapid interactive modification, and intuitive visualization of the results. The program has menus and check-items, and dialogue boxes to ease operation. The internal graphics are suitable for publication, and can be easily imported into Illustrator, GMT, Google Earth, or Flash for further enhancements.

  11. Graph based communication analysis for hardware/software codesign

    DEFF Research Database (Denmark)

    Knudsen, Peter Voigt; Madsen, Jan

    1999-01-01

    In this paper we present a coarse grain CDFG (Control/Data Flow Graph) model suitable for hardware/software partitioning of single processes and demonstrate how it is necessary to perform various transformations on the graph structure before partitioning in order to achieve a structure that allows...... for accurate estimation of communication overhead between nodes mapped to different processors. In particular, we demonstrate how various transformations of control structures can lead to a more accurate communication analysis and more efficient implementations. The purpose of the transformations is to obtain...

  12. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1999-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  13. Development of RCM analysis software for Korean nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Ho; Choi, Kwang Hee; Jeong, Hyeong Jong [Korea Electric Power Research Institute, Taejon (Korea, Republic of)

    1998-12-31

    A software called KEPCO RCM workstation (KRCM) has been developed to optimize the maintenance strategies of Korean nuclear power plants. The program modules of the KRCM were designed in a manner that combines EPRI methodologies and KEPRI analysis technique. The KRCM is being applied to the three pilot system, chemical and volume control system, main steam system, and compressed air system of Yonggwang Units 1 and 2. In addition, the KRCM can be utilized as a tool to meet a part of the requirements of maintenance rule (MR) imposed by U.S. NRC. 3 refs., 4 figs. (Author)

  14. Discriminant Analysis of the Effects of Software Cost Drivers on ...

    African Journals Online (AJOL)

    The paper work investigates the effect of software cost drivers on project schedule estimation of software development projects in Nigeria. Specifically, the paper determines the extent to which software cost variables affect our software project time schedule in our environment. Such studies are lacking in the recent ...

  15. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  16. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  17. What is sought from graphic designers? A first thematic analysis of job offers for graphic design positions in the United Kingdom

    OpenAIRE

    Nicoletti Dziobczenski, Paulo; Person, Oscar

    2016-01-01

    An empirically grounded understanding about which knowledge and skills that are sought from designers is missing for a number of professional subfields of design. This gap in research challenges i) design educators in planning their educational offerings and ii) design practitioners and students in articulating their contribution to clients and future employers. In this paper, we study the references that are made to knowledge and skills in job offers for graphic designers in UK. Based on a f...

  18. Models for composing software : an analysis of software composition and objects

    NARCIS (Netherlands)

    Bergmans, Lodewijk

    1999-01-01

    In this report, we investigate component-based software construction with a focus on composition. In particular we try to analyze the requirements and issues for components and software composition. As a means to understand this research area, we introduce a canonical model for representing

  19. MetaComp: comprehensive analysis software for comparative meta-omics including comparative metagenomics.

    Science.gov (United States)

    Zhai, Peng; Yang, Longshu; Guo, Xiao; Wang, Zhe; Guo, Jiangtao; Wang, Xiaoqi; Zhu, Huaiqiu

    2017-10-02

    During the past decade, the development of high throughput nucleic sequencing and mass spectrometry analysis techniques have enabled the characterization of microbial communities through metagenomics, metatranscriptomics, metaproteomics and metabolomics data. To reveal the diversity of microbial communities and interactions between living conditions and microbes, it is necessary to introduce comparative analysis based upon integration of all four types of data mentioned above. Comparative meta-omics, especially comparative metageomics, has been established as a routine process to highlight the significant differences in taxon composition and functional gene abundance among microbiota samples. Meanwhile, biologists are increasingly concerning about the correlations between meta-omics features and environmental factors, which may further decipher the adaptation strategy of a microbial community. We developed a graphical comprehensive analysis software named MetaComp comprising a series of statistical analysis approaches with visualized results for metagenomics and other meta-omics data comparison. This software is capable to read files generated by a variety of upstream programs. After data loading, analyses such as multivariate statistics, hypothesis testing of two-sample, multi-sample as well as two-group sample and a novel function-regression analysis of environmental factors are offered. Here, regression analysis regards meta-omic features as independent variable and environmental factors as dependent variables. Moreover, MetaComp is capable to automatically choose an appropriate two-group sample test based upon the traits of input abundance profiles. We further evaluate the performance of its choice, and exhibit applications for metagenomics, metaproteomics and metabolomics samples. MetaComp, an integrative software capable for applying to all meta-omics data, originally distills the influence of living environment on microbial community by regression analysis

  20. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate.......However there is clearly a real need for robust tools, standard operating procedures and general acceptance of best practises. Thus we submit to the proteomics community a call for a community-wide open set of proteomics analysis challenges—PROTEINCHALLENGE—that directly target and compare data analysis workflows......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  1. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited.

    Science.gov (United States)

    Zöllner, Frank G; Daab, Markus; Sourbron, Steven P; Schad, Lothar R; Schoenberg, Stefan O; Weisser, Gerald

    2016-01-14

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft's model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control.

  2. An open source software for analysis of dynamic contrast enhanced magnetic resonance images: UMMPerfusion revisited

    International Nuclear Information System (INIS)

    Zöllner, Frank G.; Daab, Markus; Sourbron, Steven P.; Schad, Lothar R.; Schoenberg, Stefan O.; Weisser, Gerald

    2016-01-01

    Perfusion imaging has become an important image based tool to derive the physiological information in various applications, like tumor diagnostics and therapy, stroke, (cardio-) vascular diseases, or functional assessment of organs. However, even after 20 years of intense research in this field, perfusion imaging still remains a research tool without a broad clinical usage. One problem is the lack of standardization in technical aspects which have to be considered for successful quantitative evaluation; the second problem is a lack of tools that allow a direct integration into the diagnostic workflow in radiology. Five compartment models, namely, a one compartment model (1CP), a two compartment exchange (2CXM), a two compartment uptake model (2CUM), a two compartment filtration model (2FM) and eventually the extended Toft’s model (ETM) were implemented as plugin for the DICOM workstation OsiriX. Moreover, the plugin has a clean graphical user interface and provides means for quality management during the perfusion data analysis. Based on reference test data, the implementation was validated against a reference implementation. No differences were found in the calculated parameters. We developed open source software to analyse DCE-MRI perfusion data. The software is designed as plugin for the DICOM Workstation OsiriX. It features a clean GUI and provides a simple workflow for data analysis while it could also be seen as a toolbox providing an implementation of several recent compartment models to be applied in research tasks. Integration into the infrastructure of a radiology department is given via OsiriX. Results can be saved automatically and reports generated automatically during data analysis ensure certain quality control

  3. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  4. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark. 

  5. Resurfacing Graphics

    Directory of Open Access Journals (Sweden)

    Prof. Patty K. Wongpakdee

    2013-06-01

    Full Text Available “Resurfacing Graphics” deals with the subject of unconventional design, with the purpose of engaging the viewer to experience the graphics beyond paper’s passive surface. Unconventional designs serve to reinvigorate people, whose senses are dulled by the typical, printed graphics, which bombard them each day. Today’s cutting-edge designers, illustrators and artists utilize graphics in a unique manner that allows for tactile interaction. Such works serve as valuable teaching models and encourage students to do the following: 1 investigate the trans-disciplines of art and technology; 2 appreciate that this approach can have a positive effect on the environment; 3 examine and research other approaches of design communications and 4 utilize new mediums to stretch the boundaries of artistic endeavor. This paper examines how visuals communicators are “Resurfacing Graphics” by using atypical surfaces and materials such as textile, wood, ceramics and even water. Such non-traditional transmissions of visual language serve to demonstrate student’s overreliance on paper as an outdated medium. With this exposure, students can become forward-thinking, eco-friendly, creative leaders by expanding their creative breadth and continuing the perpetual exploration for new ways to make their mark.

  6. Don't Blame the Software: Using Qualitative Data Analysis Software Successfully in Doctoral Research

    Directory of Open Access Journals (Sweden)

    Michelle Salmona

    2016-07-01

    Full Text Available In this article, we explore the learning experiences of doctoral candidates as they use qualitative data analysis software (QDAS. Of particular interest is the process of adopting technology during the development of research methodology. Using an action research approach, data was gathered over five years from advanced doctoral research candidates and supervisors. The technology acceptance model (TAM was then applied as a theoretical analytic lens for better understanding how students interact with new technology. Findings relate to two significant barriers which doctoral students confront: 1. aligning perceptions of ease of use and usefulness is essential in overcoming resistance to technological change; 2. transparency into the research process through technology promotes insights into methodological challenges. Transitioning through both barriers requires a competent foundation in qualitative research. The study acknowledges the importance of higher degree research, curriculum reform and doctoral supervision in post-graduate research training together with their interconnected relationships in support of high-quality inquiry. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1603117

  7. eXtended CASA Line Analysis Software Suite (XCLASS)

    Science.gov (United States)

    Möller, T.; Endres, C.; Schilke, P.

    2017-02-01

    The eXtended CASA Line Analysis Software Suite (XCLASS) is a toolbox for the Common Astronomy Software Applications package (CASA) containing new functions for modeling interferometric and single dish data. Among the tools is the myXCLASS program which calculates synthetic spectra by solving the radiative transfer equation for an isothermal object in one dimension, whereas the finite source size and dust attenuation are considered as well. Molecular data required by the myXCLASS program are taken from an embedded SQLite3 database containing entries from the Cologne Database for Molecular Spectroscopy (CDMS) and JPL using the Virtual Atomic and Molecular Data Center (VAMDC) portal. Additionally, the toolbox provides an interface for the model optimizer package Modeling and Analysis Generic Interface for eXternal numerical codes (MAGIX), which helps to find the best description of observational data using myXCLASS (or another external model program), that is, finding the parameter set that most closely reproduces the data. http://www.astro.uni-koeln.de/projects/schilke/myXCLASSInterface A copy of the code is available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/598/A7

  8. Sub-pixel analysis to support graphic security after scanning at low resolution

    Science.gov (United States)

    Haas, Bertrand; Cordery, Robert; Gou, Hongmei; Decker, Steve

    2006-02-01

    Whether in the domain of audio, video or finance, our world tends to become increasingly digital. However, for diverse reasons, the transition from analog to digital is often much extended in time, and proceeds by long steps (and sometimes never completes). One such step is the conversion of information on analog media to digital information. We focus in this paper on the conversion (scanning) of printed documents to digital images. Analog media have the advantage over digital channels that they can harbor much imperceptible information that can be used for fraud detection and forensic purposes. But this secondary information usually fails to be retrieved during the conversion step. This is particularly relevant since the Check-21 act (Check Clearing for the 21st Century act) became effective in 2004 and allows images of checks to be handled by banks as usual paper checks. We use here this situation of check scanning as our primary benchmark for graphic security features after scanning. We will first present a quick review of the most common graphic security features currently found on checks, with their specific purpose, qualities and disadvantages, and we demonstrate their poor survivability after scanning in the average scanning conditions expected from the Check-21 Act. We will then present a novel method of measurement of distances between and rotations of line elements in a scanned image: Based on an appropriate print model, we refine direct measurements to an accuracy beyond the size of a scanning pixel, so we can then determine expected distances, periodicity, sharpness and print quality of known characters, symbols and other graphic elements in a document image. Finally we will apply our method to fraud detection of documents after gray-scale scanning at 300dpi resolution. We show in particular that alterations on legitimate checks or copies of checks can be successfully detected by measuring with sub-pixel accuracy the irregularities inherently introduced

  9. System and software safety analysis for the ERA control computer

    International Nuclear Information System (INIS)

    Beerthuizen, P.G.; Kruidhof, W.

    2001-01-01

    The European Robotic Arm (ERA) is a seven degrees of freedom relocatable anthropomorphic robotic manipulator system, to be used in manned space operation on the International Space Station, supporting the assembly and external servicing of the Russian segment. The safety design concept and implementation of the ERA is described, in particular with respect to the central computer's software design. A top-down analysis and specification process is used to down flow the safety aspects of the ERA system towards the subsystems, which are produced by a consortium of companies in many countries. The user requirements documents and the critical function list are the key documents in this process. Bottom-up analysis (FMECA) and test, on both subsystem and system level, are the basis for safety verification. A number of examples show the use of the approach and methods used

  10. Compositional Solution Space Quantification for Probabilistic Software Analysis

    Science.gov (United States)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  11. HistFitter software framework for statistical data analysis

    CERN Document Server

    Baak, M.; Côte, D.; Koutsman, A.; Lorenz, J.; Short, D.

    2015-01-01

    We present a software framework for statistical data analysis, called HistFitter, that has been used extensively by the ATLAS Collaboration to analyze big datasets originating from proton-proton collisions at the Large Hadron Collider at CERN. Since 2012 HistFitter has been the standard statistical tool in searches for supersymmetric particles performed by ATLAS. HistFitter is a programmable and flexible framework to build, book-keep, fit, interpret and present results of data models of nearly arbitrary complexity. Starting from an object-oriented configuration, defined by users, the framework builds probability density functions that are automatically fitted to data and interpreted with statistical tests. A key innovation of HistFitter is its design, which is rooted in core analysis strategies of particle physics. The concepts of control, signal and validation regions are woven into its very fabric. These are progressively treated with statistically rigorous built-in methods. Being capable of working with mu...

  12. Facts and figures: a graphical analysis of world energy up to 1991

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Various data are given in graphical form on primary energy production, consumption and on reserves. The energy carriers considered are: oil, gas, coal, hydro and nuclear (uranium). The subdivision of countries is done in categories like developing countries, OECD, Eastern Europe, OPEC or alternatively as geographical regions. On page 18 there is a section on 'nuclear power electricity generation capacity'. Another group of data are on non-energy figures like GPD and trade; here the partners are the groups 'developing countries' vs 'industrialized countries'

  13. An Analysis of Related Software Cycles Among Organizations, People and the Software Industry

    National Research Council Canada - National Science Library

    Moore, Robert; Adams, Brady

    2008-01-01

    .... This thesis intends to explore the moderating factors of these three distinct and disjointed cycles and propose courses of action towards mitigating various issues and problems inherent in the software upgrade process...

  14. Computer graphics in piping structural engineering

    International Nuclear Information System (INIS)

    Revesz, Z.

    1985-01-01

    Computer graphics in piping structural engineering is gaining in popularity. The large number of systems, the growing complexity of the load cases and structure models require human assimilation of large amounts of data. An effort has been made to enlighten evaluation of numerical data and visualize as much of it as possible, thus eliminating a source of error and accelerating analysis/reporting. The product of this effort is PAID, the Piping Analysis and Interactive Design software. While developing PAID, interest has been focused on the acceleration of the work done mainly by PIPESTRESS. Some installed and tested capabilities of PAID are presented in this paper. Examples are given from the graphic output in report form and the conversation necessary to get such is demonstrated. (orig.)

  15. Using CASE Software to Teach Undergraduates Systems Analysis and Design.

    Science.gov (United States)

    Wilcox, Russell E.

    1988-01-01

    Describes the design and delivery of a college course for information system students utilizing a Computer-Aided Software Engineering program. Discusses class assignments, cooperative learning, student attitudes, and the advantages of using this software in the course. (CW)

  16. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  17. SCALE Graphical Developments for Improved Criticality Safety Analyses

    International Nuclear Information System (INIS)

    Barnett, D.L.; Bowman, S.M.; Horwedel, J.E.; Petrie, L.M.

    1999-01-01

    New computer graphic developments at Oak Ridge National Ridge National Laboratory (ORNL) are being used to provide visualization of criticality safety models and calculational results as well as tools for criticality safety analysis input preparation. The purpose of this paper is to present the status of current development efforts to continue to enhance the SCALE (Standardized Computer Analyses for Licensing Evaluations) computer software system. Applications for criticality safety analysis in the areas of 3-D model visualization, input preparation and execution via a graphical user interface (GUI), and two-dimensional (2-D) plotting of results are discussed

  18. Software for 3D diagnostic image reconstruction and analysis

    International Nuclear Information System (INIS)

    Taton, G.; Rokita, E.; Sierzega, M.; Klek, S.; Kulig, J.; Urbanik, A.

    2005-01-01

    Recent advances in computer technologies have opened new frontiers in medical diagnostics. Interesting possibilities are the use of three-dimensional (3D) imaging and the combination of images from different modalities. Software prepared in our laboratories devoted to 3D image reconstruction and analysis from computed tomography and ultrasonography is presented. In developing our software it was assumed that it should be applicable in standard medical practice, i.e. it should work effectively with a PC. An additional feature is the possibility of combining 3D images from different modalities. The reconstruction and data processing can be conducted using a standard PC, so low investment costs result in the introduction of advanced and useful diagnostic possibilities. The program was tested on a PC using DICOM data from computed tomography and TIFF files obtained from a 3D ultrasound system. The results of the anthropomorphic phantom and patient data were taken into consideration. A new approach was used to achieve spatial correlation of two independently obtained 3D images. The method relies on the use of four pairs of markers within the regions under consideration. The user selects the markers manually and the computer calculates the transformations necessary for coupling the images. The main software feature is the possibility of 3D image reconstruction from a series of two-dimensional (2D) images. The reconstructed 3D image can be: (1) viewed with the most popular methods of 3D image viewing, (2) filtered and processed to improve image quality, (3) analyzed quantitatively (geometrical measurements), and (4) coupled with another, independently acquired 3D image. The reconstructed and processed 3D image can be stored at every stage of image processing. The overall software performance was good considering the relatively low costs of the hardware used and the huge data sets processed. The program can be freely used and tested (source code and program available at

  19. Phenomenology and Qualitative Data Analysis Software (QDAS: A Careful Reconciliation

    Directory of Open Access Journals (Sweden)

    Brian Kelleher Sohn

    2017-01-01

    Full Text Available An oft-cited phenomenological methodologist, Max VAN MANEN (2014, claims that qualitative data analysis software (QDAS is not an appropriate tool for phenomenological research. Yet phenomenologists rarely describe how phenomenology is to be done: pencil, paper, computer? DAVIDSON and DI GREGORIO (2011 urge QDAS contrarians such as VAN MANEN to get over their methodological loyalties and join the digital world, claiming that all qualitative researchers, whatever their methodology, perform processes aided by QDAS: disaggregation and recontextualization of texts. Other phenomenologists exemplify DAVIDSON and DI GREGORIO's observation that arguments against QDAS often identify problems more closely related to the researchers than QDAS. But the concerns about technology of McLUHAN (2003 [1964], HEIDEGGER (2008 [1977], and FLUSSER (2013 cannot be ignored. In this conceptual article I answer the questions of phenomenologists and the call of QDAS methodologists to describe how I used QDAS to carry out a phenomenological study in order to guide others who choose to reconcile the use of software to assist their research. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1701142

  20. CSpace: an integrated workplace for the graphical and algebraic analysis of phase assemblages on 32-bit wintel platforms

    Science.gov (United States)

    Torres-Roldan, Rafael L.; Garcia-Casco, Antonio; Garcia-Sanchez, Pedro A.

    2000-08-01

    CSpace is a program for the graphical and algebraic analysis of composition relations within chemical systems. The program is particularly suited to the needs of petrologists, but could also prove useful for mineralogists, geochemists and other environmental scientists. A few examples of what can be accomplished with CSpace are the mapping of compositions into some desired set of system/phase components, the estimation of reaction/mixing coefficients and assessment of phase-rule compatibility relations within or between complex mineral assemblages. The program also allows dynamic inspection of compositional relations by means of barycentric plots. CSpace provides an integrated workplace for data management, manipulation and plotting. Data management is done through a built-in spreadsheet-like editor, which also acts as a data repository for the graphical and algebraic procedures. Algebraic capabilities are provided by a mapping engine and a matrix analysis tool, both of which are based on singular-value decomposition. The mapping engine uses a general approach to linear mapping, capable of handling determined, underdetermined and overdetermined problems. The matrix analysis tool is implemented as a task "wizard" that guides the user through a number of steps to perform matrix approximation (finding nearest rank-deficient models of an input composition matrix), and inspection of null-reaction space relationships (i.e. of implicit linear relations among the elements of the composition matrix). Graphical capabilities are provided by a graph engine that directly links with the contents of the data editor. The graph engine can generate sophisticated 2-D ternary (triangular) and 3D quaternary (tetrahedral) barycentric plots and includes features such as interactive re-sizing and rotation, on-the-fly coordinate scaling and support for automated drawing of tie lines.

  1. Experimental software for modeling and interpreting educational data analysis processes

    Directory of Open Access Journals (Sweden)

    Natalya V. Zorina

    2017-12-01

    Full Text Available Problems, tasks and processes of educational data mining are considered in this article. The objective is to create a fundamentally new information system of the University using the results educational data analysis. One of the functions of such a system is knowledge extraction from accumulated in the operation process data. The creation of the national system of this type is an iterative and time-consuming process requiring the preliminary studies and incremental prototyping modules. The novelty of such systems is that there is a lack of those using this methodology of the development, for this purpose a number of experiments was carried out in order to collect data, choose appropriate methods for the study and to interpret them. As a result of the experiment, the authors were available sources available for analysis in the information environment of the home university. The data were taken from the semester performance, obtained from the information system of the training department of the Institute of IT MTU MIREA, the data obtained as a result of the independent work of students and data, using specially designed Google-forms. To automate the collection of information and analysis of educational data, an experimental software package was created. As a methodology for developing the experimental software complex, a decision was made using the methodologies of rational-empirical complexes (REX and single-experimentation program technologies (TPEI. The details of the program implementation of the complex are described in detail, conclusions are given about the availability of the data sources used, and conclusions are drawn about the prospects for further development.

  2. Introduction to regression graphics

    CERN Document Server

    Cook, R Dennis

    2009-01-01

    Covers the use of dynamic and interactive computer graphics in linear regression analysis, focusing on analytical graphics. Features new techniques like plot rotation. The authors have composed their own regression code, using Xlisp-Stat language called R-code, which is a nearly complete system for linear regression analysis and can be utilized as the main computer program in a linear regression course. The accompanying disks, for both Macintosh and Windows computers, contain the R-code and Xlisp-Stat. An Instructor's Manual presenting detailed solutions to all the problems in the book is ava

  3. Visual data mining and analysis of software repositories

    NARCIS (Netherlands)

    Voinea, S.L.; Telea, A.C.

    2007-01-01

    In this article we describe an ongoing effort to integrate information visualization techniques into the process of configuration management for software systems. Our focus is to help software engineers manage the evolution of large and complex software systems by offering them effective and

  4. MeV+R: using MeV as a graphical user interface for Bioconductor applications in microarray analysis

    Science.gov (United States)

    Chu, Vu T; Gottardo, Raphael; Raftery, Adrian E; Bumgarner, Roger E; Yeung, Ka Yee

    2008-01-01

    We present MeV+R, an integration of the JAVA MultiExperiment Viewer program with Bioconductor packages. This integration of MultiExperiment Viewer and R is easily extensible to other R packages and provides users with point and click access to traditionally command line driven tools written in R. We demonstrate the ability to use MultiExperiment Viewer as a graphical user interface for Bioconductor applications in microarray data analysis by incorporating three Bioconductor packages, RAMA, BRIDGE and iterativeBMA. PMID:18652698

  5. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  6. Integrated Software Environment for Pressurized Thermal Shock Analysis

    Directory of Open Access Journals (Sweden)

    Dino Araneo

    2011-01-01

    Full Text Available The present paper describes the main features and an application to a real Nuclear Power Plant (NPP of an Integrated Software Environment (in the following referred to as “platform” developed at University of Pisa (UNIPI to perform Pressurized Thermal Shock (PTS analysis. The platform is written in Java for the portability and it implements all the steps foreseen in the methodology developed at UNIPI for the deterministic analysis of PTS scenarios. The methodology starts with the thermal hydraulic analysis of the NPP with a system code (such as Relap5-3D and Cathare2, during a selected transient scenario. The results so obtained are then processed to provide boundary conditions for the next step, that is, a CFD calculation. Once the system pressure and the RPV wall temperature are known, the stresses inside the RPV wall can be calculated by mean a Finite Element (FE code. The last step of the methodology is the Fracture Mechanics (FM analysis, using weight functions, aimed at evaluating the stress intensity factor (KI at crack tip to be compared with the critical stress intensity factor KIc. The platform automates all these steps foreseen in the methodology once the user specifies a number of boundary conditions at the beginning of the simulation.

  7. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  8. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  9. Nonlinear analysis of reinforced concrete structures using software package abaqus

    Directory of Open Access Journals (Sweden)

    Marković Nemanja

    2014-01-01

    Full Text Available Reinforced concrete (AB is characterized by huge inhomogeneity resulting from the material characteristics of the concrete, then, quasi-brittle behavior during failure. These and other phenomena require the introduction of material nonlinearity in the modeling of reinforced concrete structures. This paper presents the modeling reinforced concrete in the software package ABAQUS. A brief theoretical overview is presented of methods such as: Concrete Damage Plasticity (CDP, Smeared Concrete Cracking (CSC, Cap Plasticity (CP and Drucker-Prager model (DPM. We performed a nonlinear analysis of two-storey reinforced concrete frame by applying CDP method for modeling material nonlinearity of concrete. We have analyzed damage zones, crack propagation and loading-deflection ratio.

  10. The Database and Data Analysis Software of Radiation Monitoring System

    International Nuclear Information System (INIS)

    Wang Weizhen; Li Jianmin; Wang Xiaobing; Hua Zhengdong; Xu Xunjiang

    2009-01-01

    Shanghai Synchrotron Radiation Facility (SSRF for short) is a third-generation light source building in China, including a 150MeV injector, 3.5GeV booster, 3.5GeV storage ring and an amount of beam line stations. The data is fetched by the monitoring computer from collecting modules in the front end, and saved in the MySQL database in the managing computer. The data analysis software is coded with Python, a script language, to inquire, summarize and plot the data of a certain monitoring channel during a certain period and export to an external file. In addition, the warning event can be inquired separately. The website for historical and real-time data inquiry and plotting is coded with PHP. (authors)

  11. Image analysis software for following progression of peripheral neuropathy

    Science.gov (United States)

    Epplin-Zapf, Thomas; Miller, Clayton; Larkin, Sean; Hermesmeyer, Eduardo; Macy, Jenny; Pellegrini, Marco; Luccarelli, Saverio; Staurenghi, Giovanni; Holmes, Timothy

    2009-02-01

    A relationship has been reported by several research groups [1 - 4] between the density and shapes of nerve fibers in the cornea and the existence and severity of peripheral neuropathy. Peripheral neuropathy is a complication of several prevalent diseases or conditions, which include diabetes, HIV, prolonged alcohol overconsumption and aging. A common clinical technique for confirming the condition is intramuscular electromyography (EMG), which is invasive, so a noninvasive technique like the one proposed here carries important potential advantages for the physician and patient. A software program that automatically detects the nerve fibers, counts them and measures their shapes is being developed and tested. Tests were carried out with a database of subjects with levels of severity of diabetic neuropathy as determined by EMG testing. Results from this testing, that include a linear regression analysis are shown.

  12. PROTEINCHALLENGE: Crowd sourcing in proteomics analysis and software development

    DEFF Research Database (Denmark)

    Martin, Sarah F.; Falkenberg, Heiner; Dyrlund, Thomas Franck

    2013-01-01

    , including arguments for community-wide open source software development and “big data” compatible solutions for the future. For the meantime, we have laid out ten top tips for data processing. With these at hand, a first large-scale proteomics analysis hopefully becomes less daunting to navigate......, with the aim of setting a community-driven gold standard for data handling, reporting and sharing. This article is part of a Special Issue entitled: New Horizons and Applications for Proteomics [EuPA 2012].......In large-scale proteomics studies there is a temptation, after months of experimental work, to plug resulting data into a convenient—if poorly implemented—set of tools, which may neither do the data justice nor help answer the scientific question. In this paper we have captured key concerns...

  13. Development of Spectrometer Software for Electromagnetic Radiation Measurement and Analysis

    International Nuclear Information System (INIS)

    Mohd Idris Taib; Noor Ezati Shuib; Wan Saffiey Wan Abdullah

    2013-01-01

    This software was under development using LabVIEW to be using with StellarNet Spectrometer system. StellarNet Spectrometer was supplied with SpectraWiz operating software that can measure spectral data for real-time spectroscopy. This LabVIEW software was used to access real-time data from SpectraWiz dynamic link library as hardware interfacing. This software will acquire amplitude of every electromagnetic wavelength at periodic time. In addition to hardware interfacing, the user interface capabilities of software include plotting of spectral data in various mode including scope, absorbance, transmission and irradiance mode. This software surely can be used for research and development in application, utilization and safety of electromagnetic radiation, especially solar, laser and ultra violet. Of-line capabilities of this software are almost unlimited due to availability of mathematical and signal processing function in the LabVIEW add on library. (author)

  14. GRAPHICAL ANALYSIS OF LAFFER'S THEORY FOR BENELUX COUNTRIES DURING 1995-2012

    Directory of Open Access Journals (Sweden)

    L. Bunescu

    2015-06-01

    Full Text Available Concerns about finding a tax burden rate, that generates the largest amount of tax revenues, have attracted the attention of researchers all the time. Law scarcity of public financial resources in relation to public expenditure determines the continuous monitoring of the evolution of binominal concepts: fiscal pressure versus tax revenues. The most simple and practical approach is given by the well-known Laffer’s curve. This paper aims to determine in graphical representation of the curve for Belgium, Netherlands and Luxembourg. The research is based on data provided by the European Commission for18 years. Conclusions for Benelux countries refer to the fact that the optimum value of tax burden is very closed to the maximum tax burden applied by them (the differences are below 1 percent, even equal for Belgium. Moreover, Luxembourg and Belgium are positioned in the admissible area of this theory, while the Netherlands have a fluctuant position.

  15. Librarian driven analysis with graphic user interface for nuclides quantification by gamma spectra

    Energy Technology Data Exchange (ETDEWEB)

    Kondrashov, V.S. E-mail: vlkondra@cdrewu.edu; Rothenberg, S.J.; Petersone, I

    2001-09-11

    For a set of a priori given radionuclides extracted from a general nuclide data library, the authors use median estimates of the gamma-peak areas and estimates to produce a list of possible radionuclides matching gamma-ray line(s). An a priori determined list of nuclides is obtained by searching for a match with the energy information of the database. This procedure is performed in an interactive graphic mode by markers that superimpose, on the spectral data, the energy information and yields provided by a general gamma-ray data library. This library of experimental data includes approximately 17,000 gamma-energy lines related to 756 known gamma emitter radionuclides listed by ICRP.

  16. Fault tree synthesis for software design analysis of PLC based safety-critical systems

    International Nuclear Information System (INIS)

    Koo, S. R.; Cho, C. H.; Seong, P. H.

    2006-01-01

    As a software verification and validation should be performed for the development of PLC based safety-critical systems, a software safety analysis is also considered in line with entire software life cycle. In this paper, we propose a technique of software safety analysis in the design phase. Among various software hazard analysis techniques, fault tree analysis is most widely used for the safety analysis of nuclear power plant systems. Fault tree analysis also has the most intuitive notation and makes both qualitative and quantitative analyses possible. To analyze the design phase more effectively, we propose a technique of fault tree synthesis, along with a universal fault tree template for the architecture modules of nuclear software. Consequently, we can analyze the safety of software on the basis of fault tree synthesis. (authors)

  17. BioXTAS RAW, a software program for high-throughput automated small-angle X-ray scattering data reduction and preliminary analysis

    DEFF Research Database (Denmark)

    Nielsen, S.S.; Toft, K.N.; Snakenborg, Detlef

    2009-01-01

    A fully open source software program for automated two-dimensional and one-dimensional data reduction and preliminary analysis of isotropic small-angle X-ray scattering (SAXS) data is presented. The program is freely distributed, following the open-source philosophy, and does not rely on any...... commercial software packages. BioXTAS RAW is a fully automated program that, via an online feature, reads raw two-dimensional SAXS detector output files and processes and plots data as the data files are created during measurement sessions. The software handles all steps in the data reduction. This includes...... mask creation, radial averaging, error bar calculation, artifact removal, normalization and q calibration. Further data reduction such as background subtraction and absolute intensity scaling is fast and easy via the graphical user interface. BioXTAS RAW also provides preliminary analysis of one...

  18. Software for rapid time dependent ChIP-sequencing analysis (TDCA).

    Science.gov (United States)

    Myschyshyn, Mike; Farren-Dai, Marco; Chuang, Tien-Jui; Vocadlo, David

    2017-11-25

    Chromatin immunoprecipitation followed by DNA sequencing (ChIP-seq) and associated methods are widely used to define the genome wide distribution of chromatin associated proteins, post-translational epigenetic marks, and modifications found on DNA bases. An area of emerging interest is to study time dependent changes in the distribution of such proteins and marks by using serial ChIP-seq experiments performed in a time resolved manner. Despite such time resolved studies becoming increasingly common, software to facilitate analysis of such data in a robust automated manner is limited. We have designed software called Time-Dependent ChIP-Sequencing Analyser (TDCA), which is the first program to automate analysis of time-dependent ChIP-seq data by fitting to sigmoidal curves. We provide users with guidance for experimental design of TDCA for modeling of time course (TC) ChIP-seq data using two simulated data sets. Furthermore, we demonstrate that this fitting strategy is widely applicable by showing that automated analysis of three previously published TC data sets accurately recapitulates key findings reported in these studies. Using each of these data sets, we highlight how biologically relevant findings can be readily obtained by exploiting TDCA to yield intuitive parameters that describe behavior at either a single locus or sets of loci. TDCA enables customizable analysis of user input aligned DNA sequencing data, coupled with graphical outputs in the form of publication-ready figures that describe behavior at either individual loci or sets of loci sharing common traits defined by the user. TDCA accepts sequencing data as standard binary alignment map (BAM) files and loci of interest in browser extensible data (BED) file format. TDCA accurately models the number of sequencing reads, or coverage, at loci from TC ChIP-seq studies or conceptually related TC sequencing experiments. TC experiments are reduced to intuitive parametric values that facilitate biologically

  19. AGSuite: Software to conduct feature analysis of artificial grammar learning performance.

    Science.gov (United States)

    Cook, Matthew T; Chubala, Chrissy M; Jamieson, Randall K

    2017-10-01

    To simplify the problem of studying how people learn natural language, researchers use the artificial grammar learning (AGL) task. In this task, participants study letter strings constructed according to the rules of an artificial grammar and subsequently attempt to discriminate grammatical from ungrammatical test strings. Although the data from these experiments are usually analyzed by comparing the mean discrimination performance between experimental conditions, this practice discards information about the individual items and participants that could otherwise help uncover the particular features of strings associated with grammaticality judgments. However, feature analysis is tedious to compute, often complicated, and ill-defined in the literature. Moreover, the data violate the assumption of independence underlying standard linear regression models, leading to Type I error inflation. To solve these problems, we present AGSuite, a free Shiny application for researchers studying AGL. The suite's intuitive Web-based user interface allows researchers to generate strings from a database of published grammars, compute feature measures (e.g., Levenshtein distance) for each letter string, and conduct a feature analysis on the strings using linear mixed effects (LME) analyses. The LME analysis solves the inflation of Type I errors that afflicts more common methods of repeated measures regression analysis. Finally, the software can generate a number of graphical representations of the data to support an accurate interpretation of results. We hope the ease and availability of these tools will encourage researchers to take full advantage of item-level variance in their datasets in the study of AGL. We moreover discuss the broader applicability of the tools for researchers looking to conduct feature analysis in any field.

  20. MulRF: a software package for phylogenetic analysis using multi-copy gene trees.

    Science.gov (United States)

    Chaudhary, Ruchi; Fernández-Baca, David; Burleigh, John Gordon

    2015-02-01

    MulRF is a platform-independent software package for phylogenetic analysis using multi-copy gene trees. It seeks the species tree that minimizes the Robinson-Foulds (RF) distance to the input trees using a generalization of the RF distance to multi-labeled trees. The underlying generic tree distance measure and fast running time make MulRF useful for inferring phylogenies from large collections of gene trees, in which multiple evolutionary processes as well as phylogenetic error may contribute to gene tree discord. MulRF implements several features for customizing the species tree search and assessing the results, and it provides a user-friendly graphical user interface (GUI) with tree visualization. The species tree search is implemented in C++ and the GUI in Java Swing. MulRF's executable as well as sample datasets and manual are available at http://genome.cs.iastate.edu/CBL/MulRF/, and the source code is available at https://github.com/ruchiherself/MulRFRepo. ruchic@ufl.edu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.